Sample records for remote computational resources

  1. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1974-01-01

    A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.

  2. Impact of remote sensing upon the planning, management, and development of water resources

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1975-01-01

    Principal water resources users were surveyed to determine the impact of remote data streams on hydrologic computer models. Analysis of responses demonstrated that: most water resources effort suitable to remote sensing inputs is conducted through federal agencies or through federally stimulated research; and, most hydrologic models suitable to remote sensing data are federally developed. Computer usage by major water resources users was analyzed to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era.

  3. On-demand provisioning of HEP compute resources on cloud sites and shared HPC centers

    NASA Astrophysics Data System (ADS)

    Erli, G.; Fischer, F.; Fleig, G.; Giffels, M.; Hauth, T.; Quast, G.; Schnepf, M.; Heese, J.; Leppert, K.; Arnaez de Pedro, J.; Sträter, R.

    2017-10-01

    This contribution reports on solutions, experiences and recent developments with the dynamic, on-demand provisioning of remote computing resources for analysis and simulation workflows. Local resources of a physics institute are extended by private and commercial cloud sites, ranging from the inclusion of desktop clusters over institute clusters to HPC centers. Rather than relying on dedicated HEP computing centers, it is nowadays more reasonable and flexible to utilize remote computing capacity via virtualization techniques or container concepts. We report on recent experience from incorporating a remote HPC center (NEMO Cluster, Freiburg University) and resources dynamically requested from the commercial provider 1&1 Internet SE into our intitute’s computing infrastructure. The Freiburg HPC resources are requested via the standard batch system, allowing HPC and HEP applications to be executed simultaneously, such that regular batch jobs run side by side to virtual machines managed via OpenStack [1]. For the inclusion of the 1&1 commercial resources, a Python API and SDK as well as the possibility to upload images were available. Large scale tests prove the capability to serve the scientific use case in the European 1&1 datacenters. The described environment at the Institute of Experimental Nuclear Physics (IEKP) at KIT serves the needs of researchers participating in the CMS and Belle II experiments. In total, resources exceeding half a million CPU hours have been provided by remote sites.

  4. System design and implementation of digital-image processing using computational grids

    NASA Astrophysics Data System (ADS)

    Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping

    2005-06-01

    As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.

  5. Dynamic provisioning of local and remote compute resources with OpenStack

    NASA Astrophysics Data System (ADS)

    Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.

    2015-12-01

    Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.

  6. Information Power Grid Posters

    NASA Technical Reports Server (NTRS)

    Vaziri, Arsi

    2003-01-01

    This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.

  7. Controlling user access to electronic resources without password

    DOEpatents

    Smith, Fred Hewitt

    2015-06-16

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes pre-determining an association of the restricted computer resource and computer-resource-proximal environmental information. Indicia of user-proximal environmental information are received from a user requesting access to the restricted computer resource. Received indicia of user-proximal environmental information are compared to associated computer-resource-proximal environmental information. User access to the restricted computer resource is selectively granted responsive to a favorable comparison in which the user-proximal environmental information is sufficiently similar to the computer-resource proximal environmental information. In at least some embodiments, the process further includes comparing user-supplied biometric measure and comparing it with a predetermined association of at least one biometric measure of an authorized user. Access to the restricted computer resource is granted in response to a favorable comparison.

  8. An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing

    NASA Astrophysics Data System (ADS)

    Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.

    2015-07-01

    Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.

  9. A computer software system for integration and analysis of grid-based remote sensing data with other natural resource data. Remote Sensing Project

    NASA Technical Reports Server (NTRS)

    Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.

    1977-01-01

    A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.

  10. Remote sensing image ship target detection method based on visual attention model

    NASA Astrophysics Data System (ADS)

    Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong

    2017-11-01

    The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.

  11. Impact of remote sensing upon the planning, management and development of water resources, appendix

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L.; Fowler, T. R.; Frech, S. L.

    1975-01-01

    Lists are presented of water resource agencies from the federal, state, Water Resources Research Institute, university, local, and private sectors. Information is provided on their water resource activities, computers, and models used. For Basic doc., see N75-25263.

  12. mGrid: A load-balanced distributed computing environment for the remote execution of the user-defined Matlab code

    PubMed Central

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-01-01

    Background Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. Results mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Conclusion Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet. PMID:16539707

  13. mGrid: a load-balanced distributed computing environment for the remote execution of the user-defined Matlab code.

    PubMed

    Karpievitch, Yuliya V; Almeida, Jonas S

    2006-03-15

    Matlab, a powerful and productive language that allows for rapid prototyping, modeling and simulation, is widely used in computational biology. Modeling and simulation of large biological systems often require more computational resources then are available on a single computer. Existing distributed computing environments like the Distributed Computing Toolbox, MatlabMPI, Matlab*G and others allow for the remote (and possibly parallel) execution of Matlab commands with varying support for features like an easy-to-use application programming interface, load-balanced utilization of resources, extensibility over the wide area network, and minimal system administration skill requirements. However, all of these environments require some level of access to participating machines to manually distribute the user-defined libraries that the remote call may invoke. mGrid augments the usual process distribution seen in other similar distributed systems by adding facilities for user code distribution. mGrid's client-side interface is an easy-to-use native Matlab toolbox that transparently executes user-defined code on remote machines (i.e. the user is unaware that the code is executing somewhere else). Run-time variables are automatically packed and distributed with the user-defined code and automated load-balancing of remote resources enables smooth concurrent execution. mGrid is an open source environment. Apart from the programming language itself, all other components are also open source, freely available tools: light-weight PHP scripts and the Apache web server. Transparent, load-balanced distribution of user-defined Matlab toolboxes and rapid prototyping of many simple parallel applications can now be done with a single easy-to-use Matlab command. Because mGrid utilizes only Matlab, light-weight PHP scripts and the Apache web server, installation and configuration are very simple. Moreover, the web-based infrastructure of mGrid allows for it to be easily extensible over the Internet.

  14. Natural Resource Information System. Volume I. Overall Description.

    ERIC Educational Resources Information Center

    Boeing Computer Services, Inc., Seattle, WA.

    Recognizing the need for the development of a computer based information system which would handle remote sensing as well as conventional mapping data, the Bureau of Indian Affairs and the Bureau of Land Management contracted with Boeing Computer Services for the design and construction of a prototype Natural Resource Information System. The…

  15. Smoke and Air Resource Management-Peering Through the Haze

    Treesearch

    A. R. Fox Riebau

    1987-01-01

    This paper presents a vision of the future rooted in consideration of the past 20 years in the smoke and air resource management field. This future is characterized by rapid technological development of computers for computation, communications, and remote sensing capabilities and of the possible societal responses to these advances. We discuss intellectual...

  16. Remote sensing programs and courses in engineering and water resources

    NASA Technical Reports Server (NTRS)

    Kiefer, R. W.

    1981-01-01

    The content of typical basic and advanced remote sensing and image interpretation courses are described and typical remote sensing graduate programs of study in civil engineering and in interdisciplinary environmental remote sensing and water resources management programs are outlined. Ideally, graduate programs with an emphasis on remote sensing and image interpretation should be built around a core of five courses: (1) a basic course in fundamentals of remote sensing upon which the more specialized advanced remote sensing courses can build; (2) a course dealing with visual image interpretation; (3) a course dealing with quantitative (computer-based) image interpretation; (4) a basic photogrammetry course; and (5) a basic surveying course. These five courses comprise up to one-half of the course work required for the M.S. degree. The nature of other course work and thesis requirements vary greatly, depending on the department in which the degree is being awarded.

  17. Computer User's Guide to the Protection of Information Resources. NIST Special Publication 500-171.

    ERIC Educational Resources Information Center

    Helsing, Cheryl; And Others

    Computers have changed the way information resources are handled. Large amounts of information are stored in one central place and can be accessed from remote locations. Users have a personal responsibility for the security of the system and the data stored in it. This document outlines the user's responsibilities and provides security and control…

  18. Application of remote sensing to land and water resource planning: The Pocomoke River Basin, Maryland

    NASA Technical Reports Server (NTRS)

    Wildesen, S. E.; Phillips, E. P.

    1981-01-01

    Because of the size of the Pocomoke River Basin, the inaccessibility of certain areas, and study time constraints, several remote sensing techniques were used to collect base information on the river corridor, (a 23.2 km channel) and on a 1.2 km wooded floodplain. This information provided an adequate understanding of the environment and its resources, thus enabling effective management options to be designed. The remote sensing techniques used for assessment included manual analysis of high altitude color-infrared photography, computer-assisted analysis of LANDSAT-2 imagery, and the application of airborne oceanographic Lidar for topographic mapping. Results show that each techniques was valuable in providing the needed base data necessary for resource planning.

  19. Controlling user access to electronic resources without password

    DOEpatents

    Smith, Fred Hewitt

    2017-08-22

    Described herein are devices and techniques for remotely controlling user access to a restricted computer resource. The process includes obtaining an image from a communication device of a user. An individual and a landmark are identified within the image. Determinations are made that the individual is the user and that the landmark is a predetermined landmark. Access to a restricted computing resource is granted based on the determining that the individual is the user and that the landmark is the predetermined landmark. Other embodiments are disclosed.

  20. Using multi-level remote sensing and ground data to estimate forest biomass resources in remote regions: a case study in the boreal forests of interior Alaska

    Treesearch

    Hans-Erik Andersen; Strunk Jacob; Hailemariam Temesgen; Donald Atwood; Ken Winterberger

    2012-01-01

    The emergence of a new generation of remote sensing and geopositioning technologies, as well as increased capabilities in image processing, computing, and inferential techniques, have enabled the development and implementation of increasingly efficient and cost-effective multilevel sampling designs for forest inventory. In this paper, we (i) describe the conceptual...

  1. The Penn State ORSER system for processing and analyzing ERTS and other MSS data

    NASA Technical Reports Server (NTRS)

    Mcmurtry, G. J.; Petersen, G. W. (Principal Investigator); Borden, F. Y.; Weeden, H. A.

    1974-01-01

    The author has identified the following significant results. The office for Remote Sensing of Earth Resources (ORSER) of the Space Science and Engineering Laboratory at the Pennsylvania State University has developed an extensive operational system for processing and analyzing ERTS-1 and similar multispectral data. The ORSER system was developed for use by a wide variety of researchers working in remote sensing. Both photointerpretive techniques and automatic computer processing methods have been developed and used, separately and in a combined approach. A remote Job Entry system permits use of an IBM 370/168 computer from any compatible remote terminal, including equipment tied in by long distance telephone connections. An elementary cost analysis has been prepared for the processing of ERTS data.

  2. Remote Data Retrieval for Bioinformatics Applications: An Agent Migration Approach

    PubMed Central

    Gao, Lei; Dai, Hua; Zhang, Tong-Liang; Chou, Kuo-Chen

    2011-01-01

    Some of the approaches have been developed to retrieve data automatically from one or multiple remote biological data sources. However, most of them require researchers to remain online and wait for returned results. The latter not only requires highly available network connection, but also may cause the network overload. Moreover, so far none of the existing approaches has been designed to address the following problems when retrieving the remote data in a mobile network environment: (1) the resources of mobile devices are limited; (2) network connection is relatively of low quality; and (3) mobile users are not always online. To address the aforementioned problems, we integrate an agent migration approach with a multi-agent system to overcome the high latency or limited bandwidth problem by moving their computations to the required resources or services. More importantly, the approach is fit for the mobile computing environments. Presented in this paper are also the system architecture, the migration strategy, as well as the security authentication of agent migration. As a demonstration, the remote data retrieval from GenBank was used to illustrate the feasibility of the proposed approach. PMID:21701677

  3. Assessing Information on the Internet: Toward Providing Library Services for Computer-Mediated Communication. A Final Report.

    ERIC Educational Resources Information Center

    Dillon, Martin; And Others

    The Online Computer Library Center Internet Resource project focused on the nature of electronic textual information available through remote access using the Internet and the problems associated with creating machine-readable cataloging (MARC) records for these objects using current USMARC format for computer files and "Anglo-American…

  4. A Novel Resource Management Method of Providing Operating System as a Service for Mobile Transparent Computing

    PubMed Central

    Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable. PMID:24883353

  5. A novel resource management method of providing operating system as a service for mobile transparent computing.

    PubMed

    Xiong, Yonghua; Huang, Suzhen; Wu, Min; Zhang, Yaoxue; She, Jinhua

    2014-01-01

    This paper presents a framework for mobile transparent computing. It extends the PC transparent computing to mobile terminals. Since resources contain different kinds of operating systems and user data that are stored in a remote server, how to manage the network resources is essential. In this paper, we apply the technologies of quick emulator (QEMU) virtualization and mobile agent for mobile transparent computing (MTC) to devise a method of managing shared resources and services management (SRSM). It has three layers: a user layer, a manage layer, and a resource layer. A mobile virtual terminal in the user layer and virtual resource management in the manage layer cooperate to maintain the SRSM function accurately according to the user's requirements. An example of SRSM is used to validate this method. Experiment results show that the strategy is effective and stable.

  6. Study on identifying deciduous forest by the method of feature space transformation

    NASA Astrophysics Data System (ADS)

    Zhang, Xuexia; Wu, Pengfei

    2009-10-01

    The thematic remotely sensed information extraction is always one of puzzling nuts which the remote sensing science faces, so many remote sensing scientists devotes diligently to this domain research. The methods of thematic information extraction include two kinds of the visual interpretation and the computer interpretation, the developing direction of which is intellectualization and comprehensive modularization. The paper tries to develop the intelligent extraction method of feature space transformation for the deciduous forest thematic information extraction in Changping district of Beijing city. The whole Chinese-Brazil resources satellite images received in 2005 are used to extract the deciduous forest coverage area by feature space transformation method and linear spectral decomposing method, and the result from remote sensing is similar to woodland resource census data by Chinese forestry bureau in 2004.

  7. A Simple Solution to Providing Remote Access to CD-ROM.

    ERIC Educational Resources Information Center

    Garnham, Carla T.; Brodie, Kent

    1990-01-01

    A pilot project at the Medical College of Wisconsin illustrates how even small computing organizations with limited financial and staff resources can provide remote access to CD-ROM (Compact Disc-Read-Only-Memory) databases, and that providing such convenient access to a vast array of useful information can greatly benefit faculty and students.…

  8. Cross stratum resources protection in fog-computing-based radio over fiber networks for 5G services

    NASA Astrophysics Data System (ADS)

    Guo, Shaoyong; Shao, Sujie; Wang, Yao; Yang, Hui

    2017-09-01

    In order to meet the requirement of internet of things (IoT) and 5G, the cloud radio access network is a paradigm which converges all base stations computational resources into a cloud baseband unit (BBU) pool, while the distributed radio frequency signals are collected by remote radio head (RRH). A precondition for centralized processing in the BBU pool is an interconnection fronthaul network with high capacity and low delay. However, it has become more complex and frequent in the interaction between RRH and BBU and resource scheduling among BBUs in cloud. Cloud radio over fiber network has been proposed in our previous work already. In order to overcome the complexity and latency, in this paper, we first present a novel cross stratum resources protection (CSRP) architecture in fog-computing-based radio over fiber networks (F-RoFN) for 5G services. Additionally, a cross stratum protection (CSP) scheme considering the network survivability is introduced in the proposed architecture. The CSRP with CSP scheme can effectively pull the remote processing resource locally to implement the cooperative radio resource management, enhance the responsiveness and resilience to the dynamic end-to-end 5G service demands, and globally optimize optical network, wireless and fog resources. The feasibility and efficiency of the proposed architecture with CSP scheme are verified on our software defined networking testbed in terms of service latency, transmission success rate, resource occupation rate and blocking probability.

  9. Cyber-workstation for computational neuroscience.

    PubMed

    Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C

    2010-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.

  10. Cyber-Workstation for Computational Neuroscience

    PubMed Central

    DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.

    2009-01-01

    A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436

  11. Spatial information technologies for remote sensing today and tomorrow; Proceedings of the Ninth Pecora Symposium, Sioux Falls, SD, October 2-4, 1984

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Topics discussed at the symposium include hardware, geographic information system (GIS) implementation, processing remotely sensed data, spatial data structures, and NASA programs in remote sensing information systems. Attention is also given GIS applications, advanced techniques, artificial intelligence, graphics, spatial navigation, and classification. Papers are included on the design of computer software for geographic image processing, concepts for a global resource information system, algorithm development for spatial operators, and an application of expert systems technology to remotely sensed image analysis.

  12. Energy and remote sensing. [satellite exploration, monitoring, siting

    NASA Technical Reports Server (NTRS)

    Summers, R. A.; Smith, W. L.; Short, N. M.

    1977-01-01

    Exploration for uranium, thorium, oil, gas and geothermal activity through remote sensing techniques is considered; satellite monitoring of coal-derived CO2 in the atmosphere, and the remote assessment of strip mining and land restoration are also mentioned. Reference is made to color ratio composites based on Landsat data, which may aid in the detection of uranium deposits, and to computer-enhanced black and white airborne scanning imagery, which may locate geothermal anomalies. Other applications of remote sensing to energy resources management, including mapping of transportation networks and power plant siting, are discussed.

  13. Remote sensing impact on corridor selection and placement

    NASA Technical Reports Server (NTRS)

    Thomson, F. J.; Sellman, A. N.

    1975-01-01

    Computer-aided corridor selection techniques, utilizing digitized data bases of socio-economic, census, and cadastral data, and developed for highway corridor routing are considered. Land resource data generated from various remote sensing data sources were successfully merged with the ancillary data files of a corridor selection model and prototype highway corridors were designed using the combined data set. Remote sensing derived information considered useful for highway corridor location, special considerations in geometric correction of remote sensing data to facilitate merging it with ancillary data files, and special interface requirements are briefly discussed.

  14. Mobile-Cloud Assisted Video Summarization Framework for Efficient Management of Remote Sensing Data Generated by Wireless Capsule Sensors

    PubMed Central

    Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook

    2014-01-01

    Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data. PMID:25225874

  15. Mobile-cloud assisted video summarization framework for efficient management of remote sensing data generated by wireless capsule sensors.

    PubMed

    Mehmood, Irfan; Sajjad, Muhammad; Baik, Sung Wook

    2014-09-15

    Wireless capsule endoscopy (WCE) has great advantages over traditional endoscopy because it is portable and easy to use, especially in remote monitoring health-services. However, during the WCE process, the large amount of captured video data demands a significant deal of computation to analyze and retrieve informative video frames. In order to facilitate efficient WCE data collection and browsing task, we present a resource- and bandwidth-aware WCE video summarization framework that extracts the representative keyframes of the WCE video contents by removing redundant and non-informative frames. For redundancy elimination, we use Jeffrey-divergence between color histograms and inter-frame Boolean series-based correlation of color channels. To remove non-informative frames, multi-fractal texture features are extracted to assist the classification using an ensemble-based classifier. Owing to the limited WCE resources, it is impossible for the WCE system to perform computationally intensive video summarization tasks. To resolve computational challenges, mobile-cloud architecture is incorporated, which provides resizable computing capacities by adaptively offloading video summarization tasks between the client and the cloud server. The qualitative and quantitative results are encouraging and show that the proposed framework saves information transmission cost and bandwidth, as well as the valuable time of data analysts in browsing remote sensing data.

  16. Geospatial Analysis and Remote Sensing from Airplanes and Satellites for Cultural Resources Management

    NASA Technical Reports Server (NTRS)

    Giardino, Marco J.; Haley, Bryan S.

    2005-01-01

    Cultural resource management consists of research to identify, evaluate, document and assess cultural resources, planning to assist in decision-making, and stewardship to implement the preservation, protection and interpretation of these decisions and plans. One technique that may be useful in cultural resource management archaeology is remote sensing. It is the acquisition of data and derivative information about objects or materials (targets) located on the Earth's surface or in its atmosphere by using sensor mounted on platforms located at a distance from the targets to make measurements on interactions between the targets and electromagnetic radiation. Included in this definition are systems that acquire imagery by photographic methods and digital multispectral sensors. Data collected by digital multispectral sensors on aircraft and satellite platforms play a prominent role in many earth science applications, including land cover mapping, geology, soil science, agriculture, forestry, water resource management, urban and regional planning, and environmental assessments. Inherent in the analysis of remotely sensed data is the use of computer-based image processing techniques. Geographical information systems (GIS), designed for collecting, managing, and analyzing spatial information, are also useful in the analysis of remotely sensed data. A GIS can be used to integrate diverse types of spatially referenced digital data, including remotely sensed and map data. In archaeology, these tools have been used in various ways to aid in cultural resource projects. For example, they have been used to predict the presence of archaeological resources using modern environmental indicators. Remote sensing techniques have also been used to directly detect the presence of unknown sites based on the impact of past occupation on the Earth's surface. Additionally, remote sensing has been used as a mapping tool aimed at delineating the boundaries of a site or mapping previously unknown features. All of these applications are pertinent to the goals of site discovery and assessment in cultural resource management.

  17. A Proposal for a Computer Network for the Indonesian Air Force’s Remote Site Radar System

    DTIC Science & Technology

    1989-03-01

    This thesis proposes two alternatives for a preliminary design of a computer network to support this need. It suggests how existing communication...suggests how existing communication resources such as telephones, microwave links and satellite systems can be used to support the network. The first...telephone, radio-link, microwave-link and satellite systems. The goal of this thesis is to suggest how to utilize or implement these resources to support

  18. Environmental Remote Sensing for Natural Resources Management: A Workshop in Collaboration with Faculdade de Agronomia e Engenharia Florestal, Universidade Eduardo Mondlane

    NASA Astrophysics Data System (ADS)

    Washington-Allen, R. A.; Fatoyinbo, T. E.; Ribeiro, N. S.; Shugart, H. H.; Therrell, M. D.; Vaz, K. T.; von Schill, L.

    2006-12-01

    A workshop titled: Environmental Remote Sensing for Natural Resources Management was held from June 12 23, 2006 at Eduardo Mondlane University in Maputo Mozambique. The workshop was initiated through an invitation and pre-course evaluation form to interested NGOs, universities, and government organizations. The purpose of the workshop was to provide training to interested professionals, graduate students, faculty and researchers at Mozambican institutions on the research and practical uses of remote sensing for natural resource management. The course had 24 participants who were predominantly professionals in remote sensing and GIS from various NGOs, governmental and academic institutions in Mozambique. The course taught remote sensing from an ecological perspective, specifically the course focused on the application of new remote sensing technology [the Shuttle Radar Topography Mission (SRTM) C-band radar data] to carbon accounting research in Miombo woodlands and Mangrove forests. The 2-week course was free to participants and consisted of lectures, laboratories, and a field trip to the mangrove forests of Inhaca Island, Maputo. The field trip consisted of training in the use of forest inventory techniques in support of remote sensing studies. Specifically, the field workshop centered on use of Global Positioning Systems (GPS) and collection of forest inventory data on tree height, structure [leaf area index (LAI)], and productivity. Productivity studies were enhanced with the teaching of introductory dendrochronology including sample collection of tree rings from four different mangrove species. Students were provided with all course materials including a DVD that contained satellite data (e.g., Landsat and SRTM imagery), ancillary data, lectures, exercises, and remote sensing publications used in the course including a CD from the Environmental Protection Agency's Environmental Photographic Interpretation Center's (EPA-EPIC) program to teach remote sensing and data CDs from NASA's SAFARI 2000 field campaign. Nineteen participants evaluated the effectiveness of the course in regards to the course lectures, instructors, and the field trip. Future workshops should focus more on the individual projects that students are engaged with in their jobs, replace the laboratories computers with workstations geared towards computer intensive image processing software, and the purchase of field remote sensing instrumentation for practical exercises.

  19. Design and Verification of Remote Sensing Image Data Center Storage Architecture Based on Hadoop

    NASA Astrophysics Data System (ADS)

    Tang, D.; Zhou, X.; Jing, Y.; Cong, W.; Li, C.

    2018-04-01

    The data center is a new concept of data processing and application proposed in recent years. It is a new method of processing technologies based on data, parallel computing, and compatibility with different hardware clusters. While optimizing the data storage management structure, it fully utilizes cluster resource computing nodes and improves the efficiency of data parallel application. This paper used mature Hadoop technology to build a large-scale distributed image management architecture for remote sensing imagery. Using MapReduce parallel processing technology, it called many computing nodes to process image storage blocks and pyramids in the background to improve the efficiency of image reading and application and sovled the need for concurrent multi-user high-speed access to remotely sensed data. It verified the rationality, reliability and superiority of the system design by testing the storage efficiency of different image data and multi-users and analyzing the distributed storage architecture to improve the application efficiency of remote sensing images through building an actual Hadoop service system.

  20. Earth Survey Applications Division. [a bibliography

    NASA Technical Reports Server (NTRS)

    Carpenter, L. (Editor)

    1981-01-01

    Accomplishments of research and data analysis conducted to study physical parameters and processes inside the Earth and on the Earth's surface, to define techniques and systems for remotely sensing the processes and measuring the parameters of scientific and applications interest, and the transfer of promising operational applications techniques to the user community of Earth resources monitors, managers, and decision makers are described. Research areas covered include: geobotany, magnetic field modeling, crustal studies, crustal dynamics, sea surface topography, land resources, remote sensing of vegetation and soils, and hydrological sciences. Major accomplishments include: production of global maps of magnetic anomalies using Magsat data; computation of the global mean sea surface using GEOS-3 and Seasat altimetry data; delineation of the effects of topography on the interpretation of remotely-sensed data; application of snowmelt runoff models to water resources management; and mapping of snow depth over wheat growing areas using Nimbus microwave data.

  1. 1994 ASPRS/ACSM annual convention exposition. Volume 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1994-01-01

    This report is Volume II of presented papers at the joint 1994 convention of the American Society for Photgrammetry and Remote Sensing and American Congress on Surveying and Mapping. Topic areas covered include the following: Data Base/GPS Issues; Survey Management Issues; Surveying computations; Surveying education; Digital mapping; global change, EOS and NALC issues; GPS issues; Battelle Research in Remote Sensing and in GIS; Advanced Image Processing;GIS Issues; Surveying and Geodesy Issues; water resource issues; Advanced applications of remote sensing; Landsat Pathfinder I.

  2. Interfacing geographic information systems and remote sensing for rural land-use analysis

    NASA Technical Reports Server (NTRS)

    Nellis, M. Duane; Lulla, Kamlesh; Jensen, John

    1990-01-01

    Recent advances in computer-based geographic information systems (GISs) are briefly reviewed, with an emphasis on the incorporation of remote-sensing data in GISs for rural applications. Topics addressed include sampling procedures for rural land-use analyses; GIS-based mapping of agricultural land use and productivity; remote sensing of land use and agricultural, forest, rangeland, and water resources; monitoring the dynamics of irrigation agriculture; GIS methods for detecting changes in land use over time; and the development of land-use modeling strategies.

  3. Design of Remote Monitoring System of Irrigation based on GSM and ZigBee Technology

    NASA Astrophysics Data System (ADS)

    Xiao xi, Zheng; Fang, Zhao; Shuaifei, Shao

    2018-03-01

    To solve the problems of low level of irrigation and waste of water resources, a remote monitoring system for farmland irrigation based on GSM communication technology and ZigBee technology was designed. The system is composed of sensors, GSM communication module, ZigBee module, host computer, valve and so on. The system detects and closes the pump and the electromagnetic valve according to the need of the system, and transmits the monitoring information to the host computer or the user’s Mobile phone through the GSM communication network. Experiments show that the system has low power consumption, friendly man-machine interface, convenient and simple. It can monitor agricultural environment remotely and control related irrigation equipment at any time and place, and can better meet the needs of remote monitoring of farmland irrigation.

  4. Impact of remote sensing upon the planning, management and development of water resources. Summary of computers and computer growth trends for hydrologic modeling and the input of ERTS image data processing load

    NASA Technical Reports Server (NTRS)

    Castruccio, P. A.; Loats, H. L., Jr.

    1975-01-01

    An analysis of current computer usage by major water resources users was made to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era. The analysis showns significant impact due to the utilization and processing of ERTS CCT's data.

  5. Dynamic Extension of a Virtualized Cluster by using Cloud Resources

    NASA Astrophysics Data System (ADS)

    Oberst, Oliver; Hauth, Thomas; Kernert, David; Riedel, Stephan; Quast, Günter

    2012-12-01

    The specific requirements concerning the software environment within the HEP community constrain the choice of resource providers for the outsourcing of computing infrastructure. The use of virtualization in HPC clusters and in the context of cloud resources is therefore a subject of recent developments in scientific computing. The dynamic virtualization of worker nodes in common batch systems provided by ViBatch serves each user with a dynamically virtualized subset of worker nodes on a local cluster. Now it can be transparently extended by the use of common open source cloud interfaces like OpenNebula or Eucalyptus, launching a subset of the virtual worker nodes within the cloud. This paper demonstrates how a dynamically virtualized computing cluster is combined with cloud resources by attaching remotely started virtual worker nodes to the local batch system.

  6. Research in remote sensing of agriculture, earth resources, and man's environment

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A.

    1974-01-01

    Research performed on NASA and USDA remote sensing projects are reviewed and include: (1) the 1971 Corn Blight Watch Experiment; (2) crop identification; (3) soil mapping; (4) land use inventories; (5) geologic mapping; and (6) forest and water resources data collection. The extent to which ERTS images and airborne data were used is indicated along with computer implementation. A field and laboratory spectroradiometer system is described together with the LARSYS software system, both of which were widely used during the research. Abstracts are included of 160 technical reports published as a result of the work.

  7. Planning for distributed workflows: constraint-based coscheduling of computational jobs and data placement in distributed environments

    NASA Astrophysics Data System (ADS)

    Makatun, Dzmitry; Lauret, Jérôme; Rudová, Hana; Šumbera, Michal

    2015-05-01

    When running data intensive applications on distributed computational resources long I/O overheads may be observed as access to remotely stored data is performed. Latencies and bandwidth can become the major limiting factor for the overall computation performance and can reduce the CPU/WallTime ratio to excessive IO wait. Reusing the knowledge of our previous research, we propose a constraint programming based planner that schedules computational jobs and data placements (transfers) in a distributed environment in order to optimize resource utilization and reduce the overall processing completion time. The optimization is achieved by ensuring that none of the resources (network links, data storages and CPUs) are oversaturated at any moment of time and either (a) that the data is pre-placed at the site where the job runs or (b) that the jobs are scheduled where the data is already present. Such an approach eliminates the idle CPU cycles occurring when the job is waiting for the I/O from a remote site and would have wide application in the community. Our planner was evaluated and simulated based on data extracted from log files of batch and data management systems of the STAR experiment. The results of evaluation and estimation of performance improvements are discussed in this paper.

  8. VirTUal remoTe labORatories managEment System (TUTORES): Using Cloud Computing to Acquire University Practical Skills

    ERIC Educational Resources Information Center

    Caminero, Agustín C.; Ros, Salvador; Hernández, Roberto; Robles-Gómez, Antonio; Tobarra, Llanos; Tolbaños Granjo, Pedro J.

    2016-01-01

    The use of practical laboratories is a key in engineering education in order to provide our students with the resources needed to acquire practical skills. This is specially true in the case of distance education, where no physical interactions between lecturers and students take place, so virtual or remote laboratories must be used. UNED has…

  9. Network Basics.

    ERIC Educational Resources Information Center

    Tennant, Roy

    1992-01-01

    Explains how users can find and access information resources available on the Internet. Highlights include network information centers (NICs); lists, both formal and informal; computer networking protocols, including international standards; electronic mail; remote log-in; and file transfer. (LRW)

  10. Natural resources information system.

    NASA Technical Reports Server (NTRS)

    Leachtenauer, J. C.; Woll, A. M.

    1972-01-01

    A computer-based Natural Resources Information System was developed for the Bureaus of Indian Affairs and Land Management. The system stores, processes and displays data useful to the land manager in the decision making process. Emphasis is placed on the use of remote sensing as a data source. Data input consists of maps, imagery overlays, and on-site data. Maps and overlays are entered using a digitizer and stored as irregular polygons, lines and points. Processing functions include set intersection, union and difference and area, length and value computations. Data output consists of computer tabulations and overlays prepared on a drum plotter.

  11. A remote laboratory for USRP-based software defined radio

    NASA Astrophysics Data System (ADS)

    Gandhinagar Ekanthappa, Rudresh; Escobar, Rodrigo; Matevossian, Achot; Akopian, David

    2014-02-01

    Electrical and computer engineering graduates need practical working skills with real-world electronic devices, which are addressed to some extent by hands-on laboratories. Deployment capacity of hands-on laboratories is typically constrained due to insufficient equipment availability, facility shortages, and lack of human resources for in-class support and maintenance. At the same time, at many sites, existing experimental systems are usually underutilized due to class scheduling bottlenecks. Nowadays, online education gains popularity and remote laboratories have been suggested to broaden access to experimentation resources. Remote laboratories resolve many problems as various costs can be shared, and student access to instrumentation is facilitated in terms of access time and locations. Labs are converted to homeworks that can be done without physical presence in laboratories. Even though they are not providing full sense of hands-on experimentation, remote labs are a viable alternatives for underserved educational sites. This paper studies remote modality of USRP-based radio-communication labs offered by National Instruments (NI). The labs are offered to graduate and undergraduate students and tentative assessments support feasibility of remote deployments.

  12. Grid workflow validation using ontology-based tacit knowledge: A case study for quantitative remote sensing applications

    NASA Astrophysics Data System (ADS)

    Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi

    2017-01-01

    Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.

  13. Integrating remote sensing, geographic information systems and global positioning system techniques with hydrological modeling

    NASA Astrophysics Data System (ADS)

    Thakur, Jay Krishna; Singh, Sudhir Kumar; Ekanthalu, Vicky Shettigondahalli

    2017-07-01

    Integration of remote sensing (RS), geographic information systems (GIS) and global positioning system (GPS) are emerging research areas in the field of groundwater hydrology, resource management, environmental monitoring and during emergency response. Recent advancements in the fields of RS, GIS, GPS and higher level of computation will help in providing and handling a range of data simultaneously in a time- and cost-efficient manner. This review paper deals with hydrological modeling, uses of remote sensing and GIS in hydrological modeling, models of integrations and their need and in last the conclusion. After dealing with these issues conceptually and technically, we can develop better methods and novel approaches to handle large data sets and in a better way to communicate information related with rapidly decreasing societal resources, i.e. groundwater.

  14. ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing

    PubMed Central

    Rusakov, Dmitri A.; Savtchenko, Leonid P.

    2017-01-01

    Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877

  15. Investigation of forestry resources and other remote sensing data. 1: LANDSAT. 2: Remote sensing of volcanic emissions

    NASA Technical Reports Server (NTRS)

    Birnie, R. W.; Stoiber, R. E. (Principal Investigator)

    1983-01-01

    Computer classification of LANDSAT data was used for forest type mapping in New England. The ability to classify areas of hardwood, softwood, and mixed tree types was assessed along with determining clearcut regions and gypsy moth defoliation. Applications of the information to forest management and locating potential deer yards were investigated. The principal activities concerned with remote sensing of volcanic emissions centered around the development of remote sensors for SO2 and HCl gas, and their use at appropriate volcanic sites. Two major areas were investigated (Masaya, Nicaragua, and St. Helens, Washington) along with several minor ones.

  16. Developing an undergraduate geography course on digital image processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Baumann, P. R.

    1981-01-01

    Problems relating to the development of a digital image processing course in an undergraduate geography environment is discussed. Computer resource requirements, course prerequisites, and the size of the study area are addressed.

  17. Computing and Communications Infrastructure for Network-Centric Warfare: Exploiting COTS, Assuring Performance

    DTIC Science & Technology

    2004-06-01

    remote databases, has seen little vendor acceptance. Each database ( Oracle , DB2, MySQL , etc.) has its own client- server protocol. Therefore each...existing standards – SQL , X.500/LDAP, FTP, etc. • View information dissemination as selective replication – State-oriented vs . message-oriented...allowing the 8 application to start. The resource management system would serve as a broker to the resources, making sure that resources are not

  18. Open Marketplace for Simulation Software on the Basis of a Web Platform

    NASA Astrophysics Data System (ADS)

    Kryukov, A. P.; Demichev, A. P.

    2016-02-01

    The focus in development of a new generation of middleware shifts from the global grid systems to building convenient and efficient web platforms for remote access to individual computing resources. Further line of their development, suggested in this work, is related not only with the quantitative increase in their number and with the expansion of scientific, engineering, and manufacturing areas in which they are used, but also with improved technology for remote deployment of application software on the resources interacting with the web platforms. Currently, the services for providers of application software in the context of scientific-oriented web platforms is not developed enough. The proposed in this work new web platforms of application software market should have all the features of the existing web platforms for submissions of jobs to remote resources plus the provision of specific web services for interaction on market principles between the providers and consumers of application packages. The suggested approach will be approved on the example of simulation applications in the field of nonlinear optics.

  19. Publications of the Jet Propulsion Laboratory 1982

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A bibliography of articles concerning topics on the deep space network, data acquisition, telecommunication, and related aerospace studies is presented. A sample of the diverse subjects include, solar energy remote sensing, computer science, Earth resources, astronomy, and satellite communication.

  20. Spline function approximation techniques for image geometric distortion representation. [for registration of multitemporal remote sensor imagery

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.

    1975-01-01

    Least squares approximation techniques were developed for use in computer aided correction of spatial image distortions for registration of multitemporal remote sensor imagery. Polynomials were first used to define image distortion over the entire two dimensional image space. Spline functions were then investigated to determine if the combination of lower order polynomials could approximate a higher order distortion with less computational difficulty. Algorithms for generating approximating functions were developed and applied to the description of image distortion in aircraft multispectral scanner imagery. Other applications of the techniques were suggested for earth resources data processing areas other than geometric distortion representation.

  1. WinHPC System User Basics | High-Performance Computing | NREL

    Science.gov Websites

    guidance for starting to use this high-performance computing (HPC) system at NREL. Also see WinHPC policies ) when you are finished. Simply quitting Remote Desktop will keep your session active and using resources node). 2. Log in with your NREL.gov username/password. Remember to log out when finished. Mac 1. If you

  2. A study of Minnesota land and water resources using remote sensing, volume 13

    NASA Technical Reports Server (NTRS)

    1980-01-01

    Progress in the use of LANDSAT data to classify wetlands in the Upper Mississippi River Valley and efforts to evaluate stress in corn and soybean crops are described. Satellite remote sensing data was used to measure particle concentrations in Lake Superior and several different kinds of remote sensing data were synergistically combined in order to identify near surface bedrock in Minnesota. Data analysis techniques which separate those activities requiring extensive computing form those involving a great deal of user interaction were developed to allow the latter to be done in the user's office or in the field.

  3. Recommendations concerning satellite-acquired earth resource data: 1982 report of the Data Management Subcommittee of the GEOSAT Committee, Incorporated

    NASA Technical Reports Server (NTRS)

    1982-01-01

    End user concerns about the content and accessibility of libraries of remote sensing data in general are addressed. Recommendations pertaining to the United States' satellite remote sensing programs urge: (1) the continuation of the NASA/EROS Data Center program to convert pre-1979 scenes to computer readable tapes and create a historical archive of this valuable data; (2) improving the EROS archive by adding geologically interesting scenes, data from other agencies (including previously classified data), and by adopting a policy to retire data from the archive; (3) establishing a computer data base inquiry system that includes remote sensing data from all publically available sources; (4) capability for prepurchase review and evaluation; (5) a flexible price structure; and (6) adoption of standard digital data products format. Information about LANDSAT 4, the status of worldwide LANDSAT receiving stations, future non-U.S. remote sensing satellites, a list of sources for LANDSAT data, and the results of a survey of GEOSAT members' remote sensing data processing systems are also considered.

  4. Integrated Sustainable Planning for Industrial Region Using Geospatial Technology

    NASA Astrophysics Data System (ADS)

    Tiwari, Manish K.; Saxena, Aruna; Katare, Vivek

    2012-07-01

    The Geospatial techniques and its scope of applications have undergone an order of magnitude change since its advent and now it has been universally accepted as a most important and modern tool for mapping and monitoring of various natural resources as well as amenities and infrastructure. The huge and voluminous spatial database generated from various Remote Sensing platforms needs proper management like storage, retrieval, manipulation and analysis to extract desired information, which is beyond the capability of human brain. This is where the computer aided GIS technology came into existence. A GIS with major input from Remote Sensing satellites for the natural resource management applications must be able to handle the spatiotemporal data, supporting spatiotemporal quarries and other spatial operations. Software and the computer-based tools are designed to make things easier to the user and to improve the efficiency and quality of information processing tasks. The natural resources are a common heritage, which we have shared with the past generations, and our future generation will be inheriting these resources from us. Our greed for resource and our tremendous technological capacity to exploit them at a much larger scale has created a situation where we have started withdrawing from the future stocks. Bhopal capital region had attracted the attention of the planners from the beginning of the five-year plan strategy for Industrial development. However, a number of projects were carried out in the individual Districts (Bhopal, Rajgarh, Shajapur, Raisen, Sehore) which also gave fruitful results, but no serious efforts have been made to involve the entire region. No use of latest Geospatial technique (Remote Sensing, GIS, GPS) to prepare a well structured computerized data base without which it is very different to retrieve, analyze and compare the data for monitoring as well as for planning the developmental activities in future.

  5. Practical applications of remote sensing technology

    NASA Technical Reports Server (NTRS)

    Whitmore, Roy A., Jr.

    1990-01-01

    Land managers increasingly are becoming dependent upon remote sensing and automated analysis techniques for information gathering and synthesis. Remote sensing and geographic information system (GIS) techniques provide quick and economical information gathering for large areas. The outputs of remote sensing classification and analysis are most effective when combined with a total natural resources data base within the capabilities of a computerized GIS. Some examples are presented of the successes, as well as the problems, in integrating remote sensing and geographic information systems. The need to exploit remotely sensed data and the potential that geographic information systems offer for managing and analyzing such data continues to grow. New microcomputers with vastly enlarged memory, multi-fold increases in operating speed and storage capacity that was previously available only on mainframe computers are a reality. Improved raster GIS software systems have been developed for these high performance microcomputers. Vector GIS systems previously reserved for mini and mainframe systems are available to operate on these enhanced microcomputers. One of the more exciting areas that is beginning to emerge is the integration of both raster and vector formats on a single computer screen. This technology will allow satellite imagery or digital aerial photography to be presented as a background to a vector display.

  6. Natural Resource Information System. Volume 1: Overall description

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A prototype computer-based Natural Resource Information System was designed which could store, process, and display data of maximum usefulness to land management decision making. The system includes graphic input and display, the use of remote sensing as a data source, and it is useful at multiple management levels. A survey established current decision making processes and functions, information requirements, and data collection and processing procedures. The applications of remote sensing data and processing requirements were established. Processing software was constructed and a data base established using high-altitude imagery and map coverage of selected areas of SE Arizona. Finally a demonstration of system processing functions was conducted utilizing material from the data base.

  7. Hydrological research in Ethiopia

    NASA Astrophysics Data System (ADS)

    Gebremichael, M.

    2012-12-01

    Almost all major development problems in Ethiopia are water-related: food insecurity, low economic development, recurrent droughts, disastrous floods, poor health conditions, and low energy condition. In order to develop and manage existing water resources in a sustainable manner, knowledge is required about water availability, water quality, water demand in various sectors, and the impacts of water resource projects on health and the environment. The lack of ground-based data has been a major challenge for generating this knowledge. Current advances in remote sensing and computer simulation technology could provide alternative source of datasets. In this talk, I will present the challenges and opportunities in using remote sensing datasets and hydrological models in regions such as Africa where ground-based datasets are scarce.

  8. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 39: The role of computer networks in aerospace engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Ann P.; Pinelli, Thomas E.

    1994-01-01

    This paper presents selected results from an empirical investigation into the use of computer networks in aerospace engineering. Such networks allow aerospace engineers to communicate with people and access remote resources through electronic mail, file transfer, and remote log-in. The study drew its subjects from private sector, government and academic organizations in the U.S. aerospace industry. Data presented here were gathered in a mail survey, conducted in Spring 1993, that was distributed to aerospace engineers performing a wide variety of jobs. Results from the mail survey provide a snapshot of the current use of computer networks in the aerospace industry, suggest factors associated with the use of networks, and identify perceived impacts of networks on aerospace engineering work and communication.

  9. Architecture for Control of the K9 Rover

    NASA Technical Reports Server (NTRS)

    Bresina, John L.; Bualat, maria; Fair, Michael; Wright, Anne; Washington, Richard

    2006-01-01

    Software featuring a multilevel architecture is used to control the hardware on the K9 Rover, which is a mobile robot used in research on robots for scientific exploration and autonomous operation in general. The software consists of five types of modules: Device Drivers - These modules, at the lowest level of the architecture, directly control motors, cameras, data buses, and other hardware devices. Resource Managers - Each of these modules controls several device drivers. Resource managers can be commanded by either a remote operator or the pilot or conditional-executive modules described below. Behaviors and Data Processors - These modules perform computations for such functions as planning paths, avoiding obstacles, visual tracking, and stereoscopy. These modules can be commanded only by the pilot. Pilot - The pilot receives a possibly complex command from the remote operator or the conditional executive, then decomposes the command into (1) more-specific commands to the resource managers and (2) requests for information from the behaviors and data processors. Conditional Executive - This highest-level module interprets a command plan sent by the remote operator, determines whether resources required for execution of the plan are available, monitors execution, and, if necessary, selects an alternate branch of the plan.

  10. CloVR: a virtual machine for automated and portable sequence analysis from the desktop using cloud computing.

    PubMed

    Angiuoli, Samuel V; Matalka, Malcolm; Gussman, Aaron; Galens, Kevin; Vangala, Mahesh; Riley, David R; Arze, Cesar; White, James R; White, Owen; Fricke, W Florian

    2011-08-30

    Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing.

  11. Optimisation of the usage of LHC and local computing resources in a multidisciplinary physics department hosting a WLCG Tier-2 centre

    NASA Astrophysics Data System (ADS)

    Barberis, Stefano; Carminati, Leonardo; Leveraro, Franco; Mazza, Simone Michele; Perini, Laura; Perlz, Francesco; Rebatto, David; Tura, Ruggero; Vaccarossa, Luca; Villaplana, Miguel

    2015-12-01

    We present the approach of the University of Milan Physics Department and the local unit of INFN to allow and encourage the sharing among different research areas of computing, storage and networking resources (the largest ones being those composing the Milan WLCG Tier-2 centre and tailored to the needs of the ATLAS experiment). Computing resources are organised as independent HTCondor pools, with a global master in charge of monitoring them and optimising their usage. The configuration has to provide satisfactory throughput for both serial and parallel (multicore, MPI) jobs. A combination of local, remote and cloud storage options are available. The experience of users from different research areas operating on this shared infrastructure is discussed. The promising direction of improving scientific computing throughput by federating access to distributed computing and storage also seems to fit very well with the objectives listed in the European Horizon 2020 framework for research and development.

  12. International Symposium on Remote Sensing of Environment, 9th, University of Michigan, Ann Arbor, Mich., April 15-19, 1974, Proceedings. Volumes 1, 2 & 3

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The present work gathers together numerous papers describing the use of remote sensing technology for mapping, monitoring, and management of earth resources and man's environment. Studies using various types of sensing equipment are described, including multispectral scanners, radar imagery, spectrometers, lidar, and aerial photography, and both manual and computer-aided data processing techniques are described. Some of the topics covered include: estimation of population density in Tokyo districts from ERTS-1 data, a clustering algorithm for unsupervised crop classification, passive microwave sensing of moist soils, interactive computer processing for land use planning, the use of remote sensing to delineate floodplains, moisture detection from Skylab, scanning thermal plumes, electrically scanning microwave radiometers, oil slick detection by X-band synthetic aperture radar, and the use of space photos for search of oil and gas fields. Individual items are announced in this issue.

  13. Remote control system for high-perfomance computer simulation of crystal growth by the PFC method

    NASA Astrophysics Data System (ADS)

    Pavlyuk, Evgeny; Starodumov, Ilya; Osipov, Sergei

    2017-04-01

    Modeling of crystallization process by the phase field crystal method (PFC) - one of the important directions of modern computational materials science. In this paper, the practical side of the computer simulation of the crystallization process by the PFC method is investigated. To solve problems using this method, it is necessary to use high-performance computing clusters, data storage systems and other often expensive complex computer systems. Access to such resources is often limited, unstable and accompanied by various administrative problems. In addition, the variety of software and settings of different computing clusters sometimes does not allow researchers to use unified program code. There is a need to adapt the program code for each configuration of the computer complex. The practical experience of the authors has shown that the creation of a special control system for computing with the possibility of remote use can greatly simplify the implementation of simulations and increase the performance of scientific research. In current paper we show the principal idea of such a system and justify its efficiency.

  14. GIS Integration for Quantitatively Determining the Capabilities of Five Remote Sensors for Resource Exploration

    NASA Technical Reports Server (NTRS)

    Pascucci, R. F.; Smith, A.

    1982-01-01

    To assist the U.S. Geological Survey in carrying out a Congressional mandate to investigate the use of side-looking airborne radar (SLAR) for resources exploration, a research program was conducted to define the contribution of SLAR imagery to structural geologic mapping and to compare this with contributions from other remote sensing systems. Imagery from two SLAR systems and from three other remote sensing systems was interpreted, and the resulting information was digitized, quantified and intercompared using a computer-assisted geographic information system (GIS). The study area covers approximately 10,000 square miles within the Naval Petroleum Reserve, Alaska, and is situated between the foothills of the Brooks Range and the North Slope. The principal objectives were: (1) to establish quantitatively, the total information contribution of each of the five remote sensing systems to the mapping of structural geology; (2) to determine the amount of information detected in common when the sensors are used in combination; and (3) to determine the amount of unique, incremental information detected by each sensor when used in combination with others. The remote sensor imagery that was investigated included real-aperture and synthetic-aperture radar imagery, standard and digitally enhanced LANDSAT MSS imagery, and aerial photos.

  15. Atmospheric transmission computer program CP

    NASA Technical Reports Server (NTRS)

    Pitts, D. E.; Barnett, T. L.; Korb, C. L.; Hanby, W.; Dillinger, A. E.

    1974-01-01

    A computer program is described which allows for calculation of the effects of carbon dioxide, water vapor, methane, ozone, carbon monoxide, and nitrous oxide on earth resources remote sensing techniques. A flow chart of the program and operating instructions are provided. Comparisons are made between the atmospheric transmission obtained from laboratory and spacecraft spectrometer data and that obtained from a computer prediction using a model atmosphere and radiosonde data. Limitations of the model atmosphere are discussed. The computer program listings, input card formats, and sample runs for both radiosonde data and laboratory data are included.

  16. Approaches in highly parameterized inversion-PESTCommander, a graphical user interface for file and run management across networks

    USGS Publications Warehouse

    Karanovic, Marinko; Muffels, Christopher T.; Tonkin, Matthew J.; Hunt, Randall J.

    2012-01-01

    Models of environmental systems have become increasingly complex, incorporating increasingly large numbers of parameters in an effort to represent physical processes on a scale approaching that at which they occur in nature. Consequently, the inverse problem of parameter estimation (specifically, model calibration) and subsequent uncertainty analysis have become increasingly computation-intensive endeavors. Fortunately, advances in computing have made computational power equivalent to that of dozens to hundreds of desktop computers accessible through a variety of alternate means: modelers have various possibilities, ranging from traditional Local Area Networks (LANs) to cloud computing. Commonly used parameter estimation software is well suited to take advantage of the availability of such increased computing power. Unfortunately, logistical issues become increasingly important as an increasing number and variety of computers are brought to bear on the inverse problem. To facilitate efficient access to disparate computer resources, the PESTCommander program documented herein has been developed to provide a Graphical User Interface (GUI) that facilitates the management of model files ("file management") and remote launching and termination of "slave" computers across a distributed network of computers ("run management"). In version 1.0 described here, PESTCommander can access and ascertain resources across traditional Windows LANs: however, the architecture of PESTCommander has been developed with the intent that future releases will be able to access computing resources (1) via trusted domains established in Wide Area Networks (WANs) in multiple remote locations and (2) via heterogeneous networks of Windows- and Unix-based operating systems. The design of PESTCommander also makes it suitable for extension to other computational resources, such as those that are available via cloud computing. Version 1.0 of PESTCommander was developed primarily to work with the parameter estimation software PEST; the discussion presented in this report focuses on the use of the PESTCommander together with Parallel PEST. However, PESTCommander can be used with a wide variety of programs and models that require management, distribution, and cleanup of files before or after model execution. In addition to its use with the Parallel PEST program suite, discussion is also included in this report regarding the use of PESTCommander with the Global Run Manager GENIE, which was developed simultaneously with PESTCommander.

  17. LINCS: Livermore's network architecture. [Octopus computing network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fletcher, J.G.

    1982-01-01

    Octopus, a local computing network that has been evolving at the Lawrence Livermore National Laboratory for over fifteen years, is currently undergoing a major revision. The primary purpose of the revision is to consolidate and redefine the variety of conventions and formats, which have grown up over the years, into a single standard family of protocols, the Livermore Interactive Network Communication Standard (LINCS). This standard treats the entire network as a single distributed operating system such that access to a computing resource is obtained in a single way, whether that resource is local (on the same computer as the accessingmore » process) or remote (on another computer). LINCS encompasses not only communication but also such issues as the relationship of customer to server processes and the structure, naming, and protection of resources. The discussion includes: an overview of the Livermore user community and computing hardware, the functions and structure of each of the seven layers of LINCS protocol, the reasons why we have designed our own protocols and why we are dissatisfied by the directions that current protocol standards are taking.« less

  18. Interfacing External Quantum Devices to a Universal Quantum Computer

    PubMed Central

    Lagana, Antonio A.; Lohe, Max A.; von Smekal, Lorenz

    2011-01-01

    We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer. PMID:22216276

  19. Interfacing external quantum devices to a universal quantum computer.

    PubMed

    Lagana, Antonio A; Lohe, Max A; von Smekal, Lorenz

    2011-01-01

    We present a scheme to use external quantum devices using the universal quantum computer previously constructed. We thereby show how the universal quantum computer can utilize networked quantum information resources to carry out local computations. Such information may come from specialized quantum devices or even from remote universal quantum computers. We show how to accomplish this by devising universal quantum computer programs that implement well known oracle based quantum algorithms, namely the Deutsch, Deutsch-Jozsa, and the Grover algorithms using external black-box quantum oracle devices. In the process, we demonstrate a method to map existing quantum algorithms onto the universal quantum computer. © 2011 Lagana et al.

  20. Computerized data reduction techniques for nadir viewing remote sensors

    NASA Technical Reports Server (NTRS)

    Tiwari, S. N.; Gormsen, Barbara B.

    1985-01-01

    Computer resources have been developed for the analysis and reduction of MAPS experimental data from the OSTA-1 payload. The MAPS Research Project is concerned with the measurement of the global distribution of mid-tropospheric carbon monoxide. The measurement technique for the MAPS instrument is based on non-dispersive gas filter radiometer operating in the nadir viewing mode. The MAPS experiment has two passive remote sensing instruments, the prototype instrument which is used to measure tropospheric air pollution from aircraft platforms and the third generation (OSTA) instrument which is used to measure carbon monoxide in the mid and upper troposphere from space platforms. Extensive effort was also expended in support of the MAPS/OSTA-3 shuttle flight. Specific capabilities and resources developed are discussed.

  1. New techniques for the quantification and modeling of remotely sensed alteration and linear features in mineral resource assessment studies

    USGS Publications Warehouse

    Trautwein, C.M.; Rowan, L.C.

    1987-01-01

    Linear structural features and hydrothermally altered rocks that were interpreted from Landsat data have been used by the U.S. Geological Survey (USGS) in regional mineral resource appraisals for more than a decade. In the past, linear features and alterations have been incorporated into models for assessing mineral resources potential by manually overlaying these and other data sets. Recently, USGS research into computer-based geographic information systems (GIS) for mineral resources assessment programs has produced several new techniques for data analysis, quantification, and integration to meet assessment objectives.

  2. Optimizing Scheme for Remote Preparation of Four-particle Cluster-like Entangled States

    NASA Astrophysics Data System (ADS)

    Wang, Dong; Ye, Liu

    2011-09-01

    Recently, Ma et al. (Opt. Commun. 283:2640, 2010) have proposed a novel scheme for preparing a class of cluster-like entangled states based on a four-particle projective measurement. In this paper, we put forward a new and optimal scheme to realize the remote preparation for this class of cluster-like states with the aid of two bipartite partially entangled channels. Different from the previous scheme, we employ a two-particle projective measurement instead of the four-particle projective measurement during the preparation. Besides, the resource consumptions are computed in our scheme, which include classical communication cost and quantum resource consumptions. Moreover, we have some discussions on the features of our scheme and make some comparisons on resource consumptions and operation complexity between the previous scheme and ours. The results show that our scheme is more economic and feasible compared with the previous.

  3. RESTful M2M Gateway for Remote Wireless Monitoring for District Central Heating Networks

    PubMed Central

    Cheng, Bo; Wei, Zesan

    2014-01-01

    In recent years, the increased interest in energy conservation and environmental protection, combined with the development of modern communication and computer technology, has resulted in the replacement of distributed heating by central heating in urban areas. This paper proposes a Representational State Transfer (REST) Machine-to-Machine (M2M) gateway for wireless remote monitoring for a district central heating network. In particular, we focus on the resource-oriented RESTful M2M gateway architecture, and present an uniform devices abstraction approach based on Open Service Gateway Initiative (OSGi) technology, and implement the resource mapping mechanism between resource address mapping mechanism between RESTful resources and the physical sensor devices, and present the buffer queue combined with polling method to implement the data scheduling and Quality of Service (QoS) guarantee, and also give the RESTful M2M gateway open service Application Programming Interface (API) set. The performance has been measured and analyzed. Finally, the conclusions and future work are presented. PMID:25436650

  4. RESTful M2M gateway for remote wireless monitoring for district central heating networks.

    PubMed

    Cheng, Bo; Wei, Zesan

    2014-11-27

    In recent years, the increased interest in energy conservation and environmental protection, combined with the development of modern communication and computer technology, has resulted in the replacement of distributed heating by central heating in urban areas. This paper proposes a Representational State Transfer (REST) Machine-to-Machine (M2M) gateway for wireless remote monitoring for a district central heating network. In particular, we focus on the resource-oriented RESTful M2M gateway architecture, and present an uniform devices abstraction approach based on Open Service Gateway Initiative (OSGi) technology, and implement the resource mapping mechanism between resource address mapping mechanism between RESTful resources and the physical sensor devices, and present the buffer queue combined with polling method to implement the data scheduling and Quality of Service (QoS) guarantee, and also give the RESTful M2M gateway open service Application Programming Interface (API) set. The performance has been measured and analyzed. Finally, the conclusions and future work are presented.

  5. JIP: Java image processing on the Internet

    NASA Astrophysics Data System (ADS)

    Wang, Dongyan; Lin, Bo; Zhang, Jun

    1998-12-01

    In this paper, we present JIP - Java Image Processing on the Internet, a new Internet based application for remote education and software presentation. JIP offers an integrate learning environment on the Internet where remote users not only can share static HTML documents and lectures notes, but also can run and reuse dynamic distributed software components, without having the source code or any extra work of software compilation, installation and configuration. By implementing a platform-independent distributed computational model, local computational resources are consumed instead of the resources on a central server. As an extended Java applet, JIP allows users to selected local image files on their computers or specify any image on the Internet using an URL as input. Multimedia lectures such as streaming video/audio and digital images are integrated into JIP and intelligently associated with specific image processing functions. Watching demonstrations an practicing the functions with user-selected input data dramatically encourages leaning interest, while promoting the understanding of image processing theory. The JIP framework can be easily applied to other subjects in education or software presentation, such as digital signal processing, business, mathematics, physics, or other areas such as employee training and charged software consumption.

  6. CloVR: A virtual machine for automated and portable sequence analysis from the desktop using cloud computing

    PubMed Central

    2011-01-01

    Background Next-generation sequencing technologies have decentralized sequence acquisition, increasing the demand for new bioinformatics tools that are easy to use, portable across multiple platforms, and scalable for high-throughput applications. Cloud computing platforms provide on-demand access to computing infrastructure over the Internet and can be used in combination with custom built virtual machines to distribute pre-packaged with pre-configured software. Results We describe the Cloud Virtual Resource, CloVR, a new desktop application for push-button automated sequence analysis that can utilize cloud computing resources. CloVR is implemented as a single portable virtual machine (VM) that provides several automated analysis pipelines for microbial genomics, including 16S, whole genome and metagenome sequence analysis. The CloVR VM runs on a personal computer, utilizes local computer resources and requires minimal installation, addressing key challenges in deploying bioinformatics workflows. In addition CloVR supports use of remote cloud computing resources to improve performance for large-scale sequence processing. In a case study, we demonstrate the use of CloVR to automatically process next-generation sequencing data on multiple cloud computing platforms. Conclusion The CloVR VM and associated architecture lowers the barrier of entry for utilizing complex analysis protocols on both local single- and multi-core computers and cloud systems for high throughput data processing. PMID:21878105

  7. Mobile Cloud Computing with SOAP and REST Web Services

    NASA Astrophysics Data System (ADS)

    Ali, Mushtaq; Fadli Zolkipli, Mohamad; Mohamad Zain, Jasni; Anwar, Shahid

    2018-05-01

    Mobile computing in conjunction with Mobile web services drives a strong approach where the limitations of mobile devices may possibly be tackled. Mobile Web Services are based on two types of technologies; SOAP and REST, which works with the existing protocols to develop Web services. Both the approaches carry their own distinct features, yet to keep the constraint features of mobile devices in mind, the better in two is considered to be the one which minimize the computation and transmission overhead while offloading. The load transferring of mobile device to remote servers for execution called computational offloading. There are numerous approaches to implement computational offloading a viable solution for eradicating the resources constraints of mobile device, yet a dynamic method of computational offloading is always required for a smooth and simple migration of complex tasks. The intention of this work is to present a distinctive approach which may not engage the mobile resources for longer time. The concept of web services utilized in our work to delegate the computational intensive tasks for remote execution. We tested both SOAP Web services approach and REST Web Services for mobile computing. Two parameters considered in our lab experiments to test; Execution Time and Energy Consumption. The results show that RESTful Web services execution is far better than executing the same application by SOAP Web services approach, in terms of execution time and energy consumption. Conducting experiments with the developed prototype matrix multiplication app, REST execution time is about 200% better than SOAP execution approach. In case of energy consumption REST execution is about 250% better than SOAP execution approach.

  8. Field Tests of the Hybrid Remotely Operated Vehicle (HROV) Light Fiber Optic Tether

    DTIC Science & Technology

    2006-09-01

    ROV, and does not have on-board computational resources necessary to operate autonomously. A different application was the Theseus vehicle [3...OCEANS ‘ 99, September 13-19. pp 1307-1311. Seattle, WA. [3] Ferguson, J., Pope, A., Butler, B., Verrall, R. 1999. Theseus AUV - Two Record

  9. All Aboard the Internet.

    ERIC Educational Resources Information Center

    Descy, Don E.

    1993-01-01

    This introduction to the Internet with examples for Macintosh computer users demonstrates the ease of using e-mail, participating on discussion group listservs, logging in to remote sites using Telnet, and obtaining resources using the File Transfer Protocol (FTP). Included are lists of discussion groups, Telnet sites, and FTP Archive sites. (EA)

  10. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research

    PubMed Central

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C.

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction. PMID:24904400

  11. CBRAIN: a web-based, distributed computing platform for collaborative neuroimaging research.

    PubMed

    Sherif, Tarek; Rioux, Pierre; Rousseau, Marc-Etienne; Kassis, Nicolas; Beck, Natacha; Adalat, Reza; Das, Samir; Glatard, Tristan; Evans, Alan C

    2014-01-01

    The Canadian Brain Imaging Research Platform (CBRAIN) is a web-based collaborative research platform developed in response to the challenges raised by data-heavy, compute-intensive neuroimaging research. CBRAIN offers transparent access to remote data sources, distributed computing sites, and an array of processing and visualization tools within a controlled, secure environment. Its web interface is accessible through any modern browser and uses graphical interface idioms to reduce the technical expertise required to perform large-scale computational analyses. CBRAIN's flexible meta-scheduling has allowed the incorporation of a wide range of heterogeneous computing sites, currently including nine national research High Performance Computing (HPC) centers in Canada, one in Korea, one in Germany, and several local research servers. CBRAIN leverages remote computing cycles and facilitates resource-interoperability in a transparent manner for the end-user. Compared with typical grid solutions available, our architecture was designed to be easily extendable and deployed on existing remote computing sites with no tool modification, administrative intervention, or special software/hardware configuration. As October 2013, CBRAIN serves over 200 users spread across 53 cities in 17 countries. The platform is built as a generic framework that can accept data and analysis tools from any discipline. However, its current focus is primarily on neuroimaging research and studies of neurological diseases such as Autism, Parkinson's and Alzheimer's diseases, Multiple Sclerosis as well as on normal brain structure and development. This technical report presents the CBRAIN Platform, its current deployment and usage and future direction.

  12. Advanced Optical Burst Switched Network Concepts

    NASA Astrophysics Data System (ADS)

    Nejabati, Reza; Aracil, Javier; Castoldi, Piero; de Leenheer, Marc; Simeonidou, Dimitra; Valcarenghi, Luca; Zervas, Georgios; Wu, Jian

    In recent years, as the bandwidth and the speed of networks have increased significantly, a new generation of network-based applications using the concept of distributed computing and collaborative services is emerging (e.g., Grid computing applications). The use of the available fiber and DWDM infrastructure for these applications is a logical choice offering huge amounts of cheap bandwidth and ensuring global reach of computing resources [230]. Currently, there is a great deal of interest in deploying optical circuit (wavelength) switched network infrastructure for distributed computing applications that require long-lived wavelength paths and address the specific needs of a small number of well-known users. Typical users are particle physicists who, due to their international collaborations and experiments, generate enormous amounts of data (Petabytes per year). These users require a network infrastructures that can support processing and analysis of large datasets through globally distributed computing resources [230]. However, providing wavelength granularity bandwidth services is not an efficient and scalable solution for applications and services that address a wider base of user communities with different traffic profiles and connectivity requirements. Examples of such applications may be: scientific collaboration in smaller scale (e.g., bioinformatics, environmental research), distributed virtual laboratories (e.g., remote instrumentation), e-health, national security and defense, personalized learning environments and digital libraries, evolving broadband user services (i.e., high resolution home video editing, real-time rendering, high definition interactive TV). As a specific example, in e-health services and in particular mammography applications due to the size and quantity of images produced by remote mammography, stringent network requirements are necessary. Initial calculations have shown that for 100 patients to be screened remotely, the network would have to securely transport 1.2 GB of data every 30 s [230]. According to the above explanation it is clear that these types of applications need a new network infrastructure and transport technology that makes large amounts of bandwidth at subwavelength granularity, storage, computation, and visualization resources potentially available to a wide user base for specified time durations. As these types of collaborative and network-based applications evolve addressing a wide range and large number of users, it is infeasible to build dedicated networks for each application type or category. Consequently, there should be an adaptive network infrastructure able to support all application types, each with their own access, network, and resource usage patterns. This infrastructure should offer flexible and intelligent network elements and control mechanism able to deploy new applications quickly and efficiently.

  13. Virtual collaborative environments: programming and controlling robotic devices remotely

    NASA Astrophysics Data System (ADS)

    Davies, Brady R.; McDonald, Michael J., Jr.; Harrigan, Raymond W.

    1995-12-01

    This paper describes a technology for remote sharing of intelligent electro-mechanical devices. An architecture and actual system have been developed and tested, based on the proposed National Information Infrastructure (NII) or Information Highway, to facilitate programming and control of intelligent programmable machines (like robots, machine tools, etc.). Using appropriate geometric models, integrated sensors, video systems, and computing hardware; computer controlled resources owned and operated by different (in a geographic sense as well as legal sense) entities can be individually or simultaneously programmed and controlled from one or more remote locations. Remote programming and control of intelligent machines will create significant opportunities for sharing of expensive capital equipment. Using the technology described in this paper, university researchers, manufacturing entities, automation consultants, design entities, and others can directly access robotic and machining facilities located across the country. Disparate electro-mechanical resources will be shared in a manner similar to the way supercomputers are accessed by multiple users. Using this technology, it will be possible for researchers developing new robot control algorithms to validate models and algorithms right from their university labs without ever owning a robot. Manufacturers will be able to model, simulate, and measure the performance of prospective robots before selecting robot hardware optimally suited for their intended application. Designers will be able to access CNC machining centers across the country to fabricate prototypic parts during product design validation. An existing prototype architecture and system has been developed and proven. Programming and control of a large gantry robot located at Sandia National Laboratories in Albuquerque, New Mexico, was demonstrated from such remote locations as Washington D.C., Washington State, and Southern California.

  14. Multiple node remote messaging

    DOEpatents

    Blumrich, Matthias A.; Chen, Dong; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Ohmacht, Martin; Salapura, Valentina; Steinmacher-Burow, Burkhard; Vranas, Pavlos

    2010-08-31

    A method for passing remote messages in a parallel computer system formed as a network of interconnected compute nodes includes that a first compute node (A) sends a single remote message to a remote second compute node (B) in order to control the remote second compute node (B) to send at least one remote message. The method includes various steps including controlling a DMA engine at first compute node (A) to prepare the single remote message to include a first message descriptor and at least one remote message descriptor for controlling the remote second compute node (B) to send at least one remote message, including putting the first message descriptor into an injection FIFO at the first compute node (A) and sending the single remote message and the at least one remote message descriptor to the second compute node (B).

  15. Accessing and Visualizing scientific spatiotemporal data

    NASA Technical Reports Server (NTRS)

    Katz, Daniel S.; Bergou, Attila; Berriman, Bruce G.; Block, Gary L.; Collier, Jim; Curkendall, David W.; Good, John; Husman, Laura; Jacob, Joseph C.; Laity, Anastasia; hide

    2004-01-01

    This paper discusses work done by JPL 's Parallel Applications Technologies Group in helping scientists access and visualize very large data sets through the use of multiple computing resources, such as parallel supercomputers, clusters, and grids These tools do one or more of the following tasks visualize local data sets for local users, visualize local data sets for remote users, and access and visualize remote data sets The tools are used for various types of data, including remotely sensed image data, digital elevation models, astronomical surveys, etc The paper attempts to pull some common elements out of these tools that may be useful for others who have to work with similarly large data sets.

  16. The ORSER System for the Analysis of Remotely Sensed Digital Data

    NASA Technical Reports Server (NTRS)

    Myers, W. L.; Turner, B. J.

    1981-01-01

    The main effort of the University of Pennsylvania's Office for Remote Sensing of Earth Resources (ORSER) is the processing, analysis, and interpretation of multispectral data, most often supplied by NASA in the form of imagery and digital data. The facilities used for data reduction and image enhancement are described as well as the development of algorithms for producing a computer map showing various environmental and land use characteristics of data points in the analyzed scenes. The application of an (ORSER) capability for statewide monitoring of gypsy moth defoliation is discussed.

  17. Vibroacoustic Payload Environment Prediction System (VAPEPS): VAPEPS management center remote access guide

    NASA Technical Reports Server (NTRS)

    Fernandez, J. P.; Mills, D.

    1991-01-01

    A Vibroacoustic Payload Environment Prediction System (VAPEPS) Management Center was established at the JPL. The center utilizes the VAPEPS software package to manage a data base of Space Shuttle and expendable launch vehicle payload flight and ground test data. Remote terminal access over telephone lines to the computer system, where the program resides, was established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the VAPEPS Management Center and contains instructions for utilizing the resources of the center.

  18. Deterministic Remote Entanglement of Superconducting Circuits through Microwave Two-Photon Transitions

    NASA Astrophysics Data System (ADS)

    Campagne-Ibarcq, P.; Zalys-Geller, E.; Narla, A.; Shankar, S.; Reinhold, P.; Burkhart, L.; Axline, C.; Pfaff, W.; Frunzio, L.; Schoelkopf, R. J.; Devoret, M. H.

    2018-05-01

    Large-scale quantum information processing networks will most probably require the entanglement of distant systems that do not interact directly. This can be done by performing entangling gates between standing information carriers, used as memories or local computational resources, and flying ones, acting as quantum buses. We report the deterministic entanglement of two remote transmon qubits by Raman stimulated emission and absorption of a traveling photon wave packet. We achieve a Bell state fidelity of 73%, well explained by losses in the transmission line and decoherence of each qubit.

  19. Deterministic Remote Entanglement of Superconducting Circuits through Microwave Two-Photon Transitions.

    PubMed

    Campagne-Ibarcq, P; Zalys-Geller, E; Narla, A; Shankar, S; Reinhold, P; Burkhart, L; Axline, C; Pfaff, W; Frunzio, L; Schoelkopf, R J; Devoret, M H

    2018-05-18

    Large-scale quantum information processing networks will most probably require the entanglement of distant systems that do not interact directly. This can be done by performing entangling gates between standing information carriers, used as memories or local computational resources, and flying ones, acting as quantum buses. We report the deterministic entanglement of two remote transmon qubits by Raman stimulated emission and absorption of a traveling photon wave packet. We achieve a Bell state fidelity of 73%, well explained by losses in the transmission line and decoherence of each qubit.

  20. Biomedical image analysis and processing in clouds

    NASA Astrophysics Data System (ADS)

    Bednarz, Tomasz; Szul, Piotr; Arzhaeva, Yulia; Wang, Dadong; Burdett, Neil; Khassapov, Alex; Chen, Shiping; Vallotton, Pascal; Lagerstrom, Ryan; Gureyev, Tim; Taylor, John

    2013-10-01

    Cloud-based Image Analysis and Processing Toolbox project runs on the Australian National eResearch Collaboration Tools and Resources (NeCTAR) cloud infrastructure and allows access to biomedical image processing and analysis services to researchers via remotely accessible user interfaces. By providing user-friendly access to cloud computing resources and new workflow-based interfaces, our solution enables researchers to carry out various challenging image analysis and reconstruction tasks. Several case studies will be presented during the conference.

  1. An integrated system for land resources supervision based on the IoT and cloud computing

    NASA Astrophysics Data System (ADS)

    Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie

    2017-01-01

    Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.

  2. The application of LANDSAT remote sensing technology to natural resources management. Section 1: Introduction to VICAR - Image classification module. Section 2: Forest resource assessment of Humboldt County.

    NASA Technical Reports Server (NTRS)

    Fox, L., III (Principal Investigator); Mayer, K. E.

    1980-01-01

    A teaching module on image classification procedures using the VICAR computer software package was developed to optimize the training benefits for users of the VICAR programs. The field test of the module is discussed. An intensive forest land inventory strategy was developed for Humboldt County. The results indicate that LANDSAT data can be computer classified to yield site specific forest resource information with high accuracy (82%). The "Douglas-fir 80%" category was found to cover approximately 21% of the county and "Mixed Conifer 80%" covering about 13%. The "Redwood 80%" resource category, which represented dense old growth trees as well as large second growth, comprised 4.0% of the total vegetation mosaic. Furthermore, the "Brush" and "Brush-Regeneration" categories were found to be a significant part of the vegetative community, with area estimates of 9.4 and 10.0%.

  3. An Assessment of Remote Laboratory Experiments in Radio Communication

    ERIC Educational Resources Information Center

    Gampe, Andreas; Melkonyan, Arsen; Pontual, Murillo; Akopian, David

    2014-01-01

    Today's electrical and computer engineering graduates need marketable skills to work with electronic devices. Hands-on experiments prepare students to deal with real-world problems and help them to comprehend theoretical concepts and relate these to practical tasks. However, shortage of equipment, high costs, and a lack of human resources for…

  4. Development of a digital geographic data base for resource planning in a wildland environment

    NASA Technical Reports Server (NTRS)

    Ritter, P. R.; Benson, A. S.; Nedeff, N. E.

    1981-01-01

    Multiple resource planning requires the ability to access information for several parameters in a coordinated way. Attempts to do this manually, through the use of multiple transparent overlays or similar techniques can become awkward if there are more than a few parameters under consideration. One solution to this problem is to use a computer system to collect and organize the information into a data base that will make access and analysis easier, even for large numbers of parameters. The increase in the types and forms of remote sensing data and the decrease in costs for computer systems in the last decade has made this approach more popular than in the past. This paper describes the development of one such data base for the Big Basin Redwoods State Park in the Santa Cruz Mountains in California. The data base contains information for satellite spectral data, soil type, vegetation type, and hypsographic data and was developed for use in a cooperative project being conducted by personnel from the Remote Sensing Research Program and the California Department of Parks and Recreation

  5. NQS - NETWORK QUEUING SYSTEM, VERSION 2.0 (UNIX VERSION)

    NASA Technical Reports Server (NTRS)

    Walter, H.

    1994-01-01

    The Network Queuing System, NQS, is a versatile batch and device queuing facility for a single Unix computer or a group of networked computers. With the Unix operating system as a common interface, the user can invoke the NQS collection of user-space programs to move batch and device jobs freely around the different computer hardware tied into the network. NQS provides facilities for remote queuing, request routing, remote status, queue status controls, batch request resource quota limits, and remote output return. This program was developed as part of an effort aimed at tying together diverse UNIX based machines into NASA's Numerical Aerodynamic Simulator Processing System Network. This revision of NQS allows for creating, deleting, adding and setting of complexes that aid in limiting the number of requests to be handled at one time. It also has improved device-oriented queues along with some revision of the displays. NQS was designed to meet the following goals: 1) Provide for the full support of both batch and device requests. 2) Support all of the resource quotas enforceable by the underlying UNIX kernel implementation that are relevant to any particular batch request and its corresponding batch queue. 3) Support remote queuing and routing of batch and device requests throughout the NQS network. 4) Support queue access restrictions through user and group access lists for all queues. 5) Enable networked output return of both output and error files to possibly remote machines. 6) Allow mapping of accounts across machine boundaries. 7) Provide friendly configuration and modification mechanisms for each installation. 8) Support status operations across the network, without requiring a user to log in on remote target machines. 9) Provide for file staging or copying of files for movement to the actual execution machine. To support batch and device requests, NQS v.2 implements three queue types--batch, device and pipe. Batch queues hold and prioritize batch requests; device queues hold and prioritize device requests; pipe queues transport both batch and device requests to other batch, device, or pipe queues at local or remote machines. Unique to batch queues are resource quota limits that restrict the amounts of different resources that a batch request can consume during execution. Unique to each device queue is a set of one or more devices, such as a line printer, to which requests can be sent for execution. Pipe queues have associated destinations to which they route and deliver requests. If the proper destination machine is down or unreachable, pipe queues are able to requeue the request and deliver it later when the destination is available. All NQS network conversations are performed using the Berkeley socket mechanism as ported into the respective vendor kernels. NQS is written in C language. The generic UNIX version (ARC-13179) has been successfully implemented on a variety of UNIX platforms, including Sun3 and Sun4 series computers, SGI IRIS computers running IRIX 3.3, DEC computers running ULTRIX 4.1, AMDAHL computers running UTS 1.3 and 2.1, platforms running BSD 4.3 UNIX. The IBM RS/6000 AIX version (COS-10042) is a vendor port. NQS 2.0 will also communicate with the Cray Research, Inc. and Convex, Inc. versions of NQS. The standard distribution medium for either machine version of NQS 2.0 is a 60Mb, QIC-24, .25 inch streaming magnetic tape cartridge in UNIX tar format. Upon request the generic UNIX version (ARC-13179) can be provided in UNIX tar format on alternate media. Please contact COSMIC to discuss the availability and cost of media to meet your specific needs. An electronic copy of the NQS 2.0 documentation is included on the program media. NQS 2.0 was released in 1991. The IBM RS/6000 port of NQS was developed in 1992. IRIX is a trademark of Silicon Graphics Inc. IRIS is a registered trademark of Silicon Graphics Inc. UNIX is a registered trademark of UNIX System Laboratories Inc. Sun3 and Sun4 are trademarks of Sun Microsystems Inc. DEC and ULTRIX are trademarks of Digital Equipment Corporation.

  6. Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.

    PubMed

    Trudgian, David C; Mirzaei, Hamid

    2012-12-07

    We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.

  7. Environmental application of remote sensing methods to coastal zone land use and marine resource management. Appendix F: User's guide for advection, convection prototype. [southeastern Virginia

    NASA Technical Reports Server (NTRS)

    1972-01-01

    A user's manual is provided for the environmental computer model proposed for the Richmond-Cape Henry Environmental Laboratory (RICHEL) application project for coastal zone land use investigations and marine resources management. The model was developed around the hydrologic cycle and includes two data bases consisting of climate and land use variables. The main program is described, along with control parameters to be set and pertinent subroutines.

  8. Triple-server blind quantum computation using entanglement swapping

    NASA Astrophysics Data System (ADS)

    Li, Qin; Chan, Wai Hong; Wu, Chunhui; Wen, Zhonghua

    2014-04-01

    Blind quantum computation allows a client who does not have enough quantum resources or technologies to achieve quantum computation on a remote quantum server such that the client's input, output, and algorithm remain unknown to the server. Up to now, single- and double-server blind quantum computation have been considered. In this work, we propose a triple-server blind computation protocol where the client can delegate quantum computation to three quantum servers by the use of entanglement swapping. Furthermore, the three quantum servers can communicate with each other and the client is almost classical since one does not require any quantum computational power, quantum memory, and the ability to prepare any quantum states and only needs to be capable of getting access to quantum channels.

  9. Emulating a million machines to investigate botnets.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rudish, Donald W.

    2010-06-01

    Researchers at Sandia National Laboratories in Livermore, California are creating what is in effect a vast digital petridish able to hold one million operating systems at once in an effort to study the behavior of rogue programs known as botnets. Botnets are used extensively by malicious computer hackers to steal computing power fron Internet-connected computers. The hackers harness the stolen resources into a scattered but powerful computer that can be used to send spam, execute phishing, scams or steal digital information. These remote-controlled 'distributed computers' are difficult to observe and track. Botnets may take over parts of tens of thousandsmore » or in some cases even millions of computers, making them among the world's most powerful computers for some applications.« less

  10. Low cost infrared and near infrared sensors for UAVs

    NASA Astrophysics Data System (ADS)

    Aden, S. T.; Bialas, J. P.; Champion, Z.; Levin, E.; McCarty, J. L.

    2014-11-01

    Thermal remote sensing has a wide range of applications, though the extent of its use is inhibited by cost. Robotic and computer components are now widely available to consumers on a scale that makes thermal data a readily accessible resource. In this project, thermal imagery collected via a lightweight remote sensing Unmanned Aerial Vehicle (UAV) was used to create a surface temperature map for the purpose of providing wildland firefighting crews with a cost-effective and time-saving resource. The UAV system proved to be flexible, allowing for customized sensor packages to be designed that could include visible or infrared cameras, GPS, temperature sensors, and rangefinders, in addition to many data management options. Altogether, such a UAV system could be used to rapidly collect thermal and aerial data, with a geographic accuracy of less than one meter.

  11. The ATLAS Event Service: A new approach to event processing

    NASA Astrophysics Data System (ADS)

    Calafiura, P.; De, K.; Guan, W.; Maeno, T.; Nilsson, P.; Oleynik, D.; Panitkin, S.; Tsulaia, V.; Van Gemmeren, P.; Wenaus, T.

    2015-12-01

    The ATLAS Event Service (ES) implements a new fine grained approach to HEP event processing, designed to be agile and efficient in exploiting transient, short-lived resources such as HPC hole-filling, spot market commercial clouds, and volunteer computing. Input and output control and data flows, bookkeeping, monitoring, and data storage are all managed at the event level in an implementation capable of supporting ATLAS-scale distributed processing throughputs (about 4M CPU-hours/day). Input data flows utilize remote data repositories with no data locality or pre-staging requirements, minimizing the use of costly storage in favor of strongly leveraging powerful networks. Object stores provide a highly scalable means of remotely storing the quasi-continuous, fine grained outputs that give ES based applications a very light data footprint on a processing resource, and ensure negligible losses should the resource suddenly vanish. We will describe the motivations for the ES system, its unique features and capabilities, its architecture and the highly scalable tools and technologies employed in its implementation, and its applications in ATLAS processing on HPCs, commercial cloud resources, volunteer computing, and grid resources. Notice: This manuscript has been authored by employees of Brookhaven Science Associates, LLC under Contract No. DE-AC02-98CH10886 with the U.S. Department of Energy. The publisher by accepting the manuscript for publication acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes.

  12. Vibroacoustic payload environment prediction system (VAPEPS): Data base management center remote access guide

    NASA Technical Reports Server (NTRS)

    Thomas, V. C.

    1986-01-01

    A Vibroacoustic Data Base Management Center has been established at the Jet Propulsion Laboratory (JPL). The center utilizes the Vibroacoustic Payload Environment Prediction System (VAPEPS) software package to manage a data base of shuttle and expendable launch vehicle flight and ground test data. Remote terminal access over telephone lines to a dedicated VAPEPS computer system has been established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the JPL Data Base Management Center and contains instructions for utilizing the resources of the center.

  13. Semantics-based distributed I/O with the ParaMEDIC framework.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balaji, P.; Feng, W.; Lin, H.

    2008-01-01

    Many large-scale applications simultaneously rely on multiple resources for efficient execution. For example, such applications may require both large compute and storage resources; however, very few supercomputing centers can provide large quantities of both. Thus, data generated at the compute site oftentimes has to be moved to a remote storage site for either storage or visualization and analysis. Clearly, this is not an efficient model, especially when the two sites are distributed over a wide-area network. Thus, we present a framework called 'ParaMEDIC: Parallel Metadata Environment for Distributed I/O and Computing' which uses application-specific semantic information to convert the generatedmore » data to orders-of-magnitude smaller metadata at the compute site, transfer the metadata to the storage site, and re-process the metadata at the storage site to regenerate the output. Specifically, ParaMEDIC trades a small amount of additional computation (in the form of data post-processing) for a potentially significant reduction in data that needs to be transferred in distributed environments.« less

  14. Processing for spaceborne synthetic aperture radar imagery

    NASA Technical Reports Server (NTRS)

    Lybanon, M.

    1973-01-01

    The data handling and processing in using synthetic aperture radar as a satellite-borne earth resources remote sensor is considered. The discussion covers the nature of the problem, the theory, both conventional and potential advanced processing techniques, and a complete computer simulation. It is shown that digital processing is a real possibility and suggests some future directions for research.

  15. A regional land use survey based on remote sensing and other data: A report on a LANDSAT and computer mapping project, volume 2

    NASA Technical Reports Server (NTRS)

    Nez, G. (Principal Investigator); Mutter, D.

    1977-01-01

    The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.

  16. Use of remote sensing techniques for inventorying and planning utilization of land resources in South Dakota

    NASA Technical Reports Server (NTRS)

    Myers, V. I.; Frazee, C. J.; Rusche, A. E.; Moore, D. G.; Nelson, G. D.; Westin, F. C.

    1974-01-01

    The basic procedures for interpreting remote sensing imagery to rapidly develop general soils and land use inventories were developed and utilized in Pennington County, South Dakota. These procedures and remote sensing data products were illustrated and explained to many user groups, some of whom are interested in obtaining similar data. The general soils data were integrated with land soils data supplied by the county director of equalization to prepare a land value map. A computer print-out of this map indicating a land value for each quarter section is being used in tax reappraisal of Pennington County. The land use data provided the land use planners with the present use of land in Pennington County. Additional uses of remote sensing applications are also discussed including tornado damage assessment, hail damage evaluation, and presentation of soil and land value information on base maps assembled from ERTS-1 imagery.

  17. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2002-07

    USGS Publications Warehouse

    Pearson, D.K.; Gary, R.H.; Wilson, Z.D.

    2007-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is particularly useful when analyzing a wide variety of spatial data such as with remote sensing and spatial analysis. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This document presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup from 2002 through 2007.

  18. Robotic vehicles for planetary exploration

    NASA Astrophysics Data System (ADS)

    Wilcox, Brian; Matthies, Larry; Gennery, Donald; Cooper, Brian; Nguyen, Tam; Litwin, Todd; Mishkin, Andrew; Stone, Henry

    A program to develop planetary rover technology is underway at the Jet Propulsion Laboratory (JPL) under sponsorship of the National Aeronautics and Space Administration. Developmental systems with the necessary sensing, computing, power, and mobility resources to demonstrate realistic forms of control for various missions have been developed, and initial testing has been completed. These testbed systems and the associated navigation techniques used are described. Particular emphasis is placed on three technologies: Computer-Aided Remote Driving (CARD), Semiautonomous Navigation (SAN), and behavior control. It is concluded that, through the development and evaluation of such technologies, research at JPL has expanded the set of viable planetary rover mission possibilities beyond the limits of remotely teleoperated systems such as Lunakhod. These are potentially applicable to exploration of all the solid planetary surfaces in the solar system, including Mars, Venus, and the moons of the gas giant planets.

  19. Robotic vehicles for planetary exploration

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian; Matthies, Larry; Gennery, Donald; Cooper, Brian; Nguyen, Tam; Litwin, Todd; Mishkin, Andrew; Stone, Henry

    1992-01-01

    A program to develop planetary rover technology is underway at the Jet Propulsion Laboratory (JPL) under sponsorship of the National Aeronautics and Space Administration. Developmental systems with the necessary sensing, computing, power, and mobility resources to demonstrate realistic forms of control for various missions have been developed, and initial testing has been completed. These testbed systems and the associated navigation techniques used are described. Particular emphasis is placed on three technologies: Computer-Aided Remote Driving (CARD), Semiautonomous Navigation (SAN), and behavior control. It is concluded that, through the development and evaluation of such technologies, research at JPL has expanded the set of viable planetary rover mission possibilities beyond the limits of remotely teleoperated systems such as Lunakhod. These are potentially applicable to exploration of all the solid planetary surfaces in the solar system, including Mars, Venus, and the moons of the gas giant planets.

  20. Computer Aided Teaching in Photogrammetry, Remote Sensing, and Geomatics - A Status Review

    NASA Astrophysics Data System (ADS)

    Vyas, A.; Koenig, G.

    2014-04-01

    Education and training play vital role in the utilization of the technology. Shared and coordinated knowledge that geospatial technology and GIS deliver provides a deeper understanding of our present and will also help to better understand our future development. But it is not enough to explain new technological developments during congresses or workshops; it is also necessary to promote these new ideas and to distribute the knowledge by applying new learning strategies. This paper will review the status of computer aided teaching advances during the last decade, with a particular emphasis on photogrammetry, remote sensing, and geomatics. Some best practise examples will be presented featuring prominently recent Massive Open Online Courses (MOOCs) related to our fields. The consideration of mainly free online learning resources will include a commentary on quality and perceived effectiveness.

  1. Pilots 2.0: DIRAC pilots for all the skies

    NASA Astrophysics Data System (ADS)

    Stagni, F.; Tsaregorodtsev, A.; McNab, A.; Luzzi, C.

    2015-12-01

    In the last few years, new types of computing infrastructures, such as IAAS (Infrastructure as a Service) and IAAC (Infrastructure as a Client), gained popularity. New resources may come as part of pledged resources, while others are opportunistic. Most of these new infrastructures are based on virtualization techniques. Meanwhile, some concepts, such as distributed queues, lost appeal, while still supporting a vast amount of resources. Virtual Organizations are therefore facing heterogeneity of the available resources and the use of an Interware software like DIRAC to hide the diversity of underlying resources has become essential. The DIRAC WMS is based on the concept of pilot jobs that was introduced back in 2004. A pilot is what creates the possibility to run jobs on a worker node. Within DIRAC, we developed a new generation of pilot jobs, that we dubbed Pilots 2.0. Pilots 2.0 are not tied to a specific infrastructure; rather they are generic, fully configurable and extendible pilots. A Pilot 2.0 can be sent, as a script to be run, or it can be fetched from a remote location. A pilot 2.0 can run on every computing resource, e.g.: on CREAM Computing elements, on DIRAC Computing elements, on Virtual Machines as part of the contextualization script, or IAAC resources, provided that these machines are properly configured, hiding all the details of the Worker Nodes (WNs) infrastructure. Pilots 2.0 can be generated server and client side. Pilots 2.0 are the “pilots to fly in all the skies”, aiming at easy use of computing power, in whatever form it is presented. Another aim is the unification and simplification of the monitoring infrastructure for all kinds of computing resources, by using pilots as a network of distributed sensors coordinated by a central resource monitoring system. Pilots 2.0 have been developed using the command pattern. VOs using DIRAC can tune pilots 2.0 as they need, and extend or replace each and every pilot command in an easy way. In this paper we describe how Pilots 2.0 work with distributed and heterogeneous resources providing the necessary abstraction to deal with different kind of computing resources.

  2. Value of Landsat in urban water resources planning

    NASA Technical Reports Server (NTRS)

    Jackson, T. J.; Ragan, R. M.

    1977-01-01

    The reported investigation had the objective to evaluate the utility of satellite multispectral remote sensing in urban water resources planning. The results are presented of a study which was conducted to determine the economic impact of Landsat data. The use of Landsat data to estimate hydrologic model parameters employed in urban water resources planning is discussed. A decision regarding an employment of the Landsat data has to consider the tradeoff between data accuracy and cost. Bayesian decision theory is used in this connection. It is concluded that computer-aided interpretation of Landsat data is a highly cost-effective method of estimating the percentage of impervious area.

  3. Image Processing

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The Computer Graphics Center of North Carolina State University uses LAS, a COSMIC program, to analyze and manipulate data from Landsat and SPOT providing information for government and commercial land resource application projects. LAS is used to interpret aircraft/satellite data and enables researchers to improve image-based classification accuracies. The system is easy to use and has proven to be a valuable remote sensing training tool.

  4. Changes in the Arctic: Background and Issues for Congress

    DTIC Science & Technology

    2014-04-28

    knowledge of the physical environment. Data must be obtained by a suite of remote sensors (satellites, radars), autonomous sensors (data buoys...unmanned vehicles), and manned sensors (shipboard, coastal observing stations). Computer-based ocean and atmospheric models must be adjusted to the... soot ). 6. Implementation: In carrying out this policy as it relates to environmental protection and conservation of natural resources, the

  5. Information Power Grid: Distributed High-Performance Computing and Large-Scale Data Management for Science and Engineering

    NASA Technical Reports Server (NTRS)

    Johnston, William E.; Gannon, Dennis; Nitzberg, Bill

    2000-01-01

    We use the term "Grid" to refer to distributed, high performance computing and data handling infrastructure that incorporates geographically and organizationally dispersed, heterogeneous resources that are persistent and supported. This infrastructure includes: (1) Tools for constructing collaborative, application oriented Problem Solving Environments / Frameworks (the primary user interfaces for Grids); (2) Programming environments, tools, and services providing various approaches for building applications that use aggregated computing and storage resources, and federated data sources; (3) Comprehensive and consistent set of location independent tools and services for accessing and managing dynamic collections of widely distributed resources: heterogeneous computing systems, storage systems, real-time data sources and instruments, human collaborators, and communications systems; (4) Operational infrastructure including management tools for distributed systems and distributed resources, user services, accounting and auditing, strong and location independent user authentication and authorization, and overall system security services The vision for NASA's Information Power Grid - a computing and data Grid - is that it will provide significant new capabilities to scientists and engineers by facilitating routine construction of information based problem solving environments / frameworks. Such Grids will knit together widely distributed computing, data, instrument, and human resources into just-in-time systems that can address complex and large-scale computing and data analysis problems. Examples of these problems include: (1) Coupled, multidisciplinary simulations too large for single systems (e.g., multi-component NPSS turbomachine simulation); (2) Use of widely distributed, federated data archives (e.g., simultaneous access to metrological, topological, aircraft performance, and flight path scheduling databases supporting a National Air Space Simulation systems}; (3) Coupling large-scale computing and data systems to scientific and engineering instruments (e.g., realtime interaction with experiments through real-time data analysis and interpretation presented to the experimentalist in ways that allow direct interaction with the experiment (instead of just with instrument control); (5) Highly interactive, augmented reality and virtual reality remote collaborations (e.g., Ames / Boeing Remote Help Desk providing field maintenance use of coupled video and NDI to a remote, on-line airframe structures expert who uses this data to index into detailed design databases, and returns 3D internal aircraft geometry to the field); (5) Single computational problems too large for any single system (e.g. the rotocraft reference calculation). Grids also have the potential to provide pools of resources that could be called on in extraordinary / rapid response situations (such as disaster response) because they can provide common interfaces and access mechanisms, standardized management, and uniform user authentication and authorization, for large collections of distributed resources (whether or not they normally function in concert). IPG development and deployment is addressing requirements obtained by analyzing a number of different application areas, in particular from the NASA Aero-Space Technology Enterprise. This analysis has focussed primarily on two types of users: the scientist / design engineer whose primary interest is problem solving (e.g. determining wing aerodynamic characteristics in many different operating environments), and whose primary interface to IPG will be through various sorts of problem solving frameworks. The second type of user is the tool designer: the computational scientists who convert physics and mathematics into code that can simulate the physical world. These are the two primary users of IPG, and they have rather different requirements. The results of the analysis of the needs of these two types of users provides a broad set of requirements that gives rise to a general set of required capabilities. The IPG project is intended to address all of these requirements. In some cases the required computing technology exists, and in some cases it must be researched and developed. The project is using available technology to provide a prototype set of capabilities in a persistent distributed computing testbed. Beyond this, there are required capabilities that are not immediately available, and whose development spans the range from near-term engineering development (one to two years) to much longer term R&D (three to six years). Additional information is contained in the original.

  6. Research on cloud-based remote measurement and analysis system

    NASA Astrophysics Data System (ADS)

    Gao, Zhiqiang; He, Lingsong; Su, Wei; Wang, Can; Zhang, Changfan

    2015-02-01

    The promising potential of cloud computing and its convergence with technologies such as cloud storage, cloud push, mobile computing allows for creation and delivery of newer type of cloud service. Combined with the thought of cloud computing, this paper presents a cloud-based remote measurement and analysis system. This system mainly consists of three parts: signal acquisition client, web server deployed on the cloud service, and remote client. This system is a special website developed using asp.net and Flex RIA technology, which solves the selective contradiction between two monitoring modes, B/S and C/S. This platform supplies customer condition monitoring and data analysis service by Internet, which was deployed on the cloud server. Signal acquisition device is responsible for data (sensor data, audio, video, etc.) collection and pushes the monitoring data to the cloud storage database regularly. Data acquisition equipment in this system is only conditioned with the function of data collection and network function such as smartphone and smart sensor. This system's scale can adjust dynamically according to the amount of applications and users, so it won't cause waste of resources. As a representative case study, we developed a prototype system based on Ali cloud service using the rotor test rig as the research object. Experimental results demonstrate that the proposed system architecture is feasible.

  7. Community-driven computational biology with Debian Linux.

    PubMed

    Möller, Steffen; Krabbenhöft, Hajo Nils; Tille, Andreas; Paleino, David; Williams, Alan; Wolstencroft, Katy; Goble, Carole; Holland, Richard; Belhachemi, Dominique; Plessy, Charles

    2010-12-21

    The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers.

  8. An Experimental Framework for Executing Applications in Dynamic Grid Environments

    NASA Technical Reports Server (NTRS)

    Huedo, Eduardo; Montero, Ruben S.; Llorente, Ignacio M.; Bushnell, Dennis M. (Technical Monitor)

    2002-01-01

    The Grid opens up opportunities for resource-starved scientists and engineers to harness highly distributed computing resources. A number of Grid middleware projects are currently available to support the simultaneous exploitation of heterogeneous resources distributed in different administrative domains. However, efficient job submission and management continue being far from accessible to ordinary scientists and engineers due to the dynamic and complex nature of the Grid. This report describes a new Globus framework that allows an easier and more efficient execution of jobs in a 'submit and forget' fashion. Adaptation to dynamic Grid conditions is achieved by supporting automatic application migration following performance degradation, 'better' resource discovery, requirement change, owner decision or remote resource failure. The report also includes experimental results of the behavior of our framework on the TRGP testbed.

  9. Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karbach, Carsten; Frings, Wolfgang

    2013-02-22

    This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP.more » The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form the user display of LLview. These monitoring features have to be integrated into the development environment. Besides showing the current status PTP's monitoring also needs to allow for submitting and canceling user jobs. Monitoring peta-scale systems especially deals with presenting the large amount of status data in a useful manner. Users require to select arbitrary levels of detail. The monitoring views have to provide a quick overview of the system state, but also need to allow for zooming into specific parts of the system, into which the user is interested in. At present, the major batch systems running on supercomputers are PBS, TORQUE, ALPS and LoadLeveler, which have to be supported by both the monitoring and the job controlling component. Finally, PTP needs to be designed as generic as possible, so that it can be extended for future batch systems.« less

  10. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath

    PubMed Central

    Ellisman, M.; Hutton, T.; Kirkland, A.; Lin, A.; Lin, C.; Molina, T.; Peltier, S.; Singh, R.; Tang, K.; Trefethen, A.E.; Wallom, D.C.H.; Xiong, X.

    2009-01-01

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients. PMID:19487201

  11. The OptIPuter microscopy demonstrator: enabling science through a transatlantic lightpath.

    PubMed

    Ellisman, M; Hutton, T; Kirkland, A; Lin, A; Lin, C; Molina, T; Peltier, S; Singh, R; Tang, K; Trefethen, A E; Wallom, D C H; Xiong, X

    2009-07-13

    The OptIPuter microscopy demonstrator project has been designed to enable concurrent and remote usage of world-class electron microscopes located in Oxford and San Diego. The project has constructed a network consisting of microscopes and computational and data resources that are all connected by a dedicated network infrastructure using the UK Lightpath and US Starlight systems. Key science drivers include examples from both materials and biological science. The resulting system is now a permanent link between the Oxford and San Diego microscopy centres. This will form the basis of further projects between the sites and expansion of the types of systems that can be remotely controlled, including optical, as well as electron, microscopy. Other improvements will include the updating of the Microsoft cluster software to the high performance computing (HPC) server 2008, which includes the HPC basic profile implementation that will enable the development of interoperable clients.

  12. Remote sensing of natural resources: Quarterly literature review

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A quarterly review of technical literature concerning remote sensing techniques is presented. The format contains indexed and abstracted materials with emphasis on data gathering techniques performed or obtained remotely from space, aircraft, or ground-based stations. Remote sensor applications including the remote sensing of natural resources are presented.

  13. Earth System Grid II, Turning Climate Datasets into Community Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Middleton, Don

    2006-08-01

    The Earth System Grid (ESG) II project, funded by the Department of Energy’s Scientific Discovery through Advanced Computing program, has transformed climate data into community resources. ESG II has accomplished this goal by creating a virtual collaborative environment that links climate centers and users around the world to models and data via a computing Grid, which is based on the Department of Energy’s supercomputing resources and the Internet. Our project’s success stems from partnerships between climate researchers and computer scientists to advance basic and applied research in the terrestrial, atmospheric, and oceanic sciences. By interfacing with other climate science projects,more » we have learned that commonly used methods to manage and remotely distribute data among related groups lack infrastructure and under-utilize existing technologies. Knowledge and expertise gained from ESG II have helped the climate community plan strategies to manage a rapidly growing data environment more effectively. Moreover, approaches and technologies developed under the ESG project have impacted datasimulation integration in other disciplines, such as astrophysics, molecular biology and materials science.« less

  14. Telescience workstation

    NASA Technical Reports Server (NTRS)

    Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael

    1989-01-01

    As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.

  15. Groundwater resource exploration in Salem district, Tamil Nadu using GIS and remote sensing

    NASA Astrophysics Data System (ADS)

    Maheswaran, G.; Selvarani, A. Geetha; Elangovan, K.

    2016-03-01

    Since last decade, the value per barrel of potable groundwater has outpaced the value of a barrel of oil in many areas of the world. Hence, proper assessment of groundwater potential and management practices are the needs of the day. Establishing relationship between remote sensing data and hydrologic phenomenon can maximize the efficiency of water resources development projects. Present study focuses on groundwater potential assessment in Salem district, Tamil Nadu to investigate groundwater resource potential. At the same, all thematic layers important from ground water occurrence and movement point of view were digitized and integrated in the GIS environment. The weights of different parameters/themes were computed using weighed index overlay analysis (WIOA), analytic hierarchy process (AHP) and fuzzy logic technique. Through this integrated GIS analysis, groundwater prospect map of the study area was prepared qualitatively. Field verification at observation wells was used to verify identified potential zones and depth of water measured at observation wells. Generated map from weighed overlay using AHP performed very well in predicting the groundwater surface and hence this methodology proves to be a promising tool for future.

  16. Application of remote sensing to water resources problems

    NASA Technical Reports Server (NTRS)

    Clapp, J. L.

    1972-01-01

    The following conclusions were reached concerning the applications of remote sensing to water resources problems: (1) Remote sensing methods provide the most practical method of obtaining data for many water resources problems; (2) the multi-disciplinary approach is essential to the effective application of remote sensing to water resource problems; (3) there is a correlation between the amount of suspended solids in an effluent discharged into a water body and reflected energy; (4) remote sensing provides for more effective and accurate monitoring, discovery and characterization of the mixing zone of effluent discharged into a receiving water body; and (5) it is possible to differentiate between blue and blue-green algae.

  17. EarthExplorer

    USGS Publications Warehouse

    Houska, Treva

    2012-01-01

    The EarthExplorer trifold provides basic information for on-line access to remotely-sensed data from the U.S. Geological Survey Earth Resources Observation and Science (EROS) Center archive. The EarthExplorer (http://earthexplorer.usgs.gov/) client/server interface allows users to search and download aerial photography, satellite data, elevation data, land-cover products, and digitized maps. Minimum computer system requirements and customer service contact information also are included in the brochure.

  18. Assessing climate change impacts on water resources in remote mountain regions

    NASA Astrophysics Data System (ADS)

    Buytaert, Wouter; De Bièvre, Bert

    2013-04-01

    From a water resources perspective, remote mountain regions are often considered as a basket case. They are often regions where poverty is often interlocked with multiple threats to water supply, data scarcity, and high uncertainties. In these environments, it is paramount to generate locally relevant knowledge about water resources and how they impact local livelihoods. This is often problematic. Existing environmental data collection tends to be geographically biased towards more densely populated regions, and prioritized towards strategic economic activities. Data may also be locked behind institutional and technological barriers. These issues create a "knowledge trap" for data-poor regions, which is especially acute in remote and hard-to-reach mountain regions. We present lessons learned from a decade of water resources research in remote mountain regions of the Andes, Africa and South Asia. We review the entire tool chain of assessing climate change impacts on water resources, including the interrogation and downscaling of global circulation models, translating climate variables in water availability and access, and assessing local vulnerability. In global circulation models, mountain regions often stand out as regions of high uncertainties and lack of agreement of future trends. This is partly a technical artifact because of the different resolution and representation of mountain topography, but it also highlights fundamental uncertainties in climate impacts on mountain climate. This problem also affects downscaling efforts, because regional climate models should be run in very high spatial resolution to resolve local gradients, which is computationally very expensive. At the same time statistical downscaling methods may fail to find significant relations between local climate properties and synoptic processes. Further uncertainties are introduced when downscaled climate variables such as precipitation and temperature are to be translated in hydrologically relevant variables such as streamflow and groundwater recharge. Fundamental limitations in both the understanding of hydrological processes in mountain regions (e.g., glacier melt, wetland attenuation, groundwater flows) and in data availability introduce large uncertainties. Lastly, assessing access to water resources is a major challenge. Topographical gradients and barriers, as well as strong spatiotemporal variations in hydrological processes, makes it particularly difficult to assess which parts of the mountain population is most vulnerable to future perturbations of the water cycle.

  19. Extensible Computational Chemistry Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2012-08-09

    ECCE provides a sophisticated graphical user interface, scientific visualization tools, and the underlying data management framework enabling scientists to efficiently set up calculations and store, retrieve, and analyze the rapidly growing volumes of data produced by computational chemistry studies. ECCE was conceived as part of the Environmental Molecular Sciences Laboratory construction to solve the problem of researchers being able to effectively utilize complex computational chemistry codes and massively parallel high performance compute resources. Bringing the power of these codes and resources to the desktops of researcher and thus enabling world class research without users needing a detailed understanding of themore » inner workings of either the theoretical codes or the supercomputers needed to run them was a grand challenge problem in the original version of the EMSL. ECCE allows collaboration among researchers using a web-based data repository where the inputs and results for all calculations done within ECCE are organized. ECCE is a first of kind end-to-end problem solving environment for all phases of computational chemistry research: setting up calculations with sophisticated GUI and direct manipulation visualization tools, submitting and monitoring calculations on remote high performance supercomputers without having to be familiar with the details of using these compute resources, and performing results visualization and analysis including creating publication quality images. ECCE is a suite of tightly integrated applications that are employed as the user moves through the modeling process.« less

  20. A real-time remote video streaming platform for ultrasound imaging.

    PubMed

    Ahmadi, Mehdi; Gross, Warren J; Kadoury, Samuel

    2016-08-01

    Ultrasound is a viable imaging technology in remote and resources-limited areas. Ultrasonography is a user-dependent skill which depends on a high degree of training and hands-on experience. However, there is a limited number of skillful sonographers located in remote areas. In this work, we aim to develop a real-time video streaming platform which allows specialist physicians to remotely monitor ultrasound exams. To this end, an ultrasound stream is captured and transmitted through a wireless network into remote computers, smart-phones and tablets. In addition, the system is equipped with a camera to track the position of the ultrasound probe. The main advantage of our work is using an open source platform for video streaming which gives us more control over streaming parameters than the available commercial products. The transmission delays of the system are evaluated for several ultrasound video resolutions and the results show that ultrasound videos close to the high-definition (HD) resolution can be received and displayed on an Android tablet with the delay of 0.5 seconds which is acceptable for accurate real-time diagnosis.

  1. Measurement of Hydrologic Resource Parameters Through Remote Sensing in the Feather River Headwaters Area

    NASA Technical Reports Server (NTRS)

    Thorley, G. A.; Draeger, W. C.; Lauer, D. T.; Lent, J.; Roberts, E.

    1971-01-01

    The four problem are as being investigated are: (1) determination of the feasibility of providing the resource manager with operationally useful information through the use of remote sensing techniques; (2) definition of the spectral characteristics of earth resources and the optimum procedures for calibrating tone and color characteristics of multispectral imagery (3) determination of the extent to which humans can extract useful earth resource information through remote sensing imagery; (4) determination of the extent to which automatic classification and data processing can extract useful information from remote sensing data.

  2. "One-Stop Shopping" for Ocean Remote-Sensing and Model Data

    NASA Technical Reports Server (NTRS)

    Li, P. Peggy; Vu, Quoc; Chao, Yi; Li, Zhi-Jin; Choi, Jei-Kook

    2006-01-01

    OurOcean Portal 2.0 (http:// ourocean.jpl.nasa.gov) is a software system designed to enable users to easily gain access to ocean observation data, both remote-sensing and in-situ, configure and run an Ocean Model with observation data assimilated on a remote computer, and visualize both the observation data and the model outputs. At present, the observation data and models focus on the California coastal regions and Prince William Sound in Alaska. This system can be used to perform both real-time and retrospective analyses of remote-sensing data and model outputs. OurOcean Portal 2.0 incorporates state-of-the-art information technologies (IT) such as MySQL database, Java Web Server (Apache/Tomcat), Live Access Server (LAS), interactive graphics with Java Applet at the Client site and MatLab/GMT at the server site, and distributed computing. OurOcean currently serves over 20 real-time or historical ocean data products. The data are served in pre-generated plots or their native data format. For some of the datasets, users can choose different plotting parameters and produce customized graphics. OurOcean also serves 3D Ocean Model outputs generated by ROMS (Regional Ocean Model System) using LAS. The Live Access Server (LAS) software, developed by the Pacific Marine Environmental Laboratory (PMEL) of the National Oceanic and Atmospheric Administration (NOAA), is a configurable Web-server program designed to provide flexible access to geo-referenced scientific data. The model output can be views as plots in horizontal slices, depth profiles or time sequences, or can be downloaded as raw data in different data formats, such as NetCDF, ASCII, Binary, etc. The interactive visualization is provided by graphic software, Ferret, also developed by PMEL. In addition, OurOcean allows users with minimal computing resources to configure and run an Ocean Model with data assimilation on a remote computer. Users may select the forcing input, the data to be assimilated, the simulation period, and the output variables and submit the model to run on a backend parallel computer. When the run is complete, the output will be added to the LAS server for

  3. Promoting Sustainable Agricultural Practices Through Remote Sensing Education and Outreach

    NASA Astrophysics Data System (ADS)

    Driese, K. L.; Sivanpillai, R.

    2007-12-01

    Ever increasing demand for food and fiber calls for farm management strategies such as effective use of chemicals and efficient water use that will maximize productivity while reducing adverse impacts on the environment. Remotely sensed data collected by satellites are a valuable resource for farmers and ranchers for gaining insights about farm and ranch productivity. While researchers in universities and agencies have made tremendous advances, technology transfer to end-users has lagged, preventing the farmers from taking advantage of this valuable resource. To overcome this barrier, the Upper Midwest Aerospace Consortium (UMAC), a NASA funded program headed by the University of North Dakota, has been working with end-users to promote the use of remote sensing technology for sustainable agricultural practices. We will highlight the UMAC activities in Wyoming aimed at promoting this technology to sugar-beet farmers in the Big Horn Basin. To assist farmers who might not have a computer at home, we provide them to local county Cooperative Extension Offices pre-loaded with relevant imagery. Our targeted outreach activities have resulted in farmers requesting and using new and old Landsat images to identify growth anomalies and trends which have enabled them to develop management zones within their croplands.

  4. Expanding Your Laboratory by Accessing Collaboratory Resources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoyt, David W.; Burton, Sarah D.; Peterson, Michael R.

    2004-03-01

    The Environmental Molecular Sciences Laboratory (EMSL) in Richland, Washington, is the home of a research facility setup by the United States Department of Energy (DOE). The facility is atypical because it houses over 100 cutting-edge research systems for the use of researchers all over the United States and the world. Access to the lab is requested through a peer-review proposal process and the scientists who use the facility are generally referred to as ‘users’. There are six main research facilities housed in EMSL, all of which host visiting researchers. Several of these facilities also participate in the EMSL Collaboratory, amore » remote access capability supported by EMSL operations funds. Of these, the High-Field Magnetic Resonance Facility (HFMRF) and Molecular Science Computing Facility (MSCF) have a significant number of their users performing remote work. The HFMRF in EMSL currently houses 12 NMR spectrometers that range in magnet field strength from 7.05T to 21.1T. Staff associated with the NMR facility offers scientific expertise in the areas of structural biology, solid-state materials/catalyst characterization, and magnetic resonance imaging (MRI) techniques. The way in which the HFMRF operates, with a high level of dedication to remote operation across the full suite of High-Field NMR spectrometers, has earned it the name “Virtual NMR Facility”. This review will focus on the operational aspects of remote research done in the High-Field Magnetic Resonance Facility and the computer tools that make remote experiments possible.« less

  5. Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling

    NASA Astrophysics Data System (ADS)

    Her, Y. G.

    2017-12-01

    Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological observations such as soil moisture and radar rainfall depth and by sharing the model and its codes in public domain, respectively.

  6. Characterization and classification of vegetation canopy structure and distribution within the Great Smoky Mountains National Park using LiDAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Jitendra; HargroveJr., William Walter; Norman, Steven P

    Vegetation canopy structure is a critically important habit characteristic for many threatened and endangered birds and other animal species, and it is key information needed by forest and wildlife managers for monitoring and managing forest resources, conservation planning and fostering biodiversity. Advances in Light Detection and Ranging (LiDAR) technologies have enabled remote sensing-based studies of vegetation canopies by capturing three-dimensional structures, yielding information not available in two-dimensional images of the landscape pro- vided by traditional multi-spectral remote sensing platforms. However, the large volume data sets produced by airborne LiDAR instruments pose a significant computational challenge, requiring algorithms to identify andmore » analyze patterns of interest buried within LiDAR point clouds in a computationally efficient manner, utilizing state-of-art computing infrastructure. We developed and applied a computationally efficient approach to analyze a large volume of LiDAR data and to characterize and map the vegetation canopy structures for 139,859 hectares (540 sq. miles) in the Great Smoky Mountains National Park. This study helps improve our understanding of the distribution of vegetation and animal habitats in this extremely diverse ecosystem.« less

  7. REMOTE: Modem Communicator Program for the IBM personal computer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McGirt, F.

    1984-06-01

    REMOTE, a Modem Communicator Program, was developed to provide full duplex serial communication with arbitrary remote computers via either dial-up telephone modems or direct lines. The latest version of REMOTE (documented in this report) was developed for the IBM Personal Computer.

  8. Application of Remote Sensing Techniques for Appraising Changes in Wildlife Habitat

    NASA Technical Reports Server (NTRS)

    Nelson, H. K.; Klett, A. T.; Johnston, J. E.

    1971-01-01

    An attempt was made to investigate the potential of airborne, multispectral, line scanner data acquisition and computer-implemented automatic recognition techniques for providing useful information about waterfowl breeding habitat in North Dakota. The spectral characteristics of the components of a landscape containing waterfowl habitat can be detected with airborne scanners. By analyzing these spectral characteristics it is possible to identify and map the landscape components through analog and digital processing methods. At the present stage of development multispectral remote sensing techniques are not ready for operational application to surveys of migratory bird habitat and other such resources. Further developments are needed to: (1) increase accuracy; (2) decrease retrieval and processing time; and (3) reduce costs.

  9. A procedure for automated land use mapping using remotely sensed multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Whitley, S. L.

    1975-01-01

    A system of processing remotely sensed multispectral scanner data by computer programs to produce color-coded land use maps for large areas is described. The procedure is explained, the software and the hardware are described, and an analogous example of the procedure is presented. Detailed descriptions of the multispectral scanners currently in use are provided together with a summary of the background of current land use mapping techniques. The data analysis system used in the procedure and the pattern recognition software used are functionally described. Current efforts by the NASA Earth Resources Laboratory to evaluate operationally a less complex and less costly system are discussed in a separate section.

  10. Remote Sensing of Earth and Environment

    ERIC Educational Resources Information Center

    Schertler, Ronald J.

    1974-01-01

    Discusses basic principles of remote sensing applications and five areas of the earth resources survey program: agriculture and forestry production; geography, cartography, cultural resources; geology and mineral resources; hydrology and water resources; and oceanography and marine resources. Indicates that information acquisition is the first…

  11. Telescience Support Center Data System Software

    NASA Technical Reports Server (NTRS)

    Rahman, Hasan

    2010-01-01

    The Telescience Support Center (TSC) team has developed a databasedriven, increment-specific Data Require - ment Document (DRD) generation tool that automates much of the work required for generating and formatting the DRD. It creates a database to load the required changes to configure the TSC data system, thus eliminating a substantial amount of labor in database entry and formatting. The TSC database contains the TSC systems configuration, along with the experimental data, in which human physiological data must be de-commutated in real time. The data for each experiment also must be cataloged and archived for future retrieval. TSC software provides tools and resources for ground operation and data distribution to remote users consisting of PIs (principal investigators), bio-medical engineers, scientists, engineers, payload specialists, and computer scientists. Operations support is provided for computer systems access, detailed networking, and mathematical and computational problems of the International Space Station telemetry data. User training is provided for on-site staff and biomedical researchers and other remote personnel in the usage of the space-bound services via the Internet, which enables significant resource savings for the physical facility along with the time savings versus traveling to NASA sites. The software used in support of the TSC could easily be adapted to other Control Center applications. This would include not only other NASA payload monitoring facilities, but also other types of control activities, such as monitoring and control of the electric grid, chemical, or nuclear plant processes, air traffic control, and the like.

  12. Test Bed For Telerobots

    NASA Technical Reports Server (NTRS)

    Matijevic, Jacob R.; Zimmerman, Wayne F.; Dolinsky, Shlomo

    1990-01-01

    Assembly of electromechanical and electronic equipment (including computers) constitutes test bed for development of advanced robotic systems for remote manipulation. Combines features not found in commercial systems. Its architecture allows easy growth in complexity and level of automation. System national resource for validation of new telerobotic technology. Intended primarily for robots used in outer space, test bed adapted to development of advanced terrestrial telerobotic systems for handling radioactive materials, dangerous chemicals, and explosives.

  13. The ORSER LANDSAT Data Base of Pennsylvania

    NASA Technical Reports Server (NTRS)

    Turner, B. J.; Williams, D. L.

    1982-01-01

    A mosaicked LANDSAT data base for Pennsylvania, installed at the computation center of the Pennsylvania State University is described. Initially constructed by Penn State's Office for Remote Sensing of Earth Resources (ORSER) for the purpose of assisting in state-wide mapping of gypsy moth defoliation, the data base will be available to a variety of potential users. It will provide geometrically correct LANDSAT data accessible by political, jurisdictional, or arbitrary boundaries.

  14. AgRISTARS: Renewable resources inventory. Land information support system implementation plan and schedule. [San Juan National Forest pilot test

    NASA Technical Reports Server (NTRS)

    Yao, S. S. (Principal Investigator)

    1981-01-01

    The planning and scheduling of the use of remote sensing and computer technology to support the land management planning effort at the national forests level are outlined. The task planning and system capability development were reviewed. A user evaluation is presented along with technological transfer methodology. A land management planning pilot test of the San Juan National Forest is discussed.

  15. Mars rover local navigation and hazard avoidance

    NASA Technical Reports Server (NTRS)

    Wilcox, B. H.; Gennery, D. B.; Mishkin, A. H.

    1989-01-01

    A Mars rover sample return mission has been proposed for the late 1990's. Due to the long speed-of-light delays between earth and Mars, some autonomy on the rover is highly desirable. JPL has been conducting research in two possible modes of rover operation, Computer-Aided Remote Driving and Semiautonomous Navigation. A recently-completed research program used a half-scale testbed vehicle to explore several of the concepts in semiautonomous navigation. A new, full-scale vehicle with all computational and power resources on-board will be used in the coming year to demonstrate relatively fast semiautonomous navigation. The computational and power requirements for Mars rover local navigation and hazard avoidance are discussed.

  16. Mars Rover Local Navigation And Hazard Avoidance

    NASA Astrophysics Data System (ADS)

    Wilcox, B. H.; Gennery, D. B.; Mishkin, A. H.

    1989-03-01

    A Mars rover sample return mission has been proposed for the late 1990's. Due to the long speed-of-light delays between Earth and Mars, some autonomy on the rover is highly desirable. JPL has been conducting research in two possible modes of rover operation, Computer-Aided Remote Driving and Semiautonomous Navigation. A recently-completed research program used a half-scale testbed vehicle to explore several of the concepts in semiautonomous navigation. A new, full-scale vehicle with all computational and power resources on-board will be used in the coming year to demonstrate relatively fast semiautonomous navigation. The computational and power requirements for Mars rover local navigation and hazard avoidance are discussed.

  17. Squid - a simple bioinformatics grid.

    PubMed

    Carvalho, Paulo C; Glória, Rafael V; de Miranda, Antonio B; Degrave, Wim M

    2005-08-03

    BLAST is a widely used genetic research tool for analysis of similarity between nucleotide and protein sequences. This paper presents a software application entitled "Squid" that makes use of grid technology. The current version, as an example, is configured for BLAST applications, but adaptation for other computing intensive repetitive tasks can be easily accomplished in the open source version. This enables the allocation of remote resources to perform distributed computing, making large BLAST queries viable without the need of high-end computers. Most distributed computing / grid solutions have complex installation procedures requiring a computer specialist, or have limitations regarding operating systems. Squid is a multi-platform, open-source program designed to "keep things simple" while offering high-end computing power for large scale applications. Squid also has an efficient fault tolerance and crash recovery system against data loss, being able to re-route jobs upon node failure and recover even if the master machine fails. Our results show that a Squid application, working with N nodes and proper network resources, can process BLAST queries almost N times faster than if working with only one computer. Squid offers high-end computing, even for the non-specialist, and is freely available at the project web site. Its open-source and binary Windows distributions contain detailed instructions and a "plug-n-play" instalation containing a pre-configured example.

  18. Orchestrating Distributed Resource Ensembles for Petascale Science

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Baldin, Ilya; Mandal, Anirban; Ruth, Paul

    2014-04-24

    Distributed, data-intensive computational science applications of interest to DOE scientific com- munities move large amounts of data for experiment data management, distributed analysis steps, remote visualization, and accessing scientific instruments. These applications need to orchestrate ensembles of resources from multiple resource pools and interconnect them with high-capacity multi- layered networks across multiple domains. It is highly desirable that mechanisms are designed that provide this type of resource provisioning capability to a broad class of applications. It is also important to have coherent monitoring capabilities for such complex distributed environments. In this project, we addressed these problems by designing an abstractmore » API, enabled by novel semantic resource descriptions, for provisioning complex and heterogeneous resources from multiple providers using their native provisioning mechanisms and control planes: computational, storage, and multi-layered high-speed network domains. We used an extensible resource representation based on semantic web technologies to afford maximum flexibility to applications in specifying their needs. We evaluated the effectiveness of provisioning using representative data-intensive ap- plications. We also developed mechanisms for providing feedback about resource performance to the application, to enable closed-loop feedback control and dynamic adjustments to resource allo- cations (elasticity). This was enabled through development of a novel persistent query framework that consumes disparate sources of monitoring data, including perfSONAR, and provides scalable distribution of asynchronous notifications.« less

  19. Remote Sensing and the Earth

    NASA Technical Reports Server (NTRS)

    Brosius, C. A.; Gervin, J. C.; Ragusa, J. M.

    1977-01-01

    A text book on remote sensing, as part of the earth resources Skylab programs, is presented. The fundamentals of remote sensing and its application to agriculture, land use, geology, water and marine resources, and environmental monitoring are summarized.

  20. EIAGRID: In-field optimization of seismic data acquisition by real-time subsurface imaging using a remote GRID computing environment.

    NASA Astrophysics Data System (ADS)

    Heilmann, B. Z.; Vallenilla Ferrara, A. M.

    2009-04-01

    The constant growth of contaminated sites, the unsustainable use of natural resources, and, last but not least, the hydrological risk related to extreme meteorological events and increased climate variability are major environmental issues of today. Finding solutions for these complex problems requires an integrated cross-disciplinary approach, providing a unified basis for environmental science and engineering. In computer science, grid computing is emerging worldwide as a formidable tool allowing distributed computation and data management with administratively-distant resources. Utilizing these modern High Performance Computing (HPC) technologies, the GRIDA3 project bundles several applications from different fields of geoscience aiming to support decision making for reasonable and responsible land use and resource management. In this abstract we present a geophysical application called EIAGRID that uses grid computing facilities to perform real-time subsurface imaging by on-the-fly processing of seismic field data and fast optimization of the processing workflow. Even though, seismic reflection profiling has a broad application range spanning from shallow targets in a few meters depth to targets in a depth of several kilometers, it is primarily used by the hydrocarbon industry and hardly for environmental purposes. The complexity of data acquisition and processing poses severe problems for environmental and geotechnical engineering: Professional seismic processing software is expensive to buy and demands large experience from the user. In-field processing equipment needed for real-time data Quality Control (QC) and immediate optimization of the acquisition parameters is often not available for this kind of studies. As a result, the data quality will be suboptimal. In the worst case, a crucial parameter such as receiver spacing, maximum offset, or recording time turns out later to be inappropriate and the complete acquisition campaign has to be repeated. The EIAGRID portal provides an innovative solution to this problem combining state-of-the-art data processing methods and modern remote grid computing technology. In field-processing equipment is substituted by remote access to high performance grid computing facilities. The latter can be ubiquitously controlled by a user-friendly web-browser interface accessed from the field by any mobile computer using wireless data transmission technology such as UMTS (Universal Mobile Telecommunications System) or HSUPA/HSDPA (High-Speed Uplink/Downlink Packet Access). The complexity of data-manipulation and processing and thus also the time demanding user interaction is minimized by a data-driven, and highly automated velocity analysis and imaging approach based on the Common-Reflection-Surface (CRS) stack. Furthermore, the huge computing power provided by the grid deployment allows parallel testing of alternative processing sequences and parameter settings, a feature which considerably reduces the turn-around times. A shared data storage using georeferencing tools and data grid technology is under current development. It will allow to publish already accomplished projects, making results, processing workflows and parameter settings available in a transparent and reproducible way. Creating a unified database shared by all users will facilitate complex studies and enable the use of data-crossing techniques to incorporate results of other environmental applications hosted on the GRIDA3 portal.

  1. Designing for Virtual Windows in a Deep Space Habitat

    NASA Technical Reports Server (NTRS)

    Howe, A. Scott; Howard, Robert L.; Moore, Nathan; Amoroso, Michael

    2013-01-01

    This paper discusses configurations and test analogs toward the design of a virtual window capability in a Deep Space Habitat. Long-duration space missions will require crews to remain in the confines of a spacecraft for extended periods of time, with possible harmful effects if a crewmember cannot cope with the small habitable volume. Virtual windows expand perceived volume using a minimal amount of image projection equipment and computing resources, and allow a limited immersion in remote environments. Uses for the virtual window include: live or augmented reality views of the external environment; flight deck, piloting, observation, or other participation in remote missions through live transmission of cameras mounted to remote vehicles; pre-recorded background views of nature areas, seasonal occurrences, or cultural events; and pre-recorded events such as birthdays, anniversaries, and other meaningful events prepared by ground support and families of the crewmembers.

  2. The acquisition, storage, and dissemination of LANDSAT and other LACIE support data

    NASA Technical Reports Server (NTRS)

    Abbotts, L. F.; Nelson, R. M. (Principal Investigator)

    1979-01-01

    Activities performed at the LACIE physical data library are described. These include the researching, acquisition, indexing, maintenance, distribution, tracking, and control of LACIE operational data and documents. Much of the data available can be incorporated into an Earth resources data base. Elements of the data collection that can support future remote sensing programs include: (1) the LANDSAT full-frame image files; (2) the microfilm file of aerial and space photographic and multispectral maps and charts that encompasses a large portion of the Earth's surface; (3) the map/chart collection that includes various scale maps and charts for a good portion of the U.S. and the LACIE area in foreign countries; (4) computer-compatible tapes of good quality LANDSAT scenes; (5) basic remote sensing data, project data, reference material, and associated publications; (6) visual aids to support presentation on remote sensing projects; and (7) research acquisition and handling procedures for managing data.

  3. Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing

    DTIC Science & Technology

    2016-07-15

    AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER... electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study

  4. Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing

    DTIC Science & Technology

    2016-07-15

    AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER...electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study

  5. Community-driven computational biology with Debian Linux

    PubMed Central

    2010-01-01

    Background The Open Source movement and its technologies are popular in the bioinformatics community because they provide freely available tools and resources for research. In order to feed the steady demand for updates on software and associated data, a service infrastructure is required for sharing and providing these tools to heterogeneous computing environments. Results The Debian Med initiative provides ready and coherent software packages for medical informatics and bioinformatics. These packages can be used together in Taverna workflows via the UseCase plugin to manage execution on local or remote machines. If such packages are available in cloud computing environments, the underlying hardware and the analysis pipelines can be shared along with the software. Conclusions Debian Med closes the gap between developers and users. It provides a simple method for offering new releases of software and data resources, thus provisioning a local infrastructure for computational biology. For geographically distributed teams it can ensure they are working on the same versions of tools, in the same conditions. This contributes to the world-wide networking of researchers. PMID:21210984

  6. Remote Sensing For Water Resources And Hydrology. Recommended research emphasis for the 1980's

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The problems and the areas of activity that the Panel believes should be emphasized in work on remote sensing for water resources and hydrology in the 1980's are set forth. The Panel deals only with those activities and problems in water resources and hydrology that the Panel considers important, and where, in the Panel's opinion, application of current remote sensing capability or advancements in remote sensing capability can help meet urgent problems and provide large returns in practical benefits.

  7. Develop feedback system for intelligent dynamic resource allocation to improve application performance.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gentile, Ann C.; Brandt, James M.; Tucker, Thomas

    2011-09-01

    This report provides documentation for the completion of the Sandia Level II milestone 'Develop feedback system for intelligent dynamic resource allocation to improve application performance'. This milestone demonstrates the use of a scalable data collection analysis and feedback system that enables insight into how an application is utilizing the hardware resources of a high performance computing (HPC) platform in a lightweight fashion. Further we demonstrate utilizing the same mechanisms used for transporting data for remote analysis and visualization to provide low latency run-time feedback to applications. The ultimate goal of this body of work is performance optimization in the facemore » of the ever increasing size and complexity of HPC systems.« less

  8. Remote photonic metrology in the conservation of cultural heritage

    NASA Astrophysics Data System (ADS)

    Tornari, Vivi; Pedrini, G.; Osten, W.

    2013-05-01

    Photonic technologies play a leading innovative role of research in the fields of Cultural Heritage (CH) conservation, preservation and digitisation. In particular photonic technologies have introduced a new indispensable era of research in the conservation of cultural artefacts expanding from decorative objects, paintings, sculptures, monuments to archaeological sites and including fields of application as diverse as materials characterisation to restoration practices and from defect topography to 3d artwork reconstruction. Thus the last two decades photonic technologies have emerged as unique answer or most competitive alternative into many long-term standing disputes in conservation and restoration of Cultural Heritage. Despite the impressive advances on the state-of-the-art ranging from custom-made system development to new methods and practises, photonic research and technological developments remain incoherently scattered and fragmented with a significant amount of duplication of work and misuse of resources. In this context, further progress should aim to capitalise on the so far achieved milestones in any of the diverse applications flourished in the field of CH. Embedding of experimental facilities and conclusions seems the only way to secure the progress beyond the existing state of the art and its false use. The solution to this embedment seems possible through the new computing environments. Cloud computing environment and remote laboratory access hold the missing research objective to bring the leading research together and integrate the achievements. The cloud environment would allow experts from museums, galleries, historical sites, art historians, conservators, scientists and technologists, conservation and technical laboratories and SMEs to interact their research, communicate their achievements and share data and resources. The main instrument of this integration is the creation of a common research platform termed here Virtual Laboratory allowing not only remote research, inspection and evaluation, but also providing the results to the members and the public with instant and simultaneous access to necessary information, knowledge and technologies. In this paper it is presented the concept and first results confirming the potential of implementing metrology techniques as remote digital laboratory facilities in artwork structural assessment. The method paves the way of the general objective to introduce remote photonic technologies in the sensitive field of Cultural Heritage.

  9. The Montage architecture for grid-enabled science processing of large, distributed datasets

    NASA Technical Reports Server (NTRS)

    Jacob, Joseph C.; Katz, Daniel S .; Prince, Thomas; Berriman, Bruce G.; Good, John C.; Laity, Anastasia C.; Deelman, Ewa; Singh, Gurmeet; Su, Mei-Hui

    2004-01-01

    Montage is an Earth Science Technology Office (ESTO) Computational Technologies (CT) Round III Grand Challenge investigation to deploy a portable, compute-intensive, custom astronomical image mosaicking service for the National Virtual Observatory (NVO). Although Montage is developing a compute- and data-intensive service for the astronomy community, we are also helping to address a problem that spans both Earth and Space science, namely how to efficiently access and process multi-terabyte, distributed datasets. In both communities, the datasets are massive, and are stored in distributed archives that are, in most cases, remote from the available Computational resources. Therefore, state of the art computational grid technologies are a key element of the Montage portal architecture. This paper describes the aspects of the Montage design that are applicable to both the Earth and Space science communities.

  10. [Construction and analysis of a monitoring system with remote real-time multiple physiological parameters based on cloud computing].

    PubMed

    Zhu, Lingyun; Li, Lianjie; Meng, Chunyan

    2014-12-01

    There have been problems in the existing multiple physiological parameter real-time monitoring system, such as insufficient server capacity for physiological data storage and analysis so that data consistency can not be guaranteed, poor performance in real-time, and other issues caused by the growing scale of data. We therefore pro posed a new solution which was with multiple physiological parameters and could calculate clustered background data storage and processing based on cloud computing. Through our studies, a batch processing for longitudinal analysis of patients' historical data was introduced. The process included the resource virtualization of IaaS layer for cloud platform, the construction of real-time computing platform of PaaS layer, the reception and analysis of data stream of SaaS layer, and the bottleneck problem of multi-parameter data transmission, etc. The results were to achieve in real-time physiological information transmission, storage and analysis of a large amount of data. The simulation test results showed that the remote multiple physiological parameter monitoring system based on cloud platform had obvious advantages in processing time and load balancing over the traditional server model. This architecture solved the problems including long turnaround time, poor performance of real-time analysis, lack of extensibility and other issues, which exist in the traditional remote medical services. Technical support was provided in order to facilitate a "wearable wireless sensor plus mobile wireless transmission plus cloud computing service" mode moving towards home health monitoring for multiple physiological parameter wireless monitoring.

  11. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 8: User's Mission and System Requirements Data (appendix A of Volume 3)

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A computer printout is presented of the mission requirement for the TERSSE missions and their associated user tasks. The data included in the data base represents a broad-based attempt to define the amount, extent, and type of information needed for an earth resources management program in the era of the space shuttle. An effort was made to consider all aspects of remote sensing and resource management; because of its broad scope, it is not intended that the data be used without verification for in-depth studies of particular missions and/or users. The data base represents the quantitative structure necessary to define the TERSSE architecture and requirements, and to an overall integrated view of the earth resources technology requirements of the 1980's.

  12. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery

    PubMed Central

    Qi, Baogui; Zhuang, Yin; Chen, He; Chen, Liang

    2018-01-01

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited. PMID:29693585

  13. On-Board, Real-Time Preprocessing System for Optical Remote-Sensing Imagery.

    PubMed

    Qi, Baogui; Shi, Hao; Zhuang, Yin; Chen, He; Chen, Liang

    2018-04-25

    With the development of remote-sensing technology, optical remote-sensing imagery processing has played an important role in many application fields, such as geological exploration and natural disaster prevention. However, relative radiation correction and geometric correction are key steps in preprocessing because raw image data without preprocessing will cause poor performance during application. Traditionally, remote-sensing data are downlinked to the ground station, preprocessed, and distributed to users. This process generates long delays, which is a major bottleneck in real-time applications for remote-sensing data. Therefore, on-board, real-time image preprocessing is greatly desired. In this paper, a real-time processing architecture for on-board imagery preprocessing is proposed. First, a hierarchical optimization and mapping method is proposed to realize the preprocessing algorithm in a hardware structure, which can effectively reduce the computation burden of on-board processing. Second, a co-processing system using a field-programmable gate array (FPGA) and a digital signal processor (DSP; altogether, FPGA-DSP) based on optimization is designed to realize real-time preprocessing. The experimental results demonstrate the potential application of our system to an on-board processor, for which resources and power consumption are limited.

  14. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    PubMed Central

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  15. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    PubMed

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  16. Interactive Scripting for Analysis and Visualization of Arbitrarily Large, Disparately Located Climate Data Ensembles Using a Progressive Runtime Server

    NASA Astrophysics Data System (ADS)

    Christensen, C.; Summa, B.; Scorzelli, G.; Lee, J. W.; Venkat, A.; Bremer, P. T.; Pascucci, V.

    2017-12-01

    Massive datasets are becoming more common due to increasingly detailed simulations and higher resolution acquisition devices. Yet accessing and processing these huge data collections for scientific analysis is still a significant challenge. Solutions that rely on extensive data transfers are increasingly untenable and often impossible due to lack of sufficient storage at the client side as well as insufficient bandwidth to conduct such large transfers, that in some cases could entail petabytes of data. Large-scale remote computing resources can be useful, but utilizing such systems typically entails some form of offline batch processing with long delays, data replications, and substantial cost for any mistakes. Both types of workflows can severely limit the flexible exploration and rapid evaluation of new hypotheses that are crucial to the scientific process and thereby impede scientific discovery. In order to facilitate interactivity in both analysis and visualization of these massive data ensembles, we introduce a dynamic runtime system suitable for progressive computation and interactive visualization of arbitrarily large, disparately located spatiotemporal datasets. Our system includes an embedded domain-specific language (EDSL) that allows users to express a wide range of data analysis operations in a simple and abstract manner. The underlying runtime system transparently resolves issues such as remote data access and resampling while at the same time maintaining interactivity through progressive and interruptible processing. Computations involving large amounts of data can be performed remotely in an incremental fashion that dramatically reduces data movement, while the client receives updates progressively thereby remaining robust to fluctuating network latency or limited bandwidth. This system facilitates interactive, incremental analysis and visualization of massive remote datasets up to petabytes in size. Our system is now available for general use in the community through both docker and anaconda.

  17. The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce

    NASA Astrophysics Data System (ADS)

    Chen, Xi; Zhou, Liqing

    2015-12-01

    With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.

  18. A streaming-based solution for remote visualization of 3D graphics on mobile devices.

    PubMed

    Lamberti, Fabrizio; Sanna, Andrea

    2007-01-01

    Mobile devices such as Personal Digital Assistants, Tablet PCs, and cellular phones have greatly enhanced user capability to connect to remote resources. Although a large set of applications are now available bridging the gap between desktop and mobile devices, visualization of complex 3D models is still a task hard to accomplish without specialized hardware. This paper proposes a system where a cluster of PCs, equipped with accelerated graphics cards managed by the Chromium software, is able to handle remote visualization sessions based on MPEG video streaming involving complex 3D models. The proposed framework allows mobile devices such as smart phones, Personal Digital Assistants (PDAs), and Tablet PCs to visualize objects consisting of millions of textured polygons and voxels at a frame rate of 30 fps or more depending on hardware resources at the server side and on multimedia capabilities at the client side. The server is able to concurrently manage multiple clients computing a video stream for each one; resolution and quality of each stream is tailored according to screen resolution and bandwidth of the client. The paper investigates in depth issues related to latency time, bit rate and quality of the generated stream, screen resolutions, as well as frames per second displayed.

  19. Study on Karst Information Identification of Qiandongnan Prefecture Based on RS and GIS Technology

    NASA Astrophysics Data System (ADS)

    Yao, M.; Zhou, G.; Wang, W.; Wu, Z.; Huang, Y.; Huang, X.

    2018-04-01

    Karst area is a pure natural resource base, at the same time, due to the special geological environment; there are droughts and floods alternating with frequent karst collapse, rocky desertification and other resource and environment problems, which seriously restrict the sustainable economic and social development in karst areas. Therefore, this paper identifies and studies the karst, and clarifies the distribution of karst. Provide basic data for the rational development of resources in the karst region and the governance of desertification. Due to the uniqueness of the karst landscape, it can't be directly recognized and extracted by computer in remote sensing images. Therefore, this paper uses the idea of "RS + DEM" to solve the above problems. this article is based on Landsat-5 TM imagery in 2010 and DEM data, proposes the methods to identify karst information research what is use of slope vector diagram, vegetation distribution map, distribution map of karst rocky desertification and other auxiliary data in combination with the signs for human-computer interaction interpretation, identification and extraction of peak forest, peaks cluster and isolated peaks, and further extraction of karst depression. Experiments show that this method achieves the "RS + DEM" mode through the reasonable combination of remote sensing images and DEM data. It not only effectively extracts karst areas covered with vegetation, but also quickly and accurately locks down the karst area and greatly improves the efficiency and precision of visual interpretation. The accurate interpretation rate of karst information in study area in this paper is 86.73 %.

  20. Satellite Remote Sensing for Coastal Management: A Review of Successful Applications

    NASA Astrophysics Data System (ADS)

    McCarthy, Matthew J.; Colna, Kaitlyn E.; El-Mezayen, Mahmoud M.; Laureano-Rosario, Abdiel E.; Méndez-Lázaro, Pablo; Otis, Daniel B.; Toro-Farmer, Gerardo; Vega-Rodriguez, Maria; Muller-Karger, Frank E.

    2017-08-01

    Management of coastal and marine natural resources presents a number of challenges as a growing global population and a changing climate require us to find better strategies to conserve the resources on which our health, economy, and overall well-being depend. To evaluate the status and trends in changing coastal resources over larger areas, managers in government agencies and private stakeholders around the world have increasingly turned to remote sensing technologies. A surge in collaborative and innovative efforts between resource managers, academic researchers, and industry partners is becoming increasingly vital to keep pace with evolving changes of our natural resources. Synoptic capabilities of remote sensing techniques allow assessments that are impossible to do with traditional methods. Sixty years of remote sensing research have paved the way for resource management applications, but uncertainties regarding the use of this technology have hampered its use in management fields. Here we review examples of remote sensing applications in the sectors of coral reefs, wetlands, water quality, public health, and fisheries and aquaculture that have successfully contributed to management and decision-making goals.

  1. Satellite Remote Sensing for Coastal Management: A Review of Successful Applications.

    PubMed

    McCarthy, Matthew J; Colna, Kaitlyn E; El-Mezayen, Mahmoud M; Laureano-Rosario, Abdiel E; Méndez-Lázaro, Pablo; Otis, Daniel B; Toro-Farmer, Gerardo; Vega-Rodriguez, Maria; Muller-Karger, Frank E

    2017-08-01

    Management of coastal and marine natural resources presents a number of challenges as a growing global population and a changing climate require us to find better strategies to conserve the resources on which our health, economy, and overall well-being depend. To evaluate the status and trends in changing coastal resources over larger areas, managers in government agencies and private stakeholders around the world have increasingly turned to remote sensing technologies. A surge in collaborative and innovative efforts between resource managers, academic researchers, and industry partners is becoming increasingly vital to keep pace with evolving changes of our natural resources. Synoptic capabilities of remote sensing techniques allow assessments that are impossible to do with traditional methods. Sixty years of remote sensing research have paved the way for resource management applications, but uncertainties regarding the use of this technology have hampered its use in management fields. Here we review examples of remote sensing applications in the sectors of coral reefs, wetlands, water quality, public health, and fisheries and aquaculture that have successfully contributed to management and decision-making goals.

  2. Advances in U.S. Land Imaging Capabilities

    NASA Astrophysics Data System (ADS)

    Stryker, T. S.

    2017-12-01

    Advancements in Earth observations, cloud computing, and data science are improving everyday life. Information from land-imaging satellites, such as the U.S. Landsat system, helps us to better understand the changing landscapes where we live, work, and play. This understanding builds capacity for improved decision-making about our lands, waters, and resources, driving economic growth, protecting lives and property, and safeguarding the environment. The USGS is fostering the use of land remote sensing technology to meet local, national, and global challenges. A key dimension to meeting these challenges is the full, free, and open provision of land remote sensing observations for both public and private sector applications. To achieve maximum impact, these data must also be easily discoverable, accessible, and usable. The presenter will describe the USGS Land Remote Sensing Program's current capabilities and future plans to collect and deliver land remote sensing information for societal benefit. He will discuss these capabilities in the context of national plans and policies, domestic partnerships, and international collaboration. The presenter will conclude with examples of how Landsat data is being used on a daily basis to improve lives and livelihoods.

  3. Indicators of international remote sensing activities

    NASA Technical Reports Server (NTRS)

    Spann, G. W.

    1977-01-01

    The extent of worldwide remote sensing activities, including the use of satellite and high/medium altitude aircraft data was studied. Data were obtained from numerous individuals and organizations with international remote sensing responsibilities. Indicators were selected to evaluate the nature and scope of remote sensing activities in each country. These indicators ranged from attendance at remote sensing workshops and training courses to the establishment of earth resources satellite ground stations and plans for the launch of earth resources satellites. Results indicate that this technology constitutes a rapidly increasing component of environmental, land use, and natural resources investigations in many countries, and most of these countries rely on the LANDSAT satellites for a major portion of their data.

  4. The EROS Data Center

    USGS Publications Warehouse

    ,

    1977-01-01

    The Earth Resources Observation Systems (EROS) Program of the U.S. Department of the Interior, administered by the Geological Survey, was established in 1966 to apply remote-sensing techniques to the inventory, monitoring, and management of natural resources. To meet its primary objective, the EROS Program includes research and training in the interpretation and application of remotely sensed data and provides remotely sensed data at nominal cost to scientists, resource planners, managers, and the public.

  5. The EROS Data Center

    USGS Publications Warehouse

    ,

    1981-01-01

    The Earth Resources Observation Systems (EROS) Program of the U.S. Department of the Interior, administered by the Geological Survey, was established in 1966 to apply remote-sensing techniques to the inventory, monitoring, and management of natural resources. To meet its primary objective, the EROS Program includes research and training in the interpretation and application of remotely sensed data and provides remotely sensed data at nominal cost to scientists, resource planners, managers, and the public.

  6. Master Software Requirements Specification

    NASA Technical Reports Server (NTRS)

    Hu, Chaumin

    2003-01-01

    A basic function of a computational grid such as the NASA Information Power Grid (IPG) is to allow users to execute applications on remote computer systems. The Globus Resource Allocation Manager (GRAM) provides this functionality in the IPG and many other grids at this time. While the functionality provided by GRAM clients is adequate, GRAM does not support useful features such as staging several sets of files, running more than one executable in a single job submission, and maintaining historical information about execution operations. This specification is intended to provide the environmental and software functional requirements for the IPG Job Manager V2.0 being developed by AMTI for NASA.

  7. A radiative transfer model for remote sensing of laser induced fluorescence of phytoplankton in non-homogeneous turbid water

    NASA Technical Reports Server (NTRS)

    Venable, D. D.

    1980-01-01

    A radiative transfer computer model was developed to characterize the total flux of chlorophyll a fluoresced or backscattered photons when laser radiation is incident on turbid water that contains a non-homogeneous suspension of inorganic sediments and phytoplankton. The radiative transfer model is based on the Monte Carlo technique and assumes that: (1) the aquatic medium can be represented by a stratified concentration profile; and (2) that appropriate optical parameters can be defined for each layer. The model was designed to minimize the required computer resources and run time. Results are presented for an anacystis marinus culture.

  8. Distributed chemical computing using ChemStar: an open source java remote method invocation architecture applied to large scale molecular data from PubChem.

    PubMed

    Karthikeyan, M; Krishnan, S; Pandey, Anil Kumar; Bender, Andreas; Tropsha, Alexander

    2008-04-01

    We present the application of a Java remote method invocation (RMI) based open source architecture to distributed chemical computing. This architecture was previously employed for distributed data harvesting of chemical information from the Internet via the Google application programming interface (API; ChemXtreme). Due to its open source character and its flexibility, the underlying server/client framework can be quickly adopted to virtually every computational task that can be parallelized. Here, we present the server/client communication framework as well as an application to distributed computing of chemical properties on a large scale (currently the size of PubChem; about 18 million compounds), using both the Marvin toolkit as well as the open source JOELib package. As an application, for this set of compounds, the agreement of log P and TPSA between the packages was compared. Outliers were found to be mostly non-druglike compounds and differences could usually be explained by differences in the underlying algorithms. ChemStar is the first open source distributed chemical computing environment built on Java RMI, which is also easily adaptable to user demands due to its "plug-in architecture". The complete source codes as well as calculated properties along with links to PubChem resources are available on the Internet via a graphical user interface at http://moltable.ncl.res.in/chemstar/.

  9. Integrated instrumentation & computation environment for GRACE

    NASA Astrophysics Data System (ADS)

    Dhekne, P. S.

    2002-03-01

    The project GRACE (Gamma Ray Astrophysics with Coordinated Experiments) aims at setting up a state of the art Gamma Ray Observatory at Mt. Abu, Rajasthan for undertaking comprehensive scientific exploration over a wide spectral window (10's keV - 100's TeV) from a single location through 4 coordinated experiments. The cumulative data collection rate of all the telescopes is expected to be about 1 GB/hr, necessitating innovations in the data management environment. As real-time data acquisition and control as well as off-line data processing, analysis and visualization environment of these systems is based on the us cutting edge and affordable technologies in the field of computers, communications and Internet. We propose to provide a single, unified environment by seamless integration of instrumentation and computations by taking advantage of the recent advancements in Web based technologies. This new environment will allow researchers better acces to facilities, improve resource utilization and enhance collaborations by having identical environments for online as well as offline usage of this facility from any location. We present here a proposed implementation strategy for a platform independent web-based system that supplements automated functions with video-guided interactive and collaborative remote viewing, remote control through virtual instrumentation console, remote acquisition of telescope data, data analysis, data visualization and active imaging system. This end-to-end web-based solution will enhance collaboration among researchers at the national and international level for undertaking scientific studies, using the telescope systems of the GRACE project.

  10. Generic Divide and Conquer Internet-Based Computing

    NASA Technical Reports Server (NTRS)

    Follen, Gregory J. (Technical Monitor); Radenski, Atanas

    2003-01-01

    The growth of Internet-based applications and the proliferation of networking technologies have been transforming traditional commercial application areas as well as computer and computational sciences and engineering. This growth stimulates the exploration of Peer to Peer (P2P) software technologies that can open new research and application opportunities not only for the commercial world, but also for the scientific and high-performance computing applications community. The general goal of this project is to achieve better understanding of the transition to Internet-based high-performance computing and to develop solutions for some of the technical challenges of this transition. In particular, we are interested in creating long-term motivation for end users to provide their idle processor time to support computationally intensive tasks. We believe that a practical P2P architecture should provide useful service to both clients with high-performance computing needs and contributors of lower-end computing resources. To achieve this, we are designing dual -service architecture for P2P high-performance divide-and conquer computing; we are also experimenting with a prototype implementation. Our proposed architecture incorporates a master server, utilizes dual satellite servers, and operates on the Internet in a dynamically changing large configuration of lower-end nodes provided by volunteer contributors. A dual satellite server comprises a high-performance computing engine and a lower-end contributor service engine. The computing engine provides generic support for divide and conquer computations. The service engine is intended to provide free useful HTTP-based services to contributors of lower-end computing resources. Our proposed architecture is complementary to and accessible from computational grids, such as Globus, Legion, and Condor. Grids provide remote access to existing higher-end computing resources; in contrast, our goal is to utilize idle processor time of lower-end Internet nodes. Our project is focused on a generic divide and conquer paradigm and on mobile applications of this paradigm that can operate on a loose and ever changing pool of lower-end Internet nodes.

  11. NASA high performance computing and communications program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1993-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 100-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientist's abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects as well as summaries of individual research and development programs within each project.

  12. Application of Multi-Source Remote Sensing Image in Yunnan Province Grassland Resources Investigation

    NASA Astrophysics Data System (ADS)

    Li, J.; Wen, G.; Li, D.

    2018-04-01

    Trough mastering background information of Yunnan province grassland resources utilization and ecological conditions to improves grassland elaborating management capacity, it carried out grassland resource investigation work by Yunnan province agriculture department in 2017. The traditional grassland resource investigation method is ground based investigation, which is time-consuming and inefficient, especially not suitable for large scale and hard-to-reach areas. While remote sensing is low cost, wide range and efficient, which can reflect grassland resources present situation objectively. It has become indispensable grassland monitoring technology and data sources and it has got more and more recognition and application in grassland resources monitoring research. This paper researches application of multi-source remote sensing image in Yunnan province grassland resources investigation. First of all, it extracts grassland resources thematic information and conducts field investigation through BJ-2 high space resolution image segmentation. Secondly, it classifies grassland types and evaluates grassland degradation degree through high resolution characteristics of Landsat 8 image. Thirdly, it obtained grass yield model and quality classification through high resolution and wide scanning width characteristics of MODIS images and sample investigate data. Finally, it performs grassland field qualitative analysis through UAV remote sensing image. According to project area implementation, it proves that multi-source remote sensing data can be applied to the grassland resources investigation in Yunnan province and it is indispensable method.

  13. Skylab experiments. Volume 2: Remote sensing of earth resources

    NASA Technical Reports Server (NTRS)

    1973-01-01

    This volume covers the broad area of earth resources in which Skylab experiments will be performed. A brief description of the Skylab program, its objectives, and vehicles is included. Section 1 introduces the concept and historical significance of remote sensing, and discusses the major scientific considerations involved in remotely sensing the earth's resources. Sections 2 through 6 provide a description of the individual earth resource sensors and experiments to be performed. Each description includes a discussion of the experiment background and scientific objectives, the equipment involved, and a discussion of significant experiment performance areas.

  14. Data handling and analysis for the 1971 corn blight watch experiment.

    NASA Technical Reports Server (NTRS)

    Anuta, P. E.; Phillips, T. L.; Landgrebe, D. A.

    1972-01-01

    Review of the data handling and analysis methods used in the near-operational test of remote sensing systems provided by the 1971 corn blight watch experiment. The general data analysis techniques and, particularly, the statistical multispectral pattern recognition methods for automatic computer analysis of aircraft scanner data are described. Some of the results obtained are examined, and the implications of the experiment for future data communication requirements of earth resource survey systems are discussed.

  15. A Web-Based Development Environment for Collaborative Data Analysis

    NASA Astrophysics Data System (ADS)

    Erdmann, M.; Fischer, R.; Glaser, C.; Klingebiel, D.; Komm, M.; Müller, G.; Rieger, M.; Steggemann, J.; Urban, M.; Winchen, T.

    2014-06-01

    Visual Physics Analysis (VISPA) is a web-based development environment addressing high energy and astroparticle physics. It covers the entire analysis spectrum from the design and validation phase to the execution of analyses and the visualization of results. VISPA provides a graphical steering of the analysis flow, which consists of self-written, re-usable Python and C++ modules for more demanding tasks. All common operating systems are supported since a standard internet browser is the only software requirement for users. Even access via mobile and touch-compatible devices is possible. In this contribution, we present the most recent developments of our web application concerning technical, state-of-the-art approaches as well as practical experiences. One of the key features is the use of workspaces, i.e. user-configurable connections to remote machines supplying resources and local file access. Thereby, workspaces enable the management of data, computing resources (e.g. remote clusters or computing grids), and additional software either centralized or individually. We further report on the results of an application with more than 100 third-year students using VISPA for their regular particle physics exercises during the winter term 2012/13. Besides the ambition to support and simplify the development cycle of physics analyses, new use cases such as fast, location-independent status queries, the validation of results, and the ability to share analyses within worldwide collaborations with a single click become conceivable.

  16. Technology for organization of the onboard system for processing and storage of ERS data for ultrasmall spacecraft

    NASA Astrophysics Data System (ADS)

    Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.

    2017-10-01

    Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.

  17. Quo vadis, remote sensing. [use of satellite data for resource management

    NASA Technical Reports Server (NTRS)

    Billingsley, F. C.

    1977-01-01

    The use of satellite remote sensing data for resource management is discussed. The evaluation of the need for management data is reviewed, and some legislative programs which require the monitoring of environmental resources are summarized. Several characteristics of data used in the monitoring of dynamic processes are analyzed, and the implications of routine processing of extensive remote sensing data for the development of a new world view are considered.

  18. Remote sensing in Michigan for land resource management

    NASA Technical Reports Server (NTRS)

    Lowe, D. S.; Istvan, L. B.; Roller, N. E.; Sattinger, I. J.; Sellman, A. N.; Wagner, T. W.

    1974-01-01

    The application of NASA earth resource survey technology to resource management and environmental protection in Michigan was investigated. Remote sensing techniques to aid Michigan government agencies were applied in the following activities: (1) land use inventory and management, (2) great lakes shorelands protection and management, (3) wetlands protection and management, and (4) soil survey. In addition, information was disseminated on remote sensing technology, and advice and assistance was provided to a number of users.

  19. Global, Persistent, Real-time Multi-sensor Automated Satellite Image Analysis and Crop Forecasting in Commercial Cloud

    NASA Astrophysics Data System (ADS)

    Brumby, S. P.; Warren, M. S.; Keisler, R.; Chartrand, R.; Skillman, S.; Franco, E.; Kontgis, C.; Moody, D.; Kelton, T.; Mathis, M.

    2016-12-01

    Cloud computing, combined with recent advances in machine learning for computer vision, is enabling understanding of the world at a scale and at a level of space and time granularity never before feasible. Multi-decadal Earth remote sensing datasets at the petabyte scale (8×10^15 bits) are now available in commercial cloud, and new satellite constellations will generate daily global coverage at a few meters per pixel. Public and commercial satellite observations now provide a wide range of sensor modalities, from traditional visible/infrared to dual-polarity synthetic aperture radar (SAR). This provides the opportunity to build a continuously updated map of the world supporting the academic community and decision-makers in government, finanace and industry. We report on work demonstrating country-scale agricultural forecasting, and global-scale land cover/land, use mapping using a range of public and commercial satellite imagery. We describe processing over a petabyte of compressed raw data from 2.8 quadrillion pixels (2.8 petapixels) acquired by the US Landsat and MODIS programs over the past 40 years. Using commodity cloud computing resources, we convert the imagery to a calibrated, georeferenced, multiresolution tiled format suited for machine-learning analysis. We believe ours is the first application to process, in less than a day, on generally available resources, over a petabyte of scientific image data. We report on work combining this imagery with time-series SAR collected by ESA Sentinel 1. We report on work using this reprocessed dataset for experiments demonstrating country-scale food production monitoring, an indicator for famine early warning. We apply remote sensing science and machine learning algorithms to detect and classify agricultural crops and then estimate crop yields and detect threats to food security (e.g., flooding, drought). The software platform and analysis methodology also support monitoring water resources, forests and other general indicators of environmental health, and can detect growth and changes in cities that are displacing historical agricultural zones.

  20. Role of remote sensing in documenting living resources

    NASA Technical Reports Server (NTRS)

    Wagner, P. E.; Anderson, R. R.; Brun, B.; Eisenberg, M.; Genys, J. B.; Lear, D. W., Jr.; Miller, M. H.

    1978-01-01

    Specific cases of known or potentially useful applications of remote sensing in assessing biological resources are discussed. It is concluded that the more usable remote sensing techniques relate to the measurement of population fluctuations in aquatic systems. Sensing of the flora and the fauna of the Bay is considered with emphasis on direct sensing of aquatic plant populations and of water quality. Recommendations for remote sensing projects are given.

  1. UNIX-based operating systems robustness evaluation

    NASA Technical Reports Server (NTRS)

    Chang, Yu-Ming

    1996-01-01

    Robust operating systems are required for reliable computing. Techniques for robustness evaluation of operating systems not only enhance the understanding of the reliability of computer systems, but also provide valuable feed- back to system designers. This thesis presents results from robustness evaluation experiments on five UNIX-based operating systems, which include Digital Equipment's OSF/l, Hewlett Packard's HP-UX, Sun Microsystems' Solaris and SunOS, and Silicon Graphics' IRIX. Three sets of experiments were performed. The methodology for evaluation tested (1) the exception handling mechanism, (2) system resource management, and (3) system capacity under high workload stress. An exception generator was used to evaluate the exception handling mechanism of the operating systems. Results included exit status of the exception generator and the system state. Resource management techniques used by individual operating systems were tested using programs designed to usurp system resources such as physical memory and process slots. Finally, the workload stress testing evaluated the effect of the workload on system performance by running a synthetic workload and recording the response time of local and remote user requests. Moderate to severe performance degradations were observed on the systems under stress.

  2. Remote sensing applications to Missouri environmental resources information system

    NASA Technical Reports Server (NTRS)

    Myers, R. E.

    1977-01-01

    An efficient system for retrieval of remotely sensed data to be used by natural resources oriented agencies, and a natural resources data system that can meet the needs of state agencies were studied. To accomplish these objectives, natural resources data sources were identified, and study of systems already in operation which address themselves to the more efficient utilization of natural resources oriented data was prepared.

  3. Remote sensing of wildland resources: A state-of-the-art review

    Treesearch

    Robert C. Aldrich

    1979-01-01

    A review, with literature citations, of current remote sensing technology, applications, and costs for wildland resource management, including collection, interpretation, and processing of data gathered through photographic and nonphotographic techniques for classification and mapping, interpretive information for specific applications, measurement of resource...

  4. Natural Resource Information System. Remote Sensing Studies.

    ERIC Educational Resources Information Center

    Leachtenauer, J.; And Others

    A major design objective of the Natural Resource Information System entailed the use of remote sensing data as an input to the system. Potential applications of remote sensing data were therefore reviewed and available imagery interpreted to provide input to a demonstration data base. A literature review was conducted to determine the types and…

  5. On the feasibility of benefit-cost analysis applied to remote sensing projects. [California water resources

    NASA Technical Reports Server (NTRS)

    Merewitz, L.

    1973-01-01

    The following step-wise procedure for making a benefit-cost analysis of using remote sensing techniques could be used either in the limited context of California water resources, or a context as broad as the making of integrated resource surveys of the entire earth resource complex on a statewide, regional, national, or global basis. (1) Survey all data collection efforts which can be accomplished by remote sensing techniques. (2) Carefully inspect the State of California budget and the Budget of the United States Government to find annual cost of data collection efforts. (3) Decide the extent to which remote sensing can obviate each of the collection efforts. (4) Sum the annual costs of all data collection which can be equivalently accomplished through remote sensing. (5) Decide what additional data could and would be collected through remote sensing. (6) Estimate the value of this information. It is not harmful to do a benefit-cost analysis so long as its severe limitations are recalled and it is supplemented with socio-economic impact studies.

  6. An integrated study of earth resources in the state of California using remote sensing techniques. [planning and management of water resources

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.; Churchman, C. W.; Burgy, R. H.; Schubert, G.; Estes, J. E.; Bowden, L. W.; Algazi, R.; Coulson, K. L. (Principal Investigator)

    1973-01-01

    The University of California has been conducting an investigation which seeks to determine the usefulness of modern remote sensing techniques for studying various components of California's earth resources complex. Most of the work has concentrated on California's water resources, but with some attention being given to other earth resources as well and to the interplay between them and California's water resources.

  7. Intrusion Prevention and Detection in Grid Computing - The ALICE Case

    NASA Astrophysics Data System (ADS)

    Gomez, Andres; Lara, Camilo; Kebschull, Udo

    2015-12-01

    Grids allow users flexible on-demand usage of computing resources through remote communication networks. A remarkable example of a Grid in High Energy Physics (HEP) research is used in the ALICE experiment at European Organization for Nuclear Research CERN. Physicists can submit jobs used to process the huge amount of particle collision data produced by the Large Hadron Collider (LHC). Grids face complex security challenges. They are interesting targets for attackers seeking for huge computational resources. Since users can execute arbitrary code in the worker nodes on the Grid sites, special care should be put in this environment. Automatic tools to harden and monitor this scenario are required. Currently, there is no integrated solution for such requirement. This paper describes a new security framework to allow execution of job payloads in a sandboxed context. It also allows process behavior monitoring to detect intrusions, even when new attack methods or zero day vulnerabilities are exploited, by a Machine Learning approach. We plan to implement the proposed framework as a software prototype that will be tested as a component of the ALICE Grid middleware.

  8. Remote sensing strategies for global resource exploration and environmental management

    NASA Astrophysics Data System (ADS)

    Henderson, Frederick B.

    Since 1972, satellite remote sensing, when integrated with other exploration techniques, has demonstrated operational exploration and engineering cost savings and reduced exploration risks through improved geological mapping. Land and ocean remote sensing satellite systems under development for the 1990's by the United States, France, Japan, Canada, ESA, Russia, China, and others, will significantly increase our ability to explore for, develop, and manage energy and mineral resources worldwide. A major difference between these systems is the "Open Skies" and "Non-Discriminatory Access to Data" policies as have been practiced by the U.S. and France and the restrictive nationalistic data policies as have been practiced by Russia and India. Global exploration will use satellite remote sensing to better map regional structural and basin-like features that control the distribution of energy and mineral resources. Improved sensors will better map lithologic and stratigraphic units and identify alteration effects in rocks, soils, and vegetation cover indicative of undiscovered subsurface resources. These same sensors will also map and monitor resource development. The use of satellite remote sensing data will grow substantially through increasing integration with other geophysical, geochemical, and geologic data using improved geographic information systems (GIS). International exploration will focus on underdeveloped countries rather than on mature exploration areas such as the United States, Europe, and Japan. Energy and mineral companies and government agencies in these countries and others will utilize available remote sensing data to acquire economic intelligence on global resources. If the "Non-Discriminatory Access to Data" principle is observed by satellite producing countries, exploration will remain competitive "on the ground". In this manner, remote sensing technology will continue to be developed to better explore for and manage the world's needed resources. If, however, satellite producing countries follow the Russian and Indian lead and restrict civil satellite data as tools of their national security and economic policies, remote sensing technology may become internationally competitive in space, redundant, prohibitively expensive, and generally unavailable to the world community.

  9. An evaluation of a UAV guidance system with consumer grade GPS receivers

    NASA Astrophysics Data System (ADS)

    Rosenberg, Abigail Stella

    Remote sensing has been demonstrated an important tool in agricultural and natural resource management and research applications, however there are limitations that exist with traditional platforms (i.e., hand held sensors, linear moves, vehicle mounted, airplanes, remotely piloted vehicles (RPVs), unmanned aerial vehicles (UAVs) and satellites). Rapid technological advances in electronics, computers, software applications, and the aerospace industry have dramatically reduced the cost and increased the availability of remote sensing technologies. Remote sensing imagery vary in spectral, spatial, and temporal resolutions and are available from numerous providers. Appendix A presented results of a test project that acquired high-resolution aerial photography with a RPV to map the boundary of a 0.42 km2 fire area. The project mapped the boundaries of the fire area from a mosaic of the aerial images collected and compared this with ground-based measurements. The project achieved a 92.4% correlation between the aerial assessment and the ground truth data. Appendix B used multi-objective analysis to quantitatively assess the tradeoffs between different sensor platform attributes to identify the best overall technology. Experts were surveyed to identify the best overall technology at three different pixel sizes. Appendix C evaluated the positional accuracy of a relatively low cost UAV designed for high resolution remote sensing of small areas in order to determine the positional accuracy of sensor readings. The study evaluated the accuracy and uncertainty of a UAV flight route with respect to the programmed waypoints and of the UAV's GPS position, respectively. In addition, the potential displacement of sensor data was evaluated based on (1) GPS measurements on board the aircraft and (2) the autopilot's circuit board with 3-axis gyros and accelerometers (i.e., roll, pitch, and yaw). The accuracies were estimated based on a 95% confidence interval or similar methods. The accuracy achieved in the second and third manuscripts demonstrates that reasonably priced, high resolution remote sensing via RPVs and UAVs is practical for agriculture and natural resource professionals.

  10. State of the Art of Network Security Perspectives in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Oh, Tae Hwan; Lim, Shinyoung; Choi, Young B.; Park, Kwang-Roh; Lee, Heejo; Choi, Hyunsang

    Cloud computing is now regarded as one of social phenomenon that satisfy customers' needs. It is possible that the customers' needs and the primary principle of economy - gain maximum benefits from minimum investment - reflects realization of cloud computing. We are living in the connected society with flood of information and without connected computers to the Internet, our activities and work of daily living will be impossible. Cloud computing is able to provide customers with custom-tailored features of application software and user's environment based on the customer's needs by adopting on-demand outsourcing of computing resources through the Internet. It also provides cloud computing users with high-end computing power and expensive application software package, and accordingly the users will access their data and the application software where they are located at the remote system. As the cloud computing system is connected to the Internet, network security issues of cloud computing are considered as mandatory prior to real world service. In this paper, survey and issues on the network security in cloud computing are discussed from the perspective of real world service environments.

  11. Optimization of PV/WIND/DIESEL Hybrid Power System in HOMER for Rural Electrification

    NASA Astrophysics Data System (ADS)

    Hassan, Q.; Jaszczur, M.; Abdulateef, J.

    2016-09-01

    A large proportion of the world's population lives in remote rural areas that are geographically isolated and sparsely populated. The present study is based on modeling, computer simulation and optimization of hybrid power generation system in the rural area in Muqdadiyah district of Diyala state, Iraq. Two renewable resources, namely, solar photovoltaic (PV) and wind turbine (WT) are considered. The HOMER software is used to study and design the proposed hybrid energy system model. Based on simulation results, it has been found that renewable energy sources perhaps replace the conventional energy sources and would be a feasible solution for the generation of electric power at remote locations with a reasonable investment. The hybrid power system solution to electrify the selected area resulted in a least-cost combination of the hybrid power system that can meet the demand in a dependable manner at a cost about (0.321/kWh). If the wind resources in the study area at the lower stage, it's not economically viable for a wind turbine to generate the electricity.

  12. A feasibility study of using remotely sensed data for water resource models

    NASA Technical Reports Server (NTRS)

    Ruff, J. F.

    1973-01-01

    Remotely sensed data were collected to demonstrate the feasibility of applying the results to water resource problems. Photographs of the Wolf Creek watershed in southwestern Colorado were collected over a one year period. Cloud top temperatures were measured using a radiometer. Thermal imagery of the Wolf Creek Pass area was obtained during one pre-dawn flight. Remote sensing studies of water resource problems for user agencies were also conducted. The results indicated that: (1) remote sensing techniques could be used to assist in the solution of water resource problems; (2) photogrammetric determination of snow depths is feasible; (3) changes in turbidity or suspended material concentration can be observed; and (4) surface turbulence can be related to bed scour; and (5) thermal effluents into rivers can be monitored.

  13. Implementation of Multispectral Image Classification on a Remote Adaptive Computer

    NASA Technical Reports Server (NTRS)

    Figueiredo, Marco A.; Gloster, Clay S.; Stephens, Mark; Graves, Corey A.; Nakkar, Mouna

    1999-01-01

    As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).

  14. Airborne remote sensing for geology and the environment; present and future

    USGS Publications Warehouse

    Watson, Ken; Knepper, Daniel H.

    1994-01-01

    In 1988, a group of leading experts from government, academia, and industry attended a workshop on airborne remote sensing sponsored by the U.S. Geological Survey (USGS) and hosted by the Branch of Geophysics. The purpose of the workshop was to examine the scientific rationale for airborne remote sensing in support of government earth science in the next decade. This report has arranged the six resulting working-group reports under two main headings: (1) Geologic Remote Sensing, for the reports on geologic mapping, mineral resources, and fossil fuels and geothermal resources; and (2) Environmental Remote Sensing, for the reports on environmental geology, geologic hazards, and water resources. The intent of the workshop was to provide an evaluation of demonstrated capabilities, their direct extensions, and possible future applications, and this was the organizational format used for the geologic remote sensing reports. The working groups in environmental remote sensing chose to present their reports in a somewhat modified version of this format. A final section examines future advances and limitations in the field. There is a large, complex, and often bewildering array of remote sensing data available. Early remote sensing studies were based on data collected from airborne platforms. Much of that technology was later extended to satellites. The original 80-m-resolution Landsat Multispectral Scanner System (MSS) has now been largely superseded by the 30-m-resolution Thematic Mapper (TM) system that has additional spectral channels. The French satellite SPOT provides higher spatial resolution for channels equivalent to MSS. Low-resolution (1 km) data are available from the National Oceanographic and Atmospheric Administration's AVHRR system, which acquires reflectance and day and night thermal data daily. Several experimental satellites have acquired limited data, and there are extensive plans for future satellites including those of Japan (JERS), Europe (ESA), Canada (Radarsat), and the United States (EOS). There are currently two national airborne remote sensing programs (photography, radar) with data archived at the USGS' EROS Data Center. Airborne broadband multispectral data (comparable to Landsat MSS and TM but involving several more channels) for limited geographic areas also are available for digital processing and analysis. Narrow-band imaging spectrometer data are available for some NASA experiment sites and can be acquired for other locations commercially. Remote sensing data and derivative images, because of the uniform spatial coverage, availability at different resolutions, and digital format, are becoming important data sets for geographic information system (GIS) analyses. Examples range from overlaying digitized geologic maps on remote sensing images and draping these over topography, to maps of mineral distribution and inferred abundance. A large variety of remote sensing data sets are available, with costs ranging from a few dollars per square mile for satellite digital data to a few hundred dollars per square mile for airborne imaging spectrometry. Computer processing and analysis costs routinely surpass these expenses because of the equipment and expertise necessary for information extraction and interpretation. Effective use requires both an understanding of the current methodology and an appreciation of the most cost-effective solution.

  15. Applications of remote sensing in resource management in Nebraska

    NASA Technical Reports Server (NTRS)

    Drew, J. V.

    1975-01-01

    A computer-generated graphic display of land use data was developed. The level II inventory data for Sarpy County, Nebraska, was placed on magnetic tape. This data could then be displayed in a map format for comparative analysis of amount and distribution of the various categories of land use. The presentation scale can be varied and thus utilized as a direct guide for cartographic purposes during preparation for publication. In addition, the inventory and classification system was further refined.

  16. The Snowmelt-Runoff Model (SRM) user's manual

    NASA Technical Reports Server (NTRS)

    Martinec, J.; Rango, A.; Major, E.

    1983-01-01

    A manual to provide a means by which a user may apply the snowmelt runoff model (SRM) unaided is presented. Model structure, conditions of application, and data requirements, including remote sensing, are described. Guidance is given for determining various model variables and parameters. Possible sources of error are discussed and conversion of snowmelt runoff model (SRM) from the simulation mode to the operational forecasting mode is explained. A computer program is presented for running SRM is easily adaptable to most systems used by water resources agencies.

  17. Evaluation of alternative model-data fusion approaches in water balance estimation across Australia

    NASA Astrophysics Data System (ADS)

    van Dijk, A. I. J. M.; Renzullo, L. J.

    2009-04-01

    Australia's national agencies are developing a continental modelling system to provide a range of water information services. It will include rolling water balance estimation to underpin national water accounts, water resources assessments that interpret current water resources availability and trends in a historical context, and water resources predictions coupled to climate and weather forecasting. The nation-wide coverage, currency, accuracy, and consistency required means that remote sensing will need to play an important role along with in-situ observations. Different approaches to blending models and observations can be considered. Integration of on-ground and remote sensing data into land surface models in atmospheric applications often involves state updating through model-data assimilation techniques. By comparison, retrospective water balance estimation and hydrological scenario modelling to date has mostly relied on static parameter fitting against observations and has made little use of earth observation. The model-data fusion approach most appropriate for a continental water balance estimation system will need to consider the trade-off between computational overhead and the accuracy gains achieved when using more sophisticated synthesis techniques and additional observations. This trade-off was investigated using a landscape hydrological model and satellite-based estimates of soil moisture and vegetation properties for aseveral gauged test catchments in southeast Australia.

  18. Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories

    ERIC Educational Resources Information Center

    Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher

    2009-01-01

    Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…

  19. An integrated study of earth resources in the state of California using remote sensing techniques. [water and forest management

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.

    1974-01-01

    Progress and results of an integrated study of California's water resources are discussed. The investigation concerns itself primarily with the usefulness of remote sensing of relation to two categories of problems: (1) water supply; and (2) water demand. Also considered are its applicability to forest management and timber inventory. The cost effectiveness and utility of remote sensors such as the Earth Resources Technology Satellite for water and timber management are presented.

  20. Remote sensing based crop type mapping and evapotranspiration estimates at the farm level in arid regions of the globe

    NASA Astrophysics Data System (ADS)

    Ozdogan, M.; Serrat-Capdevila, A.; Anderson, M. C.

    2017-12-01

    Despite increasing scarcity of freshwater resources, there is dearth of spatially explicit information on irrigation water consumption through evapotranspiration, particularly in semi-arid and arid geographies. Remote sensing, either alone or in combination with ground surveys, is increasingly being used for irrigation water management by quantifying evaporative losses at the farm level. Increased availability of observations, sophisticated algorithms, and access to cloud-based computing is also helping this effort. This presentation will focus on crop-specific evapotranspiration estimates at the farm level derived from remote sensing in a number of water-scarce regions of the world. The work is part of a larger effort to quantify irrigation water use and improve use efficiencies associated with several World Bank projects. Examples will be drawn from India, where groundwater based irrigation withdrawals are monitored with the help of crop type mapping and evapotranspiration estimates from remote sensing. Another example will be provided from a northern irrigation district in Mexico, where remote sensing is used for detailed water accounting at the farm level. These locations exemplify the success stories in irrigation water management with the help of remote sensing with the hope that spatially disaggregated information on evapotranspiration can be used as inputs for various water management decisions as well as for better water allocation strategies in many other water scarce regions.

  1. Communication network for decentralized remote tele-science during the Spacelab mission IML-2

    NASA Technical Reports Server (NTRS)

    Christ, Uwe; Schulz, Klaus-Juergen; Incollingo, Marco

    1994-01-01

    The ESA communication network for decentralized remote telescience during the Spacelab mission IML-2, called Interconnection Ground Subnetwork (IGS), provided data, voice conferencing, video distribution/conferencing and high rate data services to 5 remote user centers in Europe. The combination of services allowed the experimenters to interact with their experiments as they would normally do from the Payload Operations Control Center (POCC) at MSFC. In addition, to enhance their science results, they were able to make use of reference facilities and computing resources in their home laboratory, which typically are not available in the POCC. Characteristics of the IML-2 communications implementation were the adaptation to the different user needs based on modular service capabilities of IGS and the cost optimization for the connectivity. This was achieved by using a combination of traditional leased lines, satellite based VSAT connectivity and N-ISDN according to the simulation and mission schedule for each remote site. The central management system of IGS allows minimization of staffing and the involvement of communications personnel at the remote sites. The successful operation of IGS for IML-2 as a precursor network for the Columbus Orbital Facility (COF) has proven the concept for communications to support the operation of the COF decentralized scenario.

  2. Active Remote Sensing of Natural Resources: Course Notes. Science Series No. 5. Final Technical Report.

    ERIC Educational Resources Information Center

    Maxwell, Eugene L.

    Presented is a portion of a research project which developed materials for teaching remote sensing of natural resources on an interdisciplinary basis at the graduate level. This volume contains notes developed for a course in active remote sensing. It is concerned with those methods or systems which generate the electromagnetic energy…

  3. Research in remote sensing of agriculture, earth resources, and man's environment

    NASA Technical Reports Server (NTRS)

    Landgrebe, D. A.

    1975-01-01

    Progress is reported for several projects involving the utilization of LANDSAT remote sensing capabilities. Areas under study include crop inventory, crop identification, crop yield prediction, forest resources evaluation, land resources evaluation and soil classification. Numerical methods for image processing are discussed, particularly those for image enhancement and analysis.

  4. A national-scale authentication infrastructure.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Butler, R.; Engert, D.; Foster, I.

    2000-12-01

    Today, individuals and institutions in science and industry are increasingly forming virtual organizations to pool resources and tackle a common goal. Participants in virtual organizations commonly need to share resources such as data archives, computer cycles, and networks - resources usually available only with restrictions based on the requested resource's nature and the user's identity. Thus, any sharing mechanism must have the ability to authenticate the user's identity and determine if the user is authorized to request the resource. Virtual organizations tend to be fluid, however, so authentication mechanisms must be flexible and lightweight, allowing administrators to quickly establish andmore » change resource-sharing arrangements. However, because virtual organizations complement rather than replace existing institutions, sharing mechanisms cannot change local policies and must allow individual institutions to maintain control over their own resources. Our group has created and deployed an authentication and authorization infrastructure that meets these requirements: the Grid Security Infrastructure. GSI offers secure single sign-ons and preserves site control over access policies and local security. It provides its own versions of common applications, such as FTP and remote login, and a programming interface for creating secure applications.« less

  5. Using Technology to Facilitate Collaboration in Community-Based Participatory Research (CBPR)

    PubMed Central

    Jessell, Lauren; Smith, Vivian; Jemal, Alexis; Windsor, Liliane

    2017-01-01

    This study explores the use of Computer-Supported Collaborative Work (CSCW) technologies, by way of a computer-based system called iCohere. This system was used to facilitate collaboration conducting Community-Based Participatory Research (CBPR). Data was gathered from 13 members of a Community Collaborative Board (CCB). Analysis revealed that iCohere served the following functions: facilitating communication, providing a depository for information and resource sharing, and allowing for remote meeting attendance. Results indicated that while iCohere was useful in performing these functions, less expensive technologies had the potential to achieve similar goals if properly implemented. Implications for future research on CSCW systems and CBPR are discussed. PMID:29056871

  6. A Simple XML Producer-Consumer Protocol

    NASA Technical Reports Server (NTRS)

    Smith, Warren; Gunter, Dan; Quesnel, Darcy; Biegel, Bryan (Technical Monitor)

    2001-01-01

    There are many different projects from government, academia, and industry that provide services for delivering events in distributed environments. The problem with these event services is that they are not general enough to support all uses and they speak different protocols so that they cannot interoperate. We require such interoperability when we, for example, wish to analyze the performance of an application in a distributed environment. Such an analysis might require performance information from the application, computer systems, networks, and scientific instruments. In this work we propose and evaluate a standard XML-based protocol for the transmission of events in distributed systems. One recent trend in government and academic research is the development and deployment of computational grids. Computational grids are large-scale distributed systems that typically consist of high-performance compute, storage, and networking resources. Examples of such computational grids are the DOE Science Grid, the NASA Information Power Grid (IPG), and the NSF Partnerships for Advanced Computing Infrastructure (PACIs). The major effort to deploy these grids is in the area of developing the software services to allow users to execute applications on these large and diverse sets of resources. These services include security, execution of remote applications, managing remote data, access to information about resources and services, and so on. There are several toolkits for providing these services such as Globus, Legion, and Condor. As part of these efforts to develop computational grids, the Global Grid Forum is working to standardize the protocols and APIs used by various grid services. This standardization will allow interoperability between the client and server software of the toolkits that are providing the grid services. The goal of the Performance Working Group of the Grid Forum is to standardize protocols and representations related to the storage and distribution of performance data. These standard protocols and representations must support tasks such as profiling parallel applications, monitoring the status of computers and networks, and monitoring the performance of services provided by a computational grid. This paper describes a proposed protocol and data representation for the exchange of events in a distributed system. The protocol exchanges messages formatted in XML and it can be layered atop any low-level communication protocol such as TCP or UDP Further, we describe Java and C++ implementations of this protocol and discuss their performance. The next section will provide some further background information. Section 3 describes the main communication patterns of our protocol. Section 4 describes how we represent events and related information using XML. Section 5 describes our protocol and Section 6 discusses the performance of two implementations of the protocol. Finally, an appendix provides the XML Schema definition of our protocol and event information.

  7. Remote sensing: An inventory of earth's resources

    NASA Technical Reports Server (NTRS)

    Gramenopoulos, N.

    1974-01-01

    The remote sensing capabilities of Landsat are reviewed along with the broad areas of application of the Landsat imagery. The importance of Landsat imagery in urban planning and resources management is stressed.

  8. Application of Machine Learning in Urban Greenery Land Cover Extraction

    NASA Astrophysics Data System (ADS)

    Qiao, X.; Li, L. L.; Li, D.; Gan, Y. L.; Hou, A. Y.

    2018-04-01

    Urban greenery is a critical part of the modern city and the greenery coverage information is essential for land resource management, environmental monitoring and urban planning. It is a challenging work to extract the urban greenery information from remote sensing image as the trees and grassland are mixed with city built-ups. In this paper, we propose a new automatic pixel-based greenery extraction method using multispectral remote sensing images. The method includes three main steps. First, a small part of the images is manually interpreted to provide prior knowledge. Secondly, a five-layer neural network is trained and optimised with the manual extraction results, which are divided to serve as training samples, verification samples and testing samples. Lastly, the well-trained neural network will be applied to the unlabelled data to perform the greenery extraction. The GF-2 and GJ-1 high resolution multispectral remote sensing images were used to extract greenery coverage information in the built-up areas of city X. It shows a favourable performance in the 619 square kilometers areas. Also, when comparing with the traditional NDVI method, the proposed method gives a more accurate delineation of the greenery region. Due to the advantage of low computational load and high accuracy, it has a great potential for large area greenery auto extraction, which saves a lot of manpower and resources.

  9. An integrated compact airborne multispectral imaging system using embedded computer

    NASA Astrophysics Data System (ADS)

    Zhang, Yuedong; Wang, Li; Zhang, Xuguo

    2015-08-01

    An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.

  10. Privacy-preserving public auditing for data integrity in cloud

    NASA Astrophysics Data System (ADS)

    Shaik Saleem, M.; Murali, M.

    2018-04-01

    Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.

  11. Automatic Between-Pulse Analysis of DIII-D Experimental Data Performed Remotely on a Supercomputer at Argonne Leadership Computing Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kostuk, M.; Uram, T. D.; Evans, T.

    For the first time, an automatically triggered, between-pulse fusion science analysis code was run on-demand at a remotely located supercomputer at Argonne Leadership Computing Facility (ALCF, Lemont, IL) in support of in-process experiments being performed at DIII-D (San Diego, CA). This represents a new paradigm for combining geographically distant experimental and high performance computing (HPC) facilities to provide enhanced data analysis that is quickly available to researchers. Enhanced analysis improves the understanding of the current pulse, translating into a more efficient use of experimental resources, and to the quality of the resultant science. The analysis code used here, called SURFMN,more » calculates the magnetic structure of the plasma using Fourier transform. Increasing the number of Fourier components provides a more accurate determination of the stochastic boundary layer near the plasma edge by better resolving magnetic islands, but requires 26 minutes to complete using local DIII-D resources, putting it well outside the useful time range for between pulse analysis. These islands relate to confinement and edge localized mode (ELM) suppression, and may be controlled by adjusting coil currents for the next pulse. Argonne has ensured on-demand execution of SURFMN by providing a reserved queue, a specialized service that launches the code after receiving an automatic trigger, and with network access from the worker nodes for data transfer. Runs are executed on 252 cores of ALCF’s Cooley cluster and the data is available locally at DIII-D within three minutes of triggering. The original SURFMN design limits additional improvements with more cores, however our work shows a path forward where codes that benefit from thousands of processors can run between pulses.« less

  12. Automatic Between-Pulse Analysis of DIII-D Experimental Data Performed Remotely on a Supercomputer at Argonne Leadership Computing Facility

    DOE PAGES

    Kostuk, M.; Uram, T. D.; Evans, T.; ...

    2018-02-01

    For the first time, an automatically triggered, between-pulse fusion science analysis code was run on-demand at a remotely located supercomputer at Argonne Leadership Computing Facility (ALCF, Lemont, IL) in support of in-process experiments being performed at DIII-D (San Diego, CA). This represents a new paradigm for combining geographically distant experimental and high performance computing (HPC) facilities to provide enhanced data analysis that is quickly available to researchers. Enhanced analysis improves the understanding of the current pulse, translating into a more efficient use of experimental resources, and to the quality of the resultant science. The analysis code used here, called SURFMN,more » calculates the magnetic structure of the plasma using Fourier transform. Increasing the number of Fourier components provides a more accurate determination of the stochastic boundary layer near the plasma edge by better resolving magnetic islands, but requires 26 minutes to complete using local DIII-D resources, putting it well outside the useful time range for between pulse analysis. These islands relate to confinement and edge localized mode (ELM) suppression, and may be controlled by adjusting coil currents for the next pulse. Argonne has ensured on-demand execution of SURFMN by providing a reserved queue, a specialized service that launches the code after receiving an automatic trigger, and with network access from the worker nodes for data transfer. Runs are executed on 252 cores of ALCF’s Cooley cluster and the data is available locally at DIII-D within three minutes of triggering. The original SURFMN design limits additional improvements with more cores, however our work shows a path forward where codes that benefit from thousands of processors can run between pulses.« less

  13. The availability of conventional forms of remotely sensed data

    USGS Publications Warehouse

    Sturdevant, James A.; Holm, Thomas M.

    1982-01-01

    For decades Federal and State agencies have been collecting aerial photographs of various film types and scales over parts of the United States. More recently, worldwide Earth resources data acquired by orbiting satellites have inundated the remote sensing community. Determining the types of remotely sensed data that are publicly available can be confusing to the land-resource manager, planner, and scientist. This paper is a summary of the more commonly used types of remotely sensed data (aircraft and satellite) and their public availability. Special emphasis is placed on the National High-Altitude Photography (NHAP) program and future remote-sensing satellites.

  14. Remote Assessment of Lunar Resource Potential

    NASA Technical Reports Server (NTRS)

    Taylor, G. Jeffrey

    1992-01-01

    Assessing the resource potential of the lunar surface requires a well-planned program to determine the chemical and mineralogical composition of the Moon's surface at a range of scales. The exploration program must include remote sensing measurements (from both Earth's surface and lunar orbit), robotic in situ analysis of specific places, and eventually, human field work by trained geologists. Remote sensing data is discussed. Resource assessment requires some idea of what resources will be needed. Studies thus far have concentrated on oxygen and hydrogen production for propellant and life support, He-3 for export as fuel for nuclear fusion reactors, and use of bulk regolith for shielding and construction materials. The measurement requirements for assessing these resources are given and discussed briefly.

  15. Computer applications in remote sensing education

    NASA Technical Reports Server (NTRS)

    Danielson, R. L.

    1980-01-01

    Computer applications to instruction in any field may be divided into two broad generic classes: computer-managed instruction and computer-assisted instruction. The division is based on how frequently the computer affects the instructional process and how active a role the computer affects the instructional process and how active a role the computer takes in actually providing instruction. There are no inherent characteristics of remote sensing education to preclude the use of one or both of these techniques, depending on the computer facilities available to the instructor. The characteristics of the two classes are summarized, potential applications to remote sensing education are discussed, and the advantages and disadvantages of computer applications to the instructional process are considered.

  16. Coal and Open-pit surface mining impacts on American Lands (COAL)

    NASA Astrophysics Data System (ADS)

    Brown, T. A.; McGibbney, L. J.

    2017-12-01

    Mining is known to cause environmental degradation, but software tools to identify its impacts are lacking. However, remote sensing, spectral reflectance, and geographic data are readily available, and high-performance cloud computing resources exist for scientific research. Coal and Open-pit surface mining impacts on American Lands (COAL) provides a suite of algorithms and documentation to leverage these data and resources to identify evidence of mining and correlate it with environmental impacts over time.COAL was originally developed as a 2016 - 2017 senior capstone collaboration between scientists at the NASA Jet Propulsion Laboratory (JPL) and computer science students at Oregon State University (OSU). The COAL team implemented a free and open-source software library called "pycoal" in the Python programming language which facilitated a case study of the effects of coal mining on water resources. Evidence of acid mine drainage associated with an open-pit coal mine in New Mexico was derived by correlating imaging spectrometer data from the JPL Airborne Visible/InfraRed Imaging Spectrometer - Next Generation (AVIRIS-NG), spectral reflectance data published by the USGS Spectroscopy Laboratory in the USGS Digital Spectral Library 06, and GIS hydrography data published by the USGS National Geospatial Program in The National Map. This case study indicated that the spectral and geospatial algorithms developed by COAL can be used successfully to analyze the environmental impacts of mining activities.Continued development of COAL has been promoted by a Startup allocation award of high-performance computing resources from the Extreme Science and Engineering Discovery Environment (XSEDE). These resources allow the team to undertake further benchmarking, evaluation, and experimentation using multiple XSEDE resources. The opportunity to use computational infrastructure of this caliber will further enable the development of a science gateway to continue foundational COAL research.This work documents the original design and development of COAL and provides insight into continuing research efforts which have potential applications beyond the project to environmental data science and other fields.

  17. The Sophia-Antipolis Conference: General presentation and basic documents. [remote sensing for agriculture, forestry, water resources, and environment management in France

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The procedures and techniques used in NASA's aerospace technology transfer program are reviewed for consideration in establishing priorities and bases for joint action by technicians and users of remotely sensed data in France. Particular emphasis is given to remote sensing in agriculture, forestry, water resources, environment management, and urban research.

  18. User Requirements for the Application of Remote Sensing in the Planning and Management of Water Resource Systems

    NASA Technical Reports Server (NTRS)

    Burgy, R. H.

    1972-01-01

    Data relating to hydrologic and water resource systems and subsystems management are reported. Systems models, user application, and remote sensing technology are covered. Parameters governing water resources include evaportranspiration, vegetation, precipitation, streams and estuaries, reservoirs and lakes, and unsaturate and saturated soil zones.

  19. Field Data Collection: an Essential Element in Remote Sensing Applications

    NASA Technical Reports Server (NTRS)

    Pettinger, L. R.

    1971-01-01

    Field data collected in support of remote sensing projects are generally used for the following purposes: (1) calibration of remote sensing systems, (2) evaluation of experimental applications of remote sensing imagery on small test sites, and (3) designing and evaluating operational regional resource studies and inventories which are conducted using the remote sensing imagery obtained. Field data may be used to help develop a technique for a particular application, or to aid in the application of that technique to a resource evaluation or inventory problem for a large area. Scientists at the Forestry Remote Sensing Laboratory have utilized field data for both purposes. How meaningful field data has been collected in each case is discussed.

  20. The economic value of remote sensing of earth resources from space: An ERTS overview and the value of continuity of service. Volume 7: Nonreplenishable natural resources: Minerals, fossil fuels and geothermal energy sources

    NASA Technical Reports Server (NTRS)

    Lietzke, K. R.

    1974-01-01

    The application of remotely-sensed information to the mineral, fossil fuel, and geothermal energy extraction industry is investigated. Public and private cost savings are documented in geologic mapping activities. Benefits and capabilities accruing to the ERS system are assessed. It is shown that remote sensing aids in resource extraction, as well as the monitoring of several dynamic phenomena, including disturbed lands, reclamation, erosion, glaciation, and volcanic and seismic activity.

  1. Virtual patient simulator for distributed collaborative medical education.

    PubMed

    Caudell, Thomas P; Summers, Kenneth L; Holten, Jim; Hakamata, Takeshi; Mowafi, Moad; Jacobs, Joshua; Lozanoff, Beth K; Lozanoff, Scott; Wilks, David; Keep, Marcus F; Saiki, Stanley; Alverson, Dale

    2003-01-01

    Project TOUCH (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) investigates the feasibility of using advanced technologies to enhance education in an innovative problem-based learning format currently being used in medical school curricula, applying specific clinical case models, and deploying to remote sites/workstations. The University of New Mexico's School of Medicine and the John A. Burns School of Medicine at the University of Hawai'i face similar health care challenges in providing and delivering services and training to remote and rural areas. Recognizing that health care needs are local and require local solutions, both states are committed to improving health care delivery to their unique populations by sharing information and experiences through emerging telehealth technologies by using high-performance computing and communications resources. The purpose of this study is to describe the deployment of a problem-based learning case distributed over the National Computational Science Alliance's Access Grid. Emphasis is placed on the underlying technical components of the TOUCH project, including the virtual reality development tool Flatland, the artificial intelligence-based simulation engine, the Access Grid, high-performance computing platforms, and the software that connects them all. In addition, educational and technical challenges for Project TOUCH are identified. Copyright 2003 Wiley-Liss, Inc.

  2. NASA High Performance Computing and Communications program

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Smith, Paul; Hunter, Paul

    1994-01-01

    The National Aeronautics and Space Administration's HPCC program is part of a new Presidential initiative aimed at producing a 1000-fold increase in supercomputing speed and a 1(X)-fold improvement in available communications capability by 1997. As more advanced technologies are developed under the HPCC program, they will be used to solve NASA's 'Grand Challenge' problems, which include improving the design and simulation of advanced aerospace vehicles, allowing people at remote locations to communicate more effectively and share information, increasing scientists' abilities to model the Earth's climate and forecast global environmental trends, and improving the development of advanced spacecraft. NASA's HPCC program is organized into three projects which are unique to the agency's mission: the Computational Aerosciences (CAS) project, the Earth and Space Sciences (ESS) project, and the Remote Exploration and Experimentation (REE) project. An additional project, the Basic Research and Human Resources (BRHR) project, exists to promote long term research in computer science and engineering and to increase the pool of trained personnel in a variety of scientific disciplines. This document presents an overview of the objectives and organization of these projects, as well as summaries of early accomplishments and the significance, status, and plans for individual research and development programs within each project. Areas of emphasis include benchmarking, testbeds, software and simulation methods.

  3. TERSSE: Definition of the Total Earth Resources System for the Shuttle Era. Volume 3: Mission and System Requirements for the Total Earth Resources System

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Resource management missions to be performed by TERSSE are described. Mission and user requirements are defined along with information flows developed for each major resource management mission. Other topics discussed include: remote sensing platforms, remote sensor requirements, ground system architecture, and such related issues as cloud cover, resolution, orbit mechanics, and aircraft versus satellite.

  4. Hypersonic Research Vehicle (HRV) real-time flight test support feasibility and requirements study. Part 2: Remote computation support for flight systems functions

    NASA Technical Reports Server (NTRS)

    Rediess, Herman A.; Hewett, M. D.

    1991-01-01

    The requirements are assessed for the use of remote computation to support HRV flight testing. First, remote computational requirements were developed to support functions that will eventually be performed onboard operational vehicles of this type. These functions which either cannot be performed onboard in the time frame of initial HRV flight test programs because the technology of airborne computers will not be sufficiently advanced to support the computational loads required, or it is not desirable to perform the functions onboard in the flight test program for other reasons. Second, remote computational support either required or highly desirable to conduct flight testing itself was addressed. The use is proposed of an Automated Flight Management System which is described in conceptual detail. Third, autonomous operations is discussed and finally, unmanned operations.

  5. Supporting research sites in resource-limited settings: Challenges in implementing IT infrastructure

    PubMed Central

    Whalen, Christopher; Donnell, Deborah; Tartakovsky, Michael

    2014-01-01

    As Information and Communication Technology infrastructure becomes more reliable, new methods of Electronic Data Capture (EDC), datamarts/Data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on EDC and internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings (RLS). We describe examples of the practical and ethical/regulatory challenges raised by use of these newer technologies for data collection in multisite clinical studies. PMID:24321986

  6. Supporting research sites in resource-limited settings: challenges in implementing information technology infrastructure.

    PubMed

    Whalen, Christopher J; Donnell, Deborah; Tartakovsky, Michael

    2014-01-01

    As information and communication technology infrastructure becomes more reliable, new methods of electronic data capture, data marts/data warehouses, and mobile computing provide platforms for rapid coordination of international research projects and multisite studies. However, despite the increasing availability of Internet connectivity and communication systems in remote regions of the world, there are still significant obstacles. Sites with poor infrastructure face serious challenges participating in modern clinical and basic research, particularly that relying on electronic data capture and Internet communication technologies. This report discusses our experiences in supporting research in resource-limited settings. We describe examples of the practical and ethical/regulatory challenges raised by the use of these newer technologies for data collection in multisite clinical studies.

  7. Quarterly literature review of the remote sensing of natural resources, third quarter 1976. [bibliography

    NASA Technical Reports Server (NTRS)

    1976-01-01

    Abstracts related to remote sensing instrumentation and techniques, and to the remote sensing of natural resources are presented by the Technology Application Center at the University of New Mexico. Areas of interest included theory, general surveys, and miscellaneous studies; geology and hydrology; agriculture and forestry; marine sciences; and urban and land use. An alphabetically arranged Author/Key Word index is provided.

  8. For multidisciplinary research on the application of remote sensing to water resources problems. [including crop yield, watershed soils, and vegetation mapping in Wisconsin

    NASA Technical Reports Server (NTRS)

    Kiefer, R. W. (Principal Investigator)

    1979-01-01

    Research on the application of remote sensing to problems of water resources was concentrated on sediments and associated nonpoint source pollutants in lakes. Further transfer of the technology of remote sensing and the refinement of equipment and programs for thermal scanning and the digital analysis of images were also addressed.

  9. Recovery Act. Direct Confirmation of Commercial Geothermal Resources in Colorado Using Remote Sensing and On-Site Exploration, Testing, and Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foley, Paul; Skeehan, Kirsten; Smith, Jerome

    Report on the confirmation of Commercial Geothermal Resources in Colorado describing the on site testing and analysis to confirm remote sensing identified potential resources. A series of thermal gradient wells were drilled in the Pagosa Springs region and the data collected is analyzed within.

  10. Remote sensing in Michigan for land resource management

    NASA Technical Reports Server (NTRS)

    Sattinger, I. J.; Istvan, L. B.; Roller, N. E. G.; Lowe, D. S.

    1977-01-01

    An extensive program was conducted to establish practical uses of NASA earth resource survey technology in meeting resource management problems throughout Michigan. As a result, a broad interest in and understanding of the usefulness of remote sensing methods was developed and a wide variety of applications was undertaken to provide information needed for informed decision making and effective action.

  11. A multipurpose computing center with distributed resources

    NASA Astrophysics Data System (ADS)

    Chudoba, J.; Adam, M.; Adamová, D.; Kouba, T.; Mikula, A.; Říkal, V.; Švec, J.; Uhlířová, J.; Vokáč, P.; Svatoš, M.

    2017-10-01

    The Computing Center of the Institute of Physics (CC IoP) of the Czech Academy of Sciences serves a broad spectrum of users with various computing needs. It runs WLCG Tier-2 center for the ALICE and the ATLAS experiments; the same group of services is used by astroparticle physics projects the Pierre Auger Observatory (PAO) and the Cherenkov Telescope Array (CTA). OSG stack is installed for the NOvA experiment. Other groups of users use directly local batch system. Storage capacity is distributed to several locations. DPM servers used by the ATLAS and the PAO are all in the same server room, but several xrootd servers for the ALICE experiment are operated in the Nuclear Physics Institute in Řež, about 10 km away. The storage capacity for the ATLAS and the PAO is extended by resources of the CESNET - the Czech National Grid Initiative representative. Those resources are in Plzen and Jihlava, more than 100 km away from the CC IoP. Both distant sites use a hierarchical storage solution based on disks and tapes. They installed one common dCache instance, which is published in the CC IoP BDII. ATLAS users can use these resources using the standard ATLAS tools in the same way as the local storage without noticing this geographical distribution. Computing clusters LUNA and EXMAG dedicated to users mostly from the Solid State Physics departments offer resources for parallel computing. They are part of the Czech NGI infrastructure MetaCentrum with distributed batch system based on torque with a custom scheduler. Clusters are installed remotely by the MetaCentrum team and a local contact helps only when needed. Users from IoP have exclusive access only to a part of these two clusters and take advantage of higher priorities on the rest (1500 cores in total), which can also be used by any user of the MetaCentrum. IoP researchers can also use distant resources located in several towns of the Czech Republic with a capacity of more than 12000 cores in total.

  12. Information Extraction of Tourist Geological Resources Based on 3d Visualization Remote Sensing Image

    NASA Astrophysics Data System (ADS)

    Wang, X.

    2018-04-01

    Tourism geological resources are of high value in admiration, scientific research and universal education, which need to be protected and rationally utilized. In the past, most of the remote sensing investigations of tourism geological resources used two-dimensional remote sensing interpretation method, which made it difficult for some geological heritages to be interpreted and led to the omission of some information. This aim of this paper is to assess the value of a method using the three-dimensional visual remote sensing image to extract information of geological heritages. skyline software system is applied to fuse the 0.36 m aerial images and 5m interval DEM to establish the digital earth model. Based on the three-dimensional shape, color tone, shadow, texture and other image features, the distribution of tourism geological resources in Shandong Province and the location of geological heritage sites were obtained, such as geological structure, DaiGu landform, granite landform, Volcanic landform, sandy landform, Waterscapes, etc. The results show that using this method for remote sensing interpretation is highly recognizable, making the interpretation more accurate and comprehensive.

  13. Joint Workshop on New Technologies for Lunar Resource Assessment

    NASA Technical Reports Server (NTRS)

    Elphic, Rick C. (Editor); Mckay, David S. (Editor)

    1992-01-01

    The workshop included talks on NASA's and DOE's role in Space Exploration Initiative, lunar geology, lunar resources, the strategy for the first lunar outpost, and an industry perspective on lunar resources. The sessions focused on four major aspects of lunar resource assessment: (1) Earth-based remote sensing of the Moon; (2) lunar orbital remote sensing; (3) lunar lander and roving investigations; and (4) geophysical and engineering consideration. The workshop ended with a spirited discussion of a number of issues related to resource assessment.

  14. Proceedings of the National Conference on Energy Resource Management. Volume 1: Techniques, Procedures and Data Bases

    NASA Technical Reports Server (NTRS)

    Brumfield, J. O. (Editor); Schiffman, Y. M. (Editor)

    1982-01-01

    Topics dealing with the integration of remotely sensed data with geographic information system for application in energy resources management are discussed. Associated remote sensing and image analysis techniques are also addressed.

  15. Flow Ambiguity: A Path Towards Classically Driven Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Mantri, Atul; Demarie, Tommaso F.; Menicucci, Nicolas C.; Fitzsimons, Joseph F.

    2017-07-01

    Blind quantum computation protocols allow a user to delegate a computation to a remote quantum computer in such a way that the privacy of their computation is preserved, even from the device implementing the computation. To date, such protocols are only known for settings involving at least two quantum devices: either a user with some quantum capabilities and a remote quantum server or two or more entangled but noncommunicating servers. In this work, we take the first step towards the construction of a blind quantum computing protocol with a completely classical client and single quantum server. Specifically, we show how a classical client can exploit the ambiguity in the flow of information in measurement-based quantum computing to construct a protocol for hiding critical aspects of a computation delegated to a remote quantum computer. This ambiguity arises due to the fact that, for a fixed graph, there exist multiple choices of the input and output vertex sets that result in deterministic measurement patterns consistent with the same fixed total ordering of vertices. This allows a classical user, computing only measurement angles, to drive a measurement-based computation performed on a remote device while hiding critical aspects of the computation.

  16. Managing the CMS Data and Monte Carlo Processing during LHC Run 2

    NASA Astrophysics Data System (ADS)

    Wissing, C.; CMS Collaboration

    2017-10-01

    In order to cope with the challenges expected during the LHC Run 2 CMS put in a number of enhancements into the main software packages and the tools used for centrally managed processing. In the presentation we will highlight these improvements that allow CMS to deal with the increased trigger output rate, the increased pileup and the evolution in computing technology. The overall system aims at high flexibility, improved operational flexibility and largely automated procedures. The tight coupling of workflow classes to types of sites has been drastically relaxed. Reliable and high-performing networking between most of the computing sites and the successful deployment of a data-federation allow the execution of workflows using remote data access. That required the development of a largely automatized system to assign workflows and to handle necessary pre-staging of data. Another step towards flexibility has been the introduction of one large global HTCondor Pool for all types of processing workflows and analysis jobs. Besides classical Grid resources also some opportunistic resources as well as Cloud resources have been integrated into that Pool, which gives reach to more than 200k CPU cores.

  17. Quarterly literature review of the remote sensing of natural resources

    NASA Technical Reports Server (NTRS)

    Fears, C. B. (Editor); Inglis, M. H. (Editor)

    1977-01-01

    The Technology Application Center reviewed abstracted literature sources, and selected document data and data gathering techniques which were performed or obtained remotely from space, aircraft or groundbased stations. All of the documentation was related to remote sensing sensors or the remote sensing of the natural resources. Sensors were primarily those operating within the 10 to the minus 8 power to 1 meter wavelength band. Included are NASA Tech Briefs, ARAC Industrial Applications Reports, U.S. Navy Technical Reports, U.S. Patent reports, and other technical articles and reports.

  18. Massive Cloud-Based Big Data Processing for Ocean Sensor Networks and Remote Sensing

    NASA Astrophysics Data System (ADS)

    Schwehr, K. D.

    2017-12-01

    Until recently, the work required to integrate and analyze data for global-scale environmental issues was prohibitive both in cost and availability. Traditional desktop processing systems are not able to effectively store and process all the data, and super computer solutions are financially out of the reach of most people. The availability of large-scale cloud computing has created tools that are usable by small groups and individuals regardless of financial resources or locally available computational resources. These systems give scientists and policymakers the ability to see how critical resources are being used across the globe with little or no barrier to entry. Google Earth Engine has the Moderate Resolution Imaging Spectroradiometer (MODIS) Terra, MODIS Aqua, and Global Land Data Assimilation Systems (GLDAS) data catalogs available live online. Here we demonstrate these data to calculate the correlation between lagged chlorophyll and rainfall to identify areas of eutrophication, matching these events to ocean currents from datasets like HYbrid Coordinate Ocean Model (HYCOM) to check if there are constraints from oceanographic configurations. The system can provide addition ground truth with observations from sensor networks like the International Comprehensive Ocean-Atmosphere Data Set / Voluntary Observing Ship (ICOADS/VOS) and Argo floats. This presentation is intended to introduce users to the datasets, programming idioms, and functionality of Earth Engine for large-scale, data-driven oceanography.

  19. Distributed Processing of Sentinel-2 Products using the BIGEARTH Platform

    NASA Astrophysics Data System (ADS)

    Bacu, Victor; Stefanut, Teodor; Nandra, Constantin; Mihon, Danut; Gorgan, Dorian

    2017-04-01

    The constellation of observational satellites orbiting around Earth is constantly increasing, providing more data that need to be processed in order to extract meaningful information and knowledge from it. Sentinel-2 satellites, part of the Copernicus Earth Observation program, aim to be used in agriculture, forestry and many other land management applications. ESA's SNAP toolbox can be used to process data gathered by Sentinel-2 satellites but is limited to the resources provided by a stand-alone computer. In this paper we present a cloud based software platform that makes use of this toolbox together with other remote sensing software applications to process Sentinel-2 products. The BIGEARTH software platform [1] offers an integrated solution for processing Earth Observation data coming from different sources (such as satellites or on-site sensors). The flow of processing is defined as a chain of tasks based on the WorDeL description language [2]. Each task could rely on a different software technology (such as Grass GIS and ESA's SNAP) in order to process the input data. One important feature of the BIGEARTH platform comes from this possibility of interconnection and integration, throughout the same flow of processing, of the various well known software technologies. All this integration is transparent from the user perspective. The proposed platform extends the SNAP capabilities by enabling specialists to easily scale the processing over distributed architectures, according to their specific needs and resources. The software platform [3] can be used in multiple configurations. In the basic one the software platform runs as a standalone application inside a virtual machine. Obviously in this case the computational resources are limited but it will give an overview of the functionalities of the software platform, and also the possibility to define the flow of processing and later on to execute it on a more complex infrastructure. The most complex and robust configuration is based on cloud computing and allows the installation on a private or public cloud infrastructure. In this configuration, the processing resources can be dynamically allocated and the execution time can be considerably improved by the available virtual resources and the number of parallelizable sequences in the processing flow. The presentation highlights the benefits and issues of the proposed solution by analyzing some significant experimental use cases. Main references for further information: [1] BigEarth project, http://cgis.utcluj.ro/projects/bigearth [2] Constantin Nandra, Dorian Gorgan: "Defining Earth data batch processing tasks by means of a flexible workflow description language", ISPRS Ann. Photogramm. Remote Sens. Spatial Inf. Sci., III-4, 59-66, (2016). [3] Victor Bacu, Teodor Stefanut, Dorian Gorgan, "Adaptive Processing of Earth Observation Data on Cloud Infrastructures Based on Workflow Description", Proceedings of the Intelligent Computer Communication and Processing (ICCP), IEEE-Press, pp.444-454, (2015).

  20. The Image Data Resource: A Bioimage Data Integration and Publication Platform.

    PubMed

    Williams, Eleanor; Moore, Josh; Li, Simon W; Rustici, Gabriella; Tarkowska, Aleksandra; Chessel, Anatole; Leo, Simone; Antal, Bálint; Ferguson, Richard K; Sarkans, Ugis; Brazma, Alvis; Salas, Rafael E Carazo; Swedlow, Jason R

    2017-08-01

    Access to primary research data is vital for the advancement of science. To extend the data types supported by community repositories, we built a prototype Image Data Resource (IDR) that collects and integrates imaging data acquired across many different imaging modalities. IDR links data from several imaging modalities, including high-content screening, super-resolution and time-lapse microscopy, digital pathology, public genetic or chemical databases, and cell and tissue phenotypes expressed using controlled ontologies. Using this integration, IDR facilitates the analysis of gene networks and reveals functional interactions that are inaccessible to individual studies. To enable re-analysis, we also established a computational resource based on Jupyter notebooks that allows remote access to the entire IDR. IDR is also an open source platform that others can use to publish their own image data. Thus IDR provides both a novel on-line resource and a software infrastructure that promotes and extends publication and re-analysis of scientific image data.

  1. Running VisIt Software on the Peregrine System | High-Performance Computing

    Science.gov Websites

    kilobyte range. VisIt features a robust remote visualization capability. VisIt can be started on a local machine and used to visualize data on a remote compute cluster.The remote machine must be able to send VisIt module must be loaded as part of this process. To enable remote visualization the 'module load

  2. Remote sensing change detection tools for natural resource managers: Understanding concepts and tradeoffs in the design of landscape monitoring projects

    Treesearch

    Robert E. Kennedy; Philip A. Townsend; John E. Gross; Warren B. Cohen; Paul Bolstad; Wang Y. Q.; Phyllis Adams

    2009-01-01

    Remote sensing provides a broad view of landscapes and can be consistent through time, making it an important tool for monitoring and managing protected areas. An impediment to broader use of remote sensing science for monitoring has been the need for resource managers to understand the specialized capabilities of an ever-expanding array of image sources and analysis...

  3. Literature review of the remote sensing of natural resources. [bibliography

    NASA Technical Reports Server (NTRS)

    Fears, C. B. (Editor); Inglis, M. H. (Editor)

    1977-01-01

    Abstracts of 596 documents related to remote sensors or the remote sensing of natural resources by satellite, aircraft, or ground-based stations are presented. Topics covered include general theory, geology and hydrology, agriculture and forestry, marine sciences, urban land use, and instrumentation. Recent documents not yet cited in any of the seven information sources used for the compilation are summarized. An author/key word index is provided.

  4. An integrated study of earth resources in the state of California using remote sensing techniques

    NASA Technical Reports Server (NTRS)

    1973-01-01

    University of California investigations to determine the usefulness of modern remote sensing techniques have concentrated on the water resources of the state. The studies consider in detail the supply, demand, and impact relationships.

  5. Natural Resource Monitoring of Rheum tanguticum by Multilevel Remote Sensing

    PubMed Central

    Xie, Caixiang; Song, Jingyuan; Suo, Fengmei; Li, Xiwen; Li, Ying; Yu, Hua; Xu, Xiaolan; Luo, Kun; Li, Qiushi; Xin, Tianyi; Guan, Meng; Xu, Xiuhai; Miki, Eiji; Takeda, Osami; Chen, Shilin

    2014-01-01

    Remote sensing has been extensively applied in agriculture for its objectiveness and promptness. However, few applications are available for monitoring natural medicinal plants. In the paper, a multilevel monitoring system, which includes satellite and aerial remote sensing, as well as ground investigation, was initially proposed to monitor natural Rheum tanguticum resource in Baihe Pasture, Zoige County, Sichuan Province. The amount of R. tanguticum from images is M = S*ρ and S is vegetation coverage obtained by satellite imaging, whereas ρ is R. tanguticum density obtained by low-altitude imaging. Only the R. tanguticum which coverages exceeded 1 m2 could be recognized from the remote sensing image because of the 0.1 m resolution of the remote sensing image (called effective resource at that moment), and the results of ground investigation represented the amounts of R. tanguticum resource in all sizes (called the future resource). The data in paper showed that the present available amount of R. tanguticum accounted for 4% to 5% of the total quantity. The quantity information and the population structure of R. tanguticum in the Baihe Pasture were initially confirmed by this system. It is feasible to monitor the quantitative distribution for natural medicinal plants with scattered distribution. PMID:25101134

  6. a Framework for Capacity Building in Mapping Coastal Resources Using Remote Sensing in the Philippines

    NASA Astrophysics Data System (ADS)

    Tamondong, A.; Cruz, C.; Ticman, T.; Peralta, R.; Go, G. A.; Vergara, M.; Estabillo, M. S.; Cadalzo, I. E.; Jalbuena, R.; Blanco, A.

    2016-06-01

    Remote sensing has been an effective technology in mapping natural resources by reducing the costs and field data gathering time and bringing in timely information. With the launch of several earth observation satellites, an increase in the availability of satellite imageries provides an immense selection of data for the users. The Philippines has recently embarked in a program which will enable the gathering of LiDAR data in the whole country. The capacity of the Philippines to take advantage of these advancements and opportunities is lacking. There is a need to transfer the knowledge of remote sensing technology to other institutions to better utilize the available data. Being an archipelagic country with approximately 36,000 kilometers of coastline, and most of its people depending on its coastal resources, remote sensing is an optimal choice in mapping such resources. A project involving fifteen (15) state universities and colleges and higher education institutions all over the country headed by the University of the Philippines Training Center for Applied Geodesy and Photogrammetry and funded by the Department of Science and Technology was formed to carry out the task of capacity building in mapping the country's coastal resources using LiDAR and other remotely sensed datasets. This paper discusses the accomplishments and the future activities of the project.

  7. Utilizing remote sensing of Thematic Mapper data to improve our understanding of estuarine processes and their influence on the productivity of estuarine-dependent fisheries

    NASA Technical Reports Server (NTRS)

    Browder, J. A.; May, L. N., Jr.; Rosenthal, A.; Baumann, R. H.; Gosselink, J. G.

    1986-01-01

    LANDSAT thematic mapper (TM) data are being used to refine and validate a stochastic spatial computer model to be applied to coastal resource management problems in Louisiana. Two major aspects of the research are: (1) the measurement of area of land (or emergent vegetation) and water and the length of the interface between land and water in TM imagery of selected coastal wetlands (sample marshes); and (2) the comparison of spatial patterns of land and water in the sample marshes of the imagery to that in marshes simulated by a computer model. In addition to activities in these two areas, the potential use of a published autocorrelation statistic is analyzed.

  8. Computer classification of remotely sensed multispectral image data by extraction and classification of homogeneous objects

    NASA Technical Reports Server (NTRS)

    Kettig, R. L.

    1975-01-01

    A method of classification of digitized multispectral images is developed and experimentally evaluated on actual earth resources data collected by aircraft and satellite. The method is designed to exploit the characteristic dependence between adjacent states of nature that is neglected by the more conventional simple-symmetric decision rule. Thus contextual information is incorporated into the classification scheme. The principle reason for doing this is to improve the accuracy of the classification. For general types of dependence this would generally require more computation per resolution element than the simple-symmetric classifier. But when the dependence occurs in the form of redundance, the elements can be classified collectively, in groups, therby reducing the number of classifications required.

  9. Aerospace technology can be applied to exploration 'back on earth'. [offshore petroleum resources

    NASA Technical Reports Server (NTRS)

    Jaffe, L. D.

    1977-01-01

    Applications of aerospace technology to petroleum exploration are described. Attention is given to seismic reflection techniques, sea-floor mapping, remote geochemical sensing, improved drilling methods and down-hole acoustic concepts, such as down-hole seismic tomography. The seismic reflection techniques include monitoring of swept-frequency explosive or solid-propellant seismic sources, as well as aerial seismic surveys. Telemetry and processing of seismic data may also be performed through use of aerospace technology. Sea-floor sonor imaging and a computer-aided system of geologic analogies for petroleum exploration are also considered.

  10. A modular (almost) automatic set-up for elastic multi-tenants cloud (micro)infrastructures

    NASA Astrophysics Data System (ADS)

    Amoroso, A.; Astorino, F.; Bagnasco, S.; Balashov, N. A.; Bianchi, F.; Destefanis, M.; Lusso, S.; Maggiora, M.; Pellegrino, J.; Yan, L.; Yan, T.; Zhang, X.; Zhao, X.

    2017-10-01

    An auto-installing tool on an usb drive can allow for a quick and easy automatic deployment of OpenNebula-based cloud infrastructures remotely managed by a central VMDIRAC instance. A single team, in the main site of an HEP Collaboration or elsewhere, can manage and run a relatively large network of federated (micro-)cloud infrastructures, making an highly dynamic and elastic use of computing resources. Exploiting such an approach can lead to modular systems of cloud-bursting infrastructures addressing complex real-life scenarios.

  11. Bringing the medical library to the office desktop.

    PubMed

    Brown, S R; Decker, G; Pletzke, C J

    1991-01-01

    This demonstration illustrates LRC Remote Computer Services- a dual operating system, multi-protocol system for delivering medical library services to the medical professional's desktop. A working model draws resources from CD-ROM and magnetic media file services, Novell and AppleTalk network protocol suites and gating, LAN and asynchronous (dial-in) access strategies, commercial applications for MS-DOS and Macintosh workstations and custom user interfaces. The demonstration includes a discussion of issues relevant to the delivery of said services, particularly with respect to maintenance, security, training/support, staffing, software licensing and costs.

  12. Accommodating Student Diversity in Remote Sensing Instruction.

    ERIC Educational Resources Information Center

    Hammen, John L., III.

    1992-01-01

    Discusses the difficulty of teaching computer-based remote sensing to students of varying levels of computer literacy. Suggests an instructional method that accommodates all levels of technical expertise through the use of microcomputers. Presents a curriculum that includes an introduction to remote sensing, digital image processing, and…

  13. Identification of Critical Erosion Prone Areas and Computation of Sediment Yield Using Remote Sensing and GIS: A Case Study on Sarada River Basin

    NASA Astrophysics Data System (ADS)

    Sundara Kumar, P.; Venkata Praveen, T.; Anjanaya Prasad, M.; Santha Rao, P.

    2018-06-01

    The two most important resources blessed by nature to the mankind are land and water. Undoubtedly, these gifts have to be conserved and maintained with unflinching efforts from every one of us for an effective environmental and ecological balance. The efforts and energy of water resources engineers and conservationists are going in this direction to conserve these precious resources of nature. The present study is an attempt to develop suitable methodology to facilitate decision makers to conserve the resources and also reflects the cause mentioned above has been presented here. The main focus of this study is to identify the critical prone areas for soil erosion and computation of sediment yield in a small basin using Universal Soil Loss Equation and Modified Universal Soil Loss Equation (MUSLE) respectively. The developed model has been applied on Sarada river basin which has a drainage area of 1252.99 km2. This river is located in Andhra Pradesh State (AP), India. The basin has been divided into micro basins for effective estimation and also for precise identification of the areas that are prone to soil erosion. Remote Sensing and Geographic Information Systems tools were used to generate and spatially organize the data that is required for soil erosion modeling. It was found that the micro basins with very severe soil erosion are consisting of hilly areas with high topographic factor and 38.01% of the study area has the rate erosion more than 20 t/ha/year and hence requires an immediate attention from the soil conservation point of view. In this study region, though there is one discharge measuring gauge station available at Anakapalli but there is no sediment yield gauging means available to compute the sediment yield. Therefore, to arrive at the suspended-sediment concentration was a challenge task. In the present study the sediment measurement has been carried out with an instrument (DH-48), sediment sampling equipment as per IS: 4890-1968, has been used. Suspended-sediment samples were collected and sediment yield was arrived at the site by using this instrument. The sediment yield was also computed using MUSLE. Data for this model study has been generated from the samples collected from 28 storm events spread over a time span of 1 year, at the outlet of the basin at Anakapalli for computation of sediment yield. The sediment yield as estimated by MUSLE model has been successfully compared with the sediment yield measured at the outlet of the basin by sediment yield measuring unit and found fairly good correlation between them. Hence the developed methodology will be useful to estimate the sediment yield in the hydrologically similar basins that are not gauged for sediment yield.

  14. What stresses remote area nurses? Current knowledge and future action.

    PubMed

    Lenthall, Sue; Wakerman, John; Opie, Tess; Dollard, Maureen; Dunn, Sandra; Knight, Sabina; Macleod, Martha; Watson, Colin

    2009-08-01

    Review and synthesise the literature identifying the stresses experienced by remote area nurses (RANs). Identify interventions implemented to address identified stresses. Explore the use of the job demands-resources (JD-R) model. A comprehensive literature review was conducted using the meta-databases Ovid and Informit. Remote Australian primary health care centres. The reported demands experienced by RANs can be grouped into four themes: (i) the remote context; (ii) workload and extended scope of practice; (iii) poor management; and (iv) violence in the workplace and community. In this high-demand, low-resource context, the JD-R model of occupational stress is particularly pertinent to examining occupational stress among RANs. The demands on RANs, such as the isolated geographical context, are immutable. However, there are key areas where resources can be enhanced to better meet the high level of need. These are: (i) adequate and appropriate education, training and orientation; (ii) appropriate funding of remote health services; and (iii) improved management practices and systems. There is a lack of empirical evidence relating to stresses experienced by RANs. The literature identifies some of the stresses experienced by RANs as unique to the remote context, while some are related to high demands coupled with a deficit of appropriate resources. Use of models, such as the JD-R model of occupational stress, might assist in identifying key areas where resources can be enhanced to better meet the high level of need and reduce RANs' levels of stress.

  15. Application of remote sensing, GIS and MCA techniques for delineating groundwater prospect zones in Kashipur block, Purulia district, West Bengal

    NASA Astrophysics Data System (ADS)

    Nag, S. K.; Kundu, Anindita

    2018-03-01

    Demand of groundwater resources has increased manifold with population expansion as well as with the advent of modern civilization. Assessment, planning and management of groundwater resource are becoming crucial and extremely urgent in recent time. The study area belongs to Kashipur block, Purulia district, West Bengal. The area is characterized with dry climate and hard rock terrain. The objective of this study is to delineate groundwater potential zone for the assessment of groundwater availability using remote sensing, GIS and MCA techniques. Different thematic layers such as hydrogeomorphology, slope and lineament density maps have been transformed to raster data in TNT mips pro2012. To assign weights and ranks to different input factor maps, multi-influencing factor (MIF) technique has been used. The weights assigned to each factor have been computed statistically. Weighted index overlay modeling technique was used to develop a groundwater potential zone map with three weighted and scored parameters. Finally, the study area has been categorized into four distinct groundwater potential zones—excellent 1.5% (6.45 sq. km), good 53% (227.9 sq. km), moderate 45% (193.5 sq. km.) and poor 0.5% (2.15 sq. km). The outcome of the present study will help local authorities, researchers, decision makers and planners in formulating proper planning and management of groundwater resources in different hydrogeological situations.

  16. [Thematic Issue: Remote Sensing.

    ERIC Educational Resources Information Center

    Howkins, John, Ed.

    1978-01-01

    Four of the articles in this publication discuss the remote sensing of the Earth and its resources by satellites. Among the topics dealt with are the development and management of remote sensing systems, types of satellites used for remote sensing, the uses of remote sensing, and issues involved in using information obtained through remote…

  17. Remote sensing techniques in cultural resource management archaeology

    NASA Astrophysics Data System (ADS)

    Johnson, Jay K.; Haley, Bryan S.

    2003-04-01

    Cultural resource management archaeology in the United States concerns compliance with legislation set in place to protect archaeological resources from the impact of modern activities. Traditionally, surface collection, shovel testing, test excavation, and mechanical stripping are used in these projects. These methods are expensive, time consuming, and may poorly represent the features within archaeological sites. The use of remote sensing techniques in cultural resource management archaeology may provide an answer to these problems. Near-surface geophysical techniques, including magnetometry, resistivity, electromagnetics, and ground penetrating radar, have proven to be particularly successful at efficiently locating archaeological features. Research has also indicated airborne and satellite remote sensing may hold some promise in the future for large-scale archaeological survey, although this is difficult in many areas of the world where ground cover reflect archaeological features in an indirect manner. A cost simulation of a hypothetical data recovery project on a large complex site in Mississippi is presented to illustrate the potential advantages of remote sensing in a cultural resource management setting. The results indicate these techniques can save a substantial amount of time and money for these projects.

  18. An efficient and secure dynamic ID-based authentication scheme for telecare medical information systems.

    PubMed

    Chen, Hung-Ming; Lo, Jung-Wen; Yeh, Chang-Kuo

    2012-12-01

    The rapidly increased availability of always-on broadband telecommunication environments and lower-cost vital signs monitoring devices bring the advantages of telemedicine directly into the patient's home. Hence, the control of access to remote medical servers' resources has become a crucial challenge. A secure authentication scheme between the medical server and remote users is therefore needed to safeguard data integrity, confidentiality and to ensure availability. Recently, many authentication schemes that use low-cost mobile devices have been proposed to meet these requirements. In contrast to previous schemes, Khan et al. proposed a dynamic ID-based remote user authentication scheme that reduces computational complexity and includes features such as a provision for the revocation of lost or stolen smart cards and a time expiry check for the authentication process. However, Khan et al.'s scheme has some security drawbacks. To remedy theses, this study proposes an enhanced authentication scheme that overcomes the weaknesses inherent in Khan et al.'s scheme and demonstrated this scheme is more secure and robust for use in a telecare medical information system.

  19. Remote sensing in Michigan for land resource management

    NASA Technical Reports Server (NTRS)

    Lowe, D. S.; Istvan, L. B.; Roller, N. E. G.; Sellman, A. N.; Wagner, T. W.

    1975-01-01

    The utilization of NASA earth resource survey technology as an important aid in the solution of current problems in resource management and environmental protection in Michigan is discussed. Remote sensing techniques to aid Michigan government agencies were used to achieve the following results: (1) provide data on Great Lakes beach recession rates to establish shoreline zoning ordinances; (2) supply technical justification for public acquisition of land to establish the St. John's Marshland Recreation Area; (3) establish economical and effective methods for performing a statewide wetlands survey; (4) accomplish a variety of regional resource management actions in the Upper Peninsula; and (5) demonstrate improved soil survey methods. The project disseminated information on remote sensing technology and provided advice and assistance to a number of users in Michigan.

  20. EROS: A space program for Earth resources

    USGS Publications Warehouse

    Metz, G.G.; Wiepking, P.J.

    1980-01-01

    Within the technology of the space age lies a key to increased knowledge about the resources and environment of the Earth. This key is remote sensing detecting the nature of an object without actually touching it. Although the photographic camera is the most familiar remote-sensing device, other instrument systems, such as scanning radiometers and radar, also can produce photographs and images. On the basis of the potential of this technology, and in response to the critical need for greater knowledge of the Earth and its resources, the Department of the Interior established the Earth Resources Observation Systems (EROS) Program to gather and use remotely sensed data collected by satellite and aircraft of natural and manmade features on the Earth's surface.

  1. Application of Near-Surface Remote Sensing and computer algorithms in evaluating impacts of agroecosystem management on Zea mays (corn) phenological development in the Platte River - High Plains Aquifer Long Term Agroecosystem Research Network field sites.

    NASA Astrophysics Data System (ADS)

    Okalebo, J. A.; Das Choudhury, S.; Awada, T.; Suyker, A.; LeBauer, D.; Newcomb, M.; Ward, R.

    2017-12-01

    The Long-term Agroecosystem Research (LTAR) network is a USDA-ARS effort that focuses on conducting research that addresses current and emerging issues in agriculture related to sustainability and profitability of agroecosystems in the face of climate change and population growth. There are 18 sites across the USA covering key agricultural production regions. In Nebraska, a partnership between the University of Nebraska - Lincoln and ARD/USDA resulted in the establishment of the Platte River - High Plains Aquifer LTAR site in 2014. The site conducts research to sustain multiple ecosystem services focusing specifically on Nebraska's main agronomic production agroecosystems that comprise of abundant corn, soybeans, managed grasslands and beef production. As part of the national LTAR network, PR-HPA participates and contributes near-surface remotely sensed imagery of corn, soybean and grassland canopy phenology to the PhenoCam Network through high-resolution digital cameras. This poster highlights the application, advantages and usefulness of near-surface remotely sensed imagery in agroecosystem studies and management. It demonstrates how both Infrared and Red-Green-Blue imagery may be applied to monitor phenological events as well as crop abiotic stresses. Computer-based algorithms and analytic techniques proved very instrumental in revealing crop phenological changes such as green-up and tasseling in corn. This poster also reports the suitability and applicability of corn-derived computer based algorithms for evaluating phenological development of sorghum since both crops have similarities in their phenology; with sorghum panicles being similar to corn tassels. This later assessment was carried out using a sorghum dataset obtained from the Transportation Energy Resources from Renewable Agriculture Phenotyping Reference Platform project, Maricopa Agricultural Center, Arizona.

  2. [Application of image recognition technology in census of national traditional Chinese medicine resources].

    PubMed

    Zhang, Xiao-Bo; Ge, Xiao-Guang; Jin, Yan; Shi, Ting-Ting; Wang, Hui; Li, Meng; Jing, Zhi-Xian; Guo, Lan-Ping; Huang, Lu-Qi

    2017-11-01

    With the development of computer and image processing technology, image recognition technology has been applied to the national medicine resources census work at all stages.Among them: ①In the preparatory work, in order to establish a unified library of traditional Chinese medicine resources, using text recognition technology based on paper materials, be the assistant in the digitalization of various categories related to Chinese medicine resources; to determine the representative area and plots of the survey from each census team, based on the satellite remote sensing image and vegetation map and other basic data, using remote sensing image classification and other technical methods to assist in determining the key investigation area. ②In the process of field investigation, to obtain the planting area of Chinese herbal medicine was accurately, we use the decision tree model, spectral feature and object-oriented method were used to assist the regional identification and area estimation of Chinese medicinal materials.③In the process of finishing in the industry, in order to be able to relatively accurately determine the type of Chinese medicine resources in the region, based on the individual photos of the plant, the specimens and the name of the use of image recognition techniques, to assist the statistical summary of the types of traditional Chinese medicine resources. ④In the application of the results of transformation, based on the pharmaceutical resources and individual samples of medicinal herbs, the development of Chinese medicine resources to identify APP and authentic herbs 3D display system, assisted the identification of Chinese medicine resources and herbs identification characteristics. The introduction of image recognition technology in the census of Chinese medicine resources, assisting census personnel to carry out related work, not only can reduce the workload of the artificial, improve work efficiency, but also improve the census results of information technology and sharing application ability. With the deepening of the work of Chinese medicine resources census, image recognition technology in the relevant work will also play its unique role. Copyright© by the Chinese Pharmaceutical Association.

  3. A Resource Package Training Framework for Producing Quality Graduates to Work in Rural, Regional and Remote Australia: A Global Perspective

    ERIC Educational Resources Information Center

    Lynch, Timothy

    2014-01-01

    The purpose of this paper is to advocate the resource package for producing quality graduates to work in rural, regional and remote Australia (TERRR Network), using a global perspective. This paper argues that the resource package achieves more than the objectives of the original project; "Developing Strategies at the Pre-service Level to…

  4. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  5. Computers, Remote Teleprocessing and Mass Communication.

    ERIC Educational Resources Information Center

    Cropley, A. J.

    Recent developments in computer technology are reducing the limitations of computers as mass communication devices. The growth of remote teleprocessing is one important step. Computers can now interact with users via terminals which may be hundreds of miles from the actual mainframe machine. Many terminals can be in operation at once, so that many…

  6. Education in Environmental Remote Sensing: Potentials and Problems.

    ERIC Educational Resources Information Center

    Kiefer, Ralph W.; Lillesand, Thomas M.

    1983-01-01

    Discusses remote sensing principles and applications and the status and needs of remote sensing education in the United States. A summary of the fundamental policy issues that will determine remote sensing's future role in environmental and resource managements is included. (Author/BC)

  7. Incorporating Brokers within Collaboration Environments

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; de Torcy, A.

    2013-12-01

    A collaboration environment, such as the integrated Rule Oriented Data System (iRODS - http://irods.diceresearch.org), provides interoperability mechanisms for accessing storage systems, authentication systems, messaging systems, information catalogs, networks, and policy engines from a wide variety of clients. The interoperability mechanisms function as brokers, translating actions requested by clients to the protocol required by a specific technology. The iRODS data grid is used to enable collaborative research within hydrology, seismology, earth science, climate, oceanography, plant biology, astronomy, physics, and genomics disciplines. Although each domain has unique resources, data formats, semantics, and protocols, the iRODS system provides a generic framework that is capable of managing collaborative research initiatives that span multiple disciplines. Each interoperability mechanism (broker) is linked to a name space that enables unified access across the heterogeneous systems. The collaboration environment provides not only support for brokers, but also support for virtualization of name spaces for users, files, collections, storage systems, metadata, and policies. The broker enables access to data or information in a remote system using the appropriate protocol, while the collaboration environment provides a uniform naming convention for accessing and manipulating each object. Within the NSF DataNet Federation Consortium project (http://www.datafed.org), three basic types of interoperability mechanisms have been identified and applied: 1) drivers for managing manipulation at the remote resource (such as data subsetting), 2) micro-services that execute the protocol required by the remote resource, and 3) policies for controlling the execution. For example, drivers have been written for manipulating NetCDF and HDF formatted files within THREDDS servers. Micro-services have been written that manage interactions with the CUAHSI data repository, the DataONE information catalog, and the GeoBrain broker. Policies have been written that manage transfer of messages between an iRODS message queue and the Advanced Message Queuing Protocol. Examples of these brokering mechanisms will be presented. The DFC collaboration environment serves as the intermediary between community resources and compute grids, enabling reproducible data-driven research. It is possible to create an analysis workflow that retrieves data subsets from a remote server, assemble the required input files, automate the execution of the workflow, automatically track the provenance of the workflow, and share the input files, workflow, and output files. A collaborator can re-execute a shared workflow, compare results, change input files, and re-execute an analysis.

  8. Remote sensing of strippable coal reserves and mine inventory in part of the Warrior Coal Field in Alabama

    NASA Technical Reports Server (NTRS)

    Joiner, T. J.; Copeland, C. W., Jr.; Russell, D. D.; Evans, F. E., Jr.; Sapp, C. D.; Boone, P. A.

    1978-01-01

    Methods by which estimates of the remaining reserves of strippable coal in Alabama could be made were developed. Information acquired from NASA's Earth Resources Office was used to analyze and map existing surface mines in a four-quadrangle area in west central Alabama. Using this information and traditional methods for mapping coal reserves, an estimate of remaining strippable reserves was derived. Techniques for the computer analysis of remotely sensed data and other types of available coal data were developed to produce an estimate of strippable coal reserves for a second four-quadrangle area. Both areas lie in the Warrior coal field, the most prolific and active of Alabama's coal fields. They were chosen because of the amount and type of coal mining in the area, their location relative to urban areas, and the amount and availability of base data necessary for this type of study.

  9. Remote creation of hybrid entanglement between particle-like and wave-like optical qubits

    NASA Astrophysics Data System (ADS)

    Morin, Olivier; Huang, Kun; Liu, Jianli; Le Jeannic, Hanna; Fabre, Claude; Laurat, Julien

    2014-07-01

    The wave-particle duality of light has led to two different encodings for optical quantum information processing. Several approaches have emerged based either on particle-like discrete-variable states (that is, finite-dimensional quantum systems) or on wave-like continuous-variable states (that is, infinite-dimensional systems). Here, we demonstrate the generation of entanglement between optical qubits of these different types, located at distant places and connected by a lossy channel. Such hybrid entanglement, which is a key resource for a variety of recently proposed schemes, including quantum cryptography and computing, enables information to be converted from one Hilbert space to the other via teleportation and therefore the connection of remote quantum processors based upon different encodings. Beyond its fundamental significance for the exploration of entanglement and its possible instantiations, our optical circuit holds promise for implementations of heterogeneous network, where discrete- and continuous-variable operations and techniques can be efficiently combined.

  10. Xi-cam: a versatile interface for data visualization and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  11. Xi-cam: a versatile interface for data visualization and analysis

    DOE PAGES

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...

    2018-05-31

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  12. Remote sensing research for agricultural applications. [San Joaquin County, California and Snake River Plain and Twin Falls area, Idaho

    NASA Technical Reports Server (NTRS)

    Colwell, R. N. (Principal Investigator); Wall, S. L.; Beck, L. H.; Degloria, S. D.; Ritter, P. R.; Thomas, R. W.; Travlos, A. J.; Fakhoury, E.

    1984-01-01

    Materials and methods used to characterize selected soil properties and agricultural crops in San Joaquin County, California are described. Results show that: (1) the location and widths of TM bands are suitable for detecting differences in selected soil properties; (2) the number of TM spectral bands allows the quantification of soil spectral curve form and magnitude; and (3) the spatial and geometric quality of TM data allows for the discrimination and quantification of within field variability of soil properties. The design of the LANDSAT based multiple crop acreage estimation experiment for the Idaho Department of Water Resources is described including the use of U.C. Berkeley's Survey Modeling Planning Model. Progress made on Peditor software development on MIDAS, and cooperative computing using local and remote systems is reported as well as development of MIDAS microcomputer systems.

  13. Study of sensor spectral responses and data processing algorithms and architectures for onboard feature identification

    NASA Technical Reports Server (NTRS)

    Huck, F. O.; Davis, R. E.; Fales, C. L.; Aherron, R. M.

    1982-01-01

    A computational model of the deterministic and stochastic processes involved in remote sensing is used to study spectral feature identification techniques for real-time onboard processing of data acquired with advanced earth-resources sensors. Preliminary results indicate that: Narrow spectral responses are advantageous; signal normalization improves mean-square distance (MSD) classification accuracy but tends to degrade maximum-likelihood (MLH) classification accuracy; and MSD classification of normalized signals performs better than the computationally more complex MLH classification when imaging conditions change appreciably from those conditions during which reference data were acquired. The results also indicate that autonomous categorization of TM signals into vegetation, bare land, water, snow and clouds can be accomplished with adequate reliability for many applications over a reasonably wide range of imaging conditions. However, further analysis is required to develop computationally efficient boundary approximation algorithms for such categorization.

  14. Benefits of cloud computing for PACS and archiving.

    PubMed

    Koch, Patrick

    2012-01-01

    The goal of cloud-based services is to provide easy, scalable access to computing resources and IT services. The healthcare industry requires a private cloud that adheres to government mandates designed to ensure privacy and security of patient data while enabling access by authorized users. Cloud-based computing in the imaging market has evolved from a service that provided cost effective disaster recovery for archived data to fully featured PACS and vendor neutral archiving services that can address the needs of healthcare providers of all sizes. Healthcare providers worldwide are now using the cloud to distribute images to remote radiologists while supporting advanced reading tools, deliver radiology reports and imaging studies to referring physicians, and provide redundant data storage. Vendor managed cloud services eliminate large capital investments in equipment and maintenance, as well as staffing for the data center--creating a reduction in total cost of ownership for the healthcare provider.

  15. Development of alternative data analysis techniques for improving the accuracy and specificity of natural resource inventories made with digital remote sensing data

    NASA Technical Reports Server (NTRS)

    Lillesand, T. M.; Meisner, D. E. (Principal Investigator)

    1980-01-01

    An investigation was conducted into ways to improve the involvement of state and local user personnel in the digital image analysis process by isolating those elements of the analysis process which require extensive involvement by field personnel and providing means for performing those activities apart from a computer facility. In this way, the analysis procedure can be converted from a centralized activity focused on a computer facility to a distributed activity in which users can interact with the data at the field office level or in the field itself. A general image processing software was developed on the University of Minnesota computer system (Control Data Cyber models 172 and 74). The use of color hardcopy image data as a primary medium in supervised training procedures was investigated and digital display equipment and a coordinate digitizer were procured.

  16. Multi-party Semi-quantum Key Agreement with Delegating Quantum Computation

    NASA Astrophysics Data System (ADS)

    Liu, Wen-Jie; Chen, Zhen-Yu; Ji, Sai; Wang, Hai-Bin; Zhang, Jun

    2017-10-01

    A multi-party semi-quantum key agreement (SQKA) protocol based on delegating quantum computation (DQC) model is proposed by taking Bell states as quantum resources. In the proposed protocol, the participants only need the ability of accessing quantum channel and preparing single photons {|0〉, |1〉, |+〉, |-〉}, while the complicated quantum operations, such as the unitary operations and Bell measurement, will be delegated to the remote quantum center. Compared with previous quantum key agreement protocols, this client-server model is more feasible in the early days of the emergence of quantum computers. In order to prevent the attacks from outside eavesdroppers, inner participants and quantum center, two single photon sequences are randomly inserted into Bell states: the first sequence is used to perform the quantum channel detection, while the second is applied to disorder the positions of message qubits, which guarantees the security of the protocol.

  17. Geographic information systems, remote sensing, and spatial analysis activities in Texas, 2008-09

    USGS Publications Warehouse

    ,

    2009-01-01

    Geographic information system (GIS) technology has become an important tool for scientific investigation, resource management, and environmental planning. A GIS is a computer-aided system capable of collecting, storing, analyzing, and displaying spatially referenced digital data. GIS technology is useful for analyzing a wide variety of spatial data. Remote sensing involves collecting remotely sensed data, such as satellite imagery, aerial photography, or radar images, and analyzing the data to gather information or investigate trends about the environment or the Earth's surface. Spatial analysis combines remotely sensed, thematic, statistical, quantitative, and geographical data through overlay, modeling, and other analytical techniques to investigate specific research questions. It is the combination of data formats and analysis techniques that has made GIS an essential tool in scientific investigations. This fact sheet presents information about the technical capabilities and project activities of the U.S. Geological Survey (USGS) Texas Water Science Center (TWSC) GIS Workgroup during 2008 and 2009. After a summary of GIS Workgroup capabilities, brief descriptions of activities by project at the local and national levels are presented. Projects are grouped by the fiscal year (October-September 2008 or 2009) the project ends and include overviews, project images, and Internet links to additional project information and related publications or articles.

  18. Review of Remote Sensing Needs and Applications in Africa

    NASA Technical Reports Server (NTRS)

    Brown, Molly E.

    2007-01-01

    Remote sensing data has had an important role in identifying and responding to inter-annual variations in the African environment during the past three decades. As a largely agricultural region with diverse but generally limited government capacity to acquire and distribute ground observations of rainfall, temperature and other parameters, remote sensing is sometimes the only reliable measure of crop growing conditions in Africa. Thus, developing and maintaining the technical and scientific capacity to analyze and utilize satellite remote sensing data in Africa is critical to augmenting the continent's local weather/climate observation networks as well as its agricultural and natural resource development and management. The report Review of Remote Sensing Needs and Applications in Africa' has as its central goal to recommend to the US Agency for International Development an appropriate approach to support sustainable remote sensing applications at African regional remote sensing centers. The report focuses on "RS applications" to refer to the acquisition, maintenance and archiving, dissemination, distribution, analysis, and interpretation of remote sensing data, as well as the integration of interpreted data with other spatial data products. The report focuses on three primary remote sensing centers: (1) The AGRHYMET Regional Center in Niamey, Niger, created in 1974, is a specialized institute of the Permanent Interstate Committee for Drought Control in the Sahel (CILSS), with particular specialization in science and techniques applied to agricultural development, rural development, and natural resource management. (2) The Regional Centre for Maiming of Resources for Development (RCMRD) in Nairobi, Kenya, established in 1975 under the auspices of the United Nations Economic Commission for Africa and the Organization of African Unity (now the African Union), is an intergovernmental organization, with 15 member states from eastern and southern Africa. (3) The Regional Remote Sensing Unit (RRSU) in Gaborone, Botswana, began work in June 1988 and operates under the Agriculture Information Management System (AIMS), as part of the Food, Agriculture and Natural Resources (FANR) Directorate, based at the Southern Africa Development Community (SADC) Secretariat.

  19. A web-based remote radiation treatment planning system using the remote desktop function of a computer operating system: a preliminary report.

    PubMed

    Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki

    2009-01-01

    We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.

  20. National Aeronautics and Space Administration fundamental research program. Information utilization and evaluation, appendices

    NASA Technical Reports Server (NTRS)

    Estes, J. E.; Eisgruber, L.

    1981-01-01

    Important points presented and recommendations made at an information and decision processes workshop held in Asilomar, California; at a data and information performance workshop held in Houston, Texas; and at a data base use and management workshop held near San Jose, California are summarized. Issues raised at a special session of the Soil Conservation Society of America's remote sensing for resource management conference in Kansas City, Missouri are also highlighted. The goals, status and activities of the NASA program definition study of basic research requirements, the necessity of making the computer science community aware of user needs with respect to information related to renewable resources, performance parameters and criteria for judging federal information systems, and the requirements and characteristics of scientific data bases are among the topics reported.

  1. Aerospace video imaging systems for rangeland management

    NASA Technical Reports Server (NTRS)

    Everitt, J. H.; Escobar, D. E.; Richardson, A. J.; Lulla, K.

    1990-01-01

    This paper presents an overview on the application of airborne video imagery (VI) for assessment of rangeland resources. Multispectral black-and-white video with visible/NIR sensitivity; color-IR, normal color, and black-and-white MIR; and thermal IR video have been used to detect or distinguish among many rangeland and other natural resource variables such as heavy grazing, drought-stressed grass, phytomass levels, burned areas, soil salinity, plant communities and species, and gopher and ant mounds. The digitization and computer processing of VI have also been demonstrated. VI does not have the detailed resolution of film, but these results have shown that it has considerable potential as an applied remote sensing tool for rangeland management. In the future, spaceborne VI may provide additional data for monitoring and management of rangelands.

  2. Techniques for assessing water resource potentials in the developing countries: with emphasis on streamflow, erosion and sediment transport, water movement in unsaturated soils, ground water, and remote sensing in hydrologic applications

    USGS Publications Warehouse

    Taylor, George C.

    1971-01-01

    Hydrologic instrumentation and methodology for assessing water-resource potentials have originated largely in the developed countries of the temperature zone. The developing countries lie largely in the tropic zone, which contains the full gamut of the earth's climatic environments, including most of those of the temperate zone. For this reason, most hydrologic techniques have world-wide applicability. Techniques for assessing water-resource potentials for the high priority goals of economic growth are well established in the developing countries--but much more are well established in the developing countries--but much more so in some than in other. Conventional techniques for measurement and evaluation of basic hydrologic parameters are now well-understood in the developing countries and are generally adequate for their current needs and those of the immediate future. Institutional and economic constraints, however, inhibit growth of sustained programs of hydrologic data collection and application of the data to problems in engineering technology. Computer-based technology, including processing of hydrologic data and mathematical modelling of hydrologic parameters i also well-begun in many developing countries and has much wider potential application. In some developing counties, however, there is a tendency to look on the computer as a panacea for deficiencies in basic hydrologic data collection programs. This fallacy must be discouraged, as the computer is a tool and not a "magic box." There is no real substitute for sound programs of basic data collection. Nuclear and isotopic techniques are being used increasingly in the developed countries in the measurement and evaluation of virtually all hydrologic parameter in which conventional techniques have been used traditionally. Even in the developed countries, however, many hydrologists are not using nuclear techniques, simply because they lack knowledge of the principles involved and of the potential benefits. Nuclear methodology in hydrologic applications is generally more complex than the conventional and hence requires a high level of technical expertise for effective use. Application of nuclear techniques to hydrologic problems in the developing countries is likely to be marginal for some years to come, owing to the higher costs involved and expertise required. Nuclear techniques, however, would seem to have particular promise in studies of water movement in unsaturated soils and of erosion and sedimentation where conventional techniques are inadequate, inefficient and in some cases costly. Remote sensing offers great promise for synoptic evaluations of water resources and hydrologic processes, including the transient phenomena of the hydrologic cycle. Remote sensing is not, however, a panacea for deficiencies in hydrologic data programs in the developing countries. Rather it is a means for extending and augmenting on-the-ground observations ans surveys (ground truth) to evaluated water resources and hydrologic processes on a regionall or even continental scale. With respect to economic growth goals in developing countries, there are few identifiable gaps in existing hydrologic instrumentation and methodology insofar as appraisal, development and management of available water resources are concerned. What is needed is acceleration of institutional development and professional motivation toward more effective use of existing and proven methodology. Moreover, much sophisticated methodology can be applied effectively in the developing countries only when adequate levels of indigenous scientific skills have been reached and supportive institutional frameworks are evolved to viability.

  3. Efficiency Evaluation of Handling of Geologic-Geophysical Information by Means of Computer Systems

    NASA Astrophysics Data System (ADS)

    Nuriyahmetova, S. M.; Demyanova, O. V.; Zabirova, L. M.; Gataullin, I. I.; Fathutdinova, O. A.; Kaptelinina, E. A.

    2018-05-01

    Development of oil and gas resources, considering difficult geological, geographical and economic conditions, requires considerable finance costs; therefore their careful reasons, application of the most perspective directions and modern technologies from the point of view of cost efficiency of planned activities are necessary. For ensuring high precision of regional and local forecasts and modeling of reservoirs of fields of hydrocarbonic raw materials, it is necessary to analyze huge arrays of the distributed information which is constantly changing spatial. The solution of this task requires application of modern remote methods of a research of the perspective oil-and-gas territories, complex use of materials remote, nondestructive the environment of geologic-geophysical and space methods of sounding of Earth and the most perfect technologies of their handling. In the article, the authors considered experience of handling of geologic-geophysical information by means of computer systems by the Russian and foreign companies. Conclusions that the multidimensional analysis of geologicgeophysical information space, effective planning and monitoring of exploration works requires broad use of geoinformation technologies as one of the most perspective directions in achievement of high profitability of an oil and gas industry are drawn.

  4. Capacity Building in Using NASA Remote Sensing for Water Resources and Disasters Management

    NASA Astrophysics Data System (ADS)

    Mehta, A. V.; Podest, E.; Prados, A. I.

    2017-12-01

    The NASA Applied Remote Sensing Training Program (ARSET), a part of NASA's Applied Sciences Capacity Building program, empowers the global community through online and in-person training. The program focuses on helping policy makers, environmental managers, and other professionals, both domestic and international, use remote sensing in decision making. Since 2011, ARSET has provided more than 20 trainings in water resource and disaster management, including floods and droughts. This presentation will include an overview of the ARSET program, best practices for approaching trainings, feedback from participants, and examples of case studies from the trainings showing the application of GPM, SMAP, Landsat, Terra and Aqua (MODIS), and Sentinel (SAR) data. This presentation will also outline how ARSET can serve as a liaison between remote sensing applications developers and users in the areas of water resource and disaster management.

  5. The application of remote sensing to resource management and environmental quality programs in Kansas

    NASA Technical Reports Server (NTRS)

    Barr, B. G.; Martinko, E. A. (Principal Investigator)

    1983-01-01

    The activities of the Kansas Applied Remote Sensing (KARS) Program during the period April 1, 1982 through Marsh 31, 1983 are described. The most important work revolved around the Kansas Interagency Task Force on Applied Remote Sensing and its efforts to establish an operational service oriented remote sensing program in Kansas state government. Concomitant with this work was the upgrading of KARS capabilities to process data for state agencies through the vehicle of a low cost digital data processing system. The KARS Program continued to take an active role in irrigation mapping. KARS is now integrating data acquired through analysis of LANDSAT into geographic information systems designed for evaluating groundwater resources. KARS also continues to work at the national level on the national inventory of state natural resources information systems.

  6. Remote Sensing Applied to Geology (Latest Citations from the Aerospace Database)

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The bibliography contains citations concerning the use of remote sensing in geological resource exploration. Technologies discussed include thermal, optical, photographic, and electronic imaging using ground-based, aerial, and satellite-borne devices. Analog and digital techniques to locate, classify, and assess geophysical features, structures, and resources are also covered. Application of remote sensing to petroleum and minerals exploration is treated in a separate bibliography. (Contains 50-250 citations and includes a subject term index and title list.)

  7. Measuring use patterns of online journals and databases

    PubMed Central

    De Groote, Sandra L.; Dorsch, Josephine L.

    2003-01-01

    Purpose: This research sought to determine use of online biomedical journals and databases and to assess current user characteristics associated with the use of online resources in an academic health sciences center. Setting: The Library of the Health Sciences–Peoria is a regional site of the University of Illinois at Chicago (UIC) Library with 350 print journals, more than 4,000 online journals, and multiple online databases. Methodology: A survey was designed to assess online journal use, print journal use, database use, computer literacy levels, and other library user characteristics. A survey was sent through campus mail to all (471) UIC Peoria faculty, residents, and students. Results: Forty-one percent (188) of the surveys were returned. Ninety-eight percent of the students, faculty, and residents reported having convenient access to a computer connected to the Internet. While 53% of the users indicated they searched MEDLINE at least once a week, other databases showed much lower usage. Overall, 71% of respondents indicated a preference for online over print journals when possible. Conclusions: Users prefer online resources to print, and many choose to access these online resources remotely. Convenience and full-text availability appear to play roles in selecting online resources. The findings of this study suggest that databases without links to full text and online journal collections without links from bibliographic databases will have lower use. These findings have implications for collection development, promotion of library resources, and end-user training. PMID:12883574

  8. Telescience Testbed Program: A study of software for SIRTF instrument control

    NASA Technical Reports Server (NTRS)

    Young, Erick T.

    1992-01-01

    As a continued element in the Telescience Testbed Program (TTP), the University of Arizona Steward Observatory and the Electrical and Computer Engineering Department (ECE) jointly developed a testbed to evaluate the Operations and Science Instrument System (OASIS) software package for remote control of an instrument for the Space Infrared Telescope Facility (SIRTF). SIRTF is a cryogenically-cooled telescope with three focal plane instruments that will be the infrared element of NASA's Great Observatory series. The anticipated launch date for SIRTF is currently 2001. Because of the complexity of the SIRTF mission, it was not expected that the OASIS package would be suitable for instrument control in the flight situation, however, its possible use as a common interface during the early development and ground test phases of the project was considered. The OASIS package, developed at the University of Colorado for control of the Solar Mesosphere Explorer (SME) satellite, serves as an interface between the operator and the remote instrument which is connected via a network. OASIS provides a rudimentary windowing system as well as support for standard spacecraft communications protocols. The experiment performed all of the functions required of the MIPS simulation program. Remote control of the instrument was demonstrated but found to be inappropriate for SIRTF at this time for the following reasons: (1) programming interface is too difficult; (2) significant computer resources were required to run OASIS; (3) the communications interface is too complicated; (4) response time was slow; and (5) quicklook of image data was not possible.

  9. Computer-aided analysis of Skylab scanner data for land use mapping, forestry and water resource applications

    NASA Technical Reports Server (NTRS)

    Hoffer, R. M.

    1975-01-01

    Skylab data were obtained over a mountainous test site containing a complex association of cover types and rugged topography. The application of computer-aided analysis techniques to the multispectral scanner data produced a number of significant results. Techniques were developed to digitally overlay topographic data (elevation, slope, and aspect) onto the S-192 MSS data to provide a method for increasing the effectiveness and accuracy of computer-aided analysis techniques for cover type mapping. The S-192 MSS data were analyzed using computer techniques developed at Laboratory for Applications of Remote Sensing (LARS), Purdue University. Land use maps, forest cover type maps, snow cover maps, and area tabulations were obtained and evaluated. These results compared very well with information obtained by conventional techniques. Analysis of the spectral characteristics of Skylab data has conclusively proven the value of the middle infrared portion of the spectrum (about 1.3-3.0 micrometers), a wavelength region not previously available in multispectral satellite data.

  10. ABrIL - Advanced Brain Imaging Lab : a cloud based computation environment for cooperative neuroimaging projects.

    PubMed

    Neves Tafula, Sérgio M; Moreira da Silva, Nádia; Rozanski, Verena E; Silva Cunha, João Paulo

    2014-01-01

    Neuroscience is an increasingly multidisciplinary and highly cooperative field where neuroimaging plays an important role. Neuroimaging rapid evolution is demanding for a growing number of computing resources and skills that need to be put in place at every lab. Typically each group tries to setup their own servers and workstations to support their neuroimaging needs, having to learn from Operating System management to specific neuroscience software tools details before any results can be obtained from each setup. This setup and learning process is replicated in every lab, even if a strong collaboration among several groups is going on. In this paper we present a new cloud service model - Brain Imaging Application as a Service (BiAaaS) - and one of its implementation - Advanced Brain Imaging Lab (ABrIL) - in the form of an ubiquitous virtual desktop remote infrastructure that offers a set of neuroimaging computational services in an interactive neuroscientist-friendly graphical user interface (GUI). This remote desktop has been used for several multi-institution cooperative projects with different neuroscience objectives that already achieved important results, such as the contribution to a high impact paper published in the January issue of the Neuroimage journal. The ABrIL system has shown its applicability in several neuroscience projects with a relatively low-cost, promoting truly collaborative actions and speeding up project results and their clinical applicability.

  11. Climate Modeling Computing Needs Assessment

    NASA Astrophysics Data System (ADS)

    Petraska, K. E.; McCabe, J. D.

    2011-12-01

    This paper discusses early findings of an assessment of computing needs for NASA science, engineering and flight communities. The purpose of this assessment is to document a comprehensive set of computing needs that will allow us to better evaluate whether our computing assets are adequately structured to meet evolving demand. The early results are interesting, already pointing out improvements we can make today to get more out of the computing capacity we have, as well as potential game changing innovations for the future in how we apply information technology to science computing. Our objective is to learn how to leverage our resources in the best way possible to do more science for less money. Our approach in this assessment is threefold: Development of use case studies for science workflows; Creating a taxonomy and structure for describing science computing requirements; and characterizing agency computing, analysis, and visualization resources. As projects evolve, science data sets increase in a number of ways: in size, scope, timelines, complexity, and fidelity. Generating, processing, moving, and analyzing these data sets places distinct and discernable requirements on underlying computing, analysis, storage, and visualization systems. The initial focus group for this assessment is the Earth Science modeling community within NASA's Science Mission Directorate (SMD). As the assessment evolves, this focus will expand to other science communities across the agency. We will discuss our use cases, our framework for requirements and our characterizations, as well as our interview process, what we learned and how we plan to improve our materials after using them in the first round of interviews in the Earth Science Modeling community. We will describe our plans for how to expand this assessment, first into the Earth Science data analysis and remote sensing communities, and then throughout the full community of science, engineering and flight at NASA.

  12. Remote Sensing of Earth--A New Perspective

    ERIC Educational Resources Information Center

    Boyer, Robert E.

    1973-01-01

    Photographs of the earth taken from space are used to illustrate the advantages and application of remote sensing. This technique may be used in such areas as the immediate appraisal of disasters, surveillance of the oceans, monitoring of land, food and water resources, detection of natural resources, and identification of pollution. (JR)

  13. Remote monitoring of implantable cardiac devices: current state and future directions.

    PubMed

    Ganeshan, Raj; Enriquez, Alan D; Freeman, James V

    2018-01-01

    Recent evidence has demonstrated substantial benefits associated with remote monitoring of cardiac implantable electronic devices (CIEDs), and treatment guidelines have endorsed the use of remote monitoring. Familiarity with the features of remote monitoring systems and the data supporting its use are vital for physicians' care for patients with CEIDs. Remote monitoring remains underutilized, but its use is expanding including in new practice settings including emergency departments. Patient experience and outcomes are positive, with earlier detection of clinical events such as atrial fibrillation, reductions in inappropriate implantable cardioverter-defibrillator (ICD) shocks and potentially a decrease in mortality with frequent remote monitoring utilizaiton. Rates of hospitalization are reduced among remote monitoring users, and the replacement of outpatient follow-up visits with remote monitoring transmissions has been shown to be well tolerated. In addition, health resource utilization is lower and remote monitoring has been associated with considerable cost savings. A dose relationship exists between use of remote monitoring and patient outcomes, and those with early and high transmission rates have superior outcomes. Remote monitoring provides clinicians with the ability to provide comprehensive follow-up care for patients with CIEDs. Patient outcomes are improved, and resource utilization is decreased with appropriate use of remote monitoring. Future efforts must focus on improving the utilization and efficiency of remote monitoring.

  14. Remote Sensing: A Film Review.

    ERIC Educational Resources Information Center

    Carter, David J.

    1986-01-01

    Reviews the content of 19 films on remote sensing published between 1973 and 1980. Concludes that they are overly simplistic, notably outdated, and generally too optimistic about the potential of remote sensing from space for resource exploration and environmental problem-solving. Provides names and addresses of more current remote sensing…

  15. Remote Symbolic Computation of Loci

    ERIC Educational Resources Information Center

    Abanades, Miguel A.; Escribano, Jesus; Botana, Francisco

    2010-01-01

    This article presents a web-based tool designed to compute certified equations and graphs of geometric loci specified using standard Dynamic Geometry Systems (DGS). Complementing the graphing abilities of the considered DGS, the equations of the loci produced by the application are remotely computed using symbolic algebraic techniques from the…

  16. Wetland resources investigation based on 3S technology

    NASA Astrophysics Data System (ADS)

    Lin, Hui; Jing, Haitao; Zhang, Lianpeng

    2008-10-01

    Wetland is a special ecosystem between land and water . It can provide massive foods, raw material, water resources and habitat for human being, animals and plants, Wetlands are so important that wetlands' development, management and protection have become the focus of public attention ."3S" integration technology was applied to investigate wetland resources in Shandong Province ,the investigation is based on remote sensing(RS) information, combining wetlandrelated geographic information system(GIS) data concerning existing geology, hydrology, land, lakes, rivers, oceans and environmental protection, using the Global Positioning System (GPS) to determine location accurately and conveniently , as well as multi-source information to demonstrate each other based on "3S" integration technology. In addition, the remote sensing(RS) interpretation shall be perfected by combining house interpretation with field survey and combining interpretation results with known data.By contrasting various types of wetland resources with the TM, ETM, SPOT image and combining with the various types of information, remote sensing interpretation symbols of various types of wetland resources are established respectively. According to the interpretation symbols, we systematically interpret the wetland resources of Shandong Province. In accordance with the purpose of different work, we interpret the image of 1987, 1996 and 2000. Finally, various interpretation results are processed by computer scanning, Vectored, projection transformation and image mosaic, wetland resources distribution map is worked out and wetland resources database of Shandong Province is established in succession. Through the investigation, wetland resource in Shandong province can be divided into 4 major categories and 17 sub-categories. we have ascertained the range and area of each category as well as their present utilization status.. By investigating and calculating, the total area of wetland in Shandong Province is 1,712,200 hm2,which accounts for 7.58% of the total area of land in Shandong Province (not including the wetland in the shallow waters along the coast). Among them, area of river wetland is 286,746 hm2, area of lakes wetland is143,490 hm2, area of reservoir and pond wetland is 118,693 hm2, area of offshore and coastal wetland is 994,100 hm2, and area of other wetland is 169,171 hm2. On the basis of this, we can analyze the dynamic changes trend and the reasons: steady degenerating for natural wetlands, increasing year by year for artificial wetland, and the distribution pattern takes shape that the existing natural wetlands are being protected and the increase of new artificial wetlands is in conformity with the social development, so the situation of the wetland resources is developing towards a virtuous circle direction.

  17. Technology development of the Space Transportation System mission and terrestrial applications of satellite technology

    NASA Technical Reports Server (NTRS)

    1981-01-01

    The Space Transportation System (STS) is discussed, including the launch processing system, the thermal protection subsystem, meteorological research, sound supression water system, rotating service structure, improved hypergol or removal systems, fiber optics research, precision positioning, remote controlled solid rocket booster nozzle plugs, ground operations for Centaur orbital transfer vehicle, parachute drying, STS hazardous waste disposal and recycle, toxic waste technology and control concepts, fast analytical densitometry study, shuttle inventory management system, operational intercommunications system improvement, and protective garment ensemble. Terrestrial applications are also covered, including LANDSAT applications to water resources, satellite freeze forecast system, application of ground penetrating radar to soil survey, turtle tracking, evaluating computer drawn ground cover maps, sparkless load pulsar, and coupling a microcomputer and computing integrator with a gas chromatograph.

  18. Remote sensing sensors and applications in environmental resources mapping and modeling

    USGS Publications Warehouse

    Melesse, Assefa M.; Weng, Qihao; Thenkabail, Prasad S.; Senay, Gabriel B.

    2007-01-01

    The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling.

  19. WLCG scale testing during CMS data challenges

    NASA Astrophysics Data System (ADS)

    Gutsche, O.; Hajdu, C.

    2008-07-01

    The CMS computing model to process and analyze LHC collision data follows a data-location driven approach and is using the WLCG infrastructure to provide access to GRID resources. As a preparation for data taking, CMS tests its computing model during dedicated data challenges. An important part of the challenges is the test of the user analysis which poses a special challenge for the infrastructure with its random distributed access patterns. The CMS Remote Analysis Builder (CRAB) handles all interactions with the WLCG infrastructure transparently for the user. During the 2006 challenge, CMS set its goal to test the infrastructure at a scale of 50,000 user jobs per day using CRAB. Both direct submissions by individual users and automated submissions by robots were used to achieve this goal. A report will be given about the outcome of the user analysis part of the challenge using both the EGEE and OSG parts of the WLCG. In particular, the difference in submission between both GRID middlewares (resource broker vs. direct submission) will be discussed. In the end, an outlook for the 2007 data challenge is given.

  20. Educational Computer Utilization and Computer Communications.

    ERIC Educational Resources Information Center

    Singh, Jai P.; Morgan, Robert P.

    As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…

  1. The Development and Preliminary Application Ofplant Quarantine Remote Teaching System Inchina

    NASA Astrophysics Data System (ADS)

    Wu, Zhigang; Li, Zhihong; Yang, Ding; Zhang, Guozhen

    With the development of modern information technology, the traditional teaching mode becomes more deficient for the requirement of modern education. Plant Quarantine has been accepted as the common course for the universities of agriculture in China after the entry of WTO. But the teaching resources of this course are not enough especially for most universities with lack base. The characteristic of e-learning is regarded as one way to solve the problem of short teaching resource. PQRTS (Plant Quarantine Remote Teaching System) was designed and developed with JSP (Java Sever Pages), MySQL and Tomcat in this study. The system included many kinds of plant quarantine teaching resources, such as international glossary, regulations and standards, multimedia information of quarantine process and pests, ppt files of teaching, and training exercise. The system prototype implemented the functions of remote learning, querying, management, examination and remote discussion. It could be a tool for teaching, teaching assistance and learning online.

  2. East Africa seminar and workshop of remote sensing of natural resources and environment

    USGS Publications Warehouse

    Deutsch, Morris

    1975-01-01

    Report on total program covering East Africa Seminar and Workshop on remote sensing of natural resources and the environment held in Nairobi, Kenya, March 21 April 3, 1974, attended by participants from 10 English-speaking African nations. Appendices are included for Seminar proceedings, workshop lectures and outlines, field trip reports and critiques by participants, and reports on potential applications of an operational earth resources satellite for the participating countries.

  3. RIP-REMOTE INTERACTIVE PARTICLE-TRACER

    NASA Technical Reports Server (NTRS)

    Rogers, S. E.

    1994-01-01

    Remote Interactive Particle-tracing (RIP) is a distributed-graphics program which computes particle traces for computational fluid dynamics (CFD) solution data sets. A particle trace is a line which shows the path a massless particle in a fluid will take; it is a visual image of where the fluid is going. The program is able to compute and display particle traces at a speed of about one trace per second because it runs on two machines concurrently. The data used by the program is contained in two files. The solution file contains data on density, momentum and energy quantities of a flow field at discrete points in three-dimensional space, while the grid file contains the physical coordinates of each of the discrete points. RIP requires two computers. A local graphics workstation interfaces with the user for program control and graphics manipulation, and a remote machine interfaces with the solution data set and performs time-intensive computations. The program utilizes two machines in a distributed mode for two reasons. First, the data to be used by the program is usually generated on the supercomputer. RIP avoids having to convert and transfer the data, eliminating any memory limitations of the local machine. Second, as computing the particle traces can be computationally expensive, RIP utilizes the power of the supercomputer for this task. Although the remote site code was developed on a CRAY, it is possible to port this to any supercomputer class machine with a UNIX-like operating system. Integration of a velocity field from a starting physical location produces the particle trace. The remote machine computes the particle traces using the particle-tracing subroutines from PLOT3D/AMES, a CFD post-processing graphics program available from COSMIC (ARC-12779). These routines use a second-order predictor-corrector method to integrate the velocity field. Then the remote program sends graphics tokens to the local machine via a remote-graphics library. The local machine interprets the graphics tokens and draws the particle traces. The program is menu driven. RIP is implemented on the silicon graphics IRIS 3000 (local workstation) with an IRIX operating system and on the CRAY2 (remote station) with a UNICOS 1.0 or 2.0 operating system. The IRIS 4D can be used in place of the IRIS 3000. The program is written in C (67%) and FORTRAN 77 (43%) and has an IRIS memory requirement of 4 MB. The remote and local stations must use the same user ID. PLOT3D/AMES unformatted data sets are required for the remote machine. The program was developed in 1988.

  4. Computer-Aided Remote Driving

    NASA Technical Reports Server (NTRS)

    Wilcox, Brian H.

    1994-01-01

    System for remote control of robotic land vehicle requires only small radio-communication bandwidth. Twin video cameras on vehicle create stereoscopic images. Operator views cross-polarized images on two cathode-ray tubes through correspondingly polarized spectacles. By use of cursor on frozen image, remote operator designates path. Vehicle proceeds to follow path, by use of limited degree of autonomous control to cope with unexpected conditions. System concept, called "computer-aided remote driving" (CARD), potentially useful in exploration of other planets, military surveillance, firefighting, and clean-up of hazardous materials.

  5. A Lightweight Remote Parallel Visualization Platform for Interactive Massive Time-varying Climate Data Analysis

    NASA Astrophysics Data System (ADS)

    Li, J.; Zhang, T.; Huang, Q.; Liu, Q.

    2014-12-01

    Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.

  6. Remote sensing in Alaska: Opportunities and policy implications

    NASA Technical Reports Server (NTRS)

    Moor, J. H.

    1981-01-01

    The natural resources of Alaska and their exploitation and further development are discussed. the use of remote sensing techniques for vegetation classification, wetlands identification, and other basic resource management techniques is assessed and the history of cooperation between state and federal land managers is reviewed. Agencies managing resources in Alaska are encountered to use existing forums to develop a coordinated program aimed at improving all resource management capabilities. Continuing education, training, demonstrations and evaluations must be provided to enhance management abilities and promote social and economic development in the state.

  7. NNDC Data Services

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuli, J.K.; Sonzogni,A.

    The National Nuclear Data Center has provided remote access to some of its resources since 1986. The major databases and other resources available currently through NNDC Web site are summarized. The National Nuclear Data Center (NNDC) has provided remote access to the nuclear physics databases it maintains and to other resources since 1986. With considerable innovation access is now mostly through the Web. The NNDC Web pages have been modernized to provide a consistent state-of-the-art style. The improved database services and other resources available from the NNOC site at www.nndc.bnl.govwill be described.

  8. Introduction. [usefulness of modern remote sensing techniques for studying components of California water resources

    NASA Technical Reports Server (NTRS)

    Colwell, R. N.

    1973-01-01

    Since May 1970, personnel on several campuses of the University of California have been conducting investigations which seek to determine the usefulness of modern remote sensing techniques for studying various components of California's earth resources complex. Emphasis has been given to California's water resources as exemplified by the Feather River project and other aspects of the California Water Plan. This study is designed to consider in detail the supply, demand, and impact relationships. The specific geographic areas studied are the Feather River drainage in northern California, the Chino-Riverside Basin and Imperial Valley areas in southern California, and selected portions of the west side of San Joaquin Valley in central California. An analysis is also given on how an effective benefit-cost study of remote sensing in relation to California's water resources might best be made.

  9. Modeling, simulation, and analysis of optical remote sensing systems

    NASA Technical Reports Server (NTRS)

    Kerekes, John Paul; Landgrebe, David A.

    1989-01-01

    Remote Sensing of the Earth's resources from space-based sensors has evolved in the past 20 years from a scientific experiment to a commonly used technological tool. The scientific applications and engineering aspects of remote sensing systems have been studied extensively. However, most of these studies have been aimed at understanding individual aspects of the remote sensing process while relatively few have studied their interrelations. A motivation for studying these interrelationships has arisen with the advent of highly sophisticated configurable sensors as part of the Earth Observing System (EOS) proposed by NASA for the 1990's. Two approaches to investigating remote sensing systems are developed. In one approach, detailed models of the scene, the sensor, and the processing aspects of the system are implemented in a discrete simulation. This approach is useful in creating simulated images with desired characteristics for use in sensor or processing algorithm development. A less complete, but computationally simpler method based on a parametric model of the system is also developed. In this analytical model the various informational classes are parameterized by their spectral mean vector and covariance matrix. These class statistics are modified by models for the atmosphere, the sensor, and processing algorithms and an estimate made of the resulting classification accuracy among the informational classes. Application of these models is made to the study of the proposed High Resolution Imaging Spectrometer (HRIS). The interrelationships among observational conditions, sensor effects, and processing choices are investigated with several interesting results.

  10. PanDA for ATLAS distributed computing in the next decade

    NASA Astrophysics Data System (ADS)

    Barreiro Megino, F. H.; De, K.; Klimentov, A.; Maeno, T.; Nilsson, P.; Oleynik, D.; Padolski, S.; Panitkin, S.; Wenaus, T.; ATLAS Collaboration

    2017-10-01

    The Production and Distributed Analysis (PanDA) system has been developed to meet ATLAS production and analysis requirements for a data-driven workload management system capable of operating at the Large Hadron Collider (LHC) data processing scale. Heterogeneous resources used by the ATLAS experiment are distributed worldwide at hundreds of sites, thousands of physicists analyse the data remotely, the volume of processed data is beyond the exabyte scale, dozens of scientific applications are supported, while data processing requires more than a few billion hours of computing usage per year. PanDA performed very well over the last decade including the LHC Run 1 data taking period. However, it was decided to upgrade the whole system concurrently with the LHC’s first long shutdown in order to cope with rapidly changing computing infrastructure. After two years of reengineering efforts, PanDA has embedded capabilities for fully dynamic and flexible workload management. The static batch job paradigm was discarded in favor of a more automated and scalable model. Workloads are dynamically tailored for optimal usage of resources, with the brokerage taking network traffic and forecasts into account. Computing resources are partitioned based on dynamic knowledge of their status and characteristics. The pilot has been re-factored around a plugin structure for easier development and deployment. Bookkeeping is handled with both coarse and fine granularities for efficient utilization of pledged or opportunistic resources. An in-house security mechanism authenticates the pilot and data management services in off-grid environments such as volunteer computing and private local clusters. The PanDA monitor has been extensively optimized for performance and extended with analytics to provide aggregated summaries of the system as well as drill-down to operational details. There are as well many other challenges planned or recently implemented, and adoption by non-LHC experiments such as bioinformatics groups successfully running Paleomix (microbial genome and metagenomes) payload on supercomputers. In this paper we will focus on the new and planned features that are most important to the next decade of distributed computing workload management.

  11. Rich client data exploration and research prototyping for NOAA

    NASA Astrophysics Data System (ADS)

    Grossberg, Michael; Gladkova, Irina; Guch, Ingrid; Alabi, Paul; Shahriar, Fazlul; Bonev, George; Aizenman, Hannah

    2009-08-01

    Data from satellites and model simulations is increasing exponentially as observations and model computing power improve rapidly. Not only is technology producing more data, but it often comes from sources all over the world. Researchers and scientists who must collaborate are also located globally. This work presents a software design and technologies which will make it possible for groups of researchers to explore large data sets visually together without the need to download these data sets locally. The design will also make it possible to exploit high performance computing remotely and transparently to analyze and explore large data sets. Computer power, high quality sensing, and data storage capacity have improved at a rate that outstrips our ability to develop software applications that exploit these resources. It is impractical for NOAA scientists to download all of the satellite and model data that may be relevant to a given problem and the computing environments available to a given researcher range from supercomputers to only a web browser. The size and volume of satellite and model data are increasing exponentially. There are at least 50 multisensor satellite platforms collecting Earth science data. On the ground and in the sea there are sensor networks, as well as networks of ground based radar stations, producing a rich real-time stream of data. This new wealth of data would have limited use were it not for the arrival of large-scale high-performance computation provided by parallel computers, clusters, grids, and clouds. With these computational resources and vast archives available, it is now possible to analyze subtle relationships which are global, multi-modal and cut across many data sources. Researchers, educators, and even the general public, need tools to access, discover, and use vast data center archives and high performance computing through a simple yet flexible interface.

  12. Development of analysis techniques for remote sensing of vegetation resources

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.

    1972-01-01

    Various data handling and analysis techniques are summarized for evaluation of ERTS-A and supporting high flight imagery. These evaluations are concerned with remote sensors applied to wildland and agricultural vegetation resource inventory problems. Monitoring California's annual grassland, automatic texture analysis, agricultural ground data collection techniques, and spectral measurements are included.

  13. Potential for a remote-sensing-aided forest resource survey for the whole globe

    Treesearch

    E. Tomppo; R. L. Czaplewski

    2002-01-01

    The Global Forest Resources Assessment 2000 (FRA 2000) relied primarily on information provided by countries, but FAO also conducted a remote-sensing study of tropical forests to complement country information and to bolster understanding of land-cover change processes in the tropics, especially deforestation, forest degradation, fragmentation and shifting cultivation...

  14. Patterns of computer usage among medical practitioners in rural and remote Queensland.

    PubMed

    White, Col; Sheedy, Vicki; Lawrence, Nicola

    2002-06-01

    As part of a more detailed needs analysis, patterns of computer usage among medical practitioners in rural and remote Queensland were investigated. Utilising a questionnaire approach, a response rate of 23.82% (n = 131) was obtained. Results suggest that medical practitioners in rural and remote Queensland are relatively sophisticated in their use of computer and information technologies and have embraced computerisation to a substantially higher extent compared with their urban counterparts and previously published estimates. Findings also indicate that a substantial number of rural and remote practitioners are utilising computer and information technologies for clinical purposes such as pathology, patient information sheets, prescribing, education, patient records and patient recalls. Despite barriers such as bandwidth limitations, cost and the sometimes unreliable quality of Internet service providers, a majority of rural and remote respondents rated an Internet site with continuing medical education information and services as being important or very important. Suggestions that "rural doctors are slow to adapt to new technologies" are questioned, with findings indicating that rural and remote medical practitioners in Queensland have adapted to, and utilise, information technology to a far higher extent than has been previously documented.

  15. Utilizing remote sensing of thematic mapper data to improve our understanding of estuarine processes and their influence on the productivity of estuarine-dependent fisheries

    NASA Technical Reports Server (NTRS)

    Browder, Joan A.; May, L. Nelson, Jr.; Rosenthal, Alan; Baumann, Robert H.; Gosselink, James G.

    1987-01-01

    A stochastic spatial computer model addressing coastal resource problems in Lousiana is being refined and validated using thematic mapper (TM) imagery. The TM images of brackish marsh sites were processed and data were tabulated on spatial parameters from TM images of the salt marsh sites. The Fisheries Image Processing Systems (FIPS) was used to analyze the TM scene. Activities were concentrated on improving the structure of the model and developing a structure and methodology for calibrating the model with spatial-pattern data from the TM imagery.

  16. NASA Wrangler: Automated Cloud-Based Data Assembly in the RECOVER Wildfire Decision Support System

    NASA Technical Reports Server (NTRS)

    Schnase, John; Carroll, Mark; Gill, Roger; Wooten, Margaret; Weber, Keith; Blair, Kindra; May, Jeffrey; Toombs, William

    2017-01-01

    NASA Wrangler is a loosely-coupled, event driven, highly parallel data aggregation service designed to take advantageof the elastic resource capabilities of cloud computing. Wrangler automatically collects Earth observational data, climate model outputs, derived remote sensing data products, and historic biophysical data for pre-, active-, and post-wildfire decision making. It is a core service of the RECOVER decision support system, which is providing rapid-response GIS analytic capabilities to state and local government agencies. Wrangler reduces to minutes the time needed to assemble and deliver crucial wildfire-related data.

  17. Future use of digital remote sensing data

    NASA Technical Reports Server (NTRS)

    Spann, G. W.; Jones, N. L.

    1978-01-01

    Users of remote sensing data are increasingly turning to digital processing techniques for the extraction of land resource, environmental, and natural resource information. This paper presents the results of recent and ongoing research efforts sponsored, in part, by NASA/Marshall Space Flight Center on the current uses of and future needs for digital remote sensing data. An ongoing investigation involves a comprehensive survey of capabilities for digital Landsat data use in the Southeastern U.S. Another effort consists of an evaluation of future needs for digital remote sensing data by federal, state, and local governments and the private sector. These needs are projected into the 1980-1985 time frame. Furthermore, the accelerating use of digital remote sensing data is not limited to the U.S. or even to the developed countries of the world.

  18. A New Computational Framework for Atmospheric and Surface Remote Sensing

    NASA Technical Reports Server (NTRS)

    Timucin, Dogan A.

    2004-01-01

    A Bayesian data-analysis framework is described for atmospheric and surface retrievals from remotely-sensed hyper-spectral data. Some computational techniques are high- lighted for improved accuracy in the forward physics model.

  19. Remote Earth Sciences data collection using ACTS

    NASA Technical Reports Server (NTRS)

    Evans, Robert H.

    1992-01-01

    Given the focus on global change and the attendant scope of such research, we anticipate significant growth of requirements for investigator interaction, processing system capabilities, and availability of data sets. The increased complexity of global processes requires interdisciplinary teams to address them; the investigators will need to interact on a regular basis; however, it is unlikely that a single institution will house sufficient investigators with the required breadth of skills. The complexity of the computations may also require resources beyond those located within a single institution; this lack of sufficient computational resources leads to a distributed system located at geographically dispersed institutions. Finally the combination of long term data sets like the Pathfinder datasets and the data to be gathered by new generations of satellites such as SeaWiFS and MODIS-N yield extra-ordinarily large amounts of data. All of these factors combine to increase demands on the communications facilities available; the demands are generating requirements for highly flexible, high capacity networks. We have been examining the applicability of the Advanced Communications Technology Satellite (ACTS) to address the scientific, computational, and, primarily, communications questions resulting from global change research. As part of this effort three scenarios for oceanographic use of ACTS have been developed; a full discussion of this is contained in Appendix B.

  20. Remote sensing and extractable biological resources

    NASA Technical Reports Server (NTRS)

    Cronin, L. E.

    1972-01-01

    The nature and quantity of extractable biological resources available in the Chesapeake Bay are discussed. The application of miniaturized radio sensors to track the movement of fish and birds is described. The specific uses of remote sensors for detecting and mapping areas of algae, red tide, thermal pollution, and vegetation beds are presented. The necessity for obtaining information on the physical, chemical, and meteorological features of the entire bay in order to provide improved resources management is emphasized.

  1. A Responsive Client for Distributed Visualization

    NASA Astrophysics Data System (ADS)

    Bollig, E. F.; Jensen, P. A.; Erlebacher, G.; Yuen, D. A.; Momsen, A. R.

    2006-12-01

    As grids, web services and distributed computing continue to gain popularity in the scientific community, demand for virtual laboratories likewise increases. Today organizations such as the Virtual Laboratory for Earth and Planetary Sciences (VLab) are dedicated to developing web-based portals to perform various simulations remotely while abstracting away details of the underlying computation. Two of the biggest challenges in portal- based computing are fast visualization and smooth interrogation without over taxing clients resources. In response to this challenge, we have expanded on our previous data storage strategy and thick client visualization scheme [1] to develop a client-centric distributed application that utilizes remote visualization of large datasets and makes use of the local graphics processor for improved interactivity. Rather than waste precious client resources for visualization, a combination of 3D graphics and 2D server bitmaps are used to simulate the look and feel of local rendering. Java Web Start and Java Bindings for OpenGL enable install-on- demand functionality as well as low level access to client graphics for all platforms. Powerful visualization services based on VTK and auto-generated by the WATT compiler [2] are accessible through a standard web API. Data is permanently stored on compute nodes while separate visualization nodes fetch data requested by clients, caching it locally to prevent unnecessary transfers. We will demonstrate application capabilities in the context of simulated charge density visualization within the VLab portal. In addition, we will address generalizations of our application to interact with a wider number of WATT services and performance bottlenecks. [1] Ananthuni, R., Karki, B.B., Bollig, E.F., da Silva, C.R.S., Erlebacher, G., "A Web-Based Visualization and Reposition Scheme for Scientific Data," In Press, Proceedings of the 2006 International Conference on Modeling Simulation and Visualization Methods (MSV'06) (2006). [2] Jensen, P.A., Yuen, D.A., Erlebacher, G., Bollig, E.F., Kigelman, D.G., Shukh, E.A., Automated Generation of Web Services for Visualization Toolkits, Eos Trans. AGU, 86(52), Fall Meet. Suppl., Abstract IN42A-06, 2005.

  2. Rapid prototyping of soil moisture estimates using the NASA Land Information System

    NASA Astrophysics Data System (ADS)

    Anantharaj, V.; Mostovoy, G.; Li, B.; Peters-Lidard, C.; Houser, P.; Moorhead, R.; Kumar, S.

    2007-12-01

    The Land Information System (LIS), developed at the NASA Goddard Space Flight Center, is a functional Land Data Assimilation System (LDAS) that incorporates a suite of land models in an interoperable computational framework. LIS has been integrated into a computational Rapid Prototyping Capabilities (RPC) infrastructure. LIS consists of a core, a number of community land models, data servers, and visualization systems - integrated in a high-performance computing environment. The land surface models (LSM) in LIS incorporate surface and atmospheric parameters of temperature, snow/water, vegetation, albedo, soil conditions, topography, and radiation. Many of these parameters are available from in-situ observations, numerical model analysis, and from NASA, NOAA, and other remote sensing satellite platforms at various spatial and temporal resolutions. The computational resources, available to LIS via the RPC infrastructure, support e- Science experiments involving the global modeling of land-atmosphere studies at 1km spatial resolutions as well as regional studies at finer resolutions. The Noah Land Surface Model, available with-in the LIS is being used to rapidly prototype soil moisture estimates in order to evaluate the viability of other science applications for decision making purposes. For example, LIS has been used to further extend the utility of the USDA Soil Climate Analysis Network of in-situ soil moisture observations. In addition, LIS also supports data assimilation capabilities that are used to assimilate remotely sensed soil moisture retrievals from the AMSR-E instrument onboard the Aqua satellite. The rapid prototyping of soil moisture estimates using LIS and their applications will be illustrated during the presentation.

  3. Remote sensing in Michigan for land resource management

    NASA Technical Reports Server (NTRS)

    Sattinger, I. J.; Sellman, A. N.; Istvan, L. B.; Cook, J. J.

    1973-01-01

    During the period from June 1972 to June 1973, remote sensing techniques were applied to the following tasks: (1) mapping Michigan's land resources, (2) waterfowl habitat management at Point Mouillee, (3) mapping of Lake Erie shoreline flooding, (4) highway impact assessment, (5) applications of the Earth Resources Technology Satellite, ERTS-1, (6) investigation of natural gas eruptions near Williamsburg, and (7) commercial site selection. The goal of the program was the large scale adaption, by both public agencies and private interests in Michigan, of earth-resource survey technology as an important aid in the solution of current problems in resources management and environmental protection.

  4. Remote Sensing Sensors and Applications in Environmental Resources Mapping and Modelling

    PubMed Central

    Melesse, Assefa M.; Weng, Qihao; S.Thenkabail, Prasad; Senay, Gabriel B.

    2007-01-01

    The history of remote sensing and development of different sensors for environmental and natural resources mapping and data acquisition is reviewed and reported. Application examples in urban studies, hydrological modeling such as land-cover and floodplain mapping, fractional vegetation cover and impervious surface area mapping, surface energy flux and micro-topography correlation studies is discussed. The review also discusses the use of remotely sensed-based rainfall and potential evapotranspiration for estimating crop water requirement satisfaction index and hence provides early warning information for growers. The review is not an exhaustive application of the remote sensing techniques rather a summary of some important applications in environmental studies and modeling. PMID:28903290

  5. Biological knowledge bases using Wikis: combining the flexibility of Wikis with the structure of databases.

    PubMed

    Brohée, Sylvain; Barriot, Roland; Moreau, Yves

    2010-09-01

    In recent years, the number of knowledge bases developed using Wiki technology has exploded. Unfortunately, next to their numerous advantages, classical Wikis present a critical limitation: the invaluable knowledge they gather is represented as free text, which hinders their computational exploitation. This is in sharp contrast with the current practice for biological databases where the data is made available in a structured way. Here, we present WikiOpener an extension for the classical MediaWiki engine that augments Wiki pages by allowing on-the-fly querying and formatting resources external to the Wiki. Those resources may provide data extracted from databases or DAS tracks, or even results returned by local or remote bioinformatics analysis tools. This also implies that structured data can be edited via dedicated forms. Hence, this generic resource combines the structure of biological databases with the flexibility of collaborative Wikis. The source code and its documentation are freely available on the MediaWiki website: http://www.mediawiki.org/wiki/Extension:WikiOpener.

  6. Ranger's Legacy

    NASA Technical Reports Server (NTRS)

    1987-01-01

    With its Landsat satellites, development of sensors, and advancement of processing techniques, NASA provided the initial technology base for another Earth-benefit application of image processing, Earth resources survey by means of remote sensing. Since each object has its own unique "signature," it is possible to distinguish among surface features and to generate computer-processed imagery identifying specific features of importance to resource managers. This capability, commercialized by Perceptive Scientific Instruments, Inc., offers practical application in such areas as agricultural crop forecasting, rangeland and forest management, land use planning, mineral and petroleum exploration, map making, water quality evaluation and disaster assessment. Major users of the technology have been federal, state, and local governments, but it is making its way into commercial operations, for example, resource exploration companies looking for oil, gas and mineral sources, and timber production firms seeking more efficient treeland management. Supporting both government and private users is a small industry composed of companies producing the processing hardware software. As is the case in the medical application, many of these companies are direct offspring of NASA's work.

  7. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, Allan M.

    1996-01-01

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

  8. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, A.M.

    1997-12-09

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.

  9. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, Allan M.

    1999-01-01

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

  10. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, A.M.

    1996-08-06

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.

  11. Remote information service access system based on a client-server-service model

    DOEpatents

    Konrad, Allan M.

    1997-01-01

    A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.

  12. Methods of training the graduate level and professional geologist in remote sensing technology

    NASA Technical Reports Server (NTRS)

    Kolm, K. E.

    1981-01-01

    Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.

  13. Remote sensing of vegetation structure using computer vision

    NASA Astrophysics Data System (ADS)

    Dandois, Jonathan P.

    High-spatial resolution measurements of vegetation structure are needed for improving understanding of ecosystem carbon, water and nutrient dynamics, the response of ecosystems to a changing climate, and for biodiversity mapping and conservation, among many research areas. Our ability to make such measurements has been greatly enhanced by continuing developments in remote sensing technology---allowing researchers the ability to measure numerous forest traits at varying spatial and temporal scales and over large spatial extents with minimal to no field work, which is costly for large spatial areas or logistically difficult in some locations. Despite these advances, there remain several research challenges related to the methods by which three-dimensional (3D) and spectral datasets are joined (remote sensing fusion) and the availability and portability of systems for frequent data collections at small scale sampling locations. Recent advances in the areas of computer vision structure from motion (SFM) and consumer unmanned aerial systems (UAS) offer the potential to address these challenges by enabling repeatable measurements of vegetation structural and spectral traits at the scale of individual trees. However, the potential advances offered by computer vision remote sensing also present unique challenges and questions that need to be addressed before this approach can be used to improve understanding of forest ecosystems. For computer vision remote sensing to be a valuable tool for studying forests, bounding information about the characteristics of the data produced by the system will help researchers understand and interpret results in the context of the forest being studied and of other remote sensing techniques. This research advances understanding of how forest canopy and tree 3D structure and color are accurately measured by a relatively low-cost and portable computer vision personal remote sensing system: 'Ecosynth'. Recommendations are made for optimal conditions under which forest structure measurements should be obtained with UAS-SFM remote sensing. Ultimately remote sensing of vegetation by computer vision offers the potential to provide an 'ecologist's eye view', capturing not only canopy 3D and spectral properties, but also seeing the trees in the forest and the leaves on the trees.

  14. KNET - DISTRIBUTED COMPUTING AND/OR DATA TRANSFER PROGRAM

    NASA Technical Reports Server (NTRS)

    Hui, J.

    1994-01-01

    KNET facilitates distributed computing between a UNIX compatible local host and a remote host which may or may not be UNIX compatible. It is capable of automatic remote login. That is, it performs on the user's behalf the chore of handling host selection, user name, and password to the designated host. Once the login has been successfully completed, the user may interactively communicate with the remote host. Data output from the remote host may be directed to the local screen, to a local file, and/or to a local process. Conversely, data input from the keyboard, a local file, or a local process may be directed to the remote host. KNET takes advantage of the multitasking and terminal mode control features of the UNIX operating system. A parent process is used as the upper layer for interfacing with the local user. A child process is used for a lower layer for interfacing with the remote host computer, and optionally one or more child processes can be used for the remote data output. Output may be directed to the screen and/or to the local processes under the control of a data pipe switch. In order for KNET to operate, the local and remote hosts must observe a common communications protocol. KNET is written in ANSI standard C-language for computers running UNIX. It has been successfully implemented on several Sun series computers and a DECstation 3100 and used to run programs remotely on VAX VMS and UNIX based computers. It requires 100K of RAM under SunOS and 120K of RAM under DEC RISC ULTRIX. An electronic copy of the documentation is provided on the distribution medium. The standard distribution medium for KNET is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. KNET was developed in 1991 and is a copyrighted work with all copyright vested in NASA. UNIX is a registered trademark of AT&T Bell Laboratories. Sun and SunOS are trademarks of Sun Microsystems, Inc. DECstation, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation.

  15. A parallel method of atmospheric correction for multispectral high spatial resolution remote sensing images

    NASA Astrophysics Data System (ADS)

    Zhao, Shaoshuai; Ni, Chen; Cao, Jing; Li, Zhengqiang; Chen, Xingfeng; Ma, Yan; Yang, Leiku; Hou, Weizhen; Qie, Lili; Ge, Bangyu; Liu, Li; Xing, Jin

    2018-03-01

    The remote sensing image is usually polluted by atmosphere components especially like aerosol particles. For the quantitative remote sensing applications, the radiative transfer model based atmospheric correction is used to get the reflectance with decoupling the atmosphere and surface by consuming a long computational time. The parallel computing is a solution method for the temporal acceleration. The parallel strategy which uses multi-CPU to work simultaneously is designed to do atmospheric correction for a multispectral remote sensing image. The parallel framework's flow and the main parallel body of atmospheric correction are described. Then, the multispectral remote sensing image of the Chinese Gaofen-2 satellite is used to test the acceleration efficiency. When the CPU number is increasing from 1 to 8, the computational speed is also increasing. The biggest acceleration rate is 6.5. Under the 8 CPU working mode, the whole image atmospheric correction costs 4 minutes.

  16. Simulation of electricity demand in a remote island for optimal planning of a hybrid renewable energy system

    NASA Astrophysics Data System (ADS)

    Koskinas, Aristotelis; Zacharopoulou, Eleni; Pouliasis, George; Engonopoulos, Ioannis; Mavroyeoryos, Konstantinos; Deligiannis, Ilias; Karakatsanis, Georgios; Dimitriadis, Panayiotis; Iliopoulou, Theano; Koutsoyiannis, Demetris; Tyralis, Hristos

    2017-04-01

    We simulate the electrical energy demand in the remote island of Astypalaia. To this end we first obtain information regarding the local socioeconomic conditions and energy demand. Secondly, the available hourly demand data are analysed at various time scales (hourly, weekly, daily, seasonal). The cross-correlations between the electrical energy demand and the mean daily temperature as well as other climatic variables for the same time period are computed. Also, we investigate the cross-correlation between those climatic variables and other variables related to renewable energy resources from numerous observations around the globe in order to assess the impact of each one to a hybrid renewable energy system. An exploratory data analysis including all variables is performed with the purpose to find hidden relationships. Finally, the demand is simulated considering all the periodicities found in the analysis. The simulation time series will be used in the development of a framework for planning of a hybrid renewable energy system in Astypalaia. Acknowledgement: This research is conducted within the frame of the undergraduate course "Stochastic Methods in Water Resources" of the National Technical University of Athens (NTUA). The School of Civil Engineering of NTUA provided moral support for the participation of the students in the Assembly.

  17. Automated Finite State Workflow for Distributed Data Production

    NASA Astrophysics Data System (ADS)

    Hajdu, L.; Didenko, L.; Lauret, J.; Amol, J.; Betts, W.; Jang, H. J.; Noh, S. Y.

    2016-10-01

    In statistically hungry science domains, data deluges can be both a blessing and a curse. They allow the narrowing of statistical errors from known measurements, and open the door to new scientific opportunities as research programs mature. They are also a testament to the efficiency of experimental operations. However, growing data samples may need to be processed with little or no opportunity for huge increases in computing capacity. A standard strategy has thus been to share resources across multiple experiments at a given facility. Another has been to use middleware that “glues” resources across the world so they are able to locally run the experimental software stack (either natively or virtually). We describe a framework STAR has successfully used to reconstruct a ~400 TB dataset consisting of over 100,000 jobs submitted to a remote site in Korea from STAR's Tier 0 facility at the Brookhaven National Laboratory. The framework automates the full workflow, taking raw data files from tape and writing Physics-ready output back to tape without operator or remote site intervention. Through hardening we have demonstrated 97(±2)% efficiency, over a period of 7 months of operation. The high efficiency is attributed to finite state checking with retries to encourage resilience in the system over capricious and fallible infrastructure.

  18. National Fusion Collaboratory: Grid Computing for Simulations and Experiments

    NASA Astrophysics Data System (ADS)

    Greenwald, Martin

    2004-05-01

    The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.

  19. Microwave remote sensing from space for earth resource surveys

    NASA Technical Reports Server (NTRS)

    1977-01-01

    The concepts of radar remote sensing and microwave radiometry are discussed and their utility in earth resource sensing is examined. The direct relationship between the character of the remotely sensed data and the level of decision making for which the data are appropriate is considered. Applications of active and a passive microwave sensing covered include hydrology, land use, mapping, vegetation classification, environmental monitoring, coastal features and processes, geology, and ice and snow. Approved and proposed microwave sensors are described and the use of space shuttle as a development platform is evaluated.

  20. Combining inventories of land cover and forest resources with prediction models and remotely sensed data

    Treesearch

    Raymond L. Czaplewski

    1989-01-01

    It is difficult to design systems for national and global resource inventory and analysis that efficiently satisfy changing, and increasingly complex objectives. It is proposed that individual inventory, monitoring, modeling, and remote sensing systems be specialized to achieve portions of the objectives. These separate systems can be statistically linked to accomplish...

  1. Application of remote sensing and Geographic Information Systems to ecosystem-based urban natural resource management

    Treesearch

    Xiaohui Zhang; George Ball; Eve Halper

    2000-01-01

    This paper presents an integrated system to support urban natural resource management. With the application of remote sensing (RS) and geographic information systems (GIS), the paper emphasizes the methodology of integrating information technology and a scientific basis to support ecosystem-based management. First, a systematic integration framework is developed and...

  2. Ground zero and up; Nebraska's resources and land use. [using LANDSAT and Skylab data

    NASA Technical Reports Server (NTRS)

    Edwards, D. M.; Macklem, R.

    1975-01-01

    A one-semester high school course was developed about the use of remote sensing techniques for land earth resources planning and management. The slide-tape-workbook program was field tested with high school students to show a substantial increase in gain of knowledge and an attitude change in application of remote sensing techniques.

  3. Resource Management

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Summit Envirosolutions of Minneapolis, Minnesota, used remote sensing images as a source for groundwater resource management. Summit is a full-service environmental consulting service specializing in hydrogeologic, environmental management, engineering and remediation services. CRSP collected, processed and analyzed multispectral/thermal imagery and aerial photography to compare remote sensing and Geographic Information System approaches to more traditional methods of environmental impact assessments and monitoring.

  4. State resource management and role of remote sensing. [California

    NASA Technical Reports Server (NTRS)

    Johnson, H. D.

    1981-01-01

    Remote sensing by satellite can provide valuable information to state officials when making decisions regarding resources management. Portions of California's investment for Prosperity program which seem likely candidates for remote sensing include: (1) surveying vegetation type, age, and density in forests and wildlife habitats; (2) controlling fires through chaparal management; (3) monitoring wetlands and measuring ocean biomass; (4) eliminating ground water overdraught; (5) locating crops in overdraught areas, assessing soil erosion and the areas of poorly drained soils and those affected by salt; (6) monitoring coastal lands and resources; (7) changes in landscapes for recreational purposes; (8) inventorying irrigated lands; (9) classifying ground cover; (10) monitoring farmland conversion; and (11) supplying data for a statewide computerized farmlands data base.

  5. Characterizing water resources of the Nile Basin using remotely sensed data

    NASA Astrophysics Data System (ADS)

    Mekonnen, Z. T.; Gebremichael, M.; Demissie, S. S.

    2015-12-01

    The Nile is one of the largest river basin in the world with a rich biodiversity as well supporting the lives of 450 million people residing within the 11 riparian countries. This vital resource is under a growing stress due to population growth, rapid development and climate change. In this work, we explore the use of the latest various remote sensing products to capture the water resource of the basin: rainfall from GPM and TRMM, soil moisture from SMAP and SMOS, evapotranspiration from MODIS and EUMETSAT LSA-SAF, and total water storage variations from GRACE. The satellite estimates were supplemented and checked by ground measurements whenever possible. Our results show that spatiotemporal variations of the basin's water resources characteristics are well captured by remote sensing products rather than the scarce point measurements that currently exist. Several aspects of our results will be presented and discussed.

  6. Evolving technologies for Space Station Freedom computer-based workstations

    NASA Technical Reports Server (NTRS)

    Jensen, Dean G.; Rudisill, Marianne

    1990-01-01

    Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.

  7. Machine processing of remotely sensed data; Proceedings of the Conference, Purdue University, West Lafayette, Ind., October 16-18, 1973

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Topics discussed include the management and processing of earth resources information, special-purpose processors for the machine processing of remotely sensed data, digital image registration by a mathematical programming technique, the use of remote-sensor data in land classification (in particular, the use of ERTS-1 multispectral scanning data), the use of remote-sensor data in geometrical transformations and mapping, earth resource measurement with the aid of ERTS-1 multispectral scanning data, the use of remote-sensor data in the classification of turbidity levels in coastal zones and in the identification of ecological anomalies, the problem of feature selection and the classification of objects in multispectral images, the estimation of proportions of certain categories of objects, and a number of special systems and techniques. Individual items are announced in this issue.

  8. Remote Operations and Ground Control Centers

    NASA Technical Reports Server (NTRS)

    Bryant, Barry S.; Lankford, Kimberly; Pitts, R. Lee

    2004-01-01

    The Payload Operations Integration Center (POIC) at the Marshall Space Flight Center supports the International Space Station (ISS) through remote interfaces around the world. The POIC was originally designed as a gateway to space for remote facilities; ranging from an individual user to a full-scale multiuser environment. This achievement was accomplished while meeting program requirements and accommodating the injection of modern technology on an ongoing basis to ensure cost effective operations. This paper will discuss the open POIC architecture developed to support similar and dissimilar remote operations centers. It will include technologies, protocols, and compromises which on a day to day basis support ongoing operations. Additional areas covered include centralized management of shared resources and methods utilized to provide highly available and restricted resources to remote users. Finally, the effort of coordinating the actions of participants will be discussed.

  9. Measuring the Interdisciplinary Impact of Using Geospatial Data with Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Downs, R. R.; Chen, R. S.; Schumacher, J.

    2017-12-01

    Various disciplines offer benefits to society by contributing to the scientific progress that informs the knowledge and decisions that improve the lives, safety, and conditions of people around the globe. In addition to disciplines within the natural sciences, other disciplines, including those in the social, health, and computer sciences, provide benefits to society by collecting, preparing, and analyzing data in the process of conducting research. Preparing geospatial environmental and socioeconomic data together with remote sensing data from satellite-based instruments for wider use by heterogeneous communities of users increases the potential impact of these data by enabling their use in different application areas and sectors of society. Furthermore, enabling wider use of scientific data can bring to bear resources and expertise that will improve reproducibility, quality, methodological transparency, interoperability, and improved understanding by diverse communities of users. In line with its commitment to open data, the NASA Socioeconomic Data and Applications Center (SEDAC), which focuses on human interactions in the environment, curates and disseminates freely and publicly available geospatial data for use across many disciplines and societal benefit areas. We describe efforts to broaden the use of SEDAC data and to publicly document their impact, assess the interdisciplinary impact of the use of SEDAC data with remote sensing data, and characterize these impacts in terms of their influence across disciplines by analyzing citations of geospatial data with remote sensing data within scientific journals.

  10. A Multiscale Random Field Model for Bayesian Image Segmentation

    DTIC Science & Technology

    1994-06-01

    ATrN: Natural Resources Branch ATTN G ieCN-C3 D-E Aberden Povig Ground . MD 21005 At Aii-DI (2)AWN IS-TEOMAMr: ATZHI-DtE (2) ATTN: ISH-BECOM Fort...based remotely-sensed data and ground -level data for natural resource inventory and evaluation. Coupling remotely sensed digital data with traditional...ecological ground data could help Army land managers inventory and monitor natural resources. This study used LCTA data sets to D T IC test image

  11. NASA's Earth Resources Laboratory - Seventeen years of using remotely sensed satellite data in land applications

    NASA Technical Reports Server (NTRS)

    Cashion, Kenneth D.; Whitehurst, Charles A.

    1987-01-01

    The activities of the Earth Resources Laboratoy (ERL) for the past seventeen years are reviewed with particular reference to four typical applications demonstrating the use of remotely sensed data in a geobased information system context. The applications discussed are: a fire control model for the Olympic National Park; wildlife habitat modeling; a resource inventory system including a potential soil erosion model; and a corridor analysis model for locating routes between geographical locations. Some future applications are also discussed.

  12. Water resources by orbital remote sensing: Examples of applications

    NASA Technical Reports Server (NTRS)

    Martini, P. R. (Principal Investigator)

    1984-01-01

    Selected applications of orbital remote sensing to water resources undertaken by INPE are described. General specifications of Earth application satellites and technical characteristics of LANDSAT 1, 2, 3, and 4 subsystems are described. Spatial, temporal and spectral image attributes of water as well as methods of image analysis for applications to water resources are discussed. Selected examples are referred to flood monitoring, analysis of water suspended sediments, spatial distribution of pollutants, inventory of surface water bodies and mapping of alluvial aquifers.

  13. Effects of remote monitoring on clinical outcomes and use of healthcare resources in heart failure patients with biventricular defibrillators: results of the MORE-CARE multicentre randomized controlled trial.

    PubMed

    Boriani, Giuseppe; Da Costa, Antoine; Quesada, Aurelio; Ricci, Renato Pietro; Favale, Stefano; Boscolo, Gabriele; Clementy, Nicolas; Amori, Valentina; Mangoni di S Stefano, Lorenza; Burri, Haran

    2017-03-01

    The aim of this study was to evaluate the clinical efficacy and safety of remote monitoring in patients with heart failure implanted with a biventricular defibrillator (CRT-D) with advanced diagnostics. The MORE-CARE trial is an international, prospective, multicentre, randomized controlled trial. Within 8 weeks of de novo implant of a CRT-D, patients were randomized to undergo remote checks alternating with in-office follow-ups (Remote arm) or in-office follow-ups alone (Standard arm). The primary endpoint was a composite of death and cardiovascular (CV) and device-related hospitalization. Use of healthcare resources was also evaluated. A total of 865 eligible patients (mean age 66 ± 10 years) were included in the final analysis (437 in the Remote arm and 428 in the Standard arm) and followed for a median of 24 (interquartile range = 15-26) months. No significant difference was found in the primary endpoint between the Remote and Standard arms [hazard ratio 1.02, 95% confidence interval (CI) 0.80-1.30, P = 0.89] or in the individual components of the primary endpoint (P > 0.05). For the composite endpoint of healthcare resource utilization (i.e. 2-year rates of CV hospitalizations, CV emergency department admissions, and CV in-office follow-ups), a significant 38% reduction was found in the Remote vs. Standard arm (incidence rate ratio 0.62, 95% CI 0.58-0.66, P < 0.001) mainly driven by a reduction of in-office visits. In heart failure patients implanted with a CRT-D, remote monitoring did not reduce mortality or risk of CV or device-related hospitalization. Use of healthcare resources was significantly reduced as a result of a marked reduction of in-office visits without compromising patient safety. NCT00885677. © 2016 The Authors. European Journal of Heart Failure published by John Wiley & Sons Ltd on behalf of European Society of Cardiology.

  14. A simple method for estimating basin-scale groundwater discharge by vegetation in the basin and range province of Arizona using remote sensing information and geographic information systems

    USGS Publications Warehouse

    Tillman, F.D.; Callegary, J.B.; Nagler, P.L.; Glenn, E.P.

    2012-01-01

    Groundwater is a vital water resource in the arid to semi-arid southwestern United States. Accurate accounting of inflows to and outflows from the groundwater system is necessary to effectively manage this shared resource, including the important outflow component of groundwater discharge by vegetation. A simple method for estimating basin-scale groundwater discharge by vegetation is presented that uses remote sensing data from satellites, geographic information systems (GIS) land cover and stream location information, and a regression equation developed within the Southern Arizona study area relating the Enhanced Vegetation Index from the MODIS sensors on the Terra satellite to measured evapotranspiration. Results computed for 16-day composited satellite passes over the study area during the 2000 through 2007 time period demonstrate a sinusoidal pattern of annual groundwater discharge by vegetation with median values ranging from around 0.3 mm per day in the cooler winter months to around 1.5 mm per day during summer. Maximum estimated annual volume of groundwater discharge by vegetation was between 1.4 and 1.9 billion m3 per year with an annual average of 1.6 billion m3. A simplified accounting of the contribution of precipitation to vegetation greenness was developed whereby monthly precipitation data were subtracted from computed vegetation discharge values, resulting in estimates of minimum groundwater discharge by vegetation. Basin-scale estimates of minimum and maximum groundwater discharge by vegetation produced by this simple method are useful bounding values for groundwater budgets and groundwater flow models, and the method may be applicable to other areas with similar vegetation types.

  15. A Computer Learning Center for Environmental Sciences

    NASA Technical Reports Server (NTRS)

    Mustard, John F.

    2000-01-01

    In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.

  16. A cloud platform for remote diagnosis of breast cancer in mammography by fusion of machine and human intelligence

    NASA Astrophysics Data System (ADS)

    Jiang, Guodong; Fan, Ming; Li, Lihua

    2016-03-01

    Mammography is the gold standard for breast cancer screening, reducing mortality by about 30%. The application of a computer-aided detection (CAD) system to assist a single radiologist is important to further improve mammographic sensitivity for breast cancer detection. In this study, a design and realization of the prototype for remote diagnosis system in mammography based on cloud platform were proposed. To build this system, technologies were utilized including medical image information construction, cloud infrastructure and human-machine diagnosis model. Specifically, on one hand, web platform for remote diagnosis was established by J2EE web technology. Moreover, background design was realized through Hadoop open-source framework. On the other hand, storage system was built up with Hadoop distributed file system (HDFS) technology which enables users to easily develop and run on massive data application, and give full play to the advantages of cloud computing which is characterized by high efficiency, scalability and low cost. In addition, the CAD system was realized through MapReduce frame. The diagnosis module in this system implemented the algorithms of fusion of machine and human intelligence. Specifically, we combined results of diagnoses from doctors' experience and traditional CAD by using the man-machine intelligent fusion model based on Alpha-Integration and multi-agent algorithm. Finally, the applications on different levels of this system in the platform were also discussed. This diagnosis system will have great importance for the balanced health resource, lower medical expense and improvement of accuracy of diagnosis in basic medical institutes.

  17. Planning and Implementation of Remote Sensing Experiments.

    DTIC Science & Technology

    Contents: TEKTITE II experiment-upwelling detection (NASA Mx 138); Design of oceanographic experiments (Gulf of Mexico, Mx 159); Design of oceanographic experiments (Gulf of Mexico, Mx 165); Experiments on thermal pollution; Remote sensing newsletter; Symposium on remote sensing in marine biology and fishery resources.

  18. Remote Agent Demonstration

    NASA Technical Reports Server (NTRS)

    Dorais, Gregory A.; Kurien, James; Rajan, Kanna

    1999-01-01

    We describe the computer demonstration of the Remote Agent Experiment (RAX). The Remote Agent is a high-level, model-based, autonomous control agent being validated on the NASA Deep Space 1 spacecraft.

  19. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation.

    PubMed

    Qin, Changbo; Jia, Yangwen; Su, Z; Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-07-29

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems.

  20. Integrating Remote Sensing Information Into A Distributed Hydrological Model for Improving Water Budget Predictions in Large-scale Basins through Data Assimilation

    PubMed Central

    Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen

    2008-01-01

    This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946

  1. ROI-Based On-Board Compression for Hyperspectral Remote Sensing Images on GPU.

    PubMed

    Giordano, Rossella; Guccione, Pietro

    2017-05-19

    In recent years, hyperspectral sensors for Earth remote sensing have become very popular. Such systems are able to provide the user with images having both spectral and spatial information. The current hyperspectral spaceborne sensors are able to capture large areas with increased spatial and spectral resolution. For this reason, the volume of acquired data needs to be reduced on board in order to avoid a low orbital duty cycle due to limited storage space. Recently, literature has focused the attention on efficient ways for on-board data compression. This topic is a challenging task due to the difficult environment (outer space) and due to the limited time, power and computing resources. Often, the hardware properties of Graphic Processing Units (GPU) have been adopted to reduce the processing time using parallel computing. The current work proposes a framework for on-board operation on a GPU, using NVIDIA's CUDA (Compute Unified Device Architecture) architecture. The algorithm aims at performing on-board compression using the target's related strategy. In detail, the main operations are: the automatic recognition of land cover types or detection of events in near real time in regions of interest (this is a user related choice) with an unsupervised classifier; the compression of specific regions with space-variant different bit rates including Principal Component Analysis (PCA), wavelet and arithmetic coding; and data volume management to the Ground Station. Experiments are provided using a real dataset taken from an AVIRIS (Airborne Visible/Infrared Imaging Spectrometer) airborne sensor in a harbor area.

  2. MED31/437: A Web-based Diabetes Management System: DiabNet

    PubMed Central

    Zhao, N; Roudsari, A; Carson, E

    1999-01-01

    Introduction A web-based system (DiabNet) was developed to provide instant access to the Electronic Diabetes Records (EDR) for end-users, and real-time information for healthcare professionals to facilitate their decision-making. It integrates portable glucometer, handheld computer, mobile phone and Internet access as a combined telecommunication and mobile computing solution for diabetes management. Methods: Active Server Pages (ASP) embedded with advanced ActiveX controls and VBScript were developed to allow remote data upload, retrieval and interpretation. Some advisory and Internet-based learning features, together with a video teleconferencing component make DiabNet web site an informative platform for Web-consultation. Results The evaluation of the system is being implemented among several UK Internet diabetes discussion groups and the Diabetes Day Centre at the Guy's & St. Thomas' Hospital. Many positive feedback are received from the web site demonstrating DiabNet is an advanced web-based diabetes management system which can help patients to keep closer control of self-monitoring blood glucose remotely, and is an integrated diabetes information resource that offers telemedicine knowledge in diabetes management. Discussion In summary, DiabNet introduces an innovative online diabetes management concept, such as online appointment and consultation, to enable users to access diabetes management information without time and location limitation and security concerns.

  3. Environmental application of remote sensing methods to coastal zone land use and marine resource management, Appendices A to E. [in southeastern Virginia

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Important data were compiled for use with the Richmond-Cape Henry Environmental Laboratory (RICHEL) remote sensing project in coastal zone land use and marine resources management, and include RICHEL climatological data and sources, a land use inventory, topographic and soil maps, and gaging records for RICHEL surface waters.

  4. Remote sensing program

    NASA Technical Reports Server (NTRS)

    Liang, T.

    1973-01-01

    Research projects concerning the development and application of remote sensors are discussed. Some of the research projects conducted are as follows: (1) aerial photographic inventory of natural resources, (2) detection of buried river channels, (3) delineation of interconnected waterways, (4) plant indicators of atmospheric pollution, and (5) techniques for data transfer from photographs to base maps. On-going projects involving earth resources analyses are described.

  5. Regional Assessment of Remote Forests and Black Bear Habitat from Forest Resource Surveys

    Treesearch

    Victor A. Rudis; John B. Tansey

    1995-01-01

    We developed a spatially explicit modeling approach, using a county-scaled remote forest (i.e., forested area reserved from or having no direct human interference) assessment derived from 1984-1990 forest resource inventory data and a 1984 black bear (Ursus americantus) range map for 12 states in the southern United States.We defined minimum suitable and optimal black...

  6. Development of a remote digital augmentation system and application to a remotely piloted research vehicle

    NASA Technical Reports Server (NTRS)

    Edwards, J. W.; Deets, D. A.

    1975-01-01

    A cost-effective approach to flight testing advanced control concepts with remotely piloted vehicles is described. The approach utilizes a ground based digital computer coupled to the remotely piloted vehicle's motion sensors and control surface actuators through telemetry links to provide high bandwidth feedback control. The system was applied to the control of an unmanned 3/8-scale model of the F-15 airplane. The model was remotely augmented; that is, the F-15 mechanical and control augmentation flight control systems were simulated by the ground-based computer, rather than being in the vehicle itself. The results of flight tests of the model at high angles of attack are discussed.

  7. The network queueing system

    NASA Technical Reports Server (NTRS)

    Kingsbury, Brent K.

    1986-01-01

    Described is the implementation of a networked, UNIX based queueing system developed on contract for NASA. The system discussed supports both batch and device requests, and provides the facilities of remote queueing, request routing, remote status, queue access controls, batch request resource quota limits, and remote output return.

  8. Remote gaming on resource-constrained devices

    NASA Astrophysics Data System (ADS)

    Reza, Waazim; Kalva, Hari; Kaufman, Richard

    2010-08-01

    Games have become important applications on mobile devices. A mobile gaming approach known as remote gaming is being developed to support games on low cost mobile devices. In the remote gaming approach, the responsibility of rendering a game and advancing the game play is put on remote servers instead of the resource constrained mobile devices. The games rendered on the servers are encoded as video and streamed to mobile devices. Mobile devices gather user input and stream the commands back to the servers to advance game play. With this solution, mobile devices with video playback and network connectivity can become game consoles. In this paper we present the design and development of such a system and evaluate the performance and design considerations to maximize the end user gaming experience.

  9. A high throughput geocomputing system for remote sensing quantitative retrieval and a case study

    NASA Astrophysics Data System (ADS)

    Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting

    2011-12-01

    The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.

  10. Evaluating IAIMS at Yale: information access.

    PubMed

    Grajek, S E; Calarco, P; Frawley, S J; McKay, J; Miller, P L; Paton, J A; Roderer, N K; Sullivan, J E

    1997-01-01

    To evaluate use of information resources during the first year of IAIMS implementation at the Yale-New Haven Medical Center. The evaluation asked: (1) Which information resources are being used? (2) Who uses information resources? (3) Where are information resources used? (4) Are multiple sources of information being integrated? Measures included monthly usage data for resources delivered network-wide, in the Medical Library, and in the Hospital; online surveys of library workstation users; an annual survey of a random, stratified sample of Medical Center faculty, postdoctoral trainees, students, nurses, residents, and managerial and professional staff; and user comments. Eighty-three percent of the Medical Center community use networked information resources, and use of resources is increasing. Both status (faculty, student, nurse, etc.) and mission (teaching, research, patient care) affect use of individual resources. Eighty-eight percent of people use computers in more than one location, and increases in usage of traditional library resources such as MEDLINE are due to increased access from outside the Library. Both survey and usage data suggest that people are using multiple resources during the same information seeking session. Almost all of the Medical Center community is using networked information resources in more settings. It is necessary to support increased demand for information access from remote locations and to specific populations, such as nurses. People are integrating information from multiple sources, but true integration within information systems is just beginning. Other institutions are advised to incorporate pragmatic evaluation into their IAIMS activities and to share evaluation results with decision-makers.

  11. NASA Remote Sensing Observations for Water Resource and Infrastructure Management

    NASA Astrophysics Data System (ADS)

    Granger, S. L.; Armstrong, L.; Farr, T.; Geller, G.; Heath, E.; Hyon, J.; Lavoie, S.; McDonald, K.; Realmuto, V.; Stough, T.; Szana, K.

    2008-12-01

    Decision support tools employed by water resource and infrastructure managers often utilize data products obtained from local sources or national/regional databases of historic surveys and observations. Incorporation of data from these sources can be laborious and time consuming as new products must be identified, cleaned and archived for each new study site. Adding remote sensing observations to the list of sources holds promise for a timely, consistent, global product to aid decision support at regional and global scales by providing global observations of geophysical parameters including soil moisture, precipitation, atmospheric temperature, derived evapotranspiration, and snow extent needed for hydrologic models and decision support tools. However, issues such as spatial and temporal resolution arise when attempting to integrate remote sensing observations into existing decision support tools. We are working to overcome these and other challenges through partnerships with water resource managers, tool developers and other stakeholders. We are developing a new data processing framework, enabled by a core GIS server, to seamlessly pull together observations from disparate sources for synthesis into information products and visualizations useful to the water resources community. A case study approach is being taken to develop the system by working closely with water infrastructure and resource managers to integrate remote observations into infrastructure, hydrologic and water resource decision tools. We present the results of a case study utilizing observations from the PALS aircraft instrument as a proxy for NASA's upcoming Soil Moisture Active Passive (SMAP) mission and an existing commercial decision support tool.

  12. Our national energy future - The role of remote sensing

    NASA Technical Reports Server (NTRS)

    Schmitt, H. H.

    1975-01-01

    An overview of problems and opportunities in remote sensing of resources. The need for independence from foreign and precarious energy sources, availability of fossil fuel materials for other purposes (petrochemicals, fertilizer), environmental conservation, and new energy sources are singled out as the main topics. Phases of response include: (1) crisis, with reduced use of petroleum and tapping of on-shore and off-shore resources combined; (2) a transition phase involving a shift from petroleum to coal and oil shale; and (3) exploitation of renewable (inexhaustible and clean) energy. Opportunities for remote sensing in fuel production and energy conservation are discussed along with problems in identifying the spectral signatures of productive and unproductive regions. Mapping of water resources, waste heat, byproducts, and wastes is considered in addition to opportunities for international collaboration.

  13. DC grid for home applications

    NASA Astrophysics Data System (ADS)

    Elangovan, D.; Archana, R.; Jayadeep, V. J.; Nithin, M.; Arunkumar, G.

    2017-11-01

    More than fifty percent Indian population do not have access to electricity in daily lives. The distance between the power generating stations and the distribution centers forms one of the main reasons for lack of electrification in rural and remote areas. Here lies the importance of decentralization of power generation through renewable energy resources. In the present world, electricity is predominantly powered by alternating current, but most day to day devices like LED lamps, computers and electrical vehicles, all run on DC power. By directly supplying DC to these loads, the number of power conversion stages was reduced, and overall system efficiency increases. Replacing existing AC network with DC is a humongous task, but with power electronic techniques, this project intends to implement DC grid at a household level in remote and rural areas. Proposed work was designed and simulated successfully for various loads amounting to 250 W through appropriate power electronic convertors. Maximum utilization of the renewable sources for domestic and commercial application was achieved with the proposed DC topology.

  14. Role of remote sensing, geographical information system (GIS) and bioinformatics in kala-azar epidemiology

    PubMed Central

    Bhunia, Gouri Sankar; Dikhit, Manas Ranjan; Kesari, Shreekant; Sahoo, Ganesh Chandra; Das, Pradeep

    2011-01-01

    Visceral leishmaniasis or kala-azar is a potent parasitic infection causing death of thousands of people each year. Medicinal compounds currently available for the treatment of kala-azar have serious side effects and decreased efficacy owing to the emergence of resistant strains. The type of immune reaction is also to be considered in patients infected with Leishmania donovani (L. donovani). For complete eradication of this disease, a high level modern research is currently being applied both at the molecular level as well as at the field level. The computational approaches like remote sensing, geographical information system (GIS) and bioinformatics are the key resources for the detection and distribution of vectors, patterns, ecological and environmental factors and genomic and proteomic analysis. Novel approaches like GIS and bioinformatics have been more appropriately utilized in determining the cause of visearal leishmaniasis and in designing strategies for preventing the disease from spreading from one region to another. PMID:23554714

  15. Measurement-induced entanglement for excitation stored in remote atomic ensembles.

    PubMed

    Chou, C W; de Riedmatten, H; Felinto, D; Polyakov, S V; van Enk, S J; Kimble, H J

    2005-12-08

    A critical requirement for diverse applications in quantum information science is the capability to disseminate quantum resources over complex quantum networks. For example, the coherent distribution of entangled quantum states together with quantum memory (for storing the states) can enable scalable architectures for quantum computation, communication and metrology. Here we report observations of entanglement between two atomic ensembles located in distinct, spatially separated set-ups. Quantum interference in the detection of a photon emitted by one of the samples projects the otherwise independent ensembles into an entangled state with one joint excitation stored remotely in 10(5) atoms at each site. After a programmable delay, we confirm entanglement by mapping the state of the atoms to optical fields and measuring mutual coherences and photon statistics for these fields. We thereby determine a quantitative lower bound for the entanglement of the joint state of the ensembles. Our observations represent significant progress in the ability to distribute and store entangled quantum states.

  16. Autoadaptivity and optimization in distributed ECG interpretation.

    PubMed

    Augustyniak, Piotr

    2010-03-01

    This paper addresses principal issues of the ECG interpretation adaptivity in a distributed surveillance network. In the age of pervasive access to wireless digital communication, distributed biosignal interpretation networks may not only optimally solve difficult medical cases, but also adapt the data acquisition, interpretation, and transmission to the variable patient's status and availability of technical resources. The background of such adaptivity is the innovative use of results from the automatic ECG analysis to the seamless remote modification of the interpreting software. Since the medical relevance of issued diagnostic data depends on the patient's status, the interpretation adaptivity implies the flexibility of report content and frequency. Proposed solutions are based on the research on human experts behavior, procedures reliability, and usage statistics. Despite the limited scale of our prototype client-server application, the tests yielded very promising results: the transmission channel occupation was reduced by 2.6 to 5.6 times comparing to the rigid reporting mode and the improvement of the remotely computed diagnostic outcome was achieved in case of over 80% of software adaptation attempts.

  17. Resource analysis applications in Michigan. [NASA remote sensing

    NASA Technical Reports Server (NTRS)

    Schar, S. W.; Enslin, W. R.; Sattinger, I. J.; Robinson, J. G.; Hosford, K. R.; Fellows, R. S.; Raad, J. H.

    1974-01-01

    During the past two years, available NASA imagery has been applied to a broad spectrum of problems of concern to Michigan-based agencies. These demonstrations include the testing of remote sensing for the purposes of (1) highway corridor planning and impact assessments, (2) game management-area information bases, (3) multi-agency river basin planning, (4) timber resource management information systems, (5) agricultural land reservation policies, and (6) shoreline flooding damage assessment. In addition, cost accounting procedures have been developed for evaluating the relative costs of utilizing remote sensing in land cover and land use analysis data collection procedures.

  18. Considerations and techniques for incorporating remotely sensed imagery into the land resource management process.

    NASA Technical Reports Server (NTRS)

    Brooner, W. G.; Nichols, D. A.

    1972-01-01

    Development of a scheme for utilizing remote sensing technology in an operational program for regional land use planning and land resource management program applications. The scheme utilizes remote sensing imagery as one of several potential inputs to derive desired and necessary data, and considers several alternative approaches to the expansion and/or reduction and analysis of data, using automated data handling techniques. Within this scheme is a five-stage program development which includes: (1) preliminary coordination, (2) interpretation and encoding, (3) creation of data base files, (4) data analysis and generation of desired products, and (5) applications.

  19. Packet spacing : an enabling mechanism for delivering multimedia content in computational grids /

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, A. C.; Feng, W. C.; Belford, Geneva G.

    2001-01-01

    Streaming multimedia with UDP has become increasingly popular over distributed systems like the Internet. Scientific applications that stream multimedia include remote computational steering of visualization data and video-on-demand teleconferencing over the Access Grid. However, UDP does not possess a self-regulating, congestion-control mechanism; and most best-efort traflc is served by congestion-controlled TCF! Consequently, UDP steals bandwidth from TCP such that TCP$ows starve for network resources. With the volume of Internet traffic continuing to increase, the perpetuation of UDP-based streaming will cause the Internet to collapse as it did in the mid-1980's due to the use of non-congestion-controlled TCP. To address thismore » problem, we introduce the counterintuitive notion of inter-packet spacing with control feedback to enable UDP-based applications to perform well in the next-generation Internet and computational grids. When compared with traditional UDP-based streaming, we illustrate that our approach can reduce packet loss over SO% without adversely afecting delivered throughput. Keywords: network protocol, multimedia, packet spacing, streaming, TCI: UDlq rate-adjusting congestion control, computational grid, Access Grid.« less

  20. Viewing ISS Data in Real Time via the Internet

    NASA Technical Reports Server (NTRS)

    Myers, Gerry; Chamberlain, Jim

    2004-01-01

    EZStream is a computer program that enables authorized users at diverse terrestrial locations to view, in real time, data generated by scientific payloads aboard the International Space Station (ISS). The only computation/communication resource needed for use of EZStream is a computer equipped with standard Web-browser software and a connection to the Internet. EZStream runs in conjunction with the TReK software, described in a prior NASA Tech Briefs article, that coordinates multiple streams of data for the ground communication system of the ISS. EZStream includes server components that interact with TReK within the ISS ground communication system and client components that reside in the users' remote computers. Once an authorized client has logged in, a server component of EZStream pulls the requested data from a TReK application-program interface and sends the data to the client. Future EZStream enhancements will include (1) extensions that enable the server to receive and process arbitrary data streams on its own and (2) a Web-based graphical-user-interface-building subprogram that enables a client who lacks programming expertise to create customized display Web pages.

  1. Satellites as Shared Resources for Caribbean Climate and Health Studies

    NASA Technical Reports Server (NTRS)

    Maynard, Nancy G.

    2002-01-01

    Remotely-sensed data and observations are providing powerful new tools for addressing climate and environment-related human health problems through increased capabilities for monitoring, risk mapping, and surveillance of parameters useful to such problems as vector-borne and infectious diseases, air and water quality, harmful algal blooms, UV (ultraviolet) radiation, contaminant and pathogen transport in air and water, and thermal stress. Remote sensing, geographic information systems (GIS), global positioning systems (GPS), improved computational capabilities, and interdisciplinary research between the Earth and health science communities are being combined in rich collaborative efforts resulting in more rapid problem-solving, early warning, and prevention in global health issues. Collaborative efforts among scientists from health and Earth sciences together with local decision-makers are enabling increased understanding of the relationships between changes in temperature, rainfall, wind, soil moisture, solar radiation, vegetation, and the patterns of extreme weather events and the occurrence and patterns of diseases (especially, infectious and vector-borne diseases) and other health problems. This increased understanding through improved information and data sharing, in turn, empowers local health and environmental officials to better predict health problems, take preventive measure, and improve response actions. This paper summarizes the remote sensing systems most useful for climate, environment and health studies of the Caribbean region and provides several examples of interdisciplinary research projects in the Caribbean currently using remote sensing technologies. These summaries include the use of remote sensing of algal blooms, pollution transport, coral reef monitoring, vectorborne disease studies, and potential health effects of African dust on Trinidad and Barbados.

  2. A remote sensing method for estimating regional reservoir area and evaporative loss

    DOE PAGES

    Zhang, Hua; Gorelick, Steven M.; Zimba, Paul V.; ...

    2017-10-07

    Evaporation from the water surface of a reservoir can significantly affect its function of ensuring the availability and temporal stability of water supply. Current estimations of reservoir evaporative loss are dependent on water area derived from a reservoir storage-area curve. Such curves are unavailable if the reservoir is located in a data-sparse region or questionable if long-term sedimentation has changed the original elevation-area relationship. In this paper, we propose a remote sensing framework to estimate reservoir evaporative loss at the regional scale. This framework uses a multispectral water index to extract reservoir area from Landsat imagery and estimate monthly evaporationmore » volume based on pan-derived evaporative rates. The optimal index threshold is determined based on local observations and extended to unobserved locations and periods. Built on the cloud computing capacity of the Google Earth Engine, this framework can efficiently analyze satellite images at large spatiotemporal scales, where such analysis is infeasible with a single computer. Our study involves 200 major reservoirs in Texas, captured in 17,811 Landsat images over a 32-year period. The results show that these reservoirs contribute to an annual evaporative loss of 8.0 billion cubic meters, equivalent to 20% of their total active storage or 53% of total annual water use in Texas. At five coastal basins, reservoir evaporative losses exceed the minimum freshwater inflows required to sustain ecosystem health and fishery productivity of the receiving estuaries. Reservoir evaporative loss can be significant enough to counterbalance the positive effects of impounding water and to offset the contribution of water conservation and reuse practices. Our results also reveal the spatially variable performance of the multispectral water index and indicate the limitation of using scene-level cloud cover to screen satellite images. Finally, this study demonstrates the advantage of combining satellite remote sensing and cloud computing to support regional water resources assessment.« less

  3. A remote sensing method for estimating regional reservoir area and evaporative loss

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Gorelick, Steven M.; Zimba, Paul V.; Zhang, Xiaodong

    2017-12-01

    Evaporation from the water surface of a reservoir can significantly affect its function of ensuring the availability and temporal stability of water supply. Current estimations of reservoir evaporative loss are dependent on water area derived from a reservoir storage-area curve. Such curves are unavailable if the reservoir is located in a data-sparse region or questionable if long-term sedimentation has changed the original elevation-area relationship. We propose a remote sensing framework to estimate reservoir evaporative loss at the regional scale. This framework uses a multispectral water index to extract reservoir area from Landsat imagery and estimate monthly evaporation volume based on pan-derived evaporative rates. The optimal index threshold is determined based on local observations and extended to unobserved locations and periods. Built on the cloud computing capacity of the Google Earth Engine, this framework can efficiently analyze satellite images at large spatiotemporal scales, where such analysis is infeasible with a single computer. Our study involves 200 major reservoirs in Texas, captured in 17,811 Landsat images over a 32-year period. The results show that these reservoirs contribute to an annual evaporative loss of 8.0 billion cubic meters, equivalent to 20% of their total active storage or 53% of total annual water use in Texas. At five coastal basins, reservoir evaporative losses exceed the minimum freshwater inflows required to sustain ecosystem health and fishery productivity of the receiving estuaries. Reservoir evaporative loss can be significant enough to counterbalance the positive effects of impounding water and to offset the contribution of water conservation and reuse practices. Our results also reveal the spatially variable performance of the multispectral water index and indicate the limitation of using scene-level cloud cover to screen satellite images. This study demonstrates the advantage of combining satellite remote sensing and cloud computing to support regional water resources assessment.

  4. A remote sensing method for estimating regional reservoir area and evaporative loss

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Hua; Gorelick, Steven M.; Zimba, Paul V.

    Evaporation from the water surface of a reservoir can significantly affect its function of ensuring the availability and temporal stability of water supply. Current estimations of reservoir evaporative loss are dependent on water area derived from a reservoir storage-area curve. Such curves are unavailable if the reservoir is located in a data-sparse region or questionable if long-term sedimentation has changed the original elevation-area relationship. In this paper, we propose a remote sensing framework to estimate reservoir evaporative loss at the regional scale. This framework uses a multispectral water index to extract reservoir area from Landsat imagery and estimate monthly evaporationmore » volume based on pan-derived evaporative rates. The optimal index threshold is determined based on local observations and extended to unobserved locations and periods. Built on the cloud computing capacity of the Google Earth Engine, this framework can efficiently analyze satellite images at large spatiotemporal scales, where such analysis is infeasible with a single computer. Our study involves 200 major reservoirs in Texas, captured in 17,811 Landsat images over a 32-year period. The results show that these reservoirs contribute to an annual evaporative loss of 8.0 billion cubic meters, equivalent to 20% of their total active storage or 53% of total annual water use in Texas. At five coastal basins, reservoir evaporative losses exceed the minimum freshwater inflows required to sustain ecosystem health and fishery productivity of the receiving estuaries. Reservoir evaporative loss can be significant enough to counterbalance the positive effects of impounding water and to offset the contribution of water conservation and reuse practices. Our results also reveal the spatially variable performance of the multispectral water index and indicate the limitation of using scene-level cloud cover to screen satellite images. Finally, this study demonstrates the advantage of combining satellite remote sensing and cloud computing to support regional water resources assessment.« less

  5. Remote Sensing Center

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The applications are reported of new remote sensing techniques for earth resources surveys and environmental monitoring. Applications discussed include: vegetation systems, environmental monitoring, and plant protection. Data processing systems are described.

  6. Investigation into Cloud Computing for More Robust Automated Bulk Image Geoprocessing

    NASA Technical Reports Server (NTRS)

    Brown, Richard B.; Smoot, James C.; Underwood, Lauren; Armstrong, C. Duane

    2012-01-01

    Geospatial resource assessments frequently require timely geospatial data processing that involves large multivariate remote sensing data sets. In particular, for disasters, response requires rapid access to large data volumes, substantial storage space and high performance processing capability. The processing and distribution of this data into usable information products requires a processing pipeline that can efficiently manage the required storage, computing utilities, and data handling requirements. In recent years, with the availability of cloud computing technology, cloud processing platforms have made available a powerful new computing infrastructure resource that can meet this need. To assess the utility of this resource, this project investigates cloud computing platforms for bulk, automated geoprocessing capabilities with respect to data handling and application development requirements. This presentation is of work being conducted by Applied Sciences Program Office at NASA-Stennis Space Center. A prototypical set of image manipulation and transformation processes that incorporate sample Unmanned Airborne System data were developed to create value-added products and tested for implementation on the "cloud". This project outlines the steps involved in creating and testing of open source software developed process code on a local prototype platform, and then transitioning this code with associated environment requirements into an analogous, but memory and processor enhanced cloud platform. A data processing cloud was used to store both standard digital camera panchromatic and multi-band image data, which were subsequently subjected to standard image processing functions such as NDVI (Normalized Difference Vegetation Index), NDMI (Normalized Difference Moisture Index), band stacking, reprojection, and other similar type data processes. Cloud infrastructure service providers were evaluated by taking these locally tested processing functions, and then applying them to a given cloud-enabled infrastructure to assesses and compare environment setup options and enabled technologies. This project reviews findings that were observed when cloud platforms were evaluated for bulk geoprocessing capabilities based on data handling and application development requirements.

  7. Help is in your pocket: the potential accuracy of smartphone- and laptop-based remotely guided resuscitative telesonography.

    PubMed

    McBeth, Paul; Crawford, Innes; Tiruta, Corina; Xiao, Zhengwen; Zhu, George Qiaohao; Shuster, Michael; Sewell, Les; Panebianco, Nova; Lautner, David; Nicolaou, Savvas; Ball, Chad G; Blaivas, Michael; Dente, Christopher J; Wyrzykowski, Amy D; Kirkpatrick, Andrew W

    2013-12-01

    Ultrasound (US) examination has many uses in resuscitation, but to use it to its full effectiveness typically requires a trained and proficient user. We sought to use information technology advances to remotely guide US-naive examiners (UNEs) using a portable battery-powered tele-US system mentored using either a smartphone or laptop computer. A cohort of UNEs (5 tactical emergency medicine technicians, 10 ski-patrollers, and 4 nurses) was guided to perform partial or complete Extended Focused Assessment with Sonography of Trauma (EFAST) examinations on both a healthy volunteer and on a US phantom, while being mentored by a remote examiner who viewed the US images over either an iPhone(®) (Apple, Cupertino, CA) or a laptop computer with an inlaid depiction of the US probe and the "patient," derived from a videocamera mounted on the UNE's head. Examinations were recorded as still images and over-read from a Web site by seven expert reviewers (ERs) (three surgeons, two emergentologists, and two radiologists). Examination goals were to identify lung sliding (LS) documented by color power Doppler (CPD) in the human and to identify intraperitoneal (IP) fluid in the phantom. All UNEs were successfully mentored to easily and clearly identify both LS (19 determinations) and IP fluid (14 determinations), as assessed in real time by the remote mentor. ERs confirmed IP fluid in 95 of 98 determinations (97%), with 100% of ERs perceiving clinical utility for the abdominal Focused Assessment with Sonography of Trauma. Based on single still CPD images, 70% of ERs agreed on the presence or absence of LS. In 16 out of 19 cases, over 70% of the ERs felt the EFAST exam was clinically useful. UNEs can confidently be guided to obtain critical findings using simple information technology resources, based on the receiving/transmitting device found in most trauma surgeons' pocket or briefcase. Global US mentoring requires only Internet connectivity and initiative.

  8. Using NetMeeting for remote configuration of the Otto Bock C-Leg: technical considerations.

    PubMed

    Lemaire, E D; Fawcett, J A

    2002-08-01

    Telehealth has the potential to be a valuable tool for technical and clinical support of computer controlled prosthetic devices. This pilot study examined the use of Internet-based, desktop video conferencing for remote configuration of the Otto Bock C-Leg. Laboratory tests involved connecting two computers running Microsoft NetMeeting over a local area network (IP protocol). Over 56 Kbs(-1), DSL/Cable, and 10 Mbs(-1) LAN speeds, a prosthetist remotely configured a user's C-Leg by using Application Sharing, Live Video, and Live Audio. A similar test between sites in Ottawa and Toronto, Canada was limited by the notebook computer's 28 Kbs(-1) modem. At the 28 Kbs(-1) Internet-connection speed, NetMeeting's application sharing feature was not able to update the remote Sliders window fast enough to display peak toe loads and peak knee angles. These results support the use of NetMeeting as an accessible and cost-effective tool for remote C-Leg configuration, provided that sufficient Internet data transfer speed is available.

  9. Application of remote sensing technology to land evaluation, planning utilization of land resources, and assessment of westland habitat in eastern South Dakota, parts 1 and 2

    NASA Technical Reports Server (NTRS)

    Myers, V. I. (Principal Investigator); Cox, T. L.; Best, R. G.

    1976-01-01

    The author has identified the following significant results. LANDSAT fulfilled the requirements for general soils and land use information. RB-57 imagery was required to provide the information and detail needed for mapping soils for land evaluation. Soils maps for land evaluation were provided on clear mylar at the scale of the county highway map to aid users in locating mapping units. Resulting mapped data were computer processed to provided a series of interpretive maps (land value, limitations to development, etc.) and area summaries for the users.

  10. ASTEP user's guide and software documentation

    NASA Technical Reports Server (NTRS)

    Gliniewicz, A. S.; Lachowski, H. M.; Pace, W. H., Jr.; Salvato, P., Jr.

    1974-01-01

    The Algorithm Simulation Test and Evaluation Program (ASTEP) is a modular computer program developed for the purpose of testing and evaluating methods of processing remotely sensed multispectral scanner earth resources data. ASTEP is written in FORTRAND V on the UNIVAC 1110 under the EXEC 8 operating system and may be operated in either a batch or interactive mode. The program currently contains over one hundred subroutines consisting of data classification and display algorithms, statistical analysis algorithms, utility support routines, and feature selection capability. The current program can accept data in LARSC1, LARSC2, ERTS, and Universal formats, and can output processed image or data tapes in Universal format.

  11. Information sciences experiment system

    NASA Technical Reports Server (NTRS)

    Katzberg, Stephen J.; Murray, Nicholas D.; Benz, Harry F.; Bowker, David E.; Hendricks, Herbert D.

    1990-01-01

    The rapid expansion of remote sensing capability over the last two decades will take another major leap forward with the advent of the Earth Observing System (Eos). An approach is presented that will permit experiments and demonstrations in onboard information extraction. The approach is a non-intrusive, eavesdropping mode in which a small amount of spacecraft real estate is allocated to an onboard computation resource. How such an approach allows the evaluation of advanced technology in the space environment, advanced techniques in information extraction for both Earth science and information science studies, direct to user data products, and real-time response to events, all without affecting other on-board instrumentation is discussed.

  12. Lab4CE: A Remote Laboratory for Computer Education

    ERIC Educational Resources Information Center

    Broisin, Julien; Venant, Rémi; Vidal, Philippe

    2017-01-01

    Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…

  13. Remote Sensing Assessment of Lunar Resources: We Know Where to Go to Find What We Need

    NASA Technical Reports Server (NTRS)

    Gillis, J. J.; Taylor, G. J.; Lucey, P. G.

    2004-01-01

    The utilization of space resources is necessary to not only foster the growth of human activities in space, but is essential to the President s vision of a "sustained and affordable human and robotic program to explore the solar system and beyond." The distribution of resources will shape planning permanent settlements by affecting decisions about where to locate a settlement. Mapping the location of such resources, however, is not the limiting factor in selecting a site for a lunar base. It is indecision about which resources to use that leaves the location uncertain. A wealth of remotely sensed data exists that can be used to identify targets for future detailed exploration. Thus, the future of space resource utilization pre-dominantly rests upon developing a strategy for resource exploration and efficient methods of extraction.

  14. Eastern Regional Remote Sensing Applications Conference

    NASA Technical Reports Server (NTRS)

    Short, N. M. (Editor)

    1981-01-01

    The roles and activities of NASA and the National Conference of State Legislatures in fostering remote sensing technology utilization by the states and in promoting interstate communication and cooperation are reviewed. The reduction and interpretation of LANDSAT MSS and aerial reconnaissance data for resources management and environment assessment are described as well as resource information systems, and the value of SEASAT synthetic aperture radar and LANDSAT 4 data.

  15. Environmental application of remote sensing methods to coastal zone land use and marine resource management, appendices G to J. [in southeastern Virginia

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Important data were compiled for use with the Richmond-Cape Henry Environmental Laboratory (RICHEL) remote sensing project in coastal zone land use and marine resources management, and include analyses and projections of population characteristics, formulation of soil loss prediction techniques, and sources and quantity analyses of air and water effluents.

  16. Architecture for distributed design and fabrication

    NASA Astrophysics Data System (ADS)

    McIlrath, Michael B.; Boning, Duane S.; Troxel, Donald E.

    1997-01-01

    We describe a flexible, distributed system architecture capable of supporting collaborative design and fabrication of semi-conductor devices and integrated circuits. Such capabilities are of particular importance in the development of new technologies, where both equipment and expertise are limited. Distributed fabrication enables direct, remote, physical experimentation in the development of leading edge technology, where the necessary manufacturing resources are new, expensive, and scarce. Computational resources, software, processing equipment, and people may all be widely distributed; their effective integration is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages is essential in order to achieve the realization of new technologies for specific product requirements. Our architecture leverages current vendor and consortia developments to define software interfaces and infrastructure based on existing and merging networking, CIM, and CAD standards. Process engineers and product designers access processing and simulation results through a common interface and collaborate across the distributed manufacturing environment.

  17. Application of computer image enhancement techniques to shuttle hand-held photography

    NASA Technical Reports Server (NTRS)

    David, B. E.

    1986-01-01

    With the advent of frequent Space Transportation System Shuttle missions, photography from hyperaltitudes stands to become an accessible and convenient resource for scientists and environmental managers. As satellite products (such as LANDSAT) continue to spiral in costs, all but the most affluent consumer is finding Earth imagery from space to be more and more unavailable. Therefore, the potential for Shuttle photography to serve a wide variety of users is increasing. However, despite the popularity of photos from space as public relations tools and report illustrations, little work has been performed to prove their scientific worth beyond that as basic mapping bases. It is the hypothesis of this project that hand-held Earth photography from the Space Shuttle has potentially high scientific merit and that primary data can be extracted. In effect, Shuttle photography should be considered a major remote sensing information resource.

  18. The ten-ecosystem study investigation plan

    NASA Technical Reports Server (NTRS)

    Kan, E. P.

    1976-01-01

    With the continental United States divided into ten forest and grassland ecosystems, the Ten Ecosystem Study (TES) is designed to investigate the feasibility and applicability of state-of-the-art automatic data processing remote sensing technology to inventory forest, grassland, and water resources by using Land Satellite data. The study will serve as a prelude to a possible future nationwide remote sensing application to inventory forest and rangeland renewable resources. This plan describes project design and phases, the ten ecosystem, data utilization and output, personnel organization, resource requirements, and schedules and milestones.

  19. Thermal infrared remote sensing of surface features for renewable resource applications

    NASA Technical Reports Server (NTRS)

    Welker, J. E.

    1981-01-01

    The subjects of infrared remote sensing of surface features for renewable resource applications is reviewed with respect to the basic physical concepts involved at the Earth's surface and up through the atmosphere, as well as the historical development of satellite systems which produce such data at increasingly greater spatial resolution. With this general background in hand, the growth of a variety of specific renewable resource applications using the developing thermal infrared technology are discussed, including data from HCMM investigators. Recommendations are made for continued growth in this field of applications.

  20. Overview of the AgRISTARS research program. I. [AGgriculture and Resources Inventory Surveys Through Aerospace Remote Sensing

    NASA Technical Reports Server (NTRS)

    Caudill, C. E.; Hatch, R. E.

    1985-01-01

    An account is given of the activities and accomplishments to date of the U.S. Department of Agriculture's Agriculture and Resources Inventory Surveys Through Aerospace Remote Sensing (AgRISTARS) program, which is a cooperative venture with NASA and the Departments of the Interior and of Commerce. AgRISTARS research activities encompass early warning and crop condition assessment, inventory technology development for production forecasting, crop yield model development, soil moisture monitoring, domestic crops and land cover sensing, renewable resources inventory, and conservation and pollution assessment.

  1. The Earth Resources Observation Systems data center's training technical assistance, and applications research activities

    USGS Publications Warehouse

    Sturdevant, J.A.

    1981-01-01

    The Earth Resources Observation Systems (EROS) Data Center (EDO, administered by the U.S. Geological Survey, U.S. Department of the Interior, provides remotely sensed data to the user community and offers a variety of professional services to further the understanding and use of remote sensing technology. EDC reproduces and sells photographic and electronic copies of satellite images of areas throughout the world. Other products include aerial photographs collected by 16 organizations, including the U.S. Geological Survey and the National Aeronautics and Space Administration. Primary users of the remotely sensed data are Federal, State, and municipal government agencies, universities, foreign nations, and private industries. The professional services available at EDC are primarily directed at integrating satellite and aircraft remote sensing technology into the programs of the Department of the Interior and its cooperators. This is accomplished through formal training workshops, user assistance, cooperative demonstration projects, and access to equipment and capabilities in an advanced data analysis laboratory. In addition, other Federal agencies, State and local governments, universities, and the general public can get assistance from the EDC Staff. Since 1973, EDC has contributed to the accelerating growth in development and operational use of remotely sensed data for land resource problems through its role as educator and by conducting basic and applied remote sensing applications research. As remote sensing technology continues to evolve, EDC will continue to respond to the increasing demand for timely information on remote sensing applications. Questions most often asked about EDC's research and training programs include: Who may attend an EDC remote sensing training course? Specifically, what is taught? Who may cooperate with EDC on remote sensing projects? Are interpretation services provided on a service basis? This report attempts to define the goals and objectives of and policies on the following EDC services: Training Program.User Assistance.Data Analysis Laboratory.Cooperative Demonstration Projects.Research Projects.

  2. An introduction to quantitative remote sensing. [data processing

    NASA Technical Reports Server (NTRS)

    Lindenlaub, J. C.; Russell, J.

    1974-01-01

    The quantitative approach to remote sensing is discussed along with the analysis of remote sensing data. Emphasis is placed on the application of pattern recognition in numerically oriented remote sensing systems. A common background and orientation for users of the LARS computer software system is provided.

  3. Telescience Resource Kit Software Capabilities and Future Enhancements

    NASA Technical Reports Server (NTRS)

    Schneider, Michelle

    2004-01-01

    The Telescience Resource Kit (TReK) is a suite of PC-based software applications that can be used to monitor and control a payload on board the International Space Station (ISS). This software provides a way for payload users to operate their payloads from their home sites. It can be used by an individual or a team of people. TReK provides both local ground support system services and an interface to utilize remote services provided by the Payload Operations Integration Center (POIC). by the POIC and to perform local data functions such as processing the data, storing it in local files, and forwarding it to other computer systems. TReK can also be used to build, send, and track payload commands. In addition to these features, work is in progress to add a new command management capability. This capability will provide a way to manage a multi- platform command environment that can include geographically distributed computers. This is intended to help those teams that need to manage a shared on-board resource such as a facility class payload. The environment can be configured such that one individual can manage all the command activities associated with that payload. This paper will provide a summary of existing TReK capabilities and a description of the new command management capability. For example, 7'ReK can be used to receive payload data distributed

  4. Conference of Remote Sensing Educators (CORSE-78)

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Ways of improving the teaching of remote sensing students at colleges and universities are discussed. Formal papers and workshops on various Earth resources disciplines, image interpretation, and data processing concepts are presented. An inventory of existing remote sensing and related subject courses being given in western regional universities is included.

  5. Application of remote sensing to solution of ecological problems

    NASA Technical Reports Server (NTRS)

    Adelman, A.

    1972-01-01

    The application of remote sensing techniques to solving ecological problems is discussed. The three phases of environmental ecological management are examined. The differences between discovery and exploitation of natural resources and their ecological management are described. The specific application of remote sensing to water management is developed.

  6. Artificial groundwater recharge zones mapping using remote sensing and GIS: a case study in Indian Punjab.

    PubMed

    Singh, Amanpreet; Panda, S N; Kumar, K S; Sharma, Chandra Shekhar

    2013-07-01

    Artificial groundwater recharge plays a vital role in sustainable management of groundwater resources. The present study was carried out to identify the artificial groundwater recharge zones in Bist Doab basin of Indian Punjab using remote sensing and geographical information system (GIS) for augmenting groundwater resources. The study area has been facing severe water scarcity due to intensive agriculture for the past few years. The thematic layers considered in the present study are: geomorphology (2004), geology (2004), land use/land cover (2008), drainage density, slope, soil texture (2000), aquifer transmissivity, and specific yield. Different themes and related features were assigned proper weights based on their relative contribution to groundwater recharge. Normalized weights were computed using the Saaty's analytic hierarchy process. Thematic layers were integrated in ArcGIS for delineation of artificial groundwater recharge zones. The recharge map thus obtained was divided into four zones (poor, moderate, good, and very good) based on their influence to groundwater recharge. Results indicate that 15, 18, 37, and 30 % of the study area falls under "poor," "moderate," "good," and "very good" groundwater recharge zones, respectively. The highest recharge potential area is located towards western and parts of middle region because of high infiltration rates caused due to the distribution of flood plains, alluvial plain, and agricultural land. The least effective recharge potential is in the eastern and middle parts of the study area due to low infiltration rate. The results of the study can be used to formulate an efficient groundwater management plan for sustainable utilization of limited groundwater resources.

  7. GIS, remote sensing and spatial modeling for conservation of stone forest landscape in Lunan, China

    NASA Astrophysics Data System (ADS)

    Zhang, Chuanrong

    The Lunan Stone Forest is the World's premier pinnacle karst landscape, with considerable scientific and cultural importance. Because of its inherent ecological fragility and ongoing human disruption, especially recently burgeoning tourism development, the landscape is stressed and is in danger of being destroyed. Conservation policies have been implemented by the local and national governments, but many problems remain in the national park. For example, there is no accurate detailed map and no computer system to help authorities manage the natural resources. By integrating GIS, remote sensing and spatial modeling this dissertation investigates the issue of landscape conservation and develops some methodologies to assist in management of the natural resources in the national park. Four elements are involved: (1) To help decision-makers and residents understand the scope of resource exploitation and develop appropriate protective strategies, the dissertation documents how the landscape has been changed by human activities over the past 3 decades; (2) To help authorities scientifically designate different levels of protection in the park and to let the public actively participate in conservation decision making, a web-based Spatial Decision Support System for the conservation of the landscape was developed; (3) To make data sharing and integration easy in the future, a GML-based interoperable database for the park was implemented; and (4) To acquire more information and provide the uncertainty information to landscape conservation decision-makers, spatial land use patterns were modeled and the distributional uncertainty of land cover categories was assessed using a triplex Markov chain (TMC) model approach.

  8. Feasibility study of the application of existing techniques to remotely monitor hydrochloric acid in the atmosphere

    NASA Technical Reports Server (NTRS)

    Zwick, H.; Ward, V.; Beaudette, L.

    1973-01-01

    A critical evaluation of existing optical remote sensors for HCl vapor detection in solid propellant rocket plumes is presented. The P branch of the fundamental vibration-rotation band was selected as the most promising spectral feature to sense. A computation of transmittance for HCl vapor, an estimation of interferent spectra, the application of these spectra to computer modelled remote sensors, and a trade-off study for instrument recommendation are also included.

  9. Design of Remote GPRS-based Gas Data Monitoring System

    NASA Astrophysics Data System (ADS)

    Yan, Xiyue; Yang, Jianhua; Lu, Wei

    2018-01-01

    In order to solve the problem of remote data transmission of gas flowmeter, and realize unattended operation on the spot, an unattended remote monitoring system based on GPRS for gas data is designed in this paper. The slave computer of this system adopts embedded microprocessor to read data of gas flowmeter through rs-232 bus and transfers it to the host computer through DTU. In the host computer, the VB program dynamically binds the Winsock control to receive and parse data. By using dynamic data exchange, the Kingview configuration software realizes history trend curve, real time trend curve, alarm, print, web browsing and other functions.

  10. General-Purpose Serial Interface For Remote Control

    NASA Technical Reports Server (NTRS)

    Busquets, Anthony M.; Gupton, Lawrence E.

    1990-01-01

    Computer controls remote television camera. General-purpose controller developed to serve as interface between host computer and pan/tilt/zoom/focus functions on series of automated video cameras. Interface port based on 8251 programmable communications-interface circuit configured for tristated outputs, and connects controller system to any host computer with RS-232 input/output (I/O) port. Accepts byte-coded data from host, compares them with prestored codes in read-only memory (ROM), and closes or opens appropriate switches. Six output ports control opening and closing of as many as 48 switches. Operator controls remote television camera by speaking commands, in system including general-purpose controller.

  11. Estimating costs and performance of systems for machine processing of remotely sensed data

    NASA Technical Reports Server (NTRS)

    Ballard, R. J.; Eastwood, L. F., Jr.

    1977-01-01

    This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.

  12. Intelligent control system based on ARM for lithography tool

    NASA Astrophysics Data System (ADS)

    Chen, Changlong; Tang, Xiaoping; Hu, Song; Wang, Nan

    2014-08-01

    The control system of traditional lithography tool is based on PC and MCU. The PC handles the complex algorithm, human-computer interaction, and communicates with MCU via serial port; The MCU controls motors and electromagnetic valves, etc. This mode has shortcomings like big volume, high power consumption, and wasting of PC resource. In this paper, an embedded intelligent control system of lithography tool, based on ARM, is provided. The control system used S5PV210 as processor, completing the functions of PC in traditional lithography tool, and provided a good human-computer interaction by using LCD and capacitive touch screen. Using Android4.0.3 as operating system, the equipment provided a cool and easy UI which made the control more user-friendly, and implemented remote control and debug, pushing video information of product by network programming. As a result, it's convenient for equipment vendor to provide technical support for users. Finally, compared with traditional lithography tool, this design reduced the PC part, making the hardware resources efficiently used and reducing the cost and volume. Introducing embedded OS and the concepts in "The Internet of things" into the design of lithography tool can be a development trend.

  13. Remote Sensing for Hazard Mitigation and Resource Protection in Pacific Latin America: New NSF sponsored initiative at Michigan Tech.

    NASA Astrophysics Data System (ADS)

    Rose, W. I.; Bluth, G. J.; Gierke, J. S.; Gross, E.

    2005-12-01

    Though much of the developing world has the potential to gain significantly from remote sensing techniques in terms of public health and safety and, eventually, economic development, they lack the resources required to advance the development and practice of remote sensing. Both developed and developing countries share a mutual interest in furthering remote sensing capabilities for natural hazard mitigation and resource development, and this common commitment creates a solid foundation upon which to build an integrated education and research project. This will prepare students for careers in science and engineering through their efforts to solve a suite of problems needing creative solutions: collaboration with foreign agencies; living abroad immersed in different cultures; and adapting their academic training to contend with potentially difficult field conditions and limited resources. This project makes two important advances: (1) We intend to develop the first formal linkage among geoscience agencies from four Pacific Latin American countries (Guatemala, El Salvador, Nicaragua and Ecuador), focusing on the collaborative development of remote sensing tools for hazard mitigation and water resource development; (2) We will build a new educational system of applied research and engineering, using two existing educational programs at Michigan Tech: a new Peace Corp/Master's International (PC/MI) program in Natural Hazards which features a 2-year field assignment, and an "Enterprise" program for undergraduates, which gives teams of geoengineering students the opportunity to work for three years in a business-like setting to solve real-world problems This project will involve 1-2 post-doctoral researchers, 3 Ph.D., 9 PC/MI, and roughly 20 undergraduate students each year.

  14. Commodity Cluster Computing for Remote Sensing Applications using Red Hat LINUX

    NASA Technical Reports Server (NTRS)

    Dorband, John

    2003-01-01

    Since 1994, we have been doing research at Goddard Space Flight Center on implementing a wide variety of applications on commodity based computing clusters. This talk is about these clusters and haw they are used on these applications including ones for remote sensing.

  15. RADIAL COMPUTED TOMOGRAPHY OF AIR CONTAMINANTS USING OPTICAL REMOTE SENSING

    EPA Science Inventory

    The paper describes the application of an optical remote-sensing (ORS) system to map air contaminants and locate fugitive emissions. Many ORD systems may utilize radial non-overlapping beam geometry and a computed tomography (CT) algorithm to map the concentrations in a plane. In...

  16. Operational Remote Sensing Services in North Eastern Region of India for Natural Resources Management, Early Warning for Disaster Risk Reduction and Dissemination of Information and Services

    NASA Astrophysics Data System (ADS)

    Raju, P. L. N.; Sarma, K. K.; Barman, D.; Handique, B. K.; Chutia, D.; Kundu, S. S.; Das, R. Kr.; Chakraborty, K.; Das, R.; Goswami, J.; Das, P.; Devi, H. S.; Nongkynrih, J. M.; Bhusan, K.; Singh, M. S.; Singh, P. S.; Saikhom, V.; Goswami, C.; Pebam, R.; Borgohain, A.; Gogoi, R. B.; Singh, N. R.; Bharali, A.; Sarma, D.; Lyngdoh, R. B.; Mandal, P. P.; Chabukdhara, M.

    2016-06-01

    North Eastern Region (NER) of India comprising of eight states considered to be most unique and one of the most challenging regions to govern due to its unique physiographic condition, rich biodiversity, disaster prone and diverse socio-economic characteristics. Operational Remote Sensing services increased manifolds in the region with the establishment of North Eastern Space Applications Centre (NESAC) in the year 2000. Since inception, NESAC has been providing remote sensing services in generating inventory, planning and developmental activities, and management of natural resources, disasters and dissemination of information and services through geo-web services for NER. The operational remote sensing services provided by NESAC can be broadly divided into three categories viz. natural resource planning and developmental services, disaster risk reduction and early warning services and information dissemination through geo-portal services. As a apart of natural resources planning and developmental services NESAC supports the state forest departments in preparing the forest working plans by providing geospatial inputs covering entire NER, identifying the suitable culturable wastelands for cultivation of silkworm food plants, mapping of natural resources such as land use/land cover, wastelands, land degradation etc. on temporal basis. In the area of disaster risk reduction, NESAC has initiated operational services for early warning and post disaster assessment inputs for flood early warning system (FLEWS) using satellite remote sensing, numerical weather prediction, hydrological modeling etc.; forest fire alert system with actionable attribute information; Japanese Encephalitis Early Warning System (JEWS) based on mosquito vector abundance, pig population and historical disease intensity and agriculture drought monitoring for the region. The large volumes of geo-spatial databases generated as part of operational services are made available to the administrators and local government bodies for better management, preparing prospective planning, and sustainable use of available resources. The knowledge dissemination is being done through online web portals wherever the internet access is available and as well as offline space based information kiosks, where the internet access is not available or having limited bandwidth availability. This paper presents a systematic and comprehensive study on the remote sensing services operational in NER of India for natural resources management, disaster risk reduction and dissemination of information and services, in addition to outlining future areas and direction of space applications for the region.

  17. The role of NASA's Water Resources applications area in improving access to water quality-related information and water resources management

    NASA Astrophysics Data System (ADS)

    Lee, C. M.

    2016-02-01

    The NASA Applied Sciences Program plays a unique role in facilitating access to remote sensing-based water information derived from US federal assets towards the goal of improving science and evidence-based decision-making in water resources management. The Water Resources Application Area within NASA Applied Sciences works specifically to develop and improve water data products to support improved management of water resources, with partners who are faced with real-world constraints and conditions including cost and regulatory standards. This poster will highlight the efforts and collaborations enabled by this program that have resulted in integration of remote sensing-based information for water quality modeling and monitoring within an operational context.

  18. The role of NASA's Water Resources applications area in improving access to water quality-related information and water resources management

    NASA Astrophysics Data System (ADS)

    Lee, C. M.

    2016-12-01

    The NASA Applied Sciences Program plays a unique role in facilitating access to remote sensing-based water information derived from US federal assets towards the goal of improving science and evidence-based decision-making in water resources management. The Water Resources Application Area within NASA Applied Sciences works specifically to develop and improve water data products to support improved management of water resources, with partners who are faced with real-world constraints and conditions including cost and regulatory standards. This poster will highlight the efforts and collaborations enabled by this program that have resulted in integration of remote sensing-based information for water quality modeling and monitoring within an operational context.

  19. Remote sensing in Michigan for land resource management

    NASA Technical Reports Server (NTRS)

    Lowe, D. S.; Istvan, L. B.; Roller, N. E. G.; Prentice, V. L.

    1976-01-01

    The Environmental Research Institute of Michigan is conducting a program whose goal is the large-scale adoption, by both public agencies and private interests in Michigan, of NASA earth-resource survey technology as an important aid in the solution of current problems in resource management and environmental protection. During the period from June 1975 to June 1976, remote sensing techniques to aid Michigan government agencies were used to achieve the following major results: (1) supply justification for public acquisition of land to establish the St. John's Marshland Recreation Area; (2) recommend economical and effective methods for performing a statewide wetlands survey; (3) assist in the enforcement of state laws relating to sand and gravel mining, soil erosion and sedimentation, and shorelands protection; (4) accomplish a variety of regional resource management actions in the East Central Michigan Planning and Development Region. Other tasks on which remote sensing technology was used include industrial and school site selection, ice detachment in the Soo Harbor, grave detection, and data presentation for wastewater management programs.

  20. Mississippi Sound remote sensing study. [NASA Earth Resources Laboratory seasonal experiments

    NASA Technical Reports Server (NTRS)

    Atwell, B. H.; Thomann, G. C.

    1973-01-01

    A study of the Mississippi Sound was initiated in early 1971 by personnel of NASA Earth Resources Laboratory. Four separate seasonal experiments consisting of quasi-synoptic remote and surface measurements over the entire area were planned. Approximately 80 stations distributed throughout Mississippi Sound were occupied. Surface water temperature and secchi extinction depth were measured at each station and water samples were collected for water quality analyses. The surface distribution of three water parameters of interest from a remote sensing standpoint - temperature, salinity and chlorophyll content - are displayed in map form. Areal variations in these parameters are related to tides and winds. A brief discussion of the general problem of radiative measurements of water temperature is followed by a comparison of remotely measured temperatures (PRT-5) to surface vessel measurements.

  1. Build It: Will They Come?

    NASA Astrophysics Data System (ADS)

    Corrie, Brian; Zimmerman, Todd

    Scientific research is fundamentally collaborative in nature, and many of today's complex scientific problems require domain expertise in a wide range of disciplines. In order to create research groups that can effectively explore such problems, research collaborations are often formed that involve colleagues at many institutions, sometimes spanning a country and often spanning the world. An increasingly common manifestation of such a collaboration is the collaboratory (Bos et al., 2007), a “…center without walls in which the nation's researchers can perform research without regard to geographical location — interacting with colleagues, accessing instrumentation, sharing data and computational resources, and accessing information from digital libraries.” In order to bring groups together on such a scale, a wide range of components need to be available to researchers, including distributed computer systems, remote instrumentation, data storage, collaboration tools, and the financial and human resources to operate and run such a system (National Research Council, 1993). Media Spaces, as both a technology and a social facilitator, have the potential to meet many of these needs. In this chapter, we focus on the use of scientific media spaces (SMS) as a tool for supporting collaboration in scientific research. In particular, we discuss the design, deployment, and use of a set of SMS environments deployed by WestGrid and one of its collaborating organizations, the Centre for Interdisciplinary Research in the Mathematical and Computational Sciences (IRMACS) over a 5-year period.

  2. Lay Theories Regarding Computer-Mediated Communication in Remote Collaboration

    ERIC Educational Resources Information Center

    Parke, Karl; Marsden, Nicola; Connolly, Cornelia

    2017-01-01

    Computer-mediated communication and remote collaboration has become an unexceptional norm as an educational modality for distance and open education, therefore the need to research and analyze students' online learning experience is necessary. This paper seeks to examine the assumptions and expectations held by students in regard to…

  3. SIMULATION STUDY FOR GASEOUS FLUXES FROM AN AREA SOURCE USING COMPUTED TOMOGRAPHY AND OPTICAL REMOTE SENSING

    EPA Science Inventory

    The paper presents a new approach to quantifying emissions from fugitive gaseous air pollution sources. Computed tomography (CT) and path-integrated optical remote sensing (PI-ORS) concentration data are combined in a new field beam geometry. Path-integrated concentrations are ...

  4. Automated Content Synthesis for Interactive Remote Instruction.

    ERIC Educational Resources Information Center

    Maly, K.; Overstreet, C. M.; Gonzalez, A.; Denbar, M. L.; Cutaran, R.; Karunaratne, N.

    This paper describes IRI (Interactive Remote Instruction), a computer-based system built at Old Dominion University (Virginia) in order to support distance education. The system is based on the concept of a virtual classroom where students at different locations have the same synchronous class experience, using networked computers to communicate…

  5. Techtalk: Telecommunications for Improving Developmental Education.

    ERIC Educational Resources Information Center

    Caverly, David C.; Broderick, Bill

    1993-01-01

    Explains how to access the Internet, discussing hardware and software considerations, connectivity, and types of access available to users. Describes the uses of electronic mail; TELNET, a method for remotely logging onto another computer; and anonymous File Transfer Protocol (FTP), a method for downloading files from a remote computer. (MAB)

  6. Remote Viewing and Computer Communications--An Experiment.

    ERIC Educational Resources Information Center

    Vallee, Jacques

    1988-01-01

    A series of remote viewing experiments were run with 12 participants who communicated through a computer conferencing network. The correct target sample was identified in 8 out of 33 cases. This represented more than double the pure chance expectation. Appendices present protocol, instructions, and results of the experiments. (Author/YP)

  7. Comparison of approaches for mobile document image analysis using server supported smartphones

    NASA Astrophysics Data System (ADS)

    Ozarslan, Suleyman; Eren, P. Erhan

    2014-03-01

    With the recent advances in mobile technologies, new capabilities are emerging, such as mobile document image analysis. However, mobile phones are still less powerful than servers, and they have some resource limitations. One approach to overcome these limitations is performing resource-intensive processes of the application on remote servers. In mobile document image analysis, the most resource consuming process is the Optical Character Recognition (OCR) process, which is used to extract text in mobile phone captured images. In this study, our goal is to compare the in-phone and the remote server processing approaches for mobile document image analysis in order to explore their trade-offs. For the inphone approach, all processes required for mobile document image analysis run on the mobile phone. On the other hand, in the remote-server approach, core OCR process runs on the remote server and other processes run on the mobile phone. Results of the experiments show that the remote server approach is considerably faster than the in-phone approach in terms of OCR time, but adds extra delays such as network delay. Since compression and downscaling of images significantly reduce file sizes and extra delays, the remote server approach overall outperforms the in-phone approach in terms of selected speed and correct recognition metrics, if the gain in OCR time compensates for the extra delays. According to the results of the experiments, using the most preferable settings, the remote server approach performs better than the in-phone approach in terms of speed and acceptable correct recognition metrics.

  8. Remote sensing on Indian and public lands

    NASA Technical Reports Server (NTRS)

    Torbert, G. B.; Woll, A. M.

    1972-01-01

    The use of remote sensing techniques by the Bureaus of Indian Affairs and Land Management in planning resource problems, making decisions, writing environmental impact statements, and monitoring their respective programs is investigated. For Indian affairs, data cover the Papago, Fort Apache, San Carlos, and South Dakota Reservations. For the Land Management Office, data cover cadastral surveys, California desert study, range watersheds, and efforts to establish a natural resources information system.

  9. Decentralized Grid Scheduling with Evolutionary Fuzzy Systems

    NASA Astrophysics Data System (ADS)

    Fölling, Alexander; Grimme, Christian; Lepping, Joachim; Papaspyrou, Alexander

    In this paper, we address the problem of finding workload exchange policies for decentralized Computational Grids using an Evolutionary Fuzzy System. To this end, we establish a non-invasive collaboration model on the Grid layer which requires minimal information about the participating High Performance and High Throughput Computing (HPC/HTC) centers and which leaves the local resource managers completely untouched. In this environment of fully autonomous sites, independent users are assumed to submit their jobs to the Grid middleware layer of their local site, which in turn decides on the delegation and execution either on the local system or on remote sites in a situation-dependent, adaptive way. We find for different scenarios that the exchange policies show good performance characteristics not only with respect to traditional metrics such as average weighted response time and utilization, but also in terms of robustness and stability in changing environments.

  10. Analysis of geologic terrain models for determination of optimum SAR sensor configuration and optimum information extraction for exploration of global non-renewable resources. Pilot study: Arkansas Remote Sensing Laboratory, part 1, part 2, and part 3

    NASA Technical Reports Server (NTRS)

    Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.; Stiles, J. A.; Frost, F. S.; Shanmugam, K. S.; Smith, S. A.; Narayanan, V.; Holtzman, J. C. (Principal Investigator)

    1982-01-01

    Computer-generated radar simulations and mathematical geologic terrain models were used to establish the optimum radar sensor operating parameters for geologic research. An initial set of mathematical geologic terrain models was created for three basic landforms and families of simulated radar images were prepared from these models for numerous interacting sensor, platform, and terrain variables. The tradeoffs between the various sensor parameters and the quantity and quality of the extractable geologic data were investigated as well as the development of automated techniques of digital SAR image analysis. Initial work on a texture analysis of SEASAT SAR imagery is reported. Computer-generated radar simulations are shown for combinations of two geologic models and three SAR angles of incidence.

  11. Application of remote sensing to state and regional problems

    NASA Technical Reports Server (NTRS)

    Bouchillon, C. W.; Miller, W. F.; Landphair, H.; Zitta, V. L.

    1974-01-01

    The use of remote sensing techniques to help the state of Mississippi recognize and solve its environmental, resource, and socio-economic problems through inventory, analysis, and monitoring is suggested.

  12. Remote sensing techniques for the detection of soil erosion and the identification of soil conservation practices

    NASA Technical Reports Server (NTRS)

    Pelletier, R. E.; Griffin, R. H.

    1985-01-01

    The following paper is a summary of a number of techniques initiated under the AgRISTARS (Agriculture and Resources Inventory Surveys Through Aerospace Remote Sensing) project for the detection of soil degradation caused by water erosion and the identification of soil conservation practices for resource inventories. Discussed are methods to utilize a geographic information system to determine potential soil erosion through a USLE (Universal Soil Loss Equation) model; application of the Kauth-Thomas Transform to detect present erosional status; and the identification of conservation practices through visual interpretation and a variety of enhancement procedures applied to digital remotely sensed data.

  13. REMOTE SENSING AND GIS FOR WETLANDS

    EPA Science Inventory

    In identifying and characterizing wetland and adjacent features, the use of remote sensor and Geographic Information Systems (GIS) technologies has been valuable. Remote sensors such as photographs and computer-sensor generated images can illustrate conditions of hydrology, exten...

  14. Remote Visualization and Remote Collaboration On Computational Fluid Dynamics

    NASA Technical Reports Server (NTRS)

    Watson, Val; Lasinski, T. A. (Technical Monitor)

    1995-01-01

    A new technology has been developed for remote visualization that provides remote, 3D, high resolution, dynamic, interactive viewing of scientific data (such as fluid dynamics simulations or measurements). Based on this technology, some World Wide Web sites on the Internet are providing fluid dynamics data for educational or testing purposes. This technology is also being used for remote collaboration in joint university, industry, and NASA projects in computational fluid dynamics and wind tunnel testing. Previously, remote visualization of dynamic data was done using video format (transmitting pixel information) such as video conferencing or MPEG movies on the Internet. The concept for this new technology is to send the raw data (e.g., grids, vectors, and scalars) along with viewing scripts over the Internet and have the pixels generated by a visualization tool running on the viewer's local workstation. The visualization tool that is currently used is FAST (Flow Analysis Software Toolkit).

  15. System architecture for asynchronous multi-processor robotic control system

    NASA Technical Reports Server (NTRS)

    Steele, Robert D.; Long, Mark; Backes, Paul

    1993-01-01

    The architecture for the Modular Telerobot Task Execution System (MOTES) as implemented in the Supervisory Telerobotics (STELER) Laboratory is described. MOTES is the software component of the remote site of a local-remote telerobotic system which is being developed for NASA for space applications, in particular Space Station Freedom applications. The system is being developed to provide control and supervised autonomous control to support both space based operation and ground-remote control with time delay. The local-remote architecture places task planning responsibilities at the local site and task execution responsibilities at the remote site. This separation allows the remote site to be designed to optimize task execution capability within a limited computational environment such as is expected in flight systems. The local site task planning system could be placed on the ground where few computational limitations are expected. MOTES is written in the Ada programming language for a multiprocessor environment.

  16. Space education in developing countries in the information era, regional reality and new educational material tendencies: example, South America

    NASA Astrophysics Data System (ADS)

    Sausen, Tania Maria

    The initial activities on space education began right after World War II, in the early 1950s, when USA and USSR started the Space Race. At that time, Space education was only and exclusively available to researchers and technicians working directly in space programs. This new area was restricted only to post-graduate programs (basically master and doctoral degree) or to very specific training programs dedicated for beginners. In South America, at that time there was no kind of activity on space education, simply because there was no activity in space research. In the beginning of the 1970s, Brazil, through INPE, had created masteral and doctoral courses on several space areas such as remote sensing and meteorology. Only in the mid-1980s did Brazil, after a UN request, create its specialisation course on remote sensing dedicated to Latin American professionals. At the same period, the Agustin Codazzi Institute (Bogota, Colombia) began to offer specialisation courses in remote sensing. In South America, educational space programs are currently being created for elementary and high schools and universities, but the author personally estimates that 90% of these educational programs still make use of traditional educational materials — such as books, tutorials, maps and graphics. There is little educational material that uses multimedia resources, advanced computing or communication methods and, basically, these are the materials that are best suited to conduct instructions in remote sensing, GIS, meteorology and astronomy.

  17. Adoption of telemedicine in Scottish remote and rural general practices: a qualitative study.

    PubMed

    King, Gerry; Richards, Helen; Godden, David

    2007-01-01

    We conducted a qualitative interview study to explore the factors that have facilitated and prevented the adoption of telemedicine in general practice in remote and rural Scotland. Face-to-face interviews were carried out with general practitioners (GPs) and practice nurses in 26 of Scotland's most remote practices and five of the seven most rural health boards. The interview study found that GPs were more positive about the use of computers and telemedicine than nurses. Although electronic access to simple data, such as laboratory results, had become widely accepted, most respondents had very little experience of more sophisticated telemedicine applications, such as videoconferencing. There was widespread scepticism about the potential usefulness of clinical applications of telemedicine, although it was perceived to have potential benefit in facilitating access to educational resources. A number of barriers to the adoption of telemedicine were reported, including concerns that videoconferencing could diminish the quality of communication in educational and clinical settings, and that telemedicine would not fit easily with the organizational routines of the practices. Policy-makers should prioritize strategies to develop educational programmes, as these are more likely to succeed than clinical initiatives. It may then follow that clinicians will see opportunities for use in their clinical work.

  18. Development of Internet algorithms and some calculations of power plant COP

    NASA Astrophysics Data System (ADS)

    Ustjuzhanin, E. E.; Ochkov, V. F.; Znamensky, V. E.

    2017-11-01

    The authors have analyzed Internet resources containing information on some thermodynamic properties of technically important substances (the water, the air etc.). There are considered databases those possess such resources and are hosted in organizations (Joint Institute for High Temperatures (Russian Academy of Sciences), Standartinform (Russia), National Institute of Standards and Technology (USA), Institute for Thermal Physics (Siberian Branch of the Russian Academy of Sciences), etc.). Currently, a typical form is an Internet resource that includes a text file, for example, it is a file containing tabulated properties, R = (ρ, s, h…), here ρ - the density, s - the entropy, h - the enthalpy of a substance. It is known a small number of Internet resources those have the following characteristic. The resource allows a customer to realize a number of options, for example: i) to enter the input data, Y = (p, T), here p - the pressure, T - the temperature, ii) to calculate R property using “an exe-file” program, iii) to copy the result X = (p, T, ρ, h, s, …). Recently, some researchers (including the authors of this report) have requested a software (SW) that is designed for R property calculations and has a form of an open interactive (OI) Internet resource (“a client function”, “template”). A computing part of OI resource is linked: 1) with a formula, which is applied to calculate R property, 2) with a Mathcad program, Code_1(R,Y). An interactive part of OI resource is based on Informatics and Internet technologies. We have proposed some methods and tools those are related to this part and let us: a) to post OI resource on a remote server, b) to link a client PC with the remote server, c) to implement a number of options to clients. Among these options, there are: i) to calculate R property at given Y arguments, ii) to copy mathematical formulas, iii) to copy Code_1(R,Y) as a whole. We have developed some OI - resources those are focused on sharing: a) SW that is used to design power plants, for an example, Code - GTP_1(Z,R,Y) and b) client functions those are aimed to determine R properties of the working fluid at fixed points of the thermodynamic cycle. The program let us calculate energy criteria, Z, including the internal coefficient of performance (COP) for a power plant. We have discussed OI resources, among them OI resource that includes Code - GTP_1(Z,R,Y) and connected with a complex power plant included: i) several gas turbines, i) several compressors etc.

  19. Using Avizo Software on the Peregrine System | High-Performance Computing |

    Science.gov Websites

    be run remotely from the Peregrine visualization node. First, launch a TurboVNC remote desktop. Then from a terminal in that remote desktop: % module load avizo % vglrun avizo Running Locally Avizo can

  20. Michigan resource inventories: Characteristics and costs of selected projects using high altitude color infrared imagery. Remote Sensing Project

    NASA Technical Reports Server (NTRS)

    Enslin, W. R.; Hill-Rowley, R.

    1976-01-01

    The procedures and costs associated with mapping land cover/use and forest resources from high altitude color infrared (CIR) imagery are documented through an evaluation of several inventory efforts. CIR photos (1:36,000) were used to classify the forests of Mason County, Michigan into six species groups, three stocking levels, and three maturity classes at a cost of $4.58/sq. km. The forest data allow the pinpointing of marketable concentrations of selected timber types, and facilitate the establishment of new forest management cooperatives. Land cover/use maps and area tabulations were prepared from small scale CIR photography at a cost of $4.28/sq. km. and $3.03/sq. km. to support regional planning programs of two Michigan agencies. procedures were also developed to facilitate analysis of this data with other natural resource information. Eleven thematic maps were generated from Windsor Township, Michigan at a cost of $1,500 by integrating grid-geocoded land cover/use, soils, topographic, and well log data using an analytical computer program.

  1. Documentary of MFENET, a national computer network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shuttleworth, B.O.

    1977-06-01

    The national Magnetic Fusion Energy Computer Network (MFENET) is a newly operational star network of geographically separated heterogeneous hosts and a communications subnetwork of PDP-11 processors. Host processors interfaced to the subnetwork currently include a CDC 7600 at the Central Computer Center (CCC) and several DECsystem-10's at User Service Centers (USC's). The network was funded by a U.S. government agency (ERDA) to provide in an economical manner the needed computational resources to magnetic confinement fusion researchers. Phase I operation of MFENET distributed the processing power of the CDC 7600 among the USC's through the provision of file transport between anymore » two hosts and remote job entry to the 7600. Extending the capabilities of Phase I, MFENET Phase II provided interactive terminal access to the CDC 7600 from the USC's. A file management system is maintained at the CCC for all network users. The history and development of MFENET are discussed, with emphasis on the protocols used to link the host computers and the USC software. Comparisons are made of MFENET versus ARPANET (Advanced Research Projects Agency Computer Network) and DECNET (Digital Distributed Network Architecture). DECNET and MFENET host-to host, host-to-CCP, and link protocols are discussed in detail. The USC--CCP interface is described briefly. 43 figures, 2 tables.« less

  2. Remote Science Operation Center research

    NASA Technical Reports Server (NTRS)

    Banks, P. M.

    1986-01-01

    Progress in the following areas is discussed: the design, planning and operation of a remote science payload operations control center; design and planning of a data link via satellite; and the design and prototyping of an advanced workstation environment for multi-media (3-D computer aided design/computer aided engineering, voice, video, text) communications and operations.

  3. Remote sensing applications to resource problems in South Dakota

    NASA Technical Reports Server (NTRS)

    Myers, V. I. (Principal Investigator)

    1981-01-01

    The procedures used as well as the results obtained and conclusions derived are described for the following applications of remote sensing in South Dakota: (1) sage grouse management; (2) censusing Canada geese; (3) monitoring grasshopper infestation in rangeland; (4) detecting Dutch elm disease in an urban environment; (5) determining water usage from the Belle Fourche River; (6) resource management of the Lower James River; and (7) the National Model Implantation Program: Lake Herman watershed.

  4. CTserver: A Computational Thermodynamics Server for the Geoscience Community

    NASA Astrophysics Data System (ADS)

    Kress, V. C.; Ghiorso, M. S.

    2006-12-01

    The CTserver platform is an Internet-based computational resource that provides on-demand services in Computational Thermodynamics (CT) to a diverse geoscience user base. This NSF-supported resource can be accessed at ctserver.ofm-research.org. The CTserver infrastructure leverages a high-quality and rigorously tested software library of routines for computing equilibrium phase assemblages and for evaluating internally consistent thermodynamic properties of materials, e.g. mineral solid solutions and a variety of geological fluids, including magmas. Thermodynamic models are currently available for 167 phases. Recent additions include Duan, Møller and Weare's model for supercritical C-O-H-S, extended to include SO2 and S2 species, and an entirely new associated solution model for O-S-Fe-Ni sulfide liquids. This software library is accessed via the CORBA Internet protocol for client-server communication. CORBA provides a standardized, object-oriented, language and platform independent, fast, low-bandwidth interface to phase property modules running on the server cluster. Network transport, language translation and resource allocation are handled by the CORBA interface. Users access server functionality in two principal ways. Clients written as browser- based Java applets may be downloaded which provide specific functionality such as retrieval of thermodynamic properties of phases, computation of phase equilibria for systems of specified composition, or modeling the evolution of these systems along some particular reaction path. This level of user interaction requires minimal programming effort and is ideal for classroom use. A more universal and flexible mode of CTserver access involves making remote procedure calls from user programs directly to the server public interface. The CTserver infrastructure relieves the user of the burden of implementing and testing the often complex thermodynamic models of real liquids and solids. A pilot application of this distributed architecture involves CFD computation of magma convection at Volcan Villarrica with magma properties and phase proportions calculated at each spatial node and at each time step via distributed function calls to MELTS-objects executing on the CTserver. Documentation and programming examples are provided at http://ctserver.ofm- research.org.

  5. Airborne Gravity Survey and Ground Gravity in Afghanistan: A Website for Distribution of Data

    USGS Publications Warehouse

    Abraham, Jared D.; Anderson, Eric D.; Drenth, Benjamin J.; Finn, Carol A.; Kucks, Robert P.; Lindsay, Charles R.; Phillips, Jeffrey D.; Sweeney, Ronald E.

    2008-01-01

    Afghanistan?s geologic setting suggests significant natural resource potential. Although important mineral deposits and petroleum resources have been identified, much of the country?s potential remains unknown. Airborne geophysical surveys are a well- accepted and cost-effective method for remotely obtaining information of the geological setting of an area. A regional airborne geophysical survey was proposed due to the security situation and the large areas of Afghanistan that have not been covered using geophysical exploration methods. Acting upon the request of the Islamic Republic of Afghanistan Ministry of Mines, the U.S. Geological Survey contracted with the U.S. Naval Research Laboratory to jointly conduct an airborne geophysical and remote sensing survey of Afghanistan. Data collected during this survey will provide basic information for mineral and petroleum exploration studies that are important for the economic development of Afghanistan. Additionally, use of these data is broadly applicable in the assessment of water resources and natural hazards, the inventory and planning of civil infrastructure and agricultural resources, and the construction of detailed maps. The U.S. Geological Survey is currently working in cooperation with the U.S. Agency of International Development to conduct resource assessments of the country of Afghanistan for mineral, energy, coal, and water resources, and to assess geologic hazards. These geophysical and remote sensing data will be used directly in the resource and hazard assessments.

  6. Applied Remote Sensing Program (ARSP)

    NASA Technical Reports Server (NTRS)

    Mouat, D. A.; Johnson, J. D.; Foster, K. E.

    1977-01-01

    Descriptions of projects engaged by the Applied Remote Sensors Program in the state of Arizona are contained in an annual report for the fiscal year 1976-1977. Remote sensing techniques included thermal infrared imagery in analog and digital form and conversion of data into thermograms. Delineation of geologic areas, surveys of vegetation and inventory of resources were also presented.

  7. Towards a Good Education in Very Remote Australia: Is it Just a Case of Moving the Desks Around?

    ERIC Educational Resources Information Center

    Guenther, John; Bat, Melodie

    2013-01-01

    The education system, as it relates to very remote Aboriginal and Torres Strait Islander communities in Australia, faces challenges. While considerable resources have been applied to very remote schools, results in terms of enrollments, attendance and learning outcomes have changed little, despite the effort applied. The Cooperative Research…

  8. Offshore Wind Resource Characterization | Wind | NREL

    Science.gov Websites

    identify critical data needed. Remote Sensing and Modeling Photo of the SeaZephIR Prototype at sea. 2009 techniques such as remote sensing and modeling to provide data on design conditions. Research includes comparing the data provided by remote sensing devices and models to data collected by traditional methods

  9. CE-ACCE: The Cloud Enabled Advanced sCience Compute Environment

    NASA Astrophysics Data System (ADS)

    Cinquini, L.; Freeborn, D. J.; Hardman, S. H.; Wong, C.

    2017-12-01

    Traditionally, Earth Science data from NASA remote sensing instruments has been processed by building custom data processing pipelines (often based on a common workflow engine or framework) which are typically deployed and run on an internal cluster of computing resources. This approach has some intrinsic limitations: it requires each mission to develop and deploy a custom software package on top of the adopted framework; it makes use of dedicated hardware, network and storage resources, which must be specifically purchased, maintained and re-purposed at mission completion; and computing services cannot be scaled on demand beyond the capability of the available servers.More recently, the rise of Cloud computing, coupled with other advances in containerization technology (most prominently, Docker) and micro-services architecture, has enabled a new paradigm, whereby space mission data can be processed through standard system architectures, which can be seamlessly deployed and scaled on demand on either on-premise clusters, or commercial Cloud providers. In this talk, we will present one such architecture named CE-ACCE ("Cloud Enabled Advanced sCience Compute Environment"), which we have been developing at the NASA Jet Propulsion Laboratory over the past year. CE-ACCE is based on the Apache OODT ("Object Oriented Data Technology") suite of services for full data lifecycle management, which are turned into a composable array of Docker images, and complemented by a plug-in model for mission-specific customization. We have applied this infrastructure to both flying and upcoming NASA missions, such as ECOSTRESS and SMAP, and demonstrated deployment on the Amazon Cloud, either using simple EC2 instances, or advanced AWS services such as Amazon Lambda and ECS (EC2 Container Services).

  10. NASA Earth Resources Survey Symposium. Volume 1-C: Land use, marine resources

    NASA Technical Reports Server (NTRS)

    1975-01-01

    Articles are presented on the utilization of remote sensing data from NASA programs involving LANDSAT, the Skylab Earth resources experiment package, and aircraft, as well as from other data acquisition programs. Emphasis is placed on land use and marine resources.

  11. INTEGRATION OF STATISTICS, REMOTE SENSING AND EXISTING DATA TO LOCATE CHANGES IN LAND RESOURCES

    EPA Science Inventory

    Stability of a nation is dependent on the availability of natural resources. When land is degraded and natural resources become limited, socioeconomic status declines and emigration increases in developing countries. Natural resource utilization without proper management may re...

  12. BioModels.net Web Services, a free and integrated toolkit for computational modelling software.

    PubMed

    Li, Chen; Courtot, Mélanie; Le Novère, Nicolas; Laibe, Camille

    2010-05-01

    Exchanging and sharing scientific results are essential for researchers in the field of computational modelling. BioModels.net defines agreed-upon standards for model curation. A fundamental one, MIRIAM (Minimum Information Requested in the Annotation of Models), standardises the annotation and curation process of quantitative models in biology. To support this standard, MIRIAM Resources maintains a set of standard data types for annotating models, and provides services for manipulating these annotations. Furthermore, BioModels.net creates controlled vocabularies, such as SBO (Systems Biology Ontology) which strictly indexes, defines and links terms used in Systems Biology. Finally, BioModels Database provides a free, centralised, publicly accessible database for storing, searching and retrieving curated and annotated computational models. Each resource provides a web interface to submit, search, retrieve and display its data. In addition, the BioModels.net team provides a set of Web Services which allows the community to programmatically access the resources. A user is then able to perform remote queries, such as retrieving a model and resolving all its MIRIAM Annotations, as well as getting the details about the associated SBO terms. These web services use established standards. Communications rely on SOAP (Simple Object Access Protocol) messages and the available queries are described in a WSDL (Web Services Description Language) file. Several libraries are provided in order to simplify the development of client software. BioModels.net Web Services make one step further for the researchers to simulate and understand the entirety of a biological system, by allowing them to retrieve biological models in their own tool, combine queries in workflows and efficiently analyse models.

  13. Cultural Resource Protection Plan for the Remote-Handled Low-Level Waste Disposal Facility at the Idaho National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pace, Brenda Ringe; Gilbert, Hollie Kae

    2015-05-01

    This plan addresses cultural resource protection procedures to be implemented during construction of the Remote Handled Low Level Waste project at the Idaho National Laboratory. The plan proposes pre-construction review of proposed ground disturbing activities to confirm avoidance of cultural resources. Depending on the final project footprint, cultural resource protection strategies might also include additional survey, protective fencing, cultural resource mapping and relocation of surface artifacts, collection of surface artifacts for permanent curation, confirmation of undisturbed historic canal segments outside the area of potential effects for construction, and/or archaeological test excavations to assess potential subsurface cultural deposits at known culturalmore » resource locations. Additionally, all initial ground disturbing activities will be monitored for subsurface cultural resource finds, cultural resource sensitivity training will be conducted for all construction field personnel, and a stop work procedure will be implemented to guide assessment and protection of any unanticipated discoveries after initial monitoring of ground disturbance.« less

  14. Application of Remote Sensing to the Chesapeake Bay Region. Volume 2: Proceedings

    NASA Technical Reports Server (NTRS)

    Chen, W. T. (Editor); Freas, G. W., Jr. (Editor); Hickman, G. D. (Editor); Pemberton, D. A. (Editor); Wilkerson, T. D. (Editor); Adler, I. (Editor); Laurie, V. J. (Editor)

    1978-01-01

    A conference was held on the application of remote sensing to the Chesapeake Bay region. Copies of the papers, resource contributions, panel discussions, and reports of the working groups are presented.

  15. Remote observing with the Nickel Telescope at Lick Observatory

    NASA Astrophysics Data System (ADS)

    Grigsby, Bryant; Chloros, Konstantinos; Gates, John; Deich, William T. S.; Gates, Elinor; Kibrick, Robert

    2008-07-01

    We describe a project to enable remote observing on the Nickel 1-meter Telescope at Lick Observatory. The purpose was to increase the subscription rate and create more economical means for graduate- and undergraduate students to observe with this telescope. The Nickel Telescope resides in a 125 year old dome on Mount Hamilton. Remote observers may work from any of the University of California (UC) remote observing facilities that have been created to support remote work at both Keck Observatory and Lick Observatory. The project included hardware and software upgrades to enable computer control of all equipment that must be operated by the astronomer; a remote observing architecture that is closely modeled on UCO/Lick's work to implement remote observing between UC campuses and Keck Observatory; new policies to ensure safety of Observatory staff and equipment, while ensuring that the telescope subsystems would be suitably configured for remote use; and new software to enforce the safety-related policies. The results increased the subscription rate from a few nights per month to nearly full subscription, and has spurred the installation of remote observing sites at more UC campuses. Thanks to the increased automation and computer control, local observing has also benefitted and is more efficient. Remote observing is now being implemented for the Shane 3- meter telescope.

  16. Comparison between remote sensing and a dynamic vegetation model for estimating terrestrial primary production of Africa.

    PubMed

    Ardö, Jonas

    2015-12-01

    Africa is an important part of the global carbon cycle. It is also a continent facing potential problems due to increasing resource demand in combination with climate change-induced changes in resource supply. Quantifying the pools and fluxes constituting the terrestrial African carbon cycle is a challenge, because of uncertainties in meteorological driver data, lack of validation data, and potentially uncertain representation of important processes in major ecosystems. In this paper, terrestrial primary production estimates derived from remote sensing and a dynamic vegetation model are compared and quantified for major African land cover types. Continental gross primary production estimates derived from remote sensing were higher than corresponding estimates derived from a dynamic vegetation model. However, estimates of continental net primary production from remote sensing were lower than corresponding estimates from the dynamic vegetation model. Variation was found among land cover classes, and the largest differences in gross primary production were found in the evergreen broadleaf forest. Average carbon use efficiency (NPP/GPP) was 0.58 for the vegetation model and 0.46 for the remote sensing method. Validation versus in situ data of aboveground net primary production revealed significant positive relationships for both methods. A combination of the remote sensing method with the dynamic vegetation model did not strongly affect this relationship. Observed significant differences in estimated vegetation productivity may have several causes, including model design and temperature sensitivity. Differences in carbon use efficiency reflect underlying model assumptions. Integrating the realistic process representation of dynamic vegetation models with the high resolution observational strength of remote sensing may support realistic estimation of components of the carbon cycle and enhance resource monitoring, providing suitable validation data is available.

  17. Hiding the Disk and Network Latency of Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes an algorithm that improves the performance of application-controlled demand paging for out-of-core visualization by hiding the latency of reading data from both local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The paper includes measurements that show that the new multithreaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by two thirds. Visualization runs using data from remote disk actually ran faster than ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

  18. The economic value of remote sensing of earth resources from space: An ERTS overview and the value of continuity of service. Volume 9: Oceans

    NASA Technical Reports Server (NTRS)

    Lietzke, K. R.

    1974-01-01

    The impact of remote sensing upon marine activities and oceanography is presented. The present capabilities of the current Earth Resources Technology Satellite (ERTS-1), as demonstrated by the principal investigators are discussed. Cost savings benefits are quantified in the area of nautical and hygrographic mapping and charting. Benefits are found in aiding coastal zone management and in the fields of weather (marine) prediction, fishery harvesting and management, and potential uses for ocean vegetation. Difficulties in quantification are explained, the primary factor being that remotely sensed information will be of greater benefit as input to forecasting models which have not yet been constructed.

  19. The emerging role of lidar remote sensing in coastal research and resource management

    USGS Publications Warehouse

    Brock, J.C.; Purkis, S.J.

    2009-01-01

    Knowledge of coastal elevation is an essential requirement for resource management and scientific research. Recognizing the vast potential of lidar remote sensing in coastal studies, this Special Issue includes a collection of articles intended to represent the state-of-the-art for lidar investigations of nearshore submerged and emergent ecosystems, coastal morphodynamics, and hazards due to sea-level rise and severe storms. Some current applications for lidar remote sensing described in this Special Issue include bluegreen wavelength lidar used for submarine coastal benthic environments such as coral reef ecosystems, airborne lidar used for shoreline mapping and coastal change detection, and temporal waveform-resolving lidar used for vegetation mapping. ?? 2009 Coastal Education and Research Foundation.

  20. The emerging role of lidar remote sensing in coastal research and resource management

    USGS Publications Warehouse

    Brock, John C.; Purkis, Samuel J.

    2009-01-01

    Knowledge of coastal elevation is an essential requirement for resource management and scientific research. Recognizing the vast potential of lidar remote sensing in coastal studies, this Special Issue includes a collection of articles intended to represent the state-of-the-art for lidar investigations of nearshore submerged and emergent ecosystems, coastal morphodynamics, and hazards due to sea-level rise and severe storms. Some current applications for lidar remote sensing described in this Special Issue include bluegreen wavelength lidar used for submarine coastal benthic environments such as coral reef ecosystems, airborne lidar used for shoreline mapping and coastal change detection, and temporal waveform-resolving lidar used for vegetation mapping.

  1. An analysis of metropolitan land-use by machine processing of earth resources technology satellite data

    NASA Technical Reports Server (NTRS)

    Mausel, P. W.; Todd, W. J.; Baumgardner, M. F.

    1976-01-01

    A successful application of state-of-the-art remote sensing technology in classifying an urban area into its broad land use classes is reported. This research proves that numerous urban features are amenable to classification using ERTS multispectral data automatically processed by computer. Furthermore, such automatic data processing (ADP) techniques permit areal analysis on an unprecedented scale with a minimum expenditure of time. Also, classification results obtained using ADP procedures are consistent, comparable, and replicable. The results of classification are compared with the proposed U. S. G. S. land use classification system in order to determine the level of classification that is feasible to obtain through ERTS analysis of metropolitan areas.

  2. Applications of ISES for vegetation and land use

    NASA Technical Reports Server (NTRS)

    Wilson, R. Gale

    1990-01-01

    Remote sensing relative to applications involving vegetation cover and land use is reviewed to consider the potential benefits to the Earth Observing System (Eos) of a proposed Information Sciences Experiment System (ISES). The ISES concept has been proposed as an onboard experiment and computational resource to support advanced experiments and demonstrations in the information and earth sciences. Embedded in the concept is potential for relieving the data glut problem, enhancing capabilities to meet real-time needs of data users and in-situ researchers, and introducing emerging technology to Eos as the technology matures. These potential benefits are examined in the context of state-of-the-art research activities in image/data processing and management.

  3. Advanced instrumentation for the collection, retrieval, and processing of urban stormwater data

    USGS Publications Warehouse

    Robinson, Jerald B.; Bales, Jerad D.; Young, Wendi S.; ,

    1995-01-01

    The U.S. Geological Survey, in cooperation with the City of Charlotte and Mecklenburg County, North Carolina, has developed a data-collection network that uses advanced instrumentation to automatically collect, retrieve, and process urban stormwater data. Precipitation measurement and water-quality networks provide data for (1) planned watershed simulation models, (2) early warning of possible flooding, (3) computation of material export, and (4) characterization of water quality in relation to basin conditions. Advantages of advanced instrumentation include remote access to real-time data, reduced demands on and more efficient use of limited human resources, and direct importation of data into a geographical information system for display and graphic analysis.

  4. The composite sequential clustering technique for analysis of multispectral scanner data

    NASA Technical Reports Server (NTRS)

    Su, M. Y.

    1972-01-01

    The clustering technique consists of two parts: (1) a sequential statistical clustering which is essentially a sequential variance analysis, and (2) a generalized K-means clustering. In this composite clustering technique, the output of (1) is a set of initial clusters which are input to (2) for further improvement by an iterative scheme. This unsupervised composite technique was employed for automatic classification of two sets of remote multispectral earth resource observations. The classification accuracy by the unsupervised technique is found to be comparable to that by traditional supervised maximum likelihood classification techniques. The mathematical algorithms for the composite sequential clustering program and a detailed computer program description with job setup are given.

  5. Design of a monitor and simulation terminal (master) for space station telerobotics and telescience

    NASA Technical Reports Server (NTRS)

    Lopez, L.; Konkel, C.; Harmon, P.; King, S.

    1989-01-01

    Based on Space Station and planetary spacecraft communication time delays and bandwidth limitations, it will be necessary to develop an intelligent, general purpose ground monitor terminal capable of sophisticated data display and control of on-orbit facilities and remote spacecraft. The basic elements that make up a Monitor and Simulation Terminal (MASTER) include computer overlay video, data compression, forward simulation, mission resource optimization and high level robotic control. Hardware and software elements of a MASTER are being assembled for testbed use. Applications of Neural Networks (NNs) to some key functions of a MASTER are also discussed. These functions are overlay graphics adjustment, object correlation and kinematic-dynamic characterization of the manipulator.

  6. Remote sensing by ERTS satellite of vegetational resources believed to be under possible threat of environmental stress

    NASA Technical Reports Server (NTRS)

    Poonai, P.; Floyd, W. J.; Hall, R.

    1974-01-01

    The distribution of natural vegetation types on North Merritt Island, Florida, was studied by analysis of ERTS multispectral scanner data on the image-100 computer system. The boundaries of six distinct plant associations were located on photos made on the image analyzer, with an insignificant mean error of -24.38 meters. The six plant associations are described; each had a characteristic spectral signature. The difference in average reflectance grey level between the lowest of the four spectral scanning bands and the highest spectral scanning band for the six vegetation types was determined. The decreasing trend of the differences is strongly negatively correlated with height of land.

  7. Utilization of LANDSAT data for water quality surveys in the Choptank River

    NASA Technical Reports Server (NTRS)

    Johnson, J. M.; Cressy, P.; Dallam, W. C.

    1975-01-01

    Computer processing of LANDSAT-1 multispectral digital data demonstrated the applicability of remotely sensed data to water quality survey in the Choptank River. Water classes derived by automated analysis correlate to river nuisance levels of chlorophyll a and sediment loading as defined by the Maryland Department of Water Resources and the U.S. Corps of Engineers. Results indicate that an increase in chlorophyll a concentration corresponds, relative to MSS 5, to decreases in 4 and increases in 6 relative to the trends with increasing sediment load. It appears that for the purpose of water quality analysis, under favorable atmospheric conditions, only MSS 4, 5 and 6 are necessary.

  8. Automated Euler and Navier-Stokes Database Generation for a Glide-Back Booster

    NASA Technical Reports Server (NTRS)

    Chaderjian, Neal M.; Rogers, Stuart E.; Aftosmis, Mike J.; Pandya, Shishir A.; Ahmad, Jasim U.; Tejnil, Edward

    2004-01-01

    The past two decades have seen a sustained increase in the use of high fidelity Computational Fluid Dynamics (CFD) in basic research, aircraft design, and the analysis of post-design issues. As the fidelity of a CFD method increases, the number of cases that can be readily and affordably computed greatly diminishes. However, computer speeds now exceed 2 GHz, hundreds of processors are currently available and more affordable, and advances in parallel CFD algorithms scale more readily with large numbers of processors. All of these factors make it feasible to compute thousands of high fidelity cases. However, there still remains the overwhelming task of monitoring the solution process. This paper presents an approach to automate the CFD solution process. A new software tool, AeroDB, is used to compute thousands of Euler and Navier-Stokes solutions for a 2nd generation glide-back booster in one week. The solution process exploits a common job-submission grid environment, the NASA Information Power Grid (IPG), using 13 computers located at 4 different geographical sites. Process automation and web-based access to a MySql database greatly reduces the user workload, removing much of the tedium and tendency for user input errors. The AeroDB framework is shown. The user submits/deletes jobs, monitors AeroDB's progress, and retrieves data and plots via a web portal. Once a job is in the database, a job launcher uses an IPG resource broker to decide which computers are best suited to run the job. Job/code requirements, the number of CPUs free on a remote system, and queue lengths are some of the parameters the broker takes into account. The Globus software provides secure services for user authentication, remote shell execution, and secure file transfers over an open network. AeroDB automatically decides when a job is completed. Currently, the Cart3D unstructured flow solver is used for the Euler equations, and the Overflow structured overset flow solver is used for the Navier-Stokes equations. Other codes can be readily included into the AeroDB framework.

  9. Remote sensing inputs to National Model Implementation Program for water resources quality improvement

    NASA Technical Reports Server (NTRS)

    Eidenshink, J. C.; Schmer, F. A.

    1979-01-01

    The Lake Herman watershed in southeastern South Dakota has been selected as one of seven water resources systems in the United States for involvement in the National Model Implementation Program (MIP). MIP is a pilot program initiated to illustrate the effectiveness of existing water resources quality improvement programs. The Remote Sensing Institute (RSI) at South Dakota State University has produced a computerized geographic information system for the Lake Herman watershed. All components necessary for the monitoring and evaluation process were included in the data base. The computerized data were used to produce thematic maps and tabular data for the land cover and soil classes within the watershed. These data are being utilized operationally by SCS resource personnel for planning and management purposes.

  10. A Virtual Mission Operations Center: Collaborative Environment

    NASA Technical Reports Server (NTRS)

    Medina, Barbara; Bussman, Marie; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    The Virtual Mission Operations Center - Collaborative Environment (VMOC-CE) intent is to have a central access point for all the resources used in a collaborative mission operations environment to assist mission operators in communicating on-site and off-site in the investigation and resolution of anomalies. It is a framework that as a minimum incorporates online chat, realtime file sharing and remote application sharing components in one central location. The use of a collaborative environment in mission operations opens up the possibilities for a central framework for other project members to access and interact with mission operations staff remotely. The goal of the Virtual Mission Operations Center (VMOC) Project is to identify, develop, and infuse technology to enable mission control by on-call personnel in geographically dispersed locations. In order to achieve this goal, the following capabilities are needed: Autonomous mission control systems Automated systems to contact on-call personnel Synthesis and presentation of mission control status and history information Desktop tools for data and situation analysis Secure mechanism for remote collaboration commanding Collaborative environment for remote cooperative work The VMOC-CE is a collaborative environment that facilitates remote cooperative work. It is an application instance of the Virtual System Design Environment (VSDE), developed by NASA Goddard Space Flight Center's (GSFC) Systems Engineering Services & Advanced Concepts (SESAC) Branch. The VSDE is a web-based portal that includes a knowledge repository and collaborative environment to serve science and engineering teams in product development. It is a "one stop shop" for product design, providing users real-time access to product development data, engineering and management tools, and relevant design specifications and resources through the Internet. The initial focus of the VSDE has been to serve teams working in the early portion of the system/product lifecycle - concept development, proposal preparation, and formulation. The VMOC-CE expands the application of the VSDE into the operations portion of the system lifecycle. It will enable meaningful and real-time collaboration regardless of the geographical distribution of project team members. Team members will be able to interact in satellite operations, specifically for resolving anomalies, through access to a desktop computer and the Internet. Mission Operations Management will be able to participate and monitor up to the minute status of anomalies or other mission operations issues. In this paper we present the VMOC-CE project, system capabilities, and technologies.

  11. SnowCloud - a Framework to Predict Streamflow in Snowmelt-dominated Watersheds Using Cloud-based Computing

    NASA Astrophysics Data System (ADS)

    Sproles, E. A.; Crumley, R. L.; Nolin, A. W.; Mar, E.; Lopez-Moreno, J. J.

    2017-12-01

    Streamflow in snowy mountain regions is extraordinarily challenging to forecast, and prediction efforts are hampered by the lack of timely snow data—particularly in data sparse regions. SnowCloud is a prototype web-based framework that integrates remote sensing, cloud computing, interactive mapping tools, and a hydrologic model to offer a new paradigm for delivering key data to water resource managers. We tested the skill of SnowCloud to forecast monthly streamflow with one month lead time in three snow-dominated headwaters. These watersheds represent a range of precipitation/runoff schemes: the Río Elqui in northern Chile (200 mm/yr, entirely snowmelt); the John Day River, Oregon, USA (635 mm/yr, primarily snowmelt); and the Río Aragon in the northern Spain (850 mm/yr, snowmelt dominated). Model skill corresponded to snowpack contribution with Nash-Sutcliffe Efficiencies of 0.86, 0.52, and 0.21 respectively. SnowCloud does not require the user to possess advanced programming skills or proprietary software. We access NASA's MOD10A1 snow cover product to calculate the snow metrics globally using Google Earth Engine's geospatial analysis and cloud computing service. The analytics and forecast tools are provided through a web-based portal that requires only internet access and minimal training. To test the efficacy of SnowCloud we provided the tools and a series of tutorials in English and Spanish to water resource managers in Chile, Spain, and the United States. Participants assessed their user experience and provided feedback, and the results of our multi-cultural assessment are also presented. While our results focus on SnowCloud, they outline methods to develop cloud-based tools that function effectively across cultures and languages. Our approach also addresses the primary challenges of science-based computing; human resource limitations, infrastructure costs, and expensive proprietary software. These challenges are particularly problematic in developing countries.

  12. From the desktop to the grid: scalable bioinformatics via workflow conversion.

    PubMed

    de la Garza, Luis; Veit, Johannes; Szolek, Andras; Röttig, Marc; Aiche, Stephan; Gesing, Sandra; Reinert, Knut; Kohlbacher, Oliver

    2016-03-12

    Reproducibility is one of the tenets of the scientific method. Scientific experiments often comprise complex data flows, selection of adequate parameters, and analysis and visualization of intermediate and end results. Breaking down the complexity of such experiments into the joint collaboration of small, repeatable, well defined tasks, each with well defined inputs, parameters, and outputs, offers the immediate benefit of identifying bottlenecks, pinpoint sections which could benefit from parallelization, among others. Workflows rest upon the notion of splitting complex work into the joint effort of several manageable tasks. There are several engines that give users the ability to design and execute workflows. Each engine was created to address certain problems of a specific community, therefore each one has its advantages and shortcomings. Furthermore, not all features of all workflow engines are royalty-free -an aspect that could potentially drive away members of the scientific community. We have developed a set of tools that enables the scientific community to benefit from workflow interoperability. We developed a platform-free structured representation of parameters, inputs, outputs of command-line tools in so-called Common Tool Descriptor documents. We have also overcome the shortcomings and combined the features of two royalty-free workflow engines with a substantial user community: the Konstanz Information Miner, an engine which we see as a formidable workflow editor, and the Grid and User Support Environment, a web-based framework able to interact with several high-performance computing resources. We have thus created a free and highly accessible way to design workflows on a desktop computer and execute them on high-performance computing resources. Our work will not only reduce time spent on designing scientific workflows, but also make executing workflows on remote high-performance computing resources more accessible to technically inexperienced users. We strongly believe that our efforts not only decrease the turnaround time to obtain scientific results but also have a positive impact on reproducibility, thus elevating the quality of obtained scientific results.

  13. On the use of satellite data to implement a parsimonious ecohydrological model in the upper Ewaso Ngiro river basin

    NASA Astrophysics Data System (ADS)

    Ruiz-Pérez, G.

    2015-12-01

    Drylands are extensive, covering 30% of the Earth's land surface and 50% of Africa. Projections of the IPCC (Intergovernmental Panel on Climate Change, 2007) indicate that the extent of these regions have high probability to increase with a considerable additional impact on water resources, which should be taken into account by water management plans. In these water-controlled areas, vegetation plays a key role in the water cycle. Ecohydrological models provide a tool to investigate the relationships between vegetation and water resources. However, studies in Africa often face the problem that many ecohydrological models have quite extensive parametrical requirements, while available data are scarce. Therefore, there is a need for assessments using models whose requirements match the data availability. In that context, parsimonious models, together with available remote sensing information, can be valuable tools for ecohydrological studies. For this reason, we have focused on the use of a parsimonious model based on the amount of photosynthetically active radiation absorbed by green vegetation (APAR) and the Light Use Efficiency index (the efficiency by which that radiation is converted to plant biomass increment) in order to compute the gross primary production (GPP).This model has been calibrated using only remote sensing data (particularly, NDVI data from Modis products) in order to explore the potential of satellite information in implementing a simple distributed model. The model has been subsequently validated against stream flow data with the aim to define a tool able to account for landuse characteristics in describing water budget. Results are promising for studies aimed at the description of the consequences of ongoing land use changes on water resources.

  14. ERTS & EROS

    ERIC Educational Resources Information Center

    Geotimes, 1972

    1972-01-01

    Describes the proposed investigations to be conducted with ERTS (Earth Resources Technology Satellite), the first experimental satellite for systematically surveying earth resources by remote sensing. Launching set for June, 1972. (PR)

  15. Encryption for Remote Control via Internet or Intranet

    NASA Technical Reports Server (NTRS)

    Lineberger, Lewis

    2005-01-01

    A data-communication protocol has been devised to enable secure, reliable remote control of processes and equipment via a collision-based network, while using minimal bandwidth and computation. The network could be the Internet or an intranet. Control is made secure by use of both a password and a dynamic key, which is sent transparently to a remote user by the controlled computer (that is, the computer, located at the site of the equipment or process to be controlled, that exerts direct control over the process). The protocol functions in the presence of network latency, overcomes errors caused by missed dynamic keys, and defeats attempts by unauthorized remote users to gain control. The protocol is not suitable for real-time control, but is well suited for applications in which control latencies up to about 0.5 second are acceptable. The encryption scheme involves the use of both a dynamic and a private key, without any additional overhead that would degrade performance. The dynamic key is embedded in the equipment- or process-monitor data packets sent out by the controlled computer: in other words, the dynamic key is a subset of the data in each such data packet. The controlled computer maintains a history of the last 3 to 5 data packets for use in decrypting incoming control commands. In addition, the controlled computer records a private key (password) that is given to the remote computer. The encrypted incoming command is permuted by both the dynamic and private key. A person who records the command data in a given packet for hostile purposes cannot use that packet after the public key expires (typically within 3 seconds). Even a person in possession of an unauthorized copy of the command/remote-display software cannot use that software in the absence of the password. The use of a dynamic key embedded in the outgoing data makes the central-processing unit overhead very small. The use of a National Instruments DataSocket(TradeMark) (or equivalent) protocol or the User Datagram Protocol makes it possible to obtain reasonably short response times: Typical response times in event-driven control, using packets sized .300 bytes, are <0.2 second for commands issued from locations anywhere on Earth. The protocol requires that control commands represent absolute values of controlled parameters (e.g., a specified temperature), as distinguished from changes in values of controlled parameters (e.g., a specified increment of temperature). Each command is issued three or more times to ensure delivery in crowded networks. The use of absolute-value commands prevents additional (redundant) commands from causing trouble. Because a remote controlling computer receives "talkback" in the form of data packets from the controlled computer, typically within a time interval < or =1 s, the controlling computer can re-issue a command if network failure has occurred. The controlled computer, the process or equipment that it controls, and any human operator(s) at the site of the controlled equipment or process should be equipped with safety measures to prevent damage to equipment or injury to humans. These features could be a combination of software, external hardware, and intervention by the human operator(s). The protocol is not fail-safe, but by adopting these safety measures as part of the protocol, one makes the protocol a robust means of controlling remote processes and equipment by use of typical office computers via intranets and/or the Internet.

  16. LEMON - LHC Era Monitoring for Large-Scale Infrastructures

    NASA Astrophysics Data System (ADS)

    Marian, Babik; Ivan, Fedorko; Nicholas, Hook; Hector, Lansdale Thomas; Daniel, Lenkes; Miroslav, Siket; Denis, Waldron

    2011-12-01

    At the present time computer centres are facing a massive rise in virtualization and cloud computing as these solutions bring advantages to service providers and consolidate the computer centre resources. However, as a result the monitoring complexity is increasing. Computer centre management requires not only to monitor servers, network equipment and associated software but also to collect additional environment and facilities data (e.g. temperature, power consumption, cooling efficiency, etc.) to have also a good overview of the infrastructure performance. The LHC Era Monitoring (Lemon) system is addressing these requirements for a very large scale infrastructure. The Lemon agent that collects data on every client and forwards the samples to the central measurement repository provides a flexible interface that allows rapid development of new sensors. The system allows also to report on behalf of remote devices such as switches and power supplies. Online and historical data can be visualized via a web-based interface or retrieved via command-line tools. The Lemon Alarm System component can be used for notifying the operator about error situations. In this article, an overview of the Lemon monitoring is provided together with a description of the CERN LEMON production instance. No direct comparison is made with other monitoring tool.

  17. Recognizing sights, smells, and sounds with gnostic fields.

    PubMed

    Kanan, Christopher

    2013-01-01

    Mammals rely on vision, audition, and olfaction to remotely sense stimuli in their environment. Determining how the mammalian brain uses this sensory information to recognize objects has been one of the major goals of psychology and neuroscience. Likewise, researchers in computer vision, machine audition, and machine olfaction have endeavored to discover good algorithms for stimulus classification. Almost 50 years ago, the neuroscientist Jerzy Konorski proposed a theoretical model in his final monograph in which competing sets of "gnostic" neurons sitting atop sensory processing hierarchies enabled stimuli to be robustly categorized, despite variations in their presentation. Much of what Konorski hypothesized has been remarkably accurate, and neurons with gnostic-like properties have been discovered in visual, aural, and olfactory brain regions. Surprisingly, there have not been any attempts to directly transform his theoretical model into a computational one. Here, I describe the first computational implementation of Konorski's theory. The model is not domain specific, and it surpasses the best machine learning algorithms on challenging image, music, and olfactory classification tasks, while also being simpler. My results suggest that criticisms of exemplar-based models of object recognition as being computationally intractable due to limited neural resources are unfounded.

  18. Recognizing Sights, Smells, and Sounds with Gnostic Fields

    PubMed Central

    Kanan, Christopher

    2013-01-01

    Mammals rely on vision, audition, and olfaction to remotely sense stimuli in their environment. Determining how the mammalian brain uses this sensory information to recognize objects has been one of the major goals of psychology and neuroscience. Likewise, researchers in computer vision, machine audition, and machine olfaction have endeavored to discover good algorithms for stimulus classification. Almost 50 years ago, the neuroscientist Jerzy Konorski proposed a theoretical model in his final monograph in which competing sets of “gnostic” neurons sitting atop sensory processing hierarchies enabled stimuli to be robustly categorized, despite variations in their presentation. Much of what Konorski hypothesized has been remarkably accurate, and neurons with gnostic-like properties have been discovered in visual, aural, and olfactory brain regions. Surprisingly, there have not been any attempts to directly transform his theoretical model into a computational one. Here, I describe the first computational implementation of Konorski's theory. The model is not domain specific, and it surpasses the best machine learning algorithms on challenging image, music, and olfactory classification tasks, while also being simpler. My results suggest that criticisms of exemplar-based models of object recognition as being computationally intractable due to limited neural resources are unfounded. PMID:23365648

  19. The Perfect Neuroimaging-Genetics-Computation Storm: Collision of Petabytes of Data, Millions of Hardware Devices and Thousands of Software Tools

    PubMed Central

    Dinov, Ivo D.; Petrosyan, Petros; Liu, Zhizhong; Eggert, Paul; Zamanyan, Alen; Torri, Federica; Macciardi, Fabio; Hobel, Sam; Moon, Seok Woo; Sung, Young Hee; Jiang, Zhiguo; Labus, Jennifer; Kurth, Florian; Ashe-McNalley, Cody; Mayer, Emeran; Vespa, Paul M.; Van Horn, John D.; Toga, Arthur W.

    2013-01-01

    The volume, diversity and velocity of biomedical data are exponentially increasing providing petabytes of new neuroimaging and genetics data every year. At the same time, tens-of-thousands of computational algorithms are developed and reported in the literature along with thousands of software tools and services. Users demand intuitive, quick and platform-agnostic access to data, software tools, and infrastructure from millions of hardware devices. This explosion of information, scientific techniques, computational models, and technological advances leads to enormous challenges in data analysis, evidence-based biomedical inference and reproducibility of findings. The Pipeline workflow environment provides a crowd-based distributed solution for consistent management of these heterogeneous resources. The Pipeline allows multiple (local) clients and (remote) servers to connect, exchange protocols, control the execution, monitor the states of different tools or hardware, and share complete protocols as portable XML workflows. In this paper, we demonstrate several advanced computational neuroimaging and genetics case-studies, and end-to-end pipeline solutions. These are implemented as graphical workflow protocols in the context of analyzing imaging (sMRI, fMRI, DTI), phenotypic (demographic, clinical), and genetic (SNP) data. PMID:23975276

  20. Accelerating Demand Paging for Local and Remote Out-of-Core Visualization

    NASA Technical Reports Server (NTRS)

    Ellsworth, David

    2001-01-01

    This paper describes a new algorithm that improves the performance of application-controlled demand paging for the out-of-core visualization of data sets that are on either local disks or disks on remote servers. The performance improvements come from better overlapping the computation with the page reading process, and by performing multiple page reads in parallel. The new algorithm can be applied to many different visualization algorithms since application-controlled demand paging is not specific to any visualization algorithm. The paper includes measurements that show that the new multi-threaded paging algorithm decreases the time needed to compute visualizations by one third when using one processor and reading data from local disk. The time needed when using one processor and reading data from remote disk decreased by up to 60%. Visualization runs using data from remote disk ran about as fast as ones using data from local disk because the remote runs were able to make use of the remote server's high performance disk array.

Top