System enhancements of Mesoscale Analysis and Space Sensor (MASS) computer system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.
1985-01-01
The interactive information processing for the mesoscale analysis and space sensor (MASS) program is reported. The development and implementation of new spaceborne remote sensing technology to observe and measure atmospheric processes is described. The space measurements and conventional observational data are processed together to gain an improved understanding of the mesoscale structure and dynamical evolution of the atmosphere relative to cloud development and precipitation processes. A Research Computer System consisting of three primary computers was developed (HP-1000F, Perkin-Elmer 3250, and Harris/6) which provides a wide range of capabilities for processing and displaying interactively large volumes of remote sensing data. The development of a MASS data base management and analysis system on the HP-1000F computer and extending these capabilities by integration with the Perkin-Elmer and Harris/6 computers using the MSFC's Apple III microcomputer workstations is described. The objectives are: to design hardware enhancements for computer integration and to provide data conversion and transfer between machines.
Practical applications of remote sensing technology
NASA Technical Reports Server (NTRS)
Whitmore, Roy A., Jr.
1990-01-01
Land managers increasingly are becoming dependent upon remote sensing and automated analysis techniques for information gathering and synthesis. Remote sensing and geographic information system (GIS) techniques provide quick and economical information gathering for large areas. The outputs of remote sensing classification and analysis are most effective when combined with a total natural resources data base within the capabilities of a computerized GIS. Some examples are presented of the successes, as well as the problems, in integrating remote sensing and geographic information systems. The need to exploit remotely sensed data and the potential that geographic information systems offer for managing and analyzing such data continues to grow. New microcomputers with vastly enlarged memory, multi-fold increases in operating speed and storage capacity that was previously available only on mainframe computers are a reality. Improved raster GIS software systems have been developed for these high performance microcomputers. Vector GIS systems previously reserved for mini and mainframe systems are available to operate on these enhanced microcomputers. One of the more exciting areas that is beginning to emerge is the integration of both raster and vector formats on a single computer screen. This technology will allow satellite imagery or digital aerial photography to be presented as a background to a vector display.
Integrated instrumentation & computation environment for GRACE
NASA Astrophysics Data System (ADS)
Dhekne, P. S.
2002-03-01
The project GRACE (Gamma Ray Astrophysics with Coordinated Experiments) aims at setting up a state of the art Gamma Ray Observatory at Mt. Abu, Rajasthan for undertaking comprehensive scientific exploration over a wide spectral window (10's keV - 100's TeV) from a single location through 4 coordinated experiments. The cumulative data collection rate of all the telescopes is expected to be about 1 GB/hr, necessitating innovations in the data management environment. As real-time data acquisition and control as well as off-line data processing, analysis and visualization environment of these systems is based on the us cutting edge and affordable technologies in the field of computers, communications and Internet. We propose to provide a single, unified environment by seamless integration of instrumentation and computations by taking advantage of the recent advancements in Web based technologies. This new environment will allow researchers better acces to facilities, improve resource utilization and enhance collaborations by having identical environments for online as well as offline usage of this facility from any location. We present here a proposed implementation strategy for a platform independent web-based system that supplements automated functions with video-guided interactive and collaborative remote viewing, remote control through virtual instrumentation console, remote acquisition of telescope data, data analysis, data visualization and active imaging system. This end-to-end web-based solution will enhance collaboration among researchers at the national and international level for undertaking scientific studies, using the telescope systems of the GRACE project.
Remote Neural Pendants In A Welding-Control System
NASA Technical Reports Server (NTRS)
Venable, Richard A.; Bucher, Joseph H.
1995-01-01
Neural network integrated circuits enhance functionalities of both remote terminals (called "pendants") and communication links, without necessitating installation of additional wires in links. Makes possible to incorporate many features into pendant, including real-time display of critical welding parameters and other process information, capability for communication between technician at pendant and host computer or technician elsewhere in system, and switches and potentiometers through which technician at pendant exerts remote control over such critical aspects of welding process as current, voltage, rate of travel, flow of gas, starting, and stopping. Other potential manufacturing applications include control of spray coating and of curing of composite materials. Potential nonmanufacturing uses include remote control of heating, air conditioning, and lighting in electrically noisy and otherwise hostile environments.
The paper presents a new approach to quantifying emissions from fugitive gaseous air pollution sources. Computed tomography (CT) and path-integrated optical remote sensing (PI-ORS) concentration data are combined in a new field beam geometry. Path-integrated concentrations are ...
New space sensor and mesoscale data analysis
NASA Technical Reports Server (NTRS)
Hickey, John S.
1987-01-01
The developed Earth Science and Application Division (ESAD) system/software provides the research scientist with the following capabilities: an extensive data base management capibility to convert various experiment data types into a standard format; and interactive analysis and display package (AVE80); an interactive imaging/color graphics capability utilizing the Apple III and IBM PC workstations integrated into the ESAD computer system; and local and remote smart-terminal capability which provides color video, graphics, and Laserjet output. Recommendations for updating and enhancing the performance of the ESAD computer system are listed.
Mesoscale and severe storms (Mass) data management and analysis system
NASA Technical Reports Server (NTRS)
Hickey, J. S.; Karitani, S.; Dickerson, M.
1984-01-01
Progress on the Mesoscale and Severe Storms (MASS) data management and analysis system is described. An interactive atmospheric data base management software package to convert four types of data (Sounding, Single Level, Grid, Image) into standard random access formats is implemented and integrated with the MASS AVE80 Series general purpose plotting and graphics display data analysis software package. An interactive analysis and display graphics software package (AVE80) to analyze large volumes of conventional and satellite derived meteorological data is enhanced to provide imaging/color graphics display utilizing color video hardware integrated into the MASS computer system. Local and remote smart-terminal capability is provided by installing APPLE III computer systems within individual scientist offices and integrated with the MASS system, thus providing color video display, graphics, and characters display of the four data types.
Methods of training the graduate level and professional geologist in remote sensing technology
NASA Technical Reports Server (NTRS)
Kolm, K. E.
1981-01-01
Requirements for a basic course in remote sensing to accommodate the needs of the graduate level and professional geologist are described. The course should stress the general topics of basic remote sensing theory, the theory and data types relating to different remote sensing systems, an introduction to the basic concepts of computer image processing and analysis, the characteristics of different data types, the development of methods for geological interpretations, the integration of all scales and data types of remote sensing in a given study, the integration of other data bases (geophysical and geochemical) into a remote sensing study, and geological remote sensing applications. The laboratories should stress hands on experience to reinforce the concepts and procedures presented in the lecture. The geologist should then be encouraged to pursue a second course in computer image processing and analysis of remotely sensed data.
Wong, Kit Fai
2011-01-01
Virtual blood bank is the computer-controlled, electronically linked information management system that allows online ordering and real-time, remote delivery of blood for transfusion. It connects the site of testing to the point of care at a remote site in a real-time fashion with networked computers thus maintaining the integrity of immunohematology test results. It has taken the advantages of information and communication technologies to ensure the accuracy of patient, specimen and blood component identification and to enhance personnel traceability and system security. The built-in logics and process constraints in the design of the virtual blood bank can guide the selection of appropriate blood and minimize transfusion risk. The quality of blood inventory is ascertained and monitored, and an audit trail for critical procedures in the transfusion process is provided by the paperless system. Thus, the virtual blood bank can help ensure that the right patient receives the right amount of the right blood component at the right time. PMID:21383930
NASA Technical Reports Server (NTRS)
Millwater, Harry; Riha, David
1996-01-01
The NESSUS and NASTRAN computer codes were successfully integrated. The enhanced NESSUS code will use NASTRAN for the structural Analysis and NESSUS for the probabilistic analysis. Any quantities in the NASTRAN bulk data input can be random variables. Any NASTRAN result that is written to the output2 file can be returned to NESSUS as the finite element result. The interfacing between NESSUS and NASTRAN is handled automatically by NESSUS. NESSUS and NASTRAN can be run on different machines using the remote host option.
a Hadoop-Based Distributed Framework for Efficient Managing and Processing Big Remote Sensing Images
NASA Astrophysics Data System (ADS)
Wang, C.; Hu, F.; Hu, X.; Zhao, S.; Wen, W.; Yang, C.
2015-07-01
Various sensors from airborne and satellite platforms are producing large volumes of remote sensing images for mapping, environmental monitoring, disaster management, military intelligence, and others. However, it is challenging to efficiently storage, query and process such big data due to the data- and computing- intensive issues. In this paper, a Hadoop-based framework is proposed to manage and process the big remote sensing data in a distributed and parallel manner. Especially, remote sensing data can be directly fetched from other data platforms into the Hadoop Distributed File System (HDFS). The Orfeo toolbox, a ready-to-use tool for large image processing, is integrated into MapReduce to provide affluent image processing operations. With the integration of HDFS, Orfeo toolbox and MapReduce, these remote sensing images can be directly processed in parallel in a scalable computing environment. The experiment results show that the proposed framework can efficiently manage and process such big remote sensing data.
Navigation of military and space unmanned ground vehicles in unstructured terrains
NASA Technical Reports Server (NTRS)
Lescoe, Paul; Lavery, David; Bedard, Roger
1991-01-01
Development of unmanned vehicles for local navigation in terrains unstructured by humans is reviewed. Modes of navigation include teleoperation or remote control, computer assisted remote driving (CARD), and semiautonomous navigation (SAN). A first implementation of a CARD system was successfully tested using the Robotic Technology Test Vehicle developed by Jet Propulsion Laboratory. Stereo pictures were transmitted to a remotely located human operator, who performed the sensing, perception, and planning functions of navigation. A computer provided range and angle measurements and the path plan was transmitted to the vehicle which autonomously executed the path. This implementation is to be enhanced by providing passive stereo vision and a reflex control system for autonomously stopping the vehicle if blocked by an obstacle. SAN achievements include implementation of a navigation testbed on a six wheel, three-body articulated rover vehicle, development of SAN algorithms and code, integration of SAN software onto the vehicle, and a successful feasibility demonstration that represents a step forward towards the technology required for long-range exploration of the lunar or Martian surface. The vehicle includes a passive stereo vision system with real-time area-based stereo image correlation, a terrain matcher, a path planner, and a path execution planner.
Autonomous control systems: applications to remote sensing and image processing
NASA Astrophysics Data System (ADS)
Jamshidi, Mohammad
2001-11-01
One of the main challenges of any control (or image processing) paradigm is being able to handle complex systems under unforeseen uncertainties. A system may be called complex here if its dimension (order) is too high and its model (if available) is nonlinear, interconnected, and information on the system is uncertain such that classical techniques cannot easily handle the problem. Examples of complex systems are power networks, space robotic colonies, national air traffic control system, and integrated manufacturing plant, the Hubble Telescope, the International Space Station, etc. Soft computing, a consortia of methodologies such as fuzzy logic, neuro-computing, genetic algorithms and genetic programming, has proven to be powerful tools for adding autonomy and semi-autonomy to many complex systems. For such systems the size of soft computing control architecture will be nearly infinite. In this paper new paradigms using soft computing approaches are utilized to design autonomous controllers and image enhancers for a number of application areas. These applications are satellite array formations for synthetic aperture radar interferometry (InSAR) and enhancement of analog and digital images.
Remote Sensing: A valuable tool in the Forest Service decision making process. [in Utah
NASA Technical Reports Server (NTRS)
Stanton, F. L.
1975-01-01
Forest Service studies for integrating remotely sensed data into existing information systems highlight a need to: (1) re-examine present methods of collecting and organizing data, (2) develop an integrated information system for rapidly processing and interpreting data, (3) apply existing technological tools in new ways, and (4) provide accurate and timely information for making right management decisions. The Forest Service developed an integrated information system using remote sensors, microdensitometers, computer hardware and software, and interactive accessories. Their efforts substantially reduce the time it takes for collecting and processing data.
Adopting Cloud Computing in the Pakistan Navy
2015-06-01
administrative aspect is required to operate optimally, provide synchronized delivery of cloud services, and integrate multi-provider cloud environment...AND ABBREVIATIONS ANSI American National Standards Institute AWS Amazon web services CIA Confidentiality Integrity Availability CIO Chief...also adopted cloud computing as an integral component of military operations conducted either locally or remotely. With the use of 2 cloud services
NASA Technical Reports Server (NTRS)
Pascucci, R. F.; Smith, A.
1982-01-01
To assist the U.S. Geological Survey in carrying out a Congressional mandate to investigate the use of side-looking airborne radar (SLAR) for resources exploration, a research program was conducted to define the contribution of SLAR imagery to structural geologic mapping and to compare this with contributions from other remote sensing systems. Imagery from two SLAR systems and from three other remote sensing systems was interpreted, and the resulting information was digitized, quantified and intercompared using a computer-assisted geographic information system (GIS). The study area covers approximately 10,000 square miles within the Naval Petroleum Reserve, Alaska, and is situated between the foothills of the Brooks Range and the North Slope. The principal objectives were: (1) to establish quantitatively, the total information contribution of each of the five remote sensing systems to the mapping of structural geology; (2) to determine the amount of information detected in common when the sensors are used in combination; and (3) to determine the amount of unique, incremental information detected by each sensor when used in combination with others. The remote sensor imagery that was investigated included real-aperture and synthetic-aperture radar imagery, standard and digitally enhanced LANDSAT MSS imagery, and aerial photos.
Calculations of atmospheric refraction for spacecraft remote-sensing applications
NASA Technical Reports Server (NTRS)
Chu, W. P.
1983-01-01
Analytical solutions to the refraction integrals appropriate for ray trajectories along slant paths through the atmosphere are derived in this paper. This type of geometry is commonly encountered in remote-sensing applications utilizing an occultation technique. The solutions are obtained by evaluating higher-order terms from expansion of the refraction integral and are dependent on the vertical temperature distributions. Refraction parameters such as total refraction angles, air masses, and path lengths can be accurately computed. It is also shown that the method can be used for computing refraction parameters in astronomical refraction geometry for large zenith angles.
1994-06-28
developing Unmanned Aerial Vehicles, not for military use, but for civilian use3, such as remote news coverage and remote tourism by broadcasting live...Interoperability, and Integration of (’ommand, (Control, (’ ommunications , Computers, and Intelligence Systems. CJCS Instruction no. 6212.01, Washington, D.C.: U.S
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1996-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Remote information service access system based on a client-server-service model
Konrad, A.M.
1997-12-09
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1999-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
Remote information service access system based on a client-server-service model
Konrad, A.M.
1996-08-06
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user`s local host, thereby providing ease of use and minimal software maintenance for users of that remote service. 16 figs.
Remote information service access system based on a client-server-service model
Konrad, Allan M.
1997-01-01
A local host computing system, a remote host computing system as connected by a network, and service functionalities: a human interface service functionality, a starter service functionality, and a desired utility service functionality, and a Client-Server-Service (CSS) model is imposed on each service functionality. In one embodiment, this results in nine logical components and three physical components (a local host, a remote host, and an intervening network), where two of the logical components are integrated into one Remote Object Client component, and that Remote Object Client component and the other seven logical components are deployed among the local host and remote host in a manner which eases compatibility and upgrade problems, and provides an illusion to a user that a desired utility service supported on a remote host resides locally on the user's local host, thereby providing ease of use and minimal software maintenance for users of that remote service.
NASA Technical Reports Server (NTRS)
Parada, N. D. J. (Principal Investigator); Paradella, W. R.; Vitorello, I.
1982-01-01
Several aspects of computer-assisted analysis techniques for image enhancement and thematic classification by which LANDSAT MSS imagery may be treated quantitatively are explained. On geological applications, computer processing of digital data allows, possibly, the fullest use of LANDSAT data, by displaying enhanced and corrected data for visual analysis and by evaluating and assigning each spectral pixel information to a given class.
RIP-REMOTE INTERACTIVE PARTICLE-TRACER
NASA Technical Reports Server (NTRS)
Rogers, S. E.
1994-01-01
Remote Interactive Particle-tracing (RIP) is a distributed-graphics program which computes particle traces for computational fluid dynamics (CFD) solution data sets. A particle trace is a line which shows the path a massless particle in a fluid will take; it is a visual image of where the fluid is going. The program is able to compute and display particle traces at a speed of about one trace per second because it runs on two machines concurrently. The data used by the program is contained in two files. The solution file contains data on density, momentum and energy quantities of a flow field at discrete points in three-dimensional space, while the grid file contains the physical coordinates of each of the discrete points. RIP requires two computers. A local graphics workstation interfaces with the user for program control and graphics manipulation, and a remote machine interfaces with the solution data set and performs time-intensive computations. The program utilizes two machines in a distributed mode for two reasons. First, the data to be used by the program is usually generated on the supercomputer. RIP avoids having to convert and transfer the data, eliminating any memory limitations of the local machine. Second, as computing the particle traces can be computationally expensive, RIP utilizes the power of the supercomputer for this task. Although the remote site code was developed on a CRAY, it is possible to port this to any supercomputer class machine with a UNIX-like operating system. Integration of a velocity field from a starting physical location produces the particle trace. The remote machine computes the particle traces using the particle-tracing subroutines from PLOT3D/AMES, a CFD post-processing graphics program available from COSMIC (ARC-12779). These routines use a second-order predictor-corrector method to integrate the velocity field. Then the remote program sends graphics tokens to the local machine via a remote-graphics library. The local machine interprets the graphics tokens and draws the particle traces. The program is menu driven. RIP is implemented on the silicon graphics IRIS 3000 (local workstation) with an IRIX operating system and on the CRAY2 (remote station) with a UNICOS 1.0 or 2.0 operating system. The IRIS 4D can be used in place of the IRIS 3000. The program is written in C (67%) and FORTRAN 77 (43%) and has an IRIS memory requirement of 4 MB. The remote and local stations must use the same user ID. PLOT3D/AMES unformatted data sets are required for the remote machine. The program was developed in 1988.
Fincke, E Michael; Padalka, Gennady; Lee, Doohi; van Holsbeeck, Marnix; Sargsyan, Ashot E; Hamilton, Douglas R; Martin, David; Melton, Shannon L; McFarlin, Kellie; Dulchavsky, Scott A
2005-02-01
Investigative procedures were approved by Henry Ford Human Investigation Committee and NASA Johnson Space Center Committee for Protection of Human Subjects. Informed consent was obtained. Authors evaluated ability of nonphysician crewmember to obtain diagnostic-quality musculoskeletal ultrasonographic (US) data of the shoulder by following a just-in-time training algorithm and using real-time remote guidance aboard the International Space Station (ISS). ISS Expedition-9 crewmembers attended a 2.5-hour didactic and hands-on US training session 4 months before launch. Aboard the ISS, they completed a 1-hour computer-based Onboard Proficiency Enhancement program 7 days before examination. Crewmembers did not receive specific training in shoulder anatomy or shoulder US techniques. Evaluation of astronaut shoulder integrity was done by using a Human Research Facility US system. Crew used special positioning techniques for subject and operator to facilitate US in microgravity environment. Common anatomic reference points aided initial probe placement. Real-time US video of shoulder was transmitted to remote experienced sonologists in Telescience Center at Johnson Space Center. Probe manipulation and equipment adjustments were guided with verbal commands from remote sonologists to astronaut operators to complete rotator cuff evaluation. Comprehensive US of crewmember's shoulder included transverse and longitudinal images of biceps and supraspinatus tendons and articular cartilage surface. Total examination time required to guide astronaut operator to acquire necessary images was approximately 15 minutes. Multiple arm and probe positions were used to acquire dynamic video images that were of excellent quality to allow evaluation of shoulder integrity. Postsession download and analysis of high-fidelity US images collected onboard demonstrated additional anatomic detail that could be used to exclude subtle injury. Musculoskeletal US can be performed in space by minimally trained operators by using remote guidance. This technique can be used to evaluate shoulder integrity in symptomatic crewmembers after strenuous extravehicular activities or to monitor microgravity-associated changes in musculoskeletal anatomy. Just-in-time training, combined with remote experienced physician guidance, may provide a useful approach to complex medical tasks performed by nonexperienced personnel in a variety of remote settings, including current and future space programs. (c) RSNA, 2004.
NASA Technical Reports Server (NTRS)
Fincke, E. Michael; Padalka, Gennady; Lee, Doohi; van Holsbeeck, Marnix; Sargsyan, Ashot E.; Hamilton, Douglas R.; Martin, David; Melton, Shannon L.; McFarlin, Kellie; Dulchavsky, Scott A.
2005-01-01
Investigative procedures were approved by Henry Ford Human Investigation Committee and NASA Johnson Space Center Committee for Protection of Human Subjects. Informed consent was obtained. Authors evaluated ability of nonphysician crewmember to obtain diagnostic-quality musculoskeletal ultrasonographic (US) data of the shoulder by following a just-in-time training algorithm and using real-time remote guidance aboard the International Space Station (ISS). ISS Expedition-9 crewmembers attended a 2.5-hour didactic and hands-on US training session 4 months before launch. Aboard the ISS, they completed a 1-hour computer-based Onboard Proficiency Enhancement program 7 days before examination. Crewmembers did not receive specific training in shoulder anatomy or shoulder US techniques. Evaluation of astronaut shoulder integrity was done by using a Human Research Facility US system. Crew used special positioning techniques for subject and operator to facilitate US in microgravity environment. Common anatomic reference points aided initial probe placement. Real-time US video of shoulder was transmitted to remote experienced sonologists in Telescience Center at Johnson Space Center. Probe manipulation and equipment adjustments were guided with verbal commands from remote sonologists to astronaut operators to complete rotator cuff evaluation. Comprehensive US of crewmember's shoulder included transverse and longitudinal images of biceps and supraspinatus tendons and articular cartilage surface. Total examination time required to guide astronaut operator to acquire necessary images was approximately 15 minutes. Multiple arm and probe positions were used to acquire dynamic video images that were of excellent quality to allow evaluation of shoulder integrity. Postsession download and analysis of high-fidelity US images collected onboard demonstrated additional anatomic detail that could be used to exclude subtle injury. Musculoskeletal US can be performed in space by minimally trained operators by using remote guidance. This technique can be used to evaluate shoulder integrity in symptomatic crewmembers after strenuous extravehicular activities or to monitor microgravity-associated changes in musculoskeletal anatomy. Just-in-time training, combined with remote experienced physician guidance, may provide a useful approach to complex medical tasks performed by nonexperienced personnel in a variety of remote settings, including current and future space programs. (c) RSNA, 2004.
Integration of remote sensing and surface geophysics in the detection of faults
NASA Technical Reports Server (NTRS)
Jackson, P. L.; Shuchman, R. A.; Wagner, H.; Ruskey, F.
1977-01-01
Remote sensing was included in a comprehensive investigation of the use of geophysical techniques to aid in underground mine placement. The primary objective was to detect faults and slumping, features which, due to structural weakness and excess water, cause construction difficulties and safety hazards in mine construction. Preliminary geologic reconnaissance was performed on a potential site for an underground oil shale mine in the Piceance Creek Basin of Colorado. LANDSAT data, black and white aerial photography and 3 cm radar imagery were obtained. LANDSAT data were primarily used in optical imagery and digital tape forms, both of which were analyzed and enhanced by computer techniques. The aerial photography and radar data offered supplemental information. Surface linears in the test area were located and mapped principally from LANDSAT data. A specific, relatively wide, linear pointed directly toward the test site, but did not extend into it. Density slicing, ratioing, and edge enhancement of the LANDSAT data all indicated the existence of this linear. Radar imagery marginally confirmed the linear, while aerial photography did not confirm it.
A comparison of operational remote sensing-based models for estimating crop evapotranspiration
USDA-ARS?s Scientific Manuscript database
The integration of remotely sensed data into models of actual evapotranspiration has allowed for the estimation of water consumption across agricultural regions. Two modeling approaches have been successfully applied. The first approach computes a surface energy balance using the radiometric surface...
PI2GIS: processing image to geographical information systems, a learning tool for QGIS
NASA Astrophysics Data System (ADS)
Correia, R.; Teodoro, A.; Duarte, L.
2017-10-01
To perform an accurate interpretation of remote sensing images, it is necessary to extract information using different image processing techniques. Nowadays, it became usual to use image processing plugins to add new capabilities/functionalities integrated in Geographical Information System (GIS) software. The aim of this work was to develop an open source application to automatically process and classify remote sensing images from a set of satellite input data. The application was integrated in a GIS software (QGIS), automating several image processing steps. The use of QGIS for this purpose is justified since it is easy and quick to develop new plugins, using Python language. This plugin is inspired in the Semi-Automatic Classification Plugin (SCP) developed by Luca Congedo. SCP allows the supervised classification of remote sensing images, the calculation of vegetation indices such as NDVI (Normalized Difference Vegetation Index) and EVI (Enhanced Vegetation Index) and other image processing operations. When analysing SCP, it was realized that a set of operations, that are very useful in teaching classes of remote sensing and image processing tasks, were lacking, such as the visualization of histograms, the application of filters, different image corrections, unsupervised classification and several environmental indices computation. The new set of operations included in the PI2GIS plugin can be divided into three groups: pre-processing, processing, and classification procedures. The application was tested consider an image from Landsat 8 OLI from a North area of Portugal.
Integrated analysis of remote sensing products from basic geological surveys. [Brazil
NASA Technical Reports Server (NTRS)
Dasilvafagundesfilho, E. (Principal Investigator)
1984-01-01
Recent advances in remote sensing led to the development of several techniques to obtain image information. These techniques as effective tools in geological maping are analyzed. A strategy for optimizing the images in basic geological surveying is presented. It embraces as integrated analysis of spatial, spectral, and temporal data through photoptic (color additive viewer) and computer processing at different scales, allowing large areas survey in a fast, precise, and low cost manner.
Diagnostics in the Extendable Integrated Support Environment (EISE)
NASA Technical Reports Server (NTRS)
Brink, James R.; Storey, Paul
1988-01-01
Extendable Integrated Support Environment (EISE) is a real-time computer network consisting of commercially available hardware and software components to support systems level integration, modifications, and enhancement to weapons systems. The EISE approach offers substantial potential savings by eliminating unique support environments in favor of sharing common modules for the support of operational weapon systems. An expert system is being developed that will help support diagnosing faults in this network. This is a multi-level, multi-expert diagnostic system that uses experiential knowledge relating symptoms to faults and also reasons from structural and functional models of the underlying physical model when experiential reasoning is inadequate. The individual expert systems are orchestrated by a supervisory reasoning controller, a meta-level reasoner which plans the sequence of reasoning steps to solve the given specific problem. The overall system, termed the Diagnostic Executive, accesses systems level performance checks and error reports, and issues remote test procedures to formulate and confirm fault hypotheses.
Chen, Hung-Ming; Lo, Jung-Wen; Yeh, Chang-Kuo
2012-12-01
The rapidly increased availability of always-on broadband telecommunication environments and lower-cost vital signs monitoring devices bring the advantages of telemedicine directly into the patient's home. Hence, the control of access to remote medical servers' resources has become a crucial challenge. A secure authentication scheme between the medical server and remote users is therefore needed to safeguard data integrity, confidentiality and to ensure availability. Recently, many authentication schemes that use low-cost mobile devices have been proposed to meet these requirements. In contrast to previous schemes, Khan et al. proposed a dynamic ID-based remote user authentication scheme that reduces computational complexity and includes features such as a provision for the revocation of lost or stolen smart cards and a time expiry check for the authentication process. However, Khan et al.'s scheme has some security drawbacks. To remedy theses, this study proposes an enhanced authentication scheme that overcomes the weaknesses inherent in Khan et al.'s scheme and demonstrated this scheme is more secure and robust for use in a telecare medical information system.
NASA Technical Reports Server (NTRS)
Tilmann, S. E.; Enslin, W. R.; Hill-Rowley, R.
1977-01-01
A computer-based information system is described designed to assist in the integration of commonly available spatial data for regional planning and resource analysis. The Resource Analysis Program (RAP) provides a variety of analytical and mapping phases for single factor or multi-factor analyses. The unique analytical and graphic capabilities of RAP are demonstrated with a study conducted in Windsor Township, Eaton County, Michigan. Soil, land cover/use, topographic and geological maps were used as a data base to develope an eleven map portfolio. The major themes of the portfolio are land cover/use, non-point water pollution, waste disposal, and ground water recharge.
NASA Technical Reports Server (NTRS)
Seinfeld, J. H. (Principal Investigator)
1982-01-01
The problem of the assimilation of remote sensing data into mathematical models of atmospheric pollutant species was investigated. The data assimilation problem is posed in terms of the matching of spatially integrated species burden measurements to the predicted three-dimensional concentration fields from atmospheric diffusion models. General conditions were derived for the reconstructability of atmospheric concentration distributions from data typical of remote sensing applications, and a computational algorithm (filter) for the processing of remote sensing data was developed.
NASA Technical Reports Server (NTRS)
Seinfeld, J. H. (Principal Investigator)
1982-01-01
The problem of the assimilation of remote sensing data into mathematical models of atmospheric pollutant species was investigated. The problem is posed in terms of the matching of spatially integrated species burden measurements to the predicted three dimensional concentration fields from atmospheric diffusion models. General conditions are derived for the "reconstructability' of atmospheric concentration distributions from data typical of remote sensing applications, and a computational algorithm (filter) for the processing of remote sensing data is developed.
2007-09-01
example, an application developed in Sun’s Netbeans [2007] integrated development environment (IDE) uses Swing class object for graphical user... Netbeans Version 5.5.1 [Computer Software]. Santa Clara, CA: Sun Microsystems. Process Modeler Version 7.0 [Computer Software]. Santa Clara, Ca
Extending IPsec for Efficient Remote Attestation
NASA Astrophysics Data System (ADS)
Sadeghi, Ahmad-Reza; Schulz, Steffen
When establishing a VPN to connect different sites of a network, the integrity of the involved VPN endpoints is often a major security concern. Based on the Trusted Platform Module (TPM), available in many computing platforms today, remote attestation mechanisms can be used to evaluate the internal state of remote endpoints automatically. However, existing protocols and extensions are either unsuited for use with IPsec or impose considerable additional implementation complexity and protocol overhead.
The Radio Frequency Health Node Wireless Sensor System
NASA Technical Reports Server (NTRS)
Valencia, J. Emilio; Stanley, Priscilla C.; Mackey, Paul J.
2009-01-01
The Radio Frequency Health Node (RFHN) wireless sensor system differs from other wireless sensor systems in ways originally intended to enhance utility as an instrumentation system for a spacecraft. The RFHN can also be adapted to use in terrestrial applications in which there are requirements for operational flexibility and integrability into higher-level instrumentation and data acquisition systems. As shown in the figure, the heart of the system is the RFHN, which is a unit that passes commands and data between (1) one or more commercially available wireless sensor units (optionally, also including wired sensor units) and (2) command and data interfaces with a local control computer that may be part of the spacecraft or other engineering system in which the wireless sensor system is installed. In turn, the local control computer can be in radio or wire communication with a remote control computer that may be part of a higher-level system. The remote control computer, acting via the local control computer and the RFHN, cannot only monitor readout data from the sensor units but can also remotely configure (program or reprogram) the RFHN and the sensor units during operation. In a spacecraft application, the RFHN and the sensor units can also be configured more nearly directly, prior to launch, via a serial interface that includes an umbilical cable between the spacecraft and ground support equipment. In either case, the RFHN wireless sensor system has the flexibility to be configured, as required, with different numbers and types of sensors for different applications. The RFHN can be used to effect realtime transfer of data from, and commands to, the wireless sensor units. It can also store data for later retrieval by an external computer. The RFHN communicates with the wireless sensor units via a radio transceiver module. The modular design of the RFHN makes it possible to add radio transceiver modules as needed to accommodate additional sets of wireless sensor units. The RFHN includes a core module that performs generic computer functions, including management of power and input, output, processing, and storage of data. In a typical application, the processing capabilities in the RFHN are utilized to perform preprocessing, trending, and fusion of sensor data. The core module also serves as the unit through which the remote control computer configures the sensor units and the rest of the RFHN.
On the Integration of Remote Experimentation into Undergraduate Laboratories--Pedagogical Approach
ERIC Educational Resources Information Center
Esche, Sven K.
2005-01-01
This paper presents an Internet-based open approach to laboratory instruction. In this article, the author talks about an open laboratory approach using a multi-user multi-device remote facility. This approach involves both the direct contact with the computer-controlled laboratory setup of interest with the students present in the laboratory…
The integrated design and archive of space-borne signal processing and compression coding
NASA Astrophysics Data System (ADS)
He, Qiang-min; Su, Hao-hang; Wu, Wen-bo
2017-10-01
With the increasing demand of users for the extraction of remote sensing image information, it is very urgent to significantly enhance the whole system's imaging quality and imaging ability by using the integrated design to achieve its compact structure, light quality and higher attitude maneuver ability. At this present stage, the remote sensing camera's video signal processing unit and image compression and coding unit are distributed in different devices. The volume, weight and consumption of these two units is relatively large, which unable to meet the requirements of the high mobility remote sensing camera. This paper according to the high mobility remote sensing camera's technical requirements, designs a kind of space-borne integrated signal processing and compression circuit by researching a variety of technologies, such as the high speed and high density analog-digital mixed PCB design, the embedded DSP technology and the image compression technology based on the special-purpose chips. This circuit lays a solid foundation for the research of the high mobility remote sensing camera.
NASA Astrophysics Data System (ADS)
Thakur, Jay Krishna; Singh, Sudhir Kumar; Ekanthalu, Vicky Shettigondahalli
2017-07-01
Integration of remote sensing (RS), geographic information systems (GIS) and global positioning system (GPS) are emerging research areas in the field of groundwater hydrology, resource management, environmental monitoring and during emergency response. Recent advancements in the fields of RS, GIS, GPS and higher level of computation will help in providing and handling a range of data simultaneously in a time- and cost-efficient manner. This review paper deals with hydrological modeling, uses of remote sensing and GIS in hydrological modeling, models of integrations and their need and in last the conclusion. After dealing with these issues conceptually and technically, we can develop better methods and novel approaches to handle large data sets and in a better way to communicate information related with rapidly decreasing societal resources, i.e. groundwater.
NASA Astrophysics Data System (ADS)
Li, J.; Zhang, T.; Huang, Q.; Liu, Q.
2014-12-01
Today's climate datasets are featured with large volume, high degree of spatiotemporal complexity and evolving fast overtime. As visualizing large volume distributed climate datasets is computationally intensive, traditional desktop based visualization applications fail to handle the computational intensity. Recently, scientists have developed remote visualization techniques to address the computational issue. Remote visualization techniques usually leverage server-side parallel computing capabilities to perform visualization tasks and deliver visualization results to clients through network. In this research, we aim to build a remote parallel visualization platform for visualizing and analyzing massive climate data. Our visualization platform was built based on Paraview, which is one of the most popular open source remote visualization and analysis applications. To further enhance the scalability and stability of the platform, we have employed cloud computing techniques to support the deployment of the platform. In this platform, all climate datasets are regular grid data which are stored in NetCDF format. Three types of data access methods are supported in the platform: accessing remote datasets provided by OpenDAP servers, accessing datasets hosted on the web visualization server and accessing local datasets. Despite different data access methods, all visualization tasks are completed at the server side to reduce the workload of clients. As a proof of concept, we have implemented a set of scientific visualization methods to show the feasibility of the platform. Preliminary results indicate that the framework can address the computation limitation of desktop based visualization applications.
Information Power Grid Posters
NASA Technical Reports Server (NTRS)
Vaziri, Arsi
2003-01-01
This document is a summary of the accomplishments of the Information Power Grid (IPG). Grids are an emerging technology that provide seamless and uniform access to the geographically dispersed, computational, data storage, networking, instruments, and software resources needed for solving large-scale scientific and engineering problems. The goal of the NASA IPG is to use NASA's remotely located computing and data system resources to build distributed systems that can address problems that are too large or complex for a single site. The accomplishments outlined in this poster presentation are: access to distributed data, IPG heterogeneous computing, integration of large-scale computing node into distributed environment, remote access to high data rate instruments,and exploratory grid environment.
NASA Astrophysics Data System (ADS)
Dorigo, W. A.; Zurita-Milla, R.; de Wit, A. J. W.; Brazile, J.; Singh, R.; Schaepman, M. E.
2007-05-01
During the last 50 years, the management of agroecosystems has been undergoing major changes to meet the growing demand for food, timber, fibre and fuel. As a result of this intensified use, the ecological status of many agroecosystems has been severely deteriorated. Modeling the behavior of agroecosystems is, therefore, of great help since it allows the definition of management strategies that maximize (crop) production while minimizing the environmental impacts. Remote sensing can support such modeling by offering information on the spatial and temporal variation of important canopy state variables which would be very difficult to obtain otherwise. In this paper, we present an overview of different methods that can be used to derive biophysical and biochemical canopy state variables from optical remote sensing data in the VNIR-SWIR regions. The overview is based on an extensive literature review where both statistical-empirical and physically based methods are discussed. Subsequently, the prevailing techniques of assimilating remote sensing data into agroecosystem models are outlined. The increasing complexity of data assimilation methods and of models describing agroecosystem functioning has significantly increased computational demands. For this reason, we include a short section on the potential of parallel processing to deal with the complex and computationally intensive algorithms described in the preceding sections. The studied literature reveals that many valuable techniques have been developed both for the retrieval of canopy state variables from reflective remote sensing data as for assimilating the retrieved variables in agroecosystem models. However, for agroecosystem modeling and remote sensing data assimilation to be commonly employed on a global operational basis, emphasis will have to be put on bridging the mismatch between data availability and accuracy on one hand, and model and user requirements on the other. This could be achieved by integrating imagery with different spatial, temporal, spectral, and angular resolutions, and the fusion of optical data with data of different origin, such as LIDAR and radar/microwave.
Naver: a PC-cluster-based VR system
NASA Astrophysics Data System (ADS)
Park, ChangHoon; Ko, HeeDong; Kim, TaiYun
2003-04-01
In this paper, we present a new framework NAVER for virtual reality application. The NAVER is based on a cluster of low-cost personal computers. The goal of NAVER is to provide flexible, extensible, scalable and re-configurable framework for the virtual environments defined as the integration of 3D virtual space and external modules. External modules are various input or output devices and applications on the remote hosts. From the view of system, personal computers are divided into three servers according to its specific functions: Render Server, Device Server and Control Server. While Device Server contains external modules requiring event-based communication for the integration, Control Server contains external modules requiring synchronous communication every frame. And, the Render Server consists of 5 managers: Scenario Manager, Event Manager, Command Manager, Interaction Manager and Sync Manager. These managers support the declaration and operation of virtual environment and the integration with external modules on remote servers.
Multiplexing electro-optic architectures for advanced aircraft integrated flight control systems
NASA Technical Reports Server (NTRS)
Seal, D. W.
1989-01-01
This report describes the results of a 10 month program sponsored by NASA. The objective of this program was to evaluate various optical sensor modulation technologies and to design an optimal Electro-Optic Architecture (EOA) for servicing remote clusters of sensors and actuators in advanced aircraft flight control systems. The EOA's supply optical power to remote sensors and actuators, process the modulated optical signals returned from the sensors, and produce conditioned electrical signals acceptable for use by a digital flight control computer or Vehicle Management System (VMS) computer. This study was part of a multi-year initiative under the Fiber Optic Control System Integration (FOCSI) program to design, develop, and test a totally integrated fiber optic flight/propulsion control system for application to advanced aircraft. Unlike earlier FOCSI studies, this program concentrated on the design of the EOA interface rather than the optical transducer technology itself.
Buried waste integrated demonstration human engineered control station. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1994-09-01
This document describes the Human Engineered Control Station (HECS) project activities including the conceptual designs. The purpose of the HECS is to enhance the effectiveness and efficiency of remote retrieval by providing an integrated remote control station. The HECS integrates human capabilities, limitations, and expectations into the design to reduce the potential for human error, provides an easy system to learn and operate, provides an increased productivity, and reduces the ultimate investment in training. The overall HECS consists of the technology interface stations, supporting engineering aids, platform (trailer), communications network (broadband system), and collision avoidance system.
NASA Astrophysics Data System (ADS)
di, L.; Deng, M.
2010-12-01
Remote sensing (RS) is an essential method to collect data for Earth science research. Huge amount of remote sensing data, most of them in the image form, have been acquired. Almost all geography departments in the world offer courses in digital processing of remote sensing images. Such courses place emphasis on how to digitally process large amount of multi-source images for solving real world problems. However, due to the diversity and complexity of RS images and the shortcomings of current data and processing infrastructure, obstacles for effectively teaching such courses still remain. The major obstacles include 1) difficulties in finding, accessing, integrating and using massive RS images by students and educators, and 2) inadequate processing functions and computing facilities for students to freely explore the massive data. Recent development in geospatial Web processing service systems, which make massive data, computing powers, and processing capabilities to average Internet users anywhere in the world, promises the removal of the obstacles. The GeoBrain system developed by CSISS is an example of such systems. All functions available in GRASS Open Source GIS have been implemented as Web services in GeoBrain. Petabytes of remote sensing images in NASA data centers, the USGS Landsat data archive, and NOAA CLASS are accessible transparently and processable through GeoBrain. The GeoBrain system is operated on a high performance cluster server with large disk storage and fast Internet connection. All GeoBrain capabilities can be accessed by any Internet-connected Web browser. Dozens of universities have used GeoBrain as an ideal platform to support data-intensive remote sensing education. This presentation gives a specific example of using GeoBrain geoprocessing services to enhance the teaching of GGS 588, Digital Remote Sensing taught at the Department of Geography and Geoinformation Science, George Mason University. The course uses the textbook "Introductory Digital Image Processing, A Remote Sensing Perspective" authored by John Jensen. The textbook is widely adopted in the geography departments around the world for training students on digital processing of remote sensing images. In the traditional teaching setting for the course, the instructor prepares a set of sample remote sensing images to be used for the course. Commercial desktop remote sensing software, such as ERDAS, is used for students to do the lab exercises. The students have to do the excurses in the lab and can only use the simple images. For this specific course at GMU, we developed GeoBrain-based lab excurses for the course. With GeoBrain, students now can explore petabytes of remote sensing images in the NASA, NOAA, and USGS data archives instead of dealing only with sample images. Students have a much more powerful computing facility available for their lab excurses. They can explore the data and do the excurses any time at any place they want as long as they can access the Internet through the Web Browser. The feedbacks from students are all very positive about the learning experience on the digital image processing with the help of GeoBrain web processing services. The teaching/lab materials and GeoBrain services are freely available to anyone at http://www.laits.gmu.edu.
An Approach to Integrate a Space-Time GIS Data Model with High Performance Computers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Dali; Zhao, Ziliang; Shaw, Shih-Lung
2011-01-01
In this paper, we describe an approach to integrate a Space-Time GIS data model on a high performance computing platform. The Space-Time GIS data model has been developed on a desktop computing environment. We use the Space-Time GIS data model to generate GIS module, which organizes a series of remote sensing data. We are in the process of porting the GIS module into an HPC environment, in which the GIS modules handle large dataset directly via parallel file system. Although it is an ongoing project, authors hope this effort can inspire further discussions on the integration of GIS on highmore » performance computing platforms.« less
Energy and remote sensing. [satellite exploration, monitoring, siting
NASA Technical Reports Server (NTRS)
Summers, R. A.; Smith, W. L.; Short, N. M.
1977-01-01
Exploration for uranium, thorium, oil, gas and geothermal activity through remote sensing techniques is considered; satellite monitoring of coal-derived CO2 in the atmosphere, and the remote assessment of strip mining and land restoration are also mentioned. Reference is made to color ratio composites based on Landsat data, which may aid in the detection of uranium deposits, and to computer-enhanced black and white airborne scanning imagery, which may locate geothermal anomalies. Other applications of remote sensing to energy resources management, including mapping of transportation networks and power plant siting, are discussed.
Mobile Tablet Use among Academic Physicians and Trainees
Sclafani, Joseph; Tirrell, Timothy F.
2014-01-01
The rapid adoption rate and integration of mobile technology (tablet computing devices and smartphones) by physicians is reshaping the current clinical landscape. These devices have sparked an evolution in a variety of arenas, including educational media dissemination, remote patient data access and point of care applications. Quantifying usage patterns of clinical applications of mobile technology is of interest to understand how these technologies are shaping current clinical care. A digital survey examining mobile tablet and associated application usage was administered via email to all ACGME training programs. Data regarding respondent specialty, level of training, and habits of tablet usage were collected and analyzed. 40 % of respondents used a tablet, of which the iPad was the most popular. Nearly half of the tablet owners reported using the tablet in clinical settings; the most commonly used application types were point of care and electronic medical record access. Increased level of training was associated with decreased support for mobile computing improving physician capabilities and patient interactions. There was strong and consistent desire for institutional support of mobile computing and integration of mobile computing technology into medical education. While many physicians are currently purchasing mobile devices, often without institutional support, successful integration of these devices into the clinical setting is still developing. Potential reasons behind the low adoption rate may include interference of technology in doctor-patient interactions or the lack of appropriate applications available for download. However, the results convincingly demonstrate that physicians recognize a potential utility in mobile computing, indicated by their desire for institutional support and integration of mobile technology into medical education. It is likely that the use of tablet computers in clinical practice will expand in the future. Thus, we believe medical institutions, providers, educators, and developers should collaborate in ways that enhance the efficacy, reliability, and safety of integrating these devices into daily medical practice. PMID:23321961
Remote voice training: A case study on space shuttle applications, appendix C
NASA Technical Reports Server (NTRS)
Mollakarimi, Cindy; Hamid, Tamin
1990-01-01
The Tile Automation System includes applications of automation and robotics technology to all aspects of the Shuttle tile processing and inspection system. An integrated set of rapid prototyping testbeds was developed which include speech recognition and synthesis, laser imaging systems, distributed Ada programming environments, distributed relational data base architectures, distributed computer network architectures, multi-media workbenches, and human factors considerations. Remote voice training in the Tile Automation System is discussed. The user is prompted over a headset by synthesized speech for the training sequences. The voice recognition units and the voice output units are remote from the user and are connected by Ethernet to the main computer system. A supervisory channel is used to monitor the training sequences. Discussions include the training approaches as well as the human factors problems and solutions for this system utilizing remote training techniques.
Autonomous Systems, Robotics, and Computing Systems Capability Roadmap: NRC Dialogue
NASA Technical Reports Server (NTRS)
Zornetzer, Steve; Gage, Douglas
2005-01-01
Contents include the following: Introduction. Process, Mission Drivers, Deliverables, and Interfaces. Autonomy. Crew-Centered and Remote Operations. Integrated Systems Health Management. Autonomous Vehicle Control. Autonomous Process Control. Robotics. Robotics for Solar System Exploration. Robotics for Lunar and Planetary Habitation. Robotics for In-Space Operations. Computing Systems. Conclusion.
Multisource Data Integration in Remote Sensing
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1991-01-01
Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1997-05-01
The Interactive Computer-Enhanced Remote Viewing System (ICERVS) is a software tool for complex three-dimensional (3-D) visualization and modeling. Its primary purpose is to facilitate the use of robotic and telerobotic systems in remote and/or hazardous environments, where spatial information is provided by 3-D mapping sensors. ICERVS provides a robust, interactive system for viewing sensor data in 3-D and combines this with interactive geometric modeling capabilities that allow an operator to construct CAD models to match the remote environment. Part I of this report traces the development of ICERVS through three evolutionary phases: (1) development of first-generation software to render orthogonalmore » view displays and wireframe models; (2) expansion of this software to include interactive viewpoint control, surface-shaded graphics, material (scalar and nonscalar) property data, cut/slice planes, color and visibility mapping, and generalized object models; (3) demonstration of ICERVS as a tool for the remediation of underground storage tanks (USTs) and the dismantlement of contaminated processing facilities. Part II of this report details the software design of ICERVS, with particular emphasis on its object-oriented architecture and user interface.« less
Test-bed for the remote health monitoring system for bridge structures using FBG sensors
NASA Astrophysics Data System (ADS)
Lee, Chin-Hyung; Park, Ki-Tae; Joo, Bong-Chul; Hwang, Yoon-Koog
2009-05-01
This paper reports on test-bed for the long-term health monitoring system for bridge structures employing fiber Bragg grating (FBG) sensors, which is remotely accessible via the web, to provide real-time quantitative information on a bridge's response to live loading and environmental changes, and fast prediction of the structure's integrity. The sensors are attached on several locations of the structure and connected to a data acquisition system permanently installed onsite. The system can be accessed through remote communication using an optical cable network, through which the evaluation of the bridge behavior under live loading can be allowed at place far away from the field. Live structural data are transmitted continuously to the server computer at the central office. The server computer is connected securely to the internet, where data can be retrieved, processed and stored for the remote web-based health monitoring. Test-bed revealed that the remote health monitoring technology will enable practical, cost-effective, and reliable condition assessment and maintenance of bridge structures.
NASA Astrophysics Data System (ADS)
Liu, Jia; Liu, Longli; Xue, Yong; Dong, Jing; Hu, Yingcui; Hill, Richard; Guang, Jie; Li, Chi
2017-01-01
Workflow for remote sensing quantitative retrieval is the ;bridge; between Grid services and Grid-enabled application of remote sensing quantitative retrieval. Workflow averts low-level implementation details of the Grid and hence enables users to focus on higher levels of application. The workflow for remote sensing quantitative retrieval plays an important role in remote sensing Grid and Cloud computing services, which can support the modelling, construction and implementation of large-scale complicated applications of remote sensing science. The validation of workflow is important in order to support the large-scale sophisticated scientific computation processes with enhanced performance and to minimize potential waste of time and resources. To research the semantic correctness of user-defined workflows, in this paper, we propose a workflow validation method based on tacit knowledge research in the remote sensing domain. We first discuss the remote sensing model and metadata. Through detailed analysis, we then discuss the method of extracting the domain tacit knowledge and expressing the knowledge with ontology. Additionally, we construct the domain ontology with Protégé. Through our experimental study, we verify the validity of this method in two ways, namely data source consistency error validation and parameters matching error validation.
Utility of remotely sensed data for identification of soil conservation practices
NASA Technical Reports Server (NTRS)
Pelletier, R. E.; Griffin, R. H.
1986-01-01
Discussed are a variety of remotely sensed data sources that may have utility in the identification of conservation practices and related linear features. Test sites were evaluated in Alabama, Kansas, Mississippi, and Oklahoma using one or more of a variety of remotely sensed data sources, including color infrared photography (CIR), LANDSAT Thematic Mapper (TM) data, and aircraft-acquired Thermal Infrared Multispectral Scanner (TIMS) data. Both visual examination and computer-implemented enhancement procedures were used to identify conservation practices and other linear features. For the Kansas, Mississippi, and Oklahoma test sites, photo interpretations of CIR identified up to 24 of the 109 conservation practices from a matrix derived from the SCS National Handbook of Conservation Practices. The conservation practice matrix was modified to predict the possibility of identifying the 109 practices at various photographic scales based on the observed results as well as photo interpreter experience. Some practices were successfully identified in TM data through visual identification, but a number of existing practices were of such size and shape that the resolution of the TM could not detect them accurately. A series of computer-automated decorrelation and filtering procedures served to enhance the conservation practices in TM data with only fair success. However, features such as field boundaries, roads, water bodies, and the Urban/Ag interface were easily differentiated. Similar enhancement techniques applied to 5 and 10 meter TIMS data proved much more useful in delineating terraces, grass waterways, and drainage ditches as well as the features mentioned above, due partly to improved resolution and partly to thermally influenced moisture conditions. Spatially oriented data such as those derived from remotely sensed data offer some promise in the inventory and monitoring of conservation practices as well as in supplying parameter data for a variety of computer-implemented agricultural models.
JPL Earth Science Center Visualization Multitouch Table
NASA Astrophysics Data System (ADS)
Kim, R.; Dodge, K.; Malhotra, S.; Chang, G.
2014-12-01
JPL Earth Science Center Visualization table is a specialized software and hardware to allow multitouch, multiuser, and remote display control to create seamlessly integrated experiences to visualize JPL missions and their remote sensing data. The software is fully GIS capable through time aware OGC WMTS using Lunar Mapping and Modeling Portal as the GIS backend to continuously ingest and retrieve realtime remote sending data and satellite location data. 55 inch and 82 inch unlimited finger count multitouch displays allows multiple users to explore JPL Earth missions and visualize remote sensing data through very intuitive and interactive touch graphical user interface. To improve the integrated experience, Earth Science Center Visualization Table team developed network streaming which allows table software to stream data visualization to near by remote display though computer network. The purpose of this visualization/presentation tool is not only to support earth science operation, but specifically designed for education and public outreach and will significantly contribute to STEM. Our presentation will include overview of our software, hardware, and showcase of our system.
The paper describes a new approach to quantify emissions from area air pollution sources. The approach combines path-integrated concentration data acquired with any path-integrated optical remote sensing (PI-ORS) technique and computed tomography (CT) technique. In this study, an...
An integrated compact airborne multispectral imaging system using embedded computer
NASA Astrophysics Data System (ADS)
Zhang, Yuedong; Wang, Li; Zhang, Xuguo
2015-08-01
An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.
Remote sensing of vegetation structure using computer vision
NASA Astrophysics Data System (ADS)
Dandois, Jonathan P.
High-spatial resolution measurements of vegetation structure are needed for improving understanding of ecosystem carbon, water and nutrient dynamics, the response of ecosystems to a changing climate, and for biodiversity mapping and conservation, among many research areas. Our ability to make such measurements has been greatly enhanced by continuing developments in remote sensing technology---allowing researchers the ability to measure numerous forest traits at varying spatial and temporal scales and over large spatial extents with minimal to no field work, which is costly for large spatial areas or logistically difficult in some locations. Despite these advances, there remain several research challenges related to the methods by which three-dimensional (3D) and spectral datasets are joined (remote sensing fusion) and the availability and portability of systems for frequent data collections at small scale sampling locations. Recent advances in the areas of computer vision structure from motion (SFM) and consumer unmanned aerial systems (UAS) offer the potential to address these challenges by enabling repeatable measurements of vegetation structural and spectral traits at the scale of individual trees. However, the potential advances offered by computer vision remote sensing also present unique challenges and questions that need to be addressed before this approach can be used to improve understanding of forest ecosystems. For computer vision remote sensing to be a valuable tool for studying forests, bounding information about the characteristics of the data produced by the system will help researchers understand and interpret results in the context of the forest being studied and of other remote sensing techniques. This research advances understanding of how forest canopy and tree 3D structure and color are accurately measured by a relatively low-cost and portable computer vision personal remote sensing system: 'Ecosynth'. Recommendations are made for optimal conditions under which forest structure measurements should be obtained with UAS-SFM remote sensing. Ultimately remote sensing of vegetation by computer vision offers the potential to provide an 'ecologist's eye view', capturing not only canopy 3D and spectral properties, but also seeing the trees in the forest and the leaves on the trees.
Remote sensing and geographically based information systems
NASA Technical Reports Server (NTRS)
Cicone, R. C.
1977-01-01
A structure is proposed for a geographically-oriented computer-based information system applicable to the analysis of remote sensing digital data. The structure, intended to answer a wide variety of user needs, would permit multiple views of the data, provide independent management of data security, quality and integrity, and rely on automatic data filing. Problems in geographically-oriented data systems, including those related to line encoding and cell encoding, are considered.
Technological requirements of teleneuropathological systems.
Szymaś, J
2000-01-01
Teleneuropathology is the practice of conducting remote neuropathological examinations with the use of telecommunication links. Because of a limited number of expert neuropathologists, some, especially smaller departments have the equipment to conduct the examination but do not have a specialist who would be able to evaluate material from the central nervous system. In case of teleneuropathology, a neuropathologist examines tissue fragments taken during an operation by means of a telemicroscope connected with the computer through a telecommunications network. It enables the neuropathologist to operate the microscope and camera remotely. Two basic systems exist for performing remote neuropathological examination: static and dynamic. Both have different needs in medical, computing and telecommunication aspect. Depending on the type of service the public telephone network, the integrated services digital network, or optical fibre should be used. Conditionally Internet can be used as a link for teleneuropathological system. However, for the newest developments in teleneuropathology such as teleconference and remote operation on robotized microscope only transmission over the integrated service digital network, which guarantees high speed of transmission gives a possibility to communicate. Because images are basic information element in teleneuropathological systems the high capacity of acquisition, processing, storing, transmission, and visualization equipment is necessary. The farther development of telecommunication as well as standardization of recording and transmission procedures of pictorial data is necessary.
Incorporating Laptop Technologies into an Animal Sciences Curriculum
ERIC Educational Resources Information Center
Birrenkott, Glenn; Bertrand, Jean A.; Bolt, Brian
2005-01-01
Teaching animal sciences, like most agricultural disciplines, requires giving students hands-on learning opportunities in remote and often computer-unfriendly sites such as animal farms. How do faculty integrate laptop use into such an environment?
Remote Adaptive Motor Resistance Training Exercise Apparatus and Method of Use Thereof
NASA Technical Reports Server (NTRS)
Reich, Alton (Inventor); Shaw, James (Inventor)
2017-01-01
The invention comprises a method and/or an apparatus using a computer configured exercise system equipped with an electric motor to provide physical resistance to user motion in conjunction with means for sharing exercise system related data and/or user performance data with a secondary user, such as a medical professional, a physical therapist, a trainer, a computer generated competitor, and/or a human competitor. For example, the exercise system is used with a remote trainer to enhance exercise performance, with a remote medical professional for rehabilitation, and/or with a competitor in a competition, such as in a power/weightlifting competition or in a video game. The exercise system is optionally configured with an intelligent software assistant and knowledge navigator functioning as a personal assistant application.
Remote Adaptive Motor Resistance Training Exercise Apparatus and Method of Use Thereof
NASA Technical Reports Server (NTRS)
Shaw, James (Inventor); Reich, Alton (Inventor)
2016-01-01
The invention comprises a method and/or an apparatus using a computer configured exercise system equipped with an electric motor to provide physical resistance to user motion in conjunction with means for sharing exercise system related data and/or user performance data with a secondary user, such as a medical professional, a physical therapist, a trainer, a computer generated competitor, and/or a human competitor. For example, the exercise system is used with a remote trainer to enhance exercise performance, with a remote medical professional for rehabilitation, and/or with a competitor in a competition, such as in a power/weightlifting competition or in a video game. The exercise system is optionally configured with an intelligent software assistant and knowledge navigator functioning as a personal assistant application.
Microcomputer software development facilities
NASA Technical Reports Server (NTRS)
Gorman, J. S.; Mathiasen, C.
1980-01-01
A more efficient and cost effective method for developing microcomputer software is to utilize a host computer with high-speed peripheral support. Application programs such as cross assemblers, loaders, and simulators are implemented in the host computer for each of the microcomputers for which software development is a requirement. The host computer is configured to operate in a time share mode for multiusers. The remote terminals, printers, and down loading capabilities provided are based on user requirements. With this configuration a user, either local or remote, can use the host computer for microcomputer software development. Once the software is developed (through the code and modular debug stage) it can be downloaded to the development system or emulator in a test area where hardware/software integration functions can proceed. The microcomputer software program sources reside in the host computer and can be edited, assembled, loaded, and then downloaded as required until the software development project has been completed.
The HEPiX Virtualisation Working Group: Towards a Grid of Clouds
NASA Astrophysics Data System (ADS)
Cass, Tony
2012-12-01
The use of virtual machine images, as for example with Cloud services such as Amazon's Elastic Compute Cloud, is attractive for users as they have a guaranteed execution environment, something that cannot today be provided across sites participating in computing grids such as the Worldwide LHC Computing Grid. However, Grid sites often operate within computer security frameworks which preclude the use of remotely generated images. The HEPiX Virtualisation Working Group was setup with the objective to enable use of remotely generated virtual machine images at Grid sites and, to this end, has introduced the idea of trusted virtual machine images which are guaranteed to be secure and configurable by sites such that security policy commitments can be met. This paper describes the requirements and details of these trusted virtual machine images and presents a model for their use to facilitate the integration of Grid- and Cloud-based computing environments for High Energy Physics.
Schoo, Adrian; Lawn, Sharon; Carson, Dean
2016-04-02
Access to rural health services is compromised in many countries including Australia due to workforce shortages. The issues that consequently impact on equity of access and sustainability of rural and remote health services are complex. The purpose of this paper is to describe a number of approaches from the literature that could form the basis of a more integrated approach to health workforce and rural health service enhancement that can be supported by policy. A case study is used to demonstrate how such an approach could work. Disjointed health services are common in rural areas due to the 'tyranny of distance.' Recruitment and retention of health professionals in rural areas and access to and sustainability of rural health services is therefore compromised. Strategies to address these issues tend to have a narrow focus. An integrated approach is needed to enhance rural workforce and health services; one that develops, acknowledges and accounts for social capital and social relations within the rural community.
NASA Astrophysics Data System (ADS)
Linn, Marcia C.
1995-06-01
Designing effective curricula for complex topics and incorporating technological tools is an evolving process. One important way to foster effective design is to synthesize successful practices. This paper describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering. One course enhancement, the LISP Knowledge Integration Environment, improved learning and resulted in more gender-equitable outcomes. The second course enhancement, the spatial reasoning environment, addressed spatial reasoning in an introductory engineering course. This enhancement minimized the importance of prior knowledge of spatial reasoning and helped students develop a more comprehensive repertoire of spatial reasoning strategies. Taken together, the instructional research programs reinforce the value of the scaffolded knowledge integration framework and suggest directions for future curriculum reformers.
NASA Technical Reports Server (NTRS)
Chatfield, Robert B.; Vastano, John A.; Guild, Liane; Hlavka, Christine; Brass, James A.; Russell, Philip B. (Technical Monitor)
1994-01-01
Burning to clear land for crops and to destroy pests is an integral and largely unavoidable part of tropical agriculture. It is easy to note but difficult to quantify using remote sensing. This report describes our efforts to integrate remotely sensed data into our computer model of tropical chemical trace-gas emissions, weather, and reaction chemistry (using the MM5 mesoscale model and our own Global-Regional Atmospheric Chemistry Simulator). The effects of burning over the continents of Africa and South America have been noticed in observations from several satellites. Smoke plumes hundreds of kilometers long may be seen individually, or may merge into a large smoke pall over thousands of kilometers of these continents. These features are related to intense pollution in the much more confined regions with heavy burning. These emissions also translocate nitrogen thousands of kilometers in the tropical ecosystems, with large fixed-nitrogen losses balanced partially by locally intense fertilization downwind, where nitric acid is rained out. At a much larger scale, various satellite measurements have indicated the escape of carbon monoxide and ozone into large filaments which extend across the Tropical and Southern Atlantic Ocean. Our work relates the source emissions, estimated in part from remote sensing, in part from conventional surface reports, to the concentrations of these gases over these intercontinental regions. We will mention work in progress to use meteorological satellite data (AVHRR, GOES, and Meteosat) to estimate the surface temperature and extent and height of clouds, and explain why these uses are so important in our computer simulations of global biogeochemistry. We will compare our simulations and interpretation of remote observations to the international cooperation involving Brazil, South Africa, and the USA in the TRACE-A (Transport and Atmospheric Chemistry near the Equator - Atlantic) and SAFARI (Southern Africa Fire Atmosphere Research Initiative) and remote-sensing /aircraft/ecosystem observational campaigns.
Remote probing of the optical strength of atmospheric turbulence and of wind velocity
NASA Technical Reports Server (NTRS)
Fried, D. L.
1969-01-01
A procedure for determining the optical strength of turbulence of the atmosphere and the wind velocity at various altitudes by measuring the spatial and temporal covariance of scintillation is developed. Emphasis is placed on the development of the formal relationships that have to be inverted to obtain the desired results. For determination of optical strength of turbulence, it is a linear integral equation that is developed. However, for determination of remote wind velocity, a nonlinear integral equation is obtained. A computer approach for solving each of the equations is suggested. The configuration and performance requirements of the measurement apparatus are discussed.
Distributed computing testbed for a remote experimental environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Butner, D.N.; Casper, T.A.; Howard, B.C.
1995-09-18
Collaboration is increasing as physics research becomes concentrated on a few large, expensive facilities, particularly in magnetic fusion energy research, with national and international participation. These facilities are designed for steady state operation and interactive, real-time experimentation. We are developing tools to provide for the establishment of geographically distant centers for interactive operations; such centers would allow scientists to participate in experiments from their home institutions. A testbed is being developed for a Remote Experimental Environment (REE), a ``Collaboratory.`` The testbed will be used to evaluate the ability of a remotely located group of scientists to conduct research on themore » DIII-D Tokamak at General Atomics. The REE will serve as a testing environment for advanced control and collaboration concepts applicable to future experiments. Process-to-process communications over high speed wide area networks provide real-time synchronization and exchange of data among multiple computer networks, while the ability to conduct research is enhanced by adding audio/video communication capabilities. The Open Software Foundation`s Distributed Computing Environment is being used to test concepts in distributed control, security, naming, remote procedure calls and distributed file access using the Distributed File Services. We are exploring the technology and sociology of remotely participating in the operation of a large scale experimental facility.« less
Broering, N C
1983-01-01
Georgetown University's Library Information System (LIS), an integrated library system designed and implemented at the Dahlgren Memorial Library, is broadly described from an administrative point of view. LIS' functional components consist of eight "user-friendly" modules: catalog, circulation, serials, bibliographic management (including Mini-MEDLINE), acquisitions, accounting, networking, and computer-assisted instruction. This article touches on emerging library services, user education, and computer information services, which are also changing the role of staff librarians. The computer's networking capability brings the library directly to users through personal or institutional computers at remote sites. The proposed Integrated Medical Center Information System at Georgetown University will include interface with LIS through a network mechanism. LIS is being replicated at other libraries, and a microcomputer version is being tested for use in a hospital setting. PMID:6688749
Heterogeneous concurrent computing with exportable services
NASA Technical Reports Server (NTRS)
Sunderam, Vaidy
1995-01-01
Heterogeneous concurrent computing, based on the traditional process-oriented model, is approaching its functionality and performance limits. An alternative paradigm, based on the concept of services, supporting data driven computation, and built on a lightweight process infrastructure, is proposed to enhance the functional capabilities and the operational efficiency of heterogeneous network-based concurrent computing. TPVM is an experimental prototype system supporting exportable services, thread-based computation, and remote memory operations that is built as an extension of and an enhancement to the PVM concurrent computing system. TPVM offers a significantly different computing paradigm for network-based computing, while maintaining a close resemblance to the conventional PVM model in the interest of compatibility and ease of transition Preliminary experiences have demonstrated that the TPVM framework presents a natural yet powerful concurrent programming interface, while being capable of delivering performance improvements of upto thirty percent.
Handels, H; Busch, C; Encarnação, J; Hahn, C; Kühn, V; Miehe, J; Pöppl, S I; Rinast, E; Rossmanith, C; Seibert, F; Will, A
1997-03-01
The software system KAMEDIN (Kooperatives Arbeiten und MEdizinische Diagnostik auf Innovativen Netzen) is a multimedia telemedicine system for exchange, cooperative diagnostics, and remote analysis of digital medical image data. It provides components for visualisation, processing, and synchronised audio-visual discussion of medical images. Techniques of computer supported cooperative work (CSCW) synchronise user interactions during a teleconference. Visibility of both local and remote cursor on the conference workstations facilitates telepointing and reinforces the conference partner's telepresence. Audio communication during teleconferences is supported by an integrated audio component. Furthermore, brain tissue segmentation with artificial neural networks can be performed on an external supercomputer as a remote image analysis procedure. KAMEDIN is designed as a low cost CSCW tool for ISDN based telecommunication. However it can be used on any TCP/IP supporting network. In a field test, KAMEDIN was installed in 15 clinics and medical departments to validate the systems' usability. The telemedicine system KAMEDIN has been developed, tested, and evaluated within a research project sponsored by German Telekom.
Gao, Peng; Liu, Peng; Su, Hongsen; Qiao, Liang
2015-04-01
Integrating visualization toolkit and the capability of interaction, bidirectional communication and graphics rendering which provided by HTML5, we explored and experimented on the feasibility of remote medical image reconstruction and interaction in pure Web. We prompted server-centric method which did not need to download the big medical data to local connections and avoided considering network transmission pressure and the three-dimensional (3D) rendering capability of client hardware. The method integrated remote medical image reconstruction and interaction into Web seamlessly, which was applicable to lower-end computers and mobile devices. Finally, we tested this method in the Internet and achieved real-time effects. This Web-based 3D reconstruction and interaction method, which crosses over internet terminals and performance limited devices, may be useful for remote medical assistant.
Three-dimensional laser microvision.
Shimotahira, H; Iizuka, K; Chu, S C; Wah, C; Costen, F; Yoshikuni, Y
2001-04-10
A three-dimensional (3-D) optical imaging system offering high resolution in all three dimensions, requiring minimum manipulation and capable of real-time operation, is presented. The system derives its capabilities from use of the superstructure grating laser source in the implementation of a laser step frequency radar for depth information acquisition. A synthetic aperture radar technique was also used to further enhance its lateral resolution as well as extend the depth of focus. High-speed operation was made possible by a dual computer system consisting of a host and a remote microcomputer supported by a dual-channel Small Computer System Interface parallel data transfer system. The system is capable of operating near real time. The 3-D display of a tunneling diode, a microwave integrated circuit, and a see-through image taken by the system operating near real time are included. The depth resolution is 40 mum; lateral resolution with a synthetic aperture approach is a fraction of a micrometer and that without it is approximately 10 mum.
Mpeg2 codec HD improvements with medical and robotic imaging benefits
NASA Astrophysics Data System (ADS)
Picard, Wayne F. J.
2010-02-01
In this report, we propose an efficient scheme to use High Definition Television (HDTV) in a console or notebook format as a computer terminal in addition to their role as TV display unit. In the proposed scheme, we assume that the main computer is situated at a remote location. The computer raster in the remote server is compressed using an HD E- >Mpeg2 encoder and transmitted to the terminal at home. The built-in E->Mpeg2 decoder in the terminal decompresses the compressed bit stream, and displays the raster. The terminal will be fitted with a mouse and keyboard, through which the interaction with the remote computer server can be performed via a communications back channel. The terminal in a notebook format can thus be used as a high resolution computer and multimedia device. We will consider developments such as the required HD enhanced Mpeg2 resolution (E->Mpeg2) and its medical ramifications due to improvements on compressed image quality with 2D to 3D conversion (Mpeg3) and using the compressed Discrete Cosine Transform coefficients in the reality compression of vision and control of medical robotic surgeons.
Remote sensor support requirements for planetary missions
NASA Technical Reports Server (NTRS)
Weddell, J. B.; Wheeler, A. E.
1971-01-01
The study approach, methods, results, and conclusions of remote sensor support requirements for planetary missions are summarized. Major efforts were made to (1) establish the scientific and engineering knowledge and observation requirements for planetary exploration in the 1975 to 1985 period; (2) define the state of the art and expected development of instrument systems appropriate for sensing planetary environments; (3) establish scaling laws relating performance and support requirements of candidate remote sensor systems; (4) establish fundamental remote sensor system capabilities, limitations, and support requirements during encounter and other dynamical conditions for specific missions; and (5) construct families of candidate remote sensors compatible with selected missions. It was recommended that these data be integrated with earlier results to enhance utility, and that more restrictions be placed on the system.
Multiple node remote messaging
Blumrich, Matthias A.; Chen, Dong; Gara, Alan G.; Giampapa, Mark E.; Heidelberger, Philip; Ohmacht, Martin; Salapura, Valentina; Steinmacher-Burow, Burkhard; Vranas, Pavlos
2010-08-31
A method for passing remote messages in a parallel computer system formed as a network of interconnected compute nodes includes that a first compute node (A) sends a single remote message to a remote second compute node (B) in order to control the remote second compute node (B) to send at least one remote message. The method includes various steps including controlling a DMA engine at first compute node (A) to prepare the single remote message to include a first message descriptor and at least one remote message descriptor for controlling the remote second compute node (B) to send at least one remote message, including putting the first message descriptor into an injection FIFO at the first compute node (A) and sending the single remote message and the at least one remote message descriptor to the second compute node (B).
Cyber-workstation for computational neuroscience.
Digiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C; Fortes, Jose; Sanchez, Justin C
2010-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface.
Cyber-Workstation for Computational Neuroscience
DiGiovanna, Jack; Rattanatamrong, Prapaporn; Zhao, Ming; Mahmoudi, Babak; Hermer, Linda; Figueiredo, Renato; Principe, Jose C.; Fortes, Jose; Sanchez, Justin C.
2009-01-01
A Cyber-Workstation (CW) to study in vivo, real-time interactions between computational models and large-scale brain subsystems during behavioral experiments has been designed and implemented. The design philosophy seeks to directly link the in vivo neurophysiology laboratory with scalable computing resources to enable more sophisticated computational neuroscience investigation. The architecture designed here allows scientists to develop new models and integrate them with existing models (e.g. recursive least-squares regressor) by specifying appropriate connections in a block-diagram. Then, adaptive middleware transparently implements these user specifications using the full power of remote grid-computing hardware. In effect, the middleware deploys an on-demand and flexible neuroscience research test-bed to provide the neurophysiology laboratory extensive computational power from an outside source. The CW consolidates distributed software and hardware resources to support time-critical and/or resource-demanding computing during data collection from behaving animals. This power and flexibility is important as experimental and theoretical neuroscience evolves based on insights gained from data-intensive experiments, new technologies and engineering methodologies. This paper describes briefly the computational infrastructure and its most relevant components. Each component is discussed within a systematic process of setting up an in vivo, neuroscience experiment. Furthermore, a co-adaptive brain machine interface is implemented on the CW to illustrate how this integrated computational and experimental platform can be used to study systems neurophysiology and learning in a behavior task. We believe this implementation is also the first remote execution and adaptation of a brain-machine interface. PMID:20126436
ERIC Educational Resources Information Center
Linn, Marcia C.
1995-01-01
Describes a framework called scaffolded knowledge integration and illustrates how it guided the design of two successful course enhancements in the field of computer science and engineering: the LISP Knowledge Integration Environment and the spatial reasoning environment. (101 references) (Author/MKR)
Remote sensing of land-based voids using computer enhanced infrared thermography
NASA Astrophysics Data System (ADS)
Weil, Gary J.
1989-10-01
Experiments are described in which computer-enhanced infrared thermography techniques are used to detect and describe subsurface land-based voids, such as voids surrounding buried utility pipes, voids in concrete structures such as airport taxiways, abandoned buried utility storage tanks, and caves and underground shelters. Infrared thermography also helps to evaluate bridge deck systems, highway pavements, and garage concrete. The IR thermography techniques make it possible to survey large areas quickly and efficiently. The paper also surveys the advantages and limitations of thermographic testing in comparison with other forms of NDT.
Application of remote sensing to state and regional problems
NASA Technical Reports Server (NTRS)
Miller, W. F. (Principal Investigator); Tingle, J.; Wright, L. H.; Tebbs, B.
1984-01-01
Progress was made in the hydroclimatology, habitat modeling and inventory, computer analysis, wildlife management, and data comparison programs that utilize LANDSAT and SEASAT data provided to Mississippi researchers through the remote sensing applications program. Specific topics include water runoff in central Mississippi, habitat models for the endangered gopher tortoise, coyote, and turkey Geographic Information Systems (GIS) development, forest inventory along the Mississipppi River, and the merging of LANDSAT and SEASAT data for enhanced forest type discrimination.
Distributive, Non-destructive Real-time System and Method for Snowpack Monitoring
NASA Technical Reports Server (NTRS)
Frolik, Jeff (Inventor); Skalka, Christian (Inventor)
2013-01-01
A ground-based system that provides quasi real-time measurement and collection of snow-water equivalent (SWE) data in remote settings is provided. The disclosed invention is significantly less expensive and easier to deploy than current methods and less susceptible to terrain and snow bridging effects. Embodiments of the invention include remote data recovery solutions. Compared to current infrastructure using existing SWE technology, the disclosed invention allows more SWE sites to be installed for similar cost and effort, in a greater variety of terrain; thus, enabling data collection at improved spatial resolutions. The invention integrates a novel computational architecture with new sensor technologies. The invention's computational architecture is based on wireless sensor networks, comprised of programmable, low-cost, low-powered nodes capable of sophisticated sensor control and remote data communication. The invention also includes measuring attenuation of electromagnetic radiation, an approach that is immune to snow bridging and significantly reduces sensor footprints.
Design and implementation of a UNIX based distributed computing system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Love, J.S.; Michael, M.W.
1994-12-31
We have designed, implemented, and are running a corporate-wide distributed processing batch queue on a large number of networked workstations using the UNIX{reg_sign} operating system. Atlas Wireline researchers and scientists have used the system for over a year. The large increase in available computer power has greatly reduced the time required for nuclear and electromagnetic tool modeling. Use of remote distributed computing has simultaneously reduced computation costs and increased usable computer time. The system integrates equipment from different manufacturers, using various CPU architectures, distinct operating system revisions, and even multiple processors per machine. Various differences between the machines have tomore » be accounted for in the master scheduler. These differences include shells, command sets, swap spaces, memory sizes, CPU sizes, and OS revision levels. Remote processing across a network must be performed in a manner that is seamless from the users` perspective. The system currently uses IBM RISC System/6000{reg_sign}, SPARCstation{sup TM}, HP9000s700, HP9000s800, and DEC Alpha AXP{sup TM} machines. Each CPU in the network has its own speed rating, allowed working hours, and workload parameters. The system if designed so that all of the computers in the network can be optimally scheduled without adversely impacting the primary users of the machines. The increase in the total usable computational capacity by means of distributed batch computing can change corporate computing strategy. The integration of disparate computer platforms eliminates the need to buy one type of computer for computations, another for graphics, and yet another for day-to-day operations. It might be possible, for example, to meet all research and engineering computing needs with existing networked computers.« less
Videotex and Education: A Review of British Developments.
ERIC Educational Resources Information Center
Real, Michael R.
Defining videotex, viewdata, teletext, and their cognates as systems that transmit computerized pages of information for remote display (on a television screen, variously integrating computers, and video, broadcasting, telephone, typewriter, and related technologies), this report explores educational and related applications of videotex…
EVALUATING LANDSCAPE CHANGE AND HYDROLOGICAL CONSEQUENCES IN A SEMI-ARID ENVIRONMENT
During the past two decades, important advances in the integration of remote imagery, computer processing, and spatial analysis technologies have been used to better understand the distribution of natural communities and ecosystems, and the ecological processes that affect these ...
NASA Astrophysics Data System (ADS)
Hodam, H.; Goetzke, R.; Rinow, A.; Voß, K.
2012-04-01
The project FIS - Fernerkundung in Schulen (German for "Remote Sensing in Schools") - aims at a better integration of remote sensing in school lessons. Respectively, the overall ob-jective is to teach pupils from primary school up to high-school graduation basics and fields of application of remote sensing. Working with remote sensing data opens up new and modern ways of teaching. Therefore many teachers have great interest in the subject "remote sensing", being motivated to integrate this topic into teaching, provided that the curriculum is con-sidered. In many cases, this encouragement fails because of confusing information, which ruins all good intentions. For this reason, a comprehensive and well structured learning portal on the subject remote sensing is developed. This will allow teachers and pupils to have a structured initial understanding of the topic. Recognizing that in-depth use of satellite imagery can only be achieved by the means of computer aided learning methods, a sizeable number of e-Learning contents have been created throughout the last 5 years since the project's kickoff which are now integrated into the learning portal. Three main sections form the backbone of the developed learning portal. 1. The "Teaching Materials" section provides registered teachers with interactive lessons to convey curriculum relevant topics through remote sensing. They are able to use the implemented management system to create classes and enregister pupils, keep track of their progresses and control results of the conducted lessons. Abandoning the functio-nalities of the management system the lessons are also available to non-registered us-ers. 2. Pupils and Teachers can investigate further into remote sensing in the "Research" sec-tion, where a knowledge base alongside a satellite image gallery offer general back-ground information on remote sensing and the provided lessons in a semi interactive manner. 3. The "Analysis Tools" section offers means to further experiment with satellite images by working with predefined sets of Images and Tools. All three sections of the platform are presented exemplary explaining the underlying didactical and technical concepts of the project, showing how they are realized and what their potentials are when put to use in school lessons.
High-Speed Recording of Test Data on Hard Disks
NASA Technical Reports Server (NTRS)
Lagarde, Paul M., Jr.; Newnan, Bruce
2003-01-01
Disk Recording System (DRS) is a systems-integration computer program for a direct-to-disk (DTD) high-speed data acquisition system (HDAS) that records rocket-engine test data. The HDAS consists partly of equipment originally designed for recording the data on tapes. The tape recorders were replaced with hard-disk drives, necessitating the development of DRS to provide an operating environment that ties two computers, a set of five DTD recorders, and signal-processing circuits from the original tape-recording version of the HDAS into one working system. DRS includes three subsystems: (1) one that generates a graphical user interface (GUI), on one of the computers, that serves as a main control panel; (2) one that generates a GUI, on the other computer, that serves as a remote control panel; and (3) a data-processing subsystem that performs tasks on the DTD recorders according to instructions sent from the main control panel. The software affords capabilities for dynamic configuration to record single or multiple channels from a remote source, remote starting and stopping of the recorders, indexing to prevent overwriting of data, and production of filtered frequency data from an original time-series data file.
Using computational modeling of river flow with remotely sensed data to infer channel bathymetry
Nelson, Jonathan M.; McDonald, Richard R.; Kinzel, Paul J.; Shimizu, Y.
2012-01-01
As part of an ongoing investigation into the use of computational river flow and morphodynamic models for the purpose of correcting and extending remotely sensed river datasets, a simple method for inferring channel bathymetry is developed and discussed. The method is based on an inversion of the equations expressing conservation of mass and momentum to develop equations that can be solved for depth given known values of vertically-averaged velocity and water-surface elevation. The ultimate goal of this work is to combine imperfect remotely sensed data on river planform, water-surface elevation and water-surface velocity in order to estimate depth and other physical parameters of river channels. In this paper, the technique is examined using synthetic data sets that are developed directly from the application of forward two-and three-dimensional flow models. These data sets are constrained to satisfy conservation of mass and momentum, unlike typical remotely sensed field data sets. This provides a better understanding of the process and also allows assessment of how simple inaccuracies in remotely sensed estimates might propagate into depth estimates. The technique is applied to three simple cases: First, depth is extracted from a synthetic dataset of vertically averaged velocity and water-surface elevation; second, depth is extracted from the same data set but with a normally-distributed random error added to the water-surface elevation; third, depth is extracted from a synthetic data set for the same river reach using computed water-surface velocities (in place of depth-integrated values) and water-surface elevations. In each case, the extracted depths are compared to the actual measured depths used to construct the synthetic data sets (with two- and three-dimensional flow models). Errors in water-surface elevation and velocity that are very small degrade depth estimates and cannot be recovered. Errors in depth estimates associated with assuming water-surface velocities equal to depth-integrated velocities are substantial, but can be reduced with simple corrections.
Environments for online maritime simulators with cloud computing capabilities
NASA Astrophysics Data System (ADS)
Raicu, Gabriel; Raicu, Alexandra
2016-12-01
This paper presents the cloud computing environments, network principles and methods for graphical development in realistic naval simulation, naval robotics and virtual interactions. The aim of this approach is to achieve a good simulation quality in large networked environments using open source solutions designed for educational purposes. Realistic rendering of maritime environments requires near real-time frameworks with enhanced computing capabilities during distance interactions. E-Navigation concepts coupled with the last achievements in virtual and augmented reality will enhance the overall experience leading to new developments and innovations. We have to deal with a multiprocessing situation using advanced technologies and distributed applications using remote ship scenario and automation of ship operations.
2010-04-01
failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE APR 2010 2. REPORT...The second is a ‘mechanical’ part that is controlled by circuit boards and is accessible by the technician via the serial console and running...was the use of conventional remote access solution designed for telecommuters or teleworkers in the Information Technology (IT) world, such as a
3D Medical Collaboration Technology to Enhance Emergency Healthcare
Welch, Greg; Sonnenwald, Diane H; Fuchs, Henry; Cairns, Bruce; Mayer-Patel, Ketan; Söderholm, Hanna M.; Yang, Ruigang; State, Andrei; Towles, Herman; Ilie, Adrian; Ampalam, Manoj; Krishnan, Srinivas; Noel, Vincent; Noland, Michael; Manning, James E.
2009-01-01
Two-dimensional (2D) videoconferencing has been explored widely in the past 15–20 years to support collaboration in healthcare. Two issues that arise in most evaluations of 2D videoconferencing in telemedicine are the difficulty obtaining optimal camera views and poor depth perception. To address these problems, we are exploring the use of a small array of cameras to reconstruct dynamic three-dimensional (3D) views of a remote environment and of events taking place within. The 3D views could be sent across wired or wireless networks to remote healthcare professionals equipped with fixed displays or with mobile devices such as personal digital assistants (PDAs). The remote professionals’ viewpoints could be specified manually or automatically (continuously) via user head or PDA tracking, giving the remote viewers head-slaved or hand-slaved virtual cameras for monoscopic or stereoscopic viewing of the dynamic reconstructions. We call this idea remote 3D medical collaboration. In this article we motivate and explain the vision for 3D medical collaboration technology; we describe the relevant computer vision, computer graphics, display, and networking research; we present a proof-of-concept prototype system; and we present evaluation results supporting the general hypothesis that 3D remote medical collaboration technology could offer benefits over conventional 2D videoconferencing in emergency healthcare. PMID:19521951
3D medical collaboration technology to enhance emergency healthcare.
Welch, Gregory F; Sonnenwald, Diane H; Fuchs, Henry; Cairns, Bruce; Mayer-Patel, Ketan; Söderholm, Hanna M; Yang, Ruigang; State, Andrei; Towles, Herman; Ilie, Adrian; Ampalam, Manoj K; Krishnan, Srinivas; Noel, Vincent; Noland, Michael; Manning, James E
2009-04-19
Two-dimensional (2D) videoconferencing has been explored widely in the past 15-20 years to support collaboration in healthcare. Two issues that arise in most evaluations of 2D videoconferencing in telemedicine are the difficulty obtaining optimal camera views and poor depth perception. To address these problems, we are exploring the use of a small array of cameras to reconstruct dynamic three-dimensional (3D) views of a remote environment and of events taking place within. The 3D views could be sent across wired or wireless networks to remote healthcare professionals equipped with fixed displays or with mobile devices such as personal digital assistants (PDAs). The remote professionals' viewpoints could be specified manually or automatically (continuously) via user head or PDA tracking, giving the remote viewers head-slaved or hand-slaved virtual cameras for monoscopic or stereoscopic viewing of the dynamic reconstructions. We call this idea remote 3D medical collaboration. In this article we motivate and explain the vision for 3D medical collaboration technology; we describe the relevant computer vision, computer graphics, display, and networking research; we present a proof-of-concept prototype system; and we present evaluation results supporting the general hypothesis that 3D remote medical collaboration technology could offer benefits over conventional 2D videoconferencing in emergency healthcare.
The Real-Time Monitoring Service Platform for Land Supervision Based on Cloud Integration
NASA Astrophysics Data System (ADS)
Sun, J.; Mao, M.; Xiang, H.; Wang, G.; Liang, Y.
2018-04-01
Remote sensing monitoring has become the important means for land and resources departments to strengthen supervision. Aiming at the problems of low monitoring frequency and poor data currency in current remote sensing monitoring, this paper researched and developed the cloud-integrated real-time monitoring service platform for land supervision which enhanced the monitoring frequency by acquiring the domestic satellite image data overall and accelerated the remote sensing image data processing efficiency by exploiting the intelligent dynamic processing technology of multi-source images. Through the pilot application in Jinan Bureau of State Land Supervision, it has been proved that the real-time monitoring technical method for land supervision is feasible. In addition, the functions of real-time monitoring and early warning are carried out on illegal land use, permanent basic farmland protection and boundary breakthrough in urban development. The application has achieved remarkable results.
Remote Data Retrieval for Bioinformatics Applications: An Agent Migration Approach
Gao, Lei; Dai, Hua; Zhang, Tong-Liang; Chou, Kuo-Chen
2011-01-01
Some of the approaches have been developed to retrieve data automatically from one or multiple remote biological data sources. However, most of them require researchers to remain online and wait for returned results. The latter not only requires highly available network connection, but also may cause the network overload. Moreover, so far none of the existing approaches has been designed to address the following problems when retrieving the remote data in a mobile network environment: (1) the resources of mobile devices are limited; (2) network connection is relatively of low quality; and (3) mobile users are not always online. To address the aforementioned problems, we integrate an agent migration approach with a multi-agent system to overcome the high latency or limited bandwidth problem by moving their computations to the required resources or services. More importantly, the approach is fit for the mobile computing environments. Presented in this paper are also the system architecture, the migration strategy, as well as the security authentication of agent migration. As a demonstration, the remote data retrieval from GenBank was used to illustrate the feasibility of the proposed approach. PMID:21701677
NASA Technical Reports Server (NTRS)
Martinko, Edward A.; Merchant, James W.
1988-01-01
During 1986 to 1987, the Kansas Applied Remote Sensing (KARS) Program continued to build upon long-term research efforts oriented towards enhancement and development of technologies for using remote sensing in the inventory and evaluation of land use and renewable resources (both natural and agricultural). These research efforts directly addressed needs and objectives of NASA's Land-Related Global Habitability Program as well as needs of and interests of public agencies and private firms. The KARS Program placed particular emphasis on two major areas: development of intelligent algorithms to improve automated classification of digital multispectral data; and integrating and merging digital multispectral data with ancillary data in spatial modes.
NASA/BLM Applications Pilot Test (APT), phase 2. Volume 3: Technology transfer
NASA Technical Reports Server (NTRS)
1981-01-01
Techniques used and materials presented at a planning session and two workshops held to provide hands-on training in the integration of quantitatively based remote sensing data are described as well as methods used to enhance understanding of approaches to inventories that integrate multiple data sources given various resource information objectives. Significant results from each of the technology transfer sessions are examined.
Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian
2015-03-01
The telecare medical information systems (TMISs) enable patients to conveniently enjoy telecare services at home. The protection of patient's privacy is a key issue due to the openness of communication environment. Authentication as a typical approach is adopted to guarantee confidential and authorized interaction between the patient and remote server. In order to achieve the goals, numerous remote authentication schemes based on cryptography have been presented. Recently, Arshad et al. (J Med Syst 38(12): 2014) presented a secure and efficient three-factor authenticated key exchange scheme to remedy the weaknesses of Tan et al.'s scheme (J Med Syst 38(3): 2014). In this paper, we found that once a successful off-line password attack that results in an adversary could impersonate any user of the system in Arshad et al.'s scheme. In order to thwart these security attacks, an enhanced biometric and smart card based remote authentication scheme for TMISs is proposed. In addition, the BAN logic is applied to demonstrate the completeness of the enhanced scheme. Security and performance analyses show that our enhanced scheme satisfies more security properties and less computational cost compared with previously proposed schemes.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology.
Deodhar, Suruchi; Bisset, Keith R; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V
2014-07-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity.
An Interactive, Web-based High Performance Modeling Environment for Computational Epidemiology
Deodhar, Suruchi; Bisset, Keith R.; Chen, Jiangzhuo; Ma, Yifei; Marathe, Madhav V.
2014-01-01
We present an integrated interactive modeling environment to support public health epidemiology. The environment combines a high resolution individual-based model with a user-friendly web-based interface that allows analysts to access the models and the analytics back-end remotely from a desktop or a mobile device. The environment is based on a loosely-coupled service-oriented-architecture that allows analysts to explore various counter factual scenarios. As the modeling tools for public health epidemiology are getting more sophisticated, it is becoming increasingly hard for non-computational scientists to effectively use the systems that incorporate such models. Thus an important design consideration for an integrated modeling environment is to improve ease of use such that experimental simulations can be driven by the users. This is achieved by designing intuitive and user-friendly interfaces that allow users to design and analyze a computational experiment and steer the experiment based on the state of the system. A key feature of a system that supports this design goal is the ability to start, stop, pause and roll-back the disease propagation and intervention application process interactively. An analyst can access the state of the system at any point in time and formulate dynamic interventions based on additional information obtained through state assessment. In addition, the environment provides automated services for experiment set-up and management, thus reducing the overall time for conducting end-to-end experimental studies. We illustrate the applicability of the system by describing computational experiments based on realistic pandemic planning scenarios. The experiments are designed to demonstrate the system's capability and enhanced user productivity. PMID:25530914
NASA Astrophysics Data System (ADS)
Jones, A. S.; Andales, A.; McGovern, C.; Smith, G. E. B.; David, O.; Fletcher, S. J.
2017-12-01
US agricultural and Govt. lands have a unique co-dependent relationship, particularly in the Western US. More than 30% of all irrigated US agricultural output comes from lands sustained by the Ogallala Aquifer in the western Great Plains. Six US Forest Service National Grasslands reside within the aquifer region, consisting of over 375,000 ha (3,759 km2) of USFS managed lands. Likewise, National Forest lands are the headwaters to many intensive agricultural regions. Our Ogallala Aquifer team is enhancing crop irrigation decision tools with predictive weather and remote sensing data to better manage water for irrigated crops within these regions. An integrated multi-model software framework is used to link irrigation decision tools, resulting in positive management benefits on natural water resources. Teams and teams-of-teams can build upon these multi-disciplinary multi-faceted modeling capabilities. For example, the CSU Catalyst for Innovative Partnerships program has formed a new multidisciplinary team that will address "Rural Wealth Creation" focusing on the many integrated links between economic, agricultural production and management, natural resource availabilities, and key social aspects of govt. policy recommendations. By enhancing tools like these with predictive weather and other related data (like in situ measurements, hydrologic models, remotely sensed data sets, and (in the near future) linking to agro-economic and life cycle assessment models) this work demonstrates an integrated data-driven future vision of inter-meshed dynamic systems that can address challenging multi-system problems. We will present the present state of the work and opportunities for future involvement.
Use of World Wide Web server and browser software to support a first-year medical physiology course.
Davis, M J; Wythe, J; Rozum, J S; Gore, R W
1997-06-01
We describe the use of a World Wide Web (Web) server to support a team-taught physiology course for first-year medical students. Our objectives were to reduce the number of formal lecture hours and enhance student enthusiasm by using more multimedia materials and creating opportunities for interactive learning. On-line course materials, consisting of administrative documents, lecture notes, animations, digital movies, practice tests, and grade reports, were placed on a departmental computer with an Internet connection. Students used Web browsers to access on-line materials from a variety of computing platforms on campus, at home, and at remote sites. To assess use of the materials and their effectiveness, we analyzed 1) log files from the server, and 2) the results of a written course evaluation completed by all students. Lecture notes and practice tests were the most-used documents. The students' evaluations indicated that computer use in class made the lecture material more interesting, while the on-line documents helped reinforce lecture materials and the textbook. We conclude that the effectiveness of on-line materials depends on several different factors, including 1) the number of instructors that provide materials; 2) the quantity of other materials handed out; 3) the degree to which computer use is demonstrated in class and integrated into lectures; and 4) the ease with which students can access the materials. Finally, we propose that additional implementation of Internet-based resources beyond what we have described would further enhance a physiology course for first-year medical students.
Schutte, Jamie L; McCue, Michael P; Parmanto, Bambang; McGonigle, John; Handen, Benjamin; Lewis, Allen; Pulantara, I Wayan; Saptono, Andi
2015-03-01
The Autism Diagnostic Observation Schedule (ADOS) Module 4 is an autism assessment designed for verbally fluent adolescents and adults. Because of a shortage of available clinical expertise, it can be difficult for adults to receive a proper autism spectrum disorder (ASD) diagnostic assessment. A potential option to address this shortage is remote assessment. The objective of this study was to examine the feasibility, usability, and reliability of administering the ADOS Module 4 remotely using the Versatile and Integrated System for Telerehabilitation (VISYTER). VISYTER consists of computer stations at the client site and clinician site for video communication and a Web portal for managing and coordinating the assessment process. Twenty-three adults with an ASD diagnosis participated in a within-subject crossover design study in which both a remote ADOS and a face-to-face ADOS were administered. After completing the remote ADOS, participants completed a satisfaction survey. Participant satisfaction with the remote ADOS delivery system was high. The kappa value was greater than 0.61 on 21 of 31 ADOS items. There was substantial agreement on ADOS classification (i.e., diagnosis) between assessments delivered face-to-face versus assessments delivered remotely (interclass coefficient=0.92). Non-agreement may have been due to outside factors or practice effect despite a washout period. The results of this study demonstrate that an autism assessment designed to be delivered face to face can be administered remotely using an integrated Web-based system with high levels of usability and reliability.
Techniques for digital enhancement of Landsat MSS data using an Apple II+ microcomputer
NASA Technical Reports Server (NTRS)
Harrington, J. A., Jr.; Cartin, K. F.
1984-01-01
The information provided by remotely sensed data collected from orbiting platforms has been useful in many research fields. Particularly convenient for evaluation are generally digital data stored on computer compatible tapes (CCT's). The major advantages of CCT's are the quality of the data and the accessibility to computer manipulation. Minicomputer systems are widely used for the required computer processing operations. However, microprocessor-related technological advances make it now possible to process CCT data with computing systems which can be obtained at a much lower price than minicomputer systems. A detailed description is provided of the design considerations of a microcomputer-based Digital Image Analysis System (DIAS). Particular attention is given to the algorithms which are incorporated for eighter edge enhancement or smoothing Landsat multispectral scanner data.
JIP: Java image processing on the Internet
NASA Astrophysics Data System (ADS)
Wang, Dongyan; Lin, Bo; Zhang, Jun
1998-12-01
In this paper, we present JIP - Java Image Processing on the Internet, a new Internet based application for remote education and software presentation. JIP offers an integrate learning environment on the Internet where remote users not only can share static HTML documents and lectures notes, but also can run and reuse dynamic distributed software components, without having the source code or any extra work of software compilation, installation and configuration. By implementing a platform-independent distributed computational model, local computational resources are consumed instead of the resources on a central server. As an extended Java applet, JIP allows users to selected local image files on their computers or specify any image on the Internet using an URL as input. Multimedia lectures such as streaming video/audio and digital images are integrated into JIP and intelligently associated with specific image processing functions. Watching demonstrations an practicing the functions with user-selected input data dramatically encourages leaning interest, while promoting the understanding of image processing theory. The JIP framework can be easily applied to other subjects in education or software presentation, such as digital signal processing, business, mathematics, physics, or other areas such as employee training and charged software consumption.
Virtual fixtures as tools to enhance operator performance in telepresence environments
NASA Astrophysics Data System (ADS)
Rosenberg, Louis B.
1993-12-01
This paper introduces the notion of virtual fixtures for use in telepresence systems and presents an empirical study which demonstrates that such virtual fixtures can greatly enhance operator performance within remote environments. Just as tools and fixtures in the real world can enhance human performance by guiding manual operations, providing localizing references, and reducing the mental processing required to perform a task, virtual fixtures are computer generated percepts overlaid on top of the reflection of a remote workspace which can provide similar benefits. Like a ruler guiding a pencil in a real manipulation task, a virtual fixture overlaid on top of a remote workspace can act to reduce the mental processing required to perform a task, limit the workload of certain sensory modalities, and most of all allow precision and performance to exceed natural human abilities. Because such perceptual overlays are virtual constructions they can be diverse in modality, abstract in form, and custom tailored to individual task or user needs. This study investigates the potential of virtual fixtures by implementing simple combinations of haptic and auditory sensations as perceptual overlays during a standardized telemanipulation task.
Chen, J M; Thomas, S C; Yin, Y; Maclaren, V; Liu, J; Pan, J; Liu, G; Tian, Q; Zhu, Q; Pan, J-J; Shi, X; Xue, J; Kang, E
2007-11-01
This article serves as an introduction to this special issue, "China's Forest Carbon Sequestration", representing major results of a project sponsored by the Canadian International Development Agency and the Chinese Academy of Sciences. China occupies a pivotal position globally as a principle emitter of carbon dioxide, as host to some of the world's largest reforestation efforts, and as a key player in international negotiations aimed at reducing global greenhouse gas emission. The goals of this project are to develop remote sensing approaches for quantifying forest carbon balance in China in a transparent manner, and information and tools to support land-use decisions for enhanced carbon sequestration (CS) that are science based and economically and socially viable. The project consists of three components: (i) remote sensing and carbon modeling, (ii) forest and soil assessment, and (iii) integrated assessment of the socio-economic implications of CS via forest management. Articles included in this special issue are highlights of the results of each of these components.
Passive microwave remote sensing of an anisotropic random-medium layer
NASA Technical Reports Server (NTRS)
Lee, J. K.; Kong, J. A.
1985-01-01
The principle of reciprocity is invoked to calculate the brightness temperatures for passive microwave remote sensing of a two-layer anisotropic random medium. The bistatic scattering coefficients are first computed with the Born approximation and then integrated over the upper hemisphere to be subtracted from unity, in order to obtain the emissivity for the random-medium layer. The theoretical results are illustrated by plotting the emissivities as functions of viewing angles and polarizations. They are used to interpret remote sgnsing data obtained from vegetation canopy where the anisotropic random-medium model applies. Field measurements with corn stalks arranged in various configurations with preferred azimuthal directions are successfully interpreted with this model.
Remote Video Monitor of Vehicles in Cooperative Information Platform
NASA Astrophysics Data System (ADS)
Qin, Guofeng; Wang, Xiaoguo; Wang, Li; Li, Yang; Li, Qiyan
Detection of vehicles plays an important role in the area of the modern intelligent traffic management. And the pattern recognition is a hot issue in the area of computer vision. An auto- recognition system in cooperative information platform is studied. In the cooperative platform, 3G wireless network, including GPS, GPRS (CDMA), Internet (Intranet), remote video monitor and M-DMB networks are integrated. The remote video information can be taken from the terminals and sent to the cooperative platform, then detected by the auto-recognition system. The images are pretreated and segmented, including feature extraction, template matching and pattern recognition. The system identifies different models and gets vehicular traffic statistics. Finally, the implementation of the system is introduced.
Huszar, Gabor; Celik-Ozenci, Ciler; Cayli, Sevil; Kovacs, Tamas; Vigue, Lynne; Kovanci, Ertug
2004-01-01
We tested several approaches that can be used to preserve sperm attributes and the objective biochemical markers of sperm maturity and function for assessment in a remote centralized laboratory after overnight shipping of semen samples. Addition of phenyl-methyl-sulfonyl-fluoride (PMSF) to a final concentration of 20 microg/mL semen at 4 degrees C has preserved sperm concentrations and HspA2 isoform ratios, even at room temperature, simulating a shipping delay in moderate ambient temperatures. Regarding the attributes of individual spermatozoa, the patterns of CK-immunocytochemistry (demonstrates cytoplasmic retention in diminished-maturity spermatozoa); aniline blue staining pattern (tests chromatin maturity); sperm shape assessed by both Kruger strict morphology and computer assisted morphometry; and sperm DNA integrity, as tested by DNA nick translation, all remained unchanged. Thus, the PMSF-4 degrees C conditions preserved sperm concentrations and the cytoplasmic and nuclear biomarkers of sperm cellular maturity and function for next-day analysis. This shipping method will facilitate the early detection of subtle changes in semen quality that can affect sperm function, even when there has been no decline in sperm concentrations to signal possible toxic effects. Furthermore, sample preservation will enable investigators to evaluate semen for toxicology studies and for diagnosis of male infertility from remote locations. Home collection of semen should enhance study participation, and semen assessment in centralized laboratories will address concerns regarding interlaboratory variations and quality control.
Parallel Rendering of Large Time-Varying Volume Data
NASA Technical Reports Server (NTRS)
Garbutt, Alexander E.
2005-01-01
Interactive visualization of large time-varying 3D volume datasets has been and still is a great challenge to the modem computational world. It stretches the limits of the memory capacity, the disk space, the network bandwidth and the CPU speed of a conventional computer. In this SURF project, we propose to develop a parallel volume rendering program on SGI's Prism, a cluster computer equipped with state-of-the-art graphic hardware. The proposed program combines both parallel computing and hardware rendering in order to achieve an interactive rendering rate. We use 3D texture mapping and a hardware shader to implement 3D volume rendering on each workstation. We use SGI's VisServer to enable remote rendering using Prism's graphic hardware. And last, we will integrate this new program with ParVox, a parallel distributed visualization system developed at JPL. At the end of the project, we Will demonstrate remote interactive visualization using this new hardware volume renderer on JPL's Prism System using a time-varying dataset from selected JPL applications.
NASA Astrophysics Data System (ADS)
Davis, A. B.; Xu, F.; Diner, D. J.
2017-12-01
Two perennial problems in applied theoretical and computational radiative transfer (RT) are: (1) the impact of unresolved spatial variability on large-scale fluxes (in climate models) or radiances (in remote sensing); and (2) efficient-yet-accurate estimation of broadband spectral integrals in radiant energy budget estimation as well as in remote sensing, in particular, of trace gases.Generalized RT (GRT) is a modification of classic RT in an optical medium with uniform extinction where Beer's exponential law for direct transmission is replaced by a monotonically decreasing function with a slower power-law decay. In a convenient parameterized version of GRT, mean extinction replaces the uniform value and just one new property is introduced. As a non-dimensional metric for the unresolved variability, we use the square of the mean extinction coefficient divided by its variance. This parameter is also the exponent of the power-law tail of the modified transmission law.This specific form of sub-exponential transmission has explored for almost two decades in application to spatial variability in the presence of long-range correlations, much like in turbulent media such as clouds, with a focus on multiple scattering. It has also been proposed by Conley and Collins (JQSRT, 112, 1525-, 2011) to improve on the standard (weak-line) implementation of the correlated-k technique for efficient spectral integration.We have merged these two applications within a rigorous formulation of the combined problem, and solve the new integral RT equations in the single-scattering limit. The result is illustrated by addressing practical problems in multi-angle remote sensing of aerosols using the O2 A-band, an emerging methodology for passive profiling of coarse aerosols and clouds.
Future remote-sensing programs
NASA Technical Reports Server (NTRS)
Schweickart, R. L.
1975-01-01
User requirements and methods developed to fulfill them are discussed. Quick-look data, data storage on computer-compatible tape, and an integrated capability for production of images from the whole class of earth-viewing satellites are among the new developments briefly described. The increased capability of LANDSAT-C and Nimbus G and the needs of specialized applications such as, urban land use planning, cartography, accurate measurement of small agricultural fields, thermal mapping and coastal zone management are examined. The affect of the space shuttle on remote sensing technology through increased capability is considered.
ERIC Educational Resources Information Center
Laing, Gregory Kenneth; Perrin, Ronald William
2012-01-01
This paper presents the findings of a field study conducted to ascertain the perceptions of first year accounting students concerning the integration of computer applications in the accounting curriculum. The results indicate that both student cohorts perceived the computer as a valuable educational tool. The use of computers to enhance the…
Computer Technology-Infused Learning Enhancement
ERIC Educational Resources Information Center
Keengwe, Jared; Anyanwu, Longy O.
2007-01-01
The purpose of the study was to determine students' perception of instructional integration of computer technology to improve learning. Two key questions were investigated in this study: (a) What is the students' perception of faculty integration of computer technology into classroom instruction? (b) To what extent does the students' perception of…
Combining hyperspectral imaging and Raman spectroscopy for remote chemical sensing
NASA Astrophysics Data System (ADS)
Ingram, John M.; Lo, Edsanter
2008-04-01
The Photonics Research Center at the United States Military Academy is conducting research to demonstrate the feasibility of combining hyperspectral imaging and Raman spectroscopy for remote chemical detection over a broad area of interest. One limitation of future trace detection systems is their ability to analyze large areas of view. Hyperspectral imaging provides a balance between fast spectral analysis and scanning area. Integration of a hyperspectral system capable of remote chemical detection will greatly enhance our soldiers' ability to see the battlefield to make threat related decisions. It can also queue the trace detection systems onto the correct interrogation area saving time and reconnaissance/surveillance resources. This research develops both the sensor design and the detection/discrimination algorithms. The one meter remote detection without background radiation is a simple proof of concept.
ERIC Educational Resources Information Center
Ilyes, Mark A.; Ortman-Link, Whitney
2009-01-01
Our school recently acquired Vernier's Wireless Dynamics Sensor System (WDSS). The WDSS consists of a three-axis accelerometer, altimeter, and force sensor that has the ability to remotely collect data for later transfer to a computer. While our primary purpose for acquiring the WDSS was to enhance our amusement park physics experiments, we…
Enhancing the Undergraduate Computing Experience in Chemical Engineering CACHE Corporation
ERIC Educational Resources Information Center
Edgar, Thomas F.
2006-01-01
This white paper focuses on the integration and enhancement of the computing experience for undergraduates throughout the chemical engineering curriculum. The computing experience for undergraduates in chemical engineering should have continuity and be coordinated from course to course, because a single software solution is difficult to achieve in…
The ORSER System for the Analysis of Remotely Sensed Digital Data
NASA Technical Reports Server (NTRS)
Myers, W. L.; Turner, B. J.
1981-01-01
The main effort of the University of Pennsylvania's Office for Remote Sensing of Earth Resources (ORSER) is the processing, analysis, and interpretation of multispectral data, most often supplied by NASA in the form of imagery and digital data. The facilities used for data reduction and image enhancement are described as well as the development of algorithms for producing a computer map showing various environmental and land use characteristics of data points in the analyzed scenes. The application of an (ORSER) capability for statewide monitoring of gypsy moth defoliation is discussed.
Cloud CPFP: a shotgun proteomics data analysis pipeline using cloud and high performance computing.
Trudgian, David C; Mirzaei, Hamid
2012-12-07
We have extended the functionality of the Central Proteomics Facilities Pipeline (CPFP) to allow use of remote cloud and high performance computing (HPC) resources for shotgun proteomics data processing. CPFP has been modified to include modular local and remote scheduling for data processing jobs. The pipeline can now be run on a single PC or server, a local cluster, a remote HPC cluster, and/or the Amazon Web Services (AWS) cloud. We provide public images that allow easy deployment of CPFP in its entirety in the AWS cloud. This significantly reduces the effort necessary to use the software, and allows proteomics laboratories to pay for compute time ad hoc, rather than obtaining and maintaining expensive local server clusters. Alternatively the Amazon cloud can be used to increase the throughput of a local installation of CPFP as necessary. We demonstrate that cloud CPFP allows users to process data at higher speed than local installations but with similar cost and lower staff requirements. In addition to the computational improvements, the web interface to CPFP is simplified, and other functionalities are enhanced. The software is under active development at two leading institutions and continues to be released under an open-source license at http://cpfp.sourceforge.net.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mengel, S.K.; Morrison, D.B.
1985-01-01
Consideration is given to global biogeochemical issues, image processing, remote sensing of tropical environments, global processes, geology, landcover hydrology, and ecosystems modeling. Topics discussed include multisensor remote sensing strategies, geographic information systems, radars, and agricultural remote sensing. Papers are presented on fast feature extraction; a computational approach for adjusting TM imagery terrain distortions; the segmentation of a textured image by a maximum likelihood classifier; analysis of MSS Landsat data; sun angle and background effects on spectral response of simulated forest canopies; an integrated approach for vegetation/landcover mapping with digital Landsat images; geological and geomorphological studies using an image processing technique;more » and wavelength intensity indices in relation to tree conditions and leaf-nutrient content.« less
Improved Interactive Medical-Imaging System
NASA Technical Reports Server (NTRS)
Ross, Muriel D.; Twombly, Ian A.; Senger, Steven
2003-01-01
An improved computational-simulation system for interactive medical imaging has been invented. The system displays high-resolution, three-dimensional-appearing images of anatomical objects based on data acquired by such techniques as computed tomography (CT) and magnetic-resonance imaging (MRI). The system enables users to manipulate the data to obtain a variety of views for example, to display cross sections in specified planes or to rotate images about specified axes. Relative to prior such systems, this system offers enhanced capabilities for synthesizing images of surgical cuts and for collaboration by users at multiple, remote computing sites.
Singh, Tarkeshwar; Perry, Christopher M; Herter, Troy M
2016-01-26
Robotic and virtual-reality systems offer tremendous potential for improving assessment and rehabilitation of neurological disorders affecting the upper extremity. A key feature of these systems is that visual stimuli are often presented within the same workspace as the hands (i.e., peripersonal space). Integrating video-based remote eye tracking with robotic and virtual-reality systems can provide an additional tool for investigating how cognitive processes influence visuomotor learning and rehabilitation of the upper extremity. However, remote eye tracking systems typically compute ocular kinematics by assuming eye movements are made in a plane with constant depth (e.g. frontal plane). When visual stimuli are presented at variable depths (e.g. transverse plane), eye movements have a vergence component that may influence reliable detection of gaze events (fixations, smooth pursuits and saccades). To our knowledge, there are no available methods to classify gaze events in the transverse plane for monocular remote eye tracking systems. Here we present a geometrical method to compute ocular kinematics from a monocular remote eye tracking system when visual stimuli are presented in the transverse plane. We then use the obtained kinematics to compute velocity-based thresholds that allow us to accurately identify onsets and offsets of fixations, saccades and smooth pursuits. Finally, we validate our algorithm by comparing the gaze events computed by the algorithm with those obtained from the eye-tracking software and manual digitization. Within the transverse plane, our algorithm reliably differentiates saccades from fixations (static visual stimuli) and smooth pursuits from saccades and fixations when visual stimuli are dynamic. The proposed methods provide advancements for examining eye movements in robotic and virtual-reality systems. Our methods can also be used with other video-based or tablet-based systems in which eye movements are performed in a peripersonal plane with variable depth.
[Surgical robotics, short state of the art and prospects].
Gravez, P
2003-11-01
State-of-the-art robotized systems developed for surgery are either remotely controlled manipulators that duplicate gestures made by the surgeon (endoscopic surgery applications), or automated robots that execute trajectories defined relatively to pre-operative medical imaging (neurosurgery and orthopaedic surgery). This generation of systems primarily applies existing robotics technologies (the remote handling systems and the so-called "industrial robots") to current surgical practices. It has contributed to validate the huge potential of surgical robotics, but it suffers from several drawbacks, mainly high costs, excessive dimensions and some lack of user-friendliness. Nevertheless, technological progress let us anticipate the appearance in the near future of miniaturised surgical robots able to assist the gesture of the surgeon and to enhance his perception of the operation at hand. Due to many in-the-body articulated links, these systems will have the capability to perform complex minimally invasive gestures without obstructing the operating theatre. They will also combine the facility of manual piloting with the accuracy and increased safety of computer control, guiding the gestures of the human without offending to his freedom of action. Lastly, they will allow the surgeon to feel the mechanical properties of the tissues he is operating through a genuine "remote palpation" function. Most probably, such technological evolutions will lead the way to redesigned surgical procedures taking place inside new operating rooms featuring a better integration of all equipments and favouring cooperative work from multidisciplinary and sometimes geographically distributed medical staff.
Robotic long-distance telementoring in neurosurgery.
Mendez, Ivar; Hill, Ron; Clarke, David; Kolyvas, George; Walling, Simon
2005-03-01
To test the feasibility of long-distance telementoring in neurosurgery by providing subspecialized expertise in real time to another neurosurgeon performing a surgical procedure in a remote location. A robotic telecollaboration system (Socrates; Computer Motion, Inc., Santa Barbara, CA) capable of controlling the movements of a robotic arm, of handling two-way video, and of audio communication as well as transmission of neuronavigational data from the remote operating room was used for the telementoring procedures. Four integrated services digital network lines with a total speed of transmission of 512 kilobytes per second provided telecommunications between a large academic center (Halifax, Nova Scotia) and a community-based center (Saint John, New Brunswick) located 400 km away. Long-distance telementoring was used in three craniotomies for brain tumors, a craniotomy for an arteriovenous malformation, a carotid endarterectomy, and a lumbar laminectomy. There were no surgical complications during the procedures, and all patients had uneventful outcomes. The neurosurgeons in the remote location believed that the input from the mentors was useful in all of the cases and was crucial in the removal of a mesial temporal lobe glioma and resection of an occipital arteriovenous malformation. Our initial experience with long-distance robotic-assisted telementoring in six cases indicates that telementoring is feasible, reliable, and safe. Although still in its infancy, telementoring has the potential to improve surgical care, to enhance neurosurgical training, and to have a major impact on the delivery of neurosurgical services throughout the world.
Providing Inservice Teachers with a "TICKET" to Digital Delivery Systems.
ERIC Educational Resources Information Center
Kochery, Timothy S.
2000-01-01
In July 1998, Pennsylvania State University, Mont Alto, began its version of the "TICKET" (Technology Integration Certification for K-12 Educators and Teachers) program that provides classroom teachers in remote regional school districts with access to educational opportunities to develop technology competencies that can enhance their instruction.…
Qin, Changbo; Jia, Yangwen; Su, Z; Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen
2008-07-29
This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems.
Qin, Changbo; Jia, Yangwen; Su, Z.(Bob); Zhou, Zuhao; Qiu, Yaqin; Suhui, Shen
2008-01-01
This paper investigates whether remote sensing evapotranspiration estimates can be integrated by means of data assimilation into a distributed hydrological model for improving the predictions of spatial water distribution over a large river basin with an area of 317,800 km2. A series of available MODIS satellite images over the Haihe River basin in China are used for the year 2005. Evapotranspiration is retrieved from these 1×1 km resolution images using the SEBS (Surface Energy Balance System) algorithm. The physically-based distributed model WEP-L (Water and Energy transfer Process in Large river basins) is used to compute the water balance of the Haihe River basin in the same year. Comparison between model-derived and remote sensing retrieval basin-averaged evapotranspiration estimates shows a good piecewise linear relationship, but their spatial distribution within the Haihe basin is different. The remote sensing derived evapotranspiration shows variability at finer scales. An extended Kalman filter (EKF) data assimilation algorithm, suitable for non-linear problems, is used. Assimilation results indicate that remote sensing observations have a potentially important role in providing spatial information to the assimilation system for the spatially optical hydrological parameterization of the model. This is especially important for large basins, such as the Haihe River basin in this study. Combining and integrating the capabilities of and information from model simulation and remote sensing techniques may provide the best spatial and temporal characteristics for hydrological states/fluxes, and would be both appealing and necessary for improving our knowledge of fundamental hydrological processes and for addressing important water resource management problems. PMID:27879946
2015-01-01
a spatial resolution of 250-m. The Gumley et al. computation for MODIS sharpening is given as a ratio of high to low resolution top of the atmosphere...NIR) correction (Stumpf, Arnone, Gould, Martinolich, & Ransibrahamanakul, 2003). Standard flagswere used tomask interference from land, clouds , sun...technique This new approach expands on the methodology described by Gumley et al. (2010), with somemodifications. We will compute a sim- ilar spatial
Remote Control and Monitoring of VLBI Experiments by Smartphones
NASA Astrophysics Data System (ADS)
Ruztort, C. H.; Hase, H.; Zapata, O.; Pedreros, F.
2012-12-01
For the remote control and monitoring of VLBI operations, we developed a software optimized for smartphones. This is a new tool based on a client-server architecture with a Web interface optimized for smartphone screens and cellphone networks. The server uses variables of the Field System and its station specific parameters stored in the shared memory. The client running on the smartphone by a Web interface analyzes and visualizes the current status of the radio telescope, receiver, schedule, and recorder. In addition, it allows commands to be sent remotely to the Field System computer and displays the log entries. The user has full access to the entire operation process, which is important in emergency cases. The software also integrates a webcam interface.
Integration of advanced technologies to enhance problem-based learning over distance: Project TOUCH.
Jacobs, Joshua; Caudell, Thomas; Wilks, David; Keep, Marcus F; Mitchell, Steven; Buchanan, Holly; Saland, Linda; Rosenheimer, Julie; Lozanoff, Beth K; Lozanoff, Scott; Saiki, Stanley; Alverson, Dale
2003-01-01
Distance education delivery has increased dramatically in recent years as a result of the rapid advancement of communication technology. The National Computational Science Alliance's Access Grid represents a significant advancement in communication technology with potential for distance medical education. The purpose of this study is to provide an overview of the TOUCH project (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) with special emphasis on the process of problem-based learning case development for distribution over the Access Grid. The objective of the TOUCH project is to use emerging Internet-based technology to overcome geographic barriers for delivery of tutorial sessions to medical students pursuing rotations at remote sites. The TOUCH project also is aimed at developing a patient simulation engine and an immersive virtual reality environment to achieve a realistic health care scenario enhancing the learning experience. A traumatic head injury case is developed and distributed over the Access Grid as a demonstration of the TOUCH system. Project TOUCH serves as an example of a computer-based learning system for developing and implementing problem-based learning cases within the medical curriculum, but this system should be easily applied to other educational environments and disciplines involving functional and clinical anatomy. Future phases will explore PC versions of the TOUCH cases for increased distribution. Copyright 2003 Wiley-Liss, Inc.
Remote sensing with intense filaments enhanced by adaptive optics
NASA Astrophysics Data System (ADS)
Daigle, J.-F.; Kamali, Y.; Châteauneuf, M.; Tremblay, G.; Théberge, F.; Dubois, J.; Roy, G.; Chin, S. L.
2009-11-01
A method involving a closed loop adaptive optic system is investigated as a tool to significantly enhance the collected optical emissions, for remote sensing applications involving ultrafast laser filamentation. The technique combines beam expansion and geometrical focusing, assisted by an adaptive optics system to correct the wavefront aberrations. Targets, such as a gaseous mixture of air and hydrocarbons, solid lead and airborne clouds of contaminated aqueous aerosols, were remotely probed with filaments generated at distances up to 118 m after the focusing beam expander. The integrated backscattered signals collected by the detection system (15-28 m from the filaments) were increased up to a factor of 7, for atmospheric N2 and solid lead, when the wavefronts were corrected by the adaptive optic system. Moreover, an extrapolation based on a simplified version of the LIDAR equation showed that the adaptive optic system improved the detection distance for N2 molecular fluorescence, from 45 m for uncorrected wavefronts to 125 m for corrected.
Custom Sky-Image Mosaics from NASA's Information Power Grid
NASA Technical Reports Server (NTRS)
Jacob, Joseph; Collier, James; Craymer, Loring; Curkendall, David
2005-01-01
yourSkyG is the second generation of the software described in yourSky: Custom Sky-Image Mosaics via the Internet (NPO-30556), NASA Tech Briefs, Vol. 27, No. 6 (June 2003), page 45. Like its predecessor, yourSkyG supplies custom astronomical image mosaics of sky regions specified by requesters using client computers connected to the Internet. Whereas yourSky constructs mosaics on a local multiprocessor system, yourSkyG performs the computations on NASA s Information Power Grid (IPG), which is capable of performing much larger mosaicking tasks. (The IPG is high-performance computation and data grid that integrates geographically distributed 18 NASA Tech Briefs, September 2005 computers, databases, and instruments.) A user of yourSkyG can specify parameters describing a mosaic to be constructed. yourSkyG then constructs the mosaic on the IPG and makes it available for downloading by the user. The complexities of determining which input images are required to construct a mosaic, retrieving the required input images from remote sky-survey archives, uploading the images to the computers on the IPG, performing the computations remotely on the Grid, and downloading the resulting mosaic from the Grid are all transparent to the user
NASA Technical Reports Server (NTRS)
Park, Steve
1990-01-01
A large and diverse number of computational techniques are routinely used to process and analyze remotely sensed data. These techniques include: univariate statistics; multivariate statistics; principal component analysis; pattern recognition and classification; other multivariate techniques; geometric correction; registration and resampling; radiometric correction; enhancement; restoration; Fourier analysis; and filtering. Each of these techniques will be considered, in order.
Kahnert, Michael; Nousiainen, Timo; Lindqvist, Hannakaisa
2013-04-08
Optical properties of light absorbing carbon (LAC) aggregates encapsulated in a shell of sulfate are computed for realistic model geometries based on field measurements. Computations are performed for wavelengths from the UV-C to the mid-IR. Both climate- and remote sensing-relevant optical properties are considered. The results are compared to commonly used simplified model geometries, none of which gives a realistic representation of the distribution of the LAC mass within the host material and, as a consequence, fail to predict the optical properties accurately. A new core-gray shell model is introduced, which accurately reproduces the size- and wavelength dependence of the integrated and differential optical properties.
Note: computer controlled rotation mount for large diameter optics.
Rakonjac, Ana; Roberts, Kris O; Deb, Amita B; Kjærgaard, Niels
2013-02-01
We describe the construction of a motorized optical rotation mount with a 40 mm clear aperture. The device is used to remotely control the power of large diameter laser beams for a magneto-optical trap. A piezo-electric ultrasonic motor on a printed circuit board provides rotation with a precision better than 0.03° and allows for a very compact design. The rotation unit is controlled from a computer via serial communication, making integration into most software control platforms straightforward.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, D.E.
1981-02-01
The Control Data Corporation Type 200 User Terminal utilizes a unique communications protocol to provide users with batch mode remote terminal access to Control Data computers. CDC/1000 is a software subsystem that implements this protocol on Hewlett-Packard minicomputers running the Real Time Executive III, IV, or IVB operating systems. This report provides brief descriptions of the various software modules comprising CDC/1000, and contains detailed instructions for integrating CDC/1000 into the Hewlett Packard operating system and for operating UTERM, the user interface program for CDC/1000. 6 figures.
ORBIT: an integrated environment for user-customized bioinformatics tools.
Bellgard, M I; Hiew, H L; Hunter, A; Wiebrands, M
1999-10-01
There are a large number of computational programs freely available to bioinformaticians via a client/server, web-based environment. However, the client interface to these tools (typically an html form page) cannot be customized from the client side as it is created by the service provider. The form page is usually generic enough to cater for a wide range of users. However, this implies that a user cannot set as 'default' advanced program parameters on the form or even customize the interface to his/her specific requirements or preferences. Currently, there is a lack of end-user interface environments that can be modified by the user when accessing computer programs available on a remote server running on an intranet or over the Internet. We have implemented a client/server system called ORBIT (Online Researcher's Bioinformatics Interface Tools) where individual clients can have interfaces created and customized to command-line-driven, server-side programs. Thus, Internet-based interfaces can be tailored to a user's specific bioinformatic needs. As interfaces are created on the client machine independent of the server, there can be different interfaces to the same server-side program to cater for different parameter settings. The interface customization is relatively quick (between 10 and 60 min) and all client interfaces are integrated into a single modular environment which will run on any computer platform supporting Java. The system has been developed to allow for a number of future enhancements and features. ORBIT represents an important advance in the way researchers gain access to bioinformatics tools on the Internet.
Miga, Michael I
2016-01-01
With the recent advances in computing, the opportunities to translate computational models to more integrated roles in patient treatment are expanding at an exciting rate. One area of considerable development has been directed towards correcting soft tissue deformation within image guided neurosurgery applications. This review captures the efforts that have been undertaken towards enhancing neuronavigation by the integration of soft tissue biomechanical models, imaging and sensing technologies, and algorithmic developments. In addition, the review speaks to the evolving role of modeling frameworks within surgery and concludes with some future directions beyond neurosurgical applications.
NASA Technical Reports Server (NTRS)
Myers, V. I.; Frazee, C. J.; Rusche, A. E.; Moore, D. G.; Nelson, G. D.; Westin, F. C.
1974-01-01
The basic procedures for interpreting remote sensing imagery to rapidly develop general soils and land use inventories were developed and utilized in Pennington County, South Dakota. These procedures and remote sensing data products were illustrated and explained to many user groups, some of whom are interested in obtaining similar data. The general soils data were integrated with land soils data supplied by the county director of equalization to prepare a land value map. A computer print-out of this map indicating a land value for each quarter section is being used in tax reappraisal of Pennington County. The land use data provided the land use planners with the present use of land in Pennington County. Additional uses of remote sensing applications are also discussed including tornado damage assessment, hail damage evaluation, and presentation of soil and land value information on base maps assembled from ERTS-1 imagery.
Satellite remote sensing for hydrology and water management
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barrett, E.C.; Power, C.H.; Micallef, A.
Interest in satellite remote sensing is fast moving away from pure science and individual case studies towards truly operational applications. At the same time the micro-computer revolution is ensuring that data reception and processing facilities need no longer be the preserve of a small number of global centers, but can be common-place installations in smaller countries and even local regional agency offices or laboratories. As remote sensing matures, and its applications proliferate, a new type of treatment is required to ensure both that decision makers, managers and engineers with problems to solve are informed of today's opportunities and that scientistsmore » are provided with integrated overviews of the ever-growing need for their services. This book addresses these needs uniquely focusing on the area bounded by satellite remote sensing, pure and applied hydrological sciences, and a specific world region, namely the Mediterranean basin.« less
Using component technologies for web based wavelet enhanced mammographic image visualization.
Sakellaropoulos, P; Costaridou, L; Panayiotakis, G
2000-01-01
The poor contrast detectability of mammography can be dealt with by domain specific software visualization tools. Remote desktop client access and time performance limitations of a previously reported visualization tool are addressed, aiming at more efficient visualization of mammographic image resources existing in web or PACS image servers. This effort is also motivated by the fact that at present, web browsers do not support domain-specific medical image visualization. To deal with desktop client access the tool was redesigned by exploring component technologies, enabling the integration of stand alone domain specific mammographic image functionality in a web browsing environment (web adaptation). The integration method is based on ActiveX Document Server technology. ActiveX Document is a part of Object Linking and Embedding (OLE) extensible systems object technology, offering new services in existing applications. The standard DICOM 3.0 part 10 compatible image-format specification Papyrus 3.0 is supported, in addition to standard digitization formats such as TIFF. The visualization functionality of the tool has been enhanced by including a fast wavelet transform implementation, which allows for real time wavelet based contrast enhancement and denoising operations. Initial use of the tool with mammograms of various breast structures demonstrated its potential in improving visualization of diagnostic mammographic features. Web adaptation and real time wavelet processing enhance the potential of the previously reported tool in remote diagnosis and education in mammography.
Privacy-preserving public auditing for data integrity in cloud
NASA Astrophysics Data System (ADS)
Shaik Saleem, M.; Murali, M.
2018-04-01
Cloud computing which has collected extent concentration from communities of research and with industry research development, a large pool of computing resources using virtualized sharing method like storage, processing power, applications and services. The users of cloud are vend with on demand resources as they want in the cloud computing. Outsourced file of the cloud user can easily tampered as it is stored at the third party service providers databases, so there is no integrity of cloud users data as it has no control on their data, therefore providing security assurance to the users data has become one of the primary concern for the cloud service providers. Cloud servers are not responsible for any data loss as it doesn’t provide the security assurance to the cloud user data. Remote data integrity checking (RDIC) licenses an information to data storage server, to determine that it is really storing an owners data truthfully. RDIC is composed of security model and ID-based RDIC where it is responsible for the security of every server and make sure the data privacy of cloud user against the third party verifier. Generally, by running a two-party Remote data integrity checking (RDIC) protocol the clients would themselves be able to check the information trustworthiness of their cloud. Within the two party scenario the verifying result is given either from the information holder or the cloud server may be considered as one-sided. Public verifiability feature of RDIC gives the privilege to all its users to verify whether the original data is modified or not. To ensure the transparency of the publicly verifiable RDIC protocols, Let’s figure out there exists a TPA who is having knowledge and efficiency to verify the work to provide the condition clearly by publicly verifiable RDIC protocols.
Improving Remote Voting Security with CodeVoting
NASA Astrophysics Data System (ADS)
Joaquim, Rui; Ribeiro, Carlos; Ferreira, Paulo
One of the major problems that prevents the spread of elections with the possibility of remote voting over electronic networks, also called Internet Voting, is the use of unreliable client platforms, such as the voter's computer and the Internet infrastructure connecting it to the election server. A computer connected to the Internet is exposed to viruses, worms, Trojans, spyware, malware and other threats that can compromise the election's integrity. For instance, it is possible to write a virus that changes the voter's vote to a predetermined vote on election's day. Another possible attack is the creation of a fake election web site where the voter uses a malicious vote program on the web site that manipulates the voter's vote (phishing/pharming attack). Such attacks may not disturb the election protocol, therefore can remain undetected in the eyes of the election auditors.
DOT National Transportation Integrated Search
2012-03-01
This report describes Phase Two enhancement of terrestrial LiDAR scanning for bridge damage : evaluation that was initially developed in Phase One. Considering the spatial and reflectivity : information contained in LiDAR scans, two detection algorit...
Context-aware access control for pervasive access to process-based healthcare systems.
Koufi, Vassiliki; Vassilacopoulos, George
2008-01-01
Healthcare is an increasingly collaborative enterprise involving a broad range of healthcare services provided by many individuals and organizations. Grid technology has been widely recognized as a means for integrating disparate computing resources in the healthcare field. Moreover, Grid portal applications can be developed on a wireless and mobile infrastructure to execute healthcare processes which, in turn, can provide remote access to Grid database services. Such an environment provides ubiquitous and pervasive access to integrated healthcare services at the point of care, thus improving healthcare quality. In such environments, the ability to provide an effective access control mechanism that meets the requirement of the least privilege principle is essential. Adherence to the least privilege principle requires continuous adjustments of user permissions in order to adapt to the current situation. This paper presents a context-aware access control mechanism for HDGPortal, a Grid portal application which provides access to workflow-based healthcare processes using wireless Personal Digital Assistants. The proposed mechanism builds upon and enhances security mechanisms provided by the Grid Security Infrastructure. It provides tight, just-in-time permissions so that authorized users get access to specific objects according to the current context. These permissions are subject to continuous adjustments triggered by the changing context. Thus, the risk of compromising information integrity during task executions is reduced.
NASA Astrophysics Data System (ADS)
Keleshis, C.; Ioannou, S.; Vrekoussis, M.; Levin, Z.; Lange, M. A.
2014-08-01
Continuous advances in unmanned aerial vehicles (UAV) and the increased complexity of their applications raise the demand for improved data acquisition systems (DAQ). These improvements may comprise low power consumption, low volume and weight, robustness, modularity and capability to interface with various sensors and peripherals while maintaining the high sampling rates and processing speeds. Such a system has been designed and developed and is currently integrated on the Autonomous Flying Platforms for Atmospheric and Earth Surface Observations (APAESO/NEA-YΠOΔOMH/NEKΠ/0308/09) however, it can be easily adapted to any UAV or any other mobile vehicle. The system consists of a single-board computer with a dual-core processor, rugged surface-mount memory and storage device, analog and digital input-output ports and many other peripherals that enhance its connectivity with various sensors, imagers and on-board devices. The system is powered by a high efficiency power supply board. Additional boards such as frame-grabbers, differential global positioning system (DGPS) satellite receivers, general packet radio service (3G-4G-GPRS) modems for communication redundancy have been interfaced to the core system and are used whenever there is a mission need. The onboard DAQ system can be preprogrammed for automatic data acquisition or it can be remotely operated during the flight from the ground control station (GCS) using a graphical user interface (GUI) which has been developed and will also be presented in this paper. The unique design of the GUI and the DAQ system enables the synchronized acquisition of a variety of scientific and UAV flight data in a single core location. The new DAQ system and the GUI have been successfully utilized in several scientific UAV missions. In conclusion, the novel DAQ system provides the UAV and the remote-sensing community with a new tool capable of reliably acquiring, processing, storing and transmitting data from any sensor integrated on an UAV.
REMOTE: Modem Communicator Program for the IBM personal computer
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGirt, F.
1984-06-01
REMOTE, a Modem Communicator Program, was developed to provide full duplex serial communication with arbitrary remote computers via either dial-up telephone modems or direct lines. The latest version of REMOTE (documented in this report) was developed for the IBM Personal Computer.
A pen-based system to support pre-operative data collection within an anaesthesia department.
Sanz, M. F.; Gómez, E. J.; Trueba, I.; Cano, P.; Arredondo, M. T.; del Pozo, F.
1993-01-01
This paper describes the design and implementation of a pen-based computer system for remote preoperative data collection. The system is envisaged to be used by anaesthesia staff at different hospital scenarios where pre-operative data are generated. Pen-based technology offers important advantages in terms of portability and human-computer interaction, as direct manipulation interfaces by direct pointing, and "notebook user interfaces metaphors". Being the human factors analysis and user interface design a vital stage to achieve the appropriate user acceptability, a methodology that integrates the "usability" evaluation from the earlier development stages was used. Additionally, the selection of a pen-based computer system as a portable device to be used by health care personnel allows to evaluate the appropriateness of this new technology for remote data collection within the hospital environment. The work presented is currently being realised under the Research Project "TANIT: Telematics in Anaesthesia and Intensive Care", within the "A.I.M.--Telematics in Health CARE" European Research Program. PMID:8130488
ERIC Educational Resources Information Center
Alexiadis, D. S.; Mitianoudis, N.
2013-01-01
Digital signal processing (DSP) has been an integral part of most electrical, electronic, and computer engineering curricula. The applications of DSP in multimedia (audio, image, video) storage, transmission, and analysis are also widely taught at both the undergraduate and post-graduate levels, as digital multimedia can be encountered in most…
Frozen blood products: clinically effective and potentially ideal for remote Australia.
Holley, A; Marks, D C; Johnson, L; Reade, M C; Badloe, J F; Noorman, F
2013-01-01
The development of effective cryopreservation techniques for both red blood cells and platelets, which maintain ex vivo biological activity, in combination with frozen plasma, provides for a unique blood banking strategy. This technology greatly enhances the storage life of these products. The rationale and potential advantages of using cryopreservation techniques for the provision of blood products to remote and military environments have been effectively demonstrated in several conflicts over the last decade. Current haemostatic resuscitation doctrine for the exsanguinating patient supports the use of red blood cells, platelets and frozen plasma early in the resuscitation. We believe an integrated fresh-frozen blood bank inventory could facilitate provision of blood products, not only in the military setting but also in regional Australia, by overcoming many logistic and geographical challenges. The processes involved in production and point of care thawing are sufficiently well developed and achievable to make this technology a viable option. The potential limitations of cryopreservation and subsequent product thawing need to be considered if such a strategy is to be developed. A substantial body of international experience using cryopreserved products in remote settings has already been accrued. This experience provides a template for the possible creation of an Australian integrated fresh-frozen blood bank inventory that could conceivably enhance the care of patients in both regional Australia and in the military setting.
Use of IRI to Model the Effect of Ionosphere Emission on Earth Remote Sensing at L-Band
NASA Technical Reports Server (NTRS)
Abraham, Saji; LeVine, David M.
2004-01-01
Microwave remote sensing in the window at 1.413 GHz (L-band) set aside for passive use only is important for monitoring sea surface salinity and soil moisture. These parameters are important for understanding ocean dynamics and energy exchange between the surface and atmosphere, and both NASA and ESA plan to launch satellite sensors to monitor these parameters at L-band (Aquarius, Hydros and SMOS). The ionosphere is an important source of error for passive remote sensing at this frequency. In addition to Faraday rotation, emission from the ionosphere is also a potential source of error at L-band. As an aid for correcting for emission, a regression model is presented that relates ionosphere emission to the integrated electron density (TEC). The goal is to use TEC from sources such as TOPEX, JASON or GPS to obtain estimates of emission over the oceans where the electron density profiles needed to compute emission are not available. In addition, data will also be presented to evaluate the use of the IRI for computing emission over the ocean.
The Hico Image Processing System: A Web-Accessible Hyperspectral Remote Sensing Toolbox
NASA Astrophysics Data System (ADS)
Harris, A. T., III; Goodman, J.; Justice, B.
2014-12-01
As the quantity of Earth-observation data increases, the use-case for hosting analytical tools in geospatial data centers becomes increasingly attractive. To address this need, HySpeed Computing and Exelis VIS have developed the HICO Image Processing System, a prototype cloud computing system that provides online, on-demand, scalable remote sensing image processing capabilities. The system provides a mechanism for delivering sophisticated image processing analytics and data visualization tools into the hands of a global user community, who will only need a browser and internet connection to perform analysis. Functionality of the HICO Image Processing System is demonstrated using imagery from the Hyperspectral Imager for the Coastal Ocean (HICO), an imaging spectrometer located on the International Space Station (ISS) that is optimized for acquisition of aquatic targets. Example applications include a collection of coastal remote sensing algorithms that are directed at deriving critical information on water and habitat characteristics of our vulnerable coastal environment. The project leverages the ENVI Services Engine as the framework for all image processing tasks, and can readily accommodate the rapid integration of new algorithms, datasets and processing tools.
An automated digital imaging system for environmental monitoring applications
Bogle, Rian; Velasco, Miguel; Vogel, John
2013-01-01
Recent improvements in the affordability and availability of high-resolution digital cameras, data loggers, embedded computers, and radio/cellular modems have advanced the development of sophisticated automated systems for remote imaging. Researchers have successfully placed and operated automated digital cameras in remote locations and in extremes of temperature and humidity, ranging from the islands of the South Pacific to the Mojave Desert and the Grand Canyon. With the integration of environmental sensors, these automated systems are able to respond to local conditions and modify their imaging regimes as needed. In this report we describe in detail the design of one type of automated imaging system developed by our group. It is easily replicated, low-cost, highly robust, and is a stand-alone automated camera designed to be placed in remote locations, without wireless connectivity.
Dynamic provisioning of local and remote compute resources with OpenStack
NASA Astrophysics Data System (ADS)
Giffels, M.; Hauth, T.; Polgart, F.; Quast, G.
2015-12-01
Modern high-energy physics experiments rely on the extensive usage of computing resources, both for the reconstruction of measured events as well as for Monte-Carlo simulation. The Institut fur Experimentelle Kernphysik (EKP) at KIT is participating in both the CMS and Belle experiments with computing and storage resources. In the upcoming years, these requirements are expected to increase due to growing amount of recorded data and the rise in complexity of the simulated events. It is therefore essential to increase the available computing capabilities by tapping into all resource pools. At the EKP institute, powerful desktop machines are available to users. Due to the multi-core nature of modern CPUs, vast amounts of CPU time are not utilized by common desktop usage patterns. Other important providers of compute capabilities are classical HPC data centers at universities or national research centers. Due to the shared nature of these installations, the standardized software stack required by HEP applications cannot be installed. A viable way to overcome this constraint and offer a standardized software environment in a transparent manner is the usage of virtualization technologies. The OpenStack project has become a widely adopted solution to virtualize hardware and offer additional services like storage and virtual machine management. This contribution will report on the incorporation of the institute's desktop machines into a private OpenStack Cloud. The additional compute resources provisioned via the virtual machines have been used for Monte-Carlo simulation and data analysis. Furthermore, a concept to integrate shared, remote HPC centers into regular HEP job workflows will be presented. In this approach, local and remote resources are merged to form a uniform, virtual compute cluster with a single point-of-entry for the user. Evaluations of the performance and stability of this setup and operational experiences will be discussed.
An integrated solution for remote data access
NASA Astrophysics Data System (ADS)
Sapunenko, Vladimir; D'Urso, Domenico; dell'Agnello, Luca; Vagnoni, Vincenzo; Duranti, Matteo
2015-12-01
Data management constitutes one of the major challenges that a geographically- distributed e-Infrastructure has to face, especially when remote data access is involved. We discuss an integrated solution which enables transparent and efficient access to on-line and near-line data through high latency networks. The solution is based on the joint use of the General Parallel File System (GPFS) and of the Tivoli Storage Manager (TSM). Both products, developed by IBM, are well known and extensively used in the HEP computing community. Owing to a new feature introduced in GPFS 3.5, so-called Active File Management (AFM), the definition of a single, geographically-distributed namespace, characterised by automated data flow management between different locations, becomes possible. As a practical example, we present the implementation of AFM-based remote data access between two data centres located in Bologna and Rome, demonstrating the validity of the solution for the use case of the AMS experiment, an astro-particle experiment supported by the INFN CNAF data centre with the large disk space requirements (more than 1.5 PB).
Aircraft integrated design and analysis: A classroom experience
NASA Technical Reports Server (NTRS)
1988-01-01
AAE 451 is the capstone course required of all senior undergraduates in the School of Aeronautics and Astronautics at Purdue University. During the past year the first steps of a long evolutionary process were taken to change the content and expectations of this course. These changes are the result of the availability of advanced computational capabilities and sophisticated electronic media availability at Purdue. This presentation will describe both the long range objectives and this year's experience using the High Speed Commercial Transport (HSCT) design, the AIAA Long Duration Aircraft design and a Remotely Piloted Vehicle (RPV) design proposal as project objectives. The central goal of these efforts was to provide a user-friendly, computer-software-based, environment to supplement traditional design course methodology. The Purdue University Computer Center (PUCC), the Engineering Computer Network (ECN), and stand-alone PC's were used for this development. This year's accomplishments centered primarily on aerodynamics software obtained from the NASA Langley Research Center and its integration into the classroom. Word processor capability for oral and written work and computer graphics were also blended into the course. A total of 10 HSCT designs were generated, ranging from twin-fuselage and forward-swept wing aircraft, to the more traditional delta and double-delta wing aircraft. Four Long Duration Aircraft designs were submitted, together with one RPV design tailored for photographic surveillance. Supporting these activities were three video satellite lectures beamed from NASA/Langley to Purdue. These lectures covered diverse areas such as an overview of HSCT design, supersonic-aircraft stability and control, and optimization of aircraft performance. Plans for next year's effort will be reviewed, including dedicated computer workstation utilization, remote satellite lectures, and university/industrial cooperative efforts.
2016-07-15
AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a. CONTRACT NUMBER 5b. GRANT NUMBER... electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study
2016-07-15
AFRL-AFOSR-JP-TR-2016-0068 Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing Hean-Teik...SUBTITLE Multi-scale Computational Electromagnetics for Phenomenology and Saliency Characterization in Remote Sensing 5a. CONTRACT NUMBER 5b. GRANT NUMBER...electromagnetics to the application in microwave remote sensing as well as extension of modelling capability with computational flexibility to study
NASA Astrophysics Data System (ADS)
Lawhead, Pamela B.; Aten, Michelle L.
2003-04-01
The Center for GeoSpatial Workforce Development is embarking on a new era in education by developing a repository of dynamic online courseware authored by the foremost industry experts within the remote sensing and GIS industries. Virtual classrooms equipped with the most advanced instructions, computations, communications, course evaluation, and management facilities amplify these courses to enhance the learning environment and provide rapid feedback between instructors and students. The launch of this program included the objective development of the Model Curriculum by an independent consortium of remote sensing industry leaders. The Center's research and development focus on recruiting additional industry experts to develop the technical content of the courseware and then utilize state-of-the-art technology to enhance their material with visually stimulating animations, compelling audio clips and entertaining, interactive exercises intended to reach the broadest audience possible by targeting various learning styles. The courseware will be delivered via various media: Internet, CD-ROM, DVD, and compressed video, that translates into anywhere, anytime delivery of GeoSpatial Information Technology education.
Remote video assessment for missile launch facilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wagner, G.G.; Stewart, W.A.
1995-07-01
The widely dispersed, unmanned launch facilities (LFs) for land-based ICBMs (intercontinental ballistic missiles) currently do not have visual assessment capability for existing intrusion alarms. The security response force currently must assess each alarm on-site. Remote assessment will enhance manpower, safety, and security efforts. Sandia National Laboratories was tasked by the USAF Electronic Systems Center to research, recommend, and demonstrate a cost-effective remote video assessment capability at missile LFs. The project`s charter was to provide: system concepts; market survey analysis; technology search recommendations; and operational hardware demonstrations for remote video assessment from a missile LF to a remote security center viamore » a cost-effective transmission medium and without using visible, on-site lighting. The technical challenges of this project were to: analyze various video transmission media and emphasize using the existing missile system copper line which can be as long as 30 miles; accentuate and extremely low-cost system because of the many sites requiring system installation; integrate the video assessment system with the current LF alarm system; and provide video assessment at the remote sites with non-visible lighting.« less
Managing the CMS Data and Monte Carlo Processing during LHC Run 2
NASA Astrophysics Data System (ADS)
Wissing, C.;
2017-10-01
In order to cope with the challenges expected during the LHC Run 2 CMS put in a number of enhancements into the main software packages and the tools used for centrally managed processing. In the presentation we will highlight these improvements that allow CMS to deal with the increased trigger output rate, the increased pileup and the evolution in computing technology. The overall system aims at high flexibility, improved operational flexibility and largely automated procedures. The tight coupling of workflow classes to types of sites has been drastically relaxed. Reliable and high-performing networking between most of the computing sites and the successful deployment of a data-federation allow the execution of workflows using remote data access. That required the development of a largely automatized system to assign workflows and to handle necessary pre-staging of data. Another step towards flexibility has been the introduction of one large global HTCondor Pool for all types of processing workflows and analysis jobs. Besides classical Grid resources also some opportunistic resources as well as Cloud resources have been integrated into that Pool, which gives reach to more than 200k CPU cores.
Final Scientific Report: A Scalable Development Environment for Peta-Scale Computing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karbach, Carsten; Frings, Wolfgang
2013-02-22
This document is the final scientific report of the project DE-SC000120 (A scalable Development Environment for Peta-Scale Computing). The objective of this project is the extension of the Parallel Tools Platform (PTP) for applying it to peta-scale systems. PTP is an integrated development environment for parallel applications. It comprises code analysis, performance tuning, parallel debugging and system monitoring. The contribution of the Juelich Supercomputing Centre (JSC) aims to provide a scalable solution for system monitoring of supercomputers. This includes the development of a new communication protocol for exchanging status data between the target remote system and the client running PTP.more » The communication has to work for high latency. PTP needs to be implemented robustly and should hide the complexity of the supercomputer's architecture in order to provide a transparent access to various remote systems via a uniform user interface. This simplifies the porting of applications to different systems, because PTP functions as abstraction layer between parallel application developer and compute resources. The common requirement for all PTP components is that they have to interact with the remote supercomputer. E.g. applications are built remotely and performance tools are attached to job submissions and their output data resides on the remote system. Status data has to be collected by evaluating outputs of the remote job scheduler and the parallel debugger needs to control an application executed on the supercomputer. The challenge is to provide this functionality for peta-scale systems in real-time. The client server architecture of the established monitoring application LLview, developed by the JSC, can be applied to PTP's system monitoring. LLview provides a well-arranged overview of the supercomputer's current status. A set of statistics, a list of running and queued jobs as well as a node display mapping running jobs to their compute resources form the user display of LLview. These monitoring features have to be integrated into the development environment. Besides showing the current status PTP's monitoring also needs to allow for submitting and canceling user jobs. Monitoring peta-scale systems especially deals with presenting the large amount of status data in a useful manner. Users require to select arbitrary levels of detail. The monitoring views have to provide a quick overview of the system state, but also need to allow for zooming into specific parts of the system, into which the user is interested in. At present, the major batch systems running on supercomputers are PBS, TORQUE, ALPS and LoadLeveler, which have to be supported by both the monitoring and the job controlling component. Finally, PTP needs to be designed as generic as possible, so that it can be extended for future batch systems.« less
Integrating Blended Teaching and Learning to Enhance Graduate Attributes
ERIC Educational Resources Information Center
Hermens, Antoine; Clarke, Elizabeth
2009-01-01
Purpose: The purpose of this paper is to explore the role of computer based business simulations in higher education as innovative tools of teaching and learning to enhance students' practical understanding of real business problems. Whether the integration of business simulation technologies will enable significant innovation in teaching and…
NASA Technical Reports Server (NTRS)
Brown, Robert L.; Doyle, Dee; Haines, Richard F.; Slocum, Michael
1989-01-01
As part of the Telescience Testbed Pilot Program, the Universities Space Research Association/ Research Institute for Advanced Computer Science (USRA/RIACS) proposed to support remote communication by providing a network of human/machine interfaces, computer resources, and experimental equipment which allows: remote science, collaboration, technical exchange, and multimedia communication. The telescience workstation is intended to provide a local computing environment for telescience. The purpose of the program are as follows: (1) to provide a suitable environment to integrate existing and new software for a telescience workstation; (2) to provide a suitable environment to develop new software in support of telescience activities; (3) to provide an interoperable environment so that a wide variety of workstations may be used in the telescience program; (4) to provide a supportive infrastructure and a common software base; and (5) to advance, apply, and evaluate the telescience technolgy base. A prototype telescience computing environment designed to bring practicing scientists in domains other than their computer science into a modern style of doing their computing was created and deployed. This environment, the Telescience Windowing Environment, Phase 1 (TeleWEn-1), met some, but not all of the goals stated above. The TeleWEn-1 provided a window-based workstation environment and a set of tools for text editing, document preparation, electronic mail, multimedia mail, raster manipulation, and system management.
GI-13 – A brief review of the GEO Work Plan DescriptionGlobal map examples of PM2.5 satellite measuresUS Maps showing examples of fused in-situ and satellite dataNew AQ Monitoring approach with social value – Village Green exampleComputing and Systems Applied in Energ...
Multispectral image enhancement processing for microsat-borne imager
NASA Astrophysics Data System (ADS)
Sun, Jianying; Tan, Zheng; Lv, Qunbo; Pei, Linlin
2017-10-01
With the rapid development of remote sensing imaging technology, the micro satellite, one kind of tiny spacecraft, appears during the past few years. A good many studies contribute to dwarfing satellites for imaging purpose. Generally speaking, micro satellites weigh less than 100 kilograms, even less than 50 kilograms, which are slightly larger or smaller than the common miniature refrigerators. However, the optical system design is hard to be perfect due to the satellite room and weight limitation. In most cases, the unprocessed data captured by the imager on the microsatellite cannot meet the application need. Spatial resolution is the key problem. As for remote sensing applications, the higher spatial resolution of images we gain, the wider fields we can apply them. Consequently, how to utilize super resolution (SR) and image fusion to enhance the quality of imagery deserves studying. Our team, the Key Laboratory of Computational Optical Imaging Technology, Academy Opto-Electronics, is devoted to designing high-performance microsat-borne imagers and high-efficiency image processing algorithms. This paper addresses a multispectral image enhancement framework for space-borne imagery, jointing the pan-sharpening and super resolution techniques to deal with the spatial resolution shortcoming of microsatellites. We test the remote sensing images acquired by CX6-02 satellite and give the SR performance. The experiments illustrate the proposed approach provides high-quality images.
NASA Astrophysics Data System (ADS)
Fauzi, Ahmad
2017-11-01
Numerical computation has many pedagogical advantages: it develops analytical skills and problem-solving skills, helps to learn through visualization, and enhances physics education. Unfortunately, numerical computation is not taught to undergraduate education physics students in Indonesia. Incorporate numerical computation into the undergraduate education physics curriculum presents many challenges. The main challenges are the dense curriculum that makes difficult to put new numerical computation course and most students have no programming experience. In this research, we used case study to review how to integrate numerical computation into undergraduate education physics curriculum. The participants of this research were 54 students of the fourth semester of physics education department. As a result, we concluded that numerical computation could be integrated into undergraduate education physics curriculum using spreadsheet excel combined with another course. The results of this research become complements of the study on how to integrate numerical computation in learning physics using spreadsheet excel.
A secure EHR system based on hybrid clouds.
Chen, Yu-Yi; Lu, Jun-Chao; Jan, Jinn-Ke
2012-10-01
Consequently, application services rendering remote medical services and electronic health record (EHR) have become a hot topic and stimulating increased interest in studying this subject in recent years. Information and communication technologies have been applied to the medical services and healthcare area for a number of years to resolve problems in medical management. Sharing EHR information can provide professional medical programs with consultancy, evaluation, and tracing services can certainly improve accessibility to the public receiving medical services or medical information at remote sites. With the widespread use of EHR, building a secure EHR sharing environment has attracted a lot of attention in both healthcare industry and academic community. Cloud computing paradigm is one of the popular healthIT infrastructures for facilitating EHR sharing and EHR integration. In this paper, we propose an EHR sharing and integration system in healthcare clouds and analyze the arising security and privacy issues in access and management of EHRs.
ROBOTICS IN HAZARDOUS ENVIRONMENTS - REAL DEPLOYMENTS BY THE SAVANNAH RIVER NATIONAL LABORATORY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kriikku, E.; Tibrea, S.; Nance, T.
The Research & Development Engineering (R&DE) section in the Savannah River National Laboratory (SRNL) engineers, integrates, tests, and supports deployment of custom robotics, systems, and tools for use in radioactive, hazardous, or inaccessible environments. Mechanical and electrical engineers, computer control professionals, specialists, machinists, welders, electricians, and mechanics adapt and integrate commercially available technology with in-house designs, to meet the needs of Savannah River Site (SRS), Department of Energy (DOE), and other governmental agency customers. This paper discusses five R&DE robotic and remote system projects.
Instructional image processing on a university mainframe: The Kansas system
NASA Technical Reports Server (NTRS)
Williams, T. H. L.; Siebert, J.; Gunn, C.
1981-01-01
An interactive digital image processing program package was developed that runs on the University of Kansas central computer, a Honeywell Level 66 multi-processor system. The module form of the package allows easy and rapid upgrades and extensions of the system and is used in remote sensing courses in the Department of Geography, in regional five-day short courses for academics and professionals, and also in remote sensing projects and research. The package comprises three self-contained modules of processing functions: Subimage extraction and rectification; image enhancement, preprocessing and data reduction; and classification. Its use in a typical course setting is described. Availability and costs are considered.
Leveraging Computer-Mediated Communication Technologies to Enhance Interactions in Online Learning
ERIC Educational Resources Information Center
Wright, Linda J.
2011-01-01
Computer-mediated communication (CMC) technologies have been an integral part of distance education for many years. They are found in both synchronous and asynchronous platforms and are intended to enhance the learning experience for students. CMC technologies add an interactive element to the online learning environment. The findings from this…
Characterization of Vegetation using the UC Davis Remote Sensing Testbed
NASA Astrophysics Data System (ADS)
Falk, M.; Hart, Q. J.; Bowen, K. S.; Ustin, S. L.
2006-12-01
Remote sensing provides information about the dynamics of the terrestrial biosphere with continuous spatial and temporal coverage on many different scales. We present the design and construction of a suite of instrument modules and network infrastructure with size, weight and power constraints suitable for small scale vehicles, anticipating vigorous growth in unmanned aerial vehicles (UAV) and other mobile platforms. Our approach provides the rapid deployment and low cost acquisition of high aerial imagery for applications requiring high spatial resolution and revisits. The testbed supports a wide range of applications, encourages remote sensing solutions in new disciplines and demonstrates the complete range of engineering knowledge required for the successful deployment of remote sensing instruments. The initial testbed is deployed on a Sig Kadet Senior remote controlled plane. It includes an onboard computer with wireless radio, GPS, inertia measurement unit, 3-axis electronic compass and digital cameras. The onboard camera is either a RGB digital camera or a modified digital camera with red and NIR channels. Cameras were calibrated using selective light sources, an integrating spheres and a spectrometer, allowing for the computation of vegetation indices such as the NDVI. Field tests to date have investigated technical challenges in wireless communication bandwidth limits, automated image geolocation, and user interfaces; as well as image applications such as environmental landscape mapping focusing on Sudden Oak Death and invasive species detection, studies on the impact of bird colonies on tree canopies, and precision agriculture.
The remote sensing image segmentation mean shift algorithm parallel processing based on MapReduce
NASA Astrophysics Data System (ADS)
Chen, Xi; Zhou, Liqing
2015-12-01
With the development of satellite remote sensing technology and the remote sensing image data, traditional remote sensing image segmentation technology cannot meet the massive remote sensing image processing and storage requirements. This article put cloud computing and parallel computing technology in remote sensing image segmentation process, and build a cheap and efficient computer cluster system that uses parallel processing to achieve MeanShift algorithm of remote sensing image segmentation based on the MapReduce model, not only to ensure the quality of remote sensing image segmentation, improved split speed, and better meet the real-time requirements. The remote sensing image segmentation MeanShift algorithm parallel processing algorithm based on MapReduce shows certain significance and a realization of value.
NASA Astrophysics Data System (ADS)
Butz, Andre; Solvejg Dinger, Anna; Bobrowski, Nicole; Kostinek, Julian; Fieber, Lukas; Fischerkeller, Constanze; Giuffrida, Giovanni Bruno; Hase, Frank; Klappenbach, Friedrich; Kuhn, Jonas; Lübcke, Peter; Tirpitz, Lukas; Tu, Qiansi
2017-04-01
Remote sensing of CO2 enhancements in volcanic plumes can be a tool to estimate volcanic CO2 emissions and thereby, to gain insight into the geological carbon cycle and into volcano interior processes. However, remote sensing of the volcanic CO2 is challenged by the large atmospheric background concentrations masking the minute volcanic signal. Here, we report on a demonstrator study conducted in September 2015 at Mt. Etna on Sicily, where we deployed an EM27/SUN Fourier Transform Spectrometer together with a UV spectrometer on a mobile remote sensing platform. The spectrometers were operated in direct-sun viewing geometry collecting cross-sectional scans of solar absorption spectra through the volcanic plume by operating the platform in stop-and-go patterns in 5 to 10 kilometers distance from the crater region. We successfully detected correlated intra-plume enhancements of CO2 and volcanic SO2, HF, HCl, and BrO. The path-integrated volcanic CO2 enhancements amounted to about 0.5 ppm (on top of the ˜400 ppm background). Key to successful detection of volcanic CO2 was A) the simultaneous observation of the O2 total column which allowed for correcting changes in the CO2 column caused by changes in observer altitude and B) the simultaneous measurement of volcanic species co-emitted with CO2 which allowed for discriminating intra-plume and extra-plume observations. The latter were used for subtracting the atmospheric CO2 background. The field study suggests that our remote sensing observatory is a candidate technique for volcano monitoring in safe distance from the crater region.
An integrated system for land resources supervision based on the IoT and cloud computing
NASA Astrophysics Data System (ADS)
Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie
2017-01-01
Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.
NASA Astrophysics Data System (ADS)
Takahashi, N.; Agata, H.; Maeda, K.; Okyudo, M..; Yamazaki, Y.
A total solar eclipse was observed on 2001 June 21 in Angola, Zambia, and Zimbabwe in Africa. For the purpose of promotion of science education using a solar eclipse as an educational project, the whole image and an enlarged image of the Sun, that showed the process of an eclipse and how things went in the observation area, were broadcast to the world through the Internet (Live Eclipse). Such images were distributed to four primary schools in Hiroshima and the Science and Technology Museum in Tokyo to give a remote lecture through computers. To find the effectiveness of the lecture, the learning effect on the participating children was examined two times before and after the remote lecture on the solar eclipse.
Integration of Wireless Technologies in Smart University Campus Environment: Framework Architecture
ERIC Educational Resources Information Center
Khamayseh, Yaser; Mardini, Wail; Aljawarneh, Shadi; Yassein, Muneer Bani
2015-01-01
In this paper, the authors are particularly interested in enhancing the education process by integrating new tools to the teaching environments. This enhancement is part of an emerging concept, called smart campus. Smart University Campus will come up with a new ubiquitous computing and communication field and change people's lives radically by…
Topics in Computer Literacy as Elements of Two Introductory College Mathematics Courses.
ERIC Educational Resources Information Center
Spresser, Diane M.
1986-01-01
Explains the integrated approach implemented by James Madison University, Virginia, in enhancing computer literacy. Reviews the changes in the mathematics courses and provides topical listings and outlines of the courses that emphasize computer applications. (ML)
Integrating electronic conferencing to enhance problem solving in nursing.
Witucki, J M; Hodson, K E; Malm, L D
1996-01-01
The authors describe how a computer-mediated conference was integrated into a baccalaureate nursing program clinical course. They discuss methods used in implementing the conference, including a technical review of the software and hardware, and methods of implementing and monitoring the conference with students. Examples of discussion items, student and faculty responses to posted items, and responses to use of the computer-mediated conference are included. Results and recommendations from this experience will be useful to other schools integrating computer-mediated conference technology into the nursing school curriculum.
Location-assured, multifactor authentication on smartphones via LTE communication
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham
2013-05-01
With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.
Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures
NASA Astrophysics Data System (ADS)
Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.
2016-12-01
The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.
Technology Integration Barriers: Urban School Mathematics Teachers Perspectives
NASA Astrophysics Data System (ADS)
Wachira, Patrick; Keengwe, Jared
2011-02-01
Despite the promise of technology in education, many practicing teachers face several challenges when trying to effectively integrate technology into their classroom instruction. Additionally, while national statistics cite a remarkable improvement in access to computer technology tools in schools, teacher surveys show consistent declines in the use and integration of computer technology to enhance student learning. This article reports on primary technology integration barriers that mathematics teachers identified when using technology in their classrooms. Suggestions to overcome some of these barriers are also provided.
Mission leverage education: NSU/NASA innovative undergraduate model
NASA Technical Reports Server (NTRS)
Chaudhury, S. Raj; Shaw, Paula R. D.
2005-01-01
The BEST Lab (Center for Excellence in Science Education), the Center for Materials Research (CMR), and the Chemistry, Mathematics, Physics, and Computer Science (CS) Departments at Norfolk State University (NSU) joined forces to implement MiLEN(2) IUM - an innovative approach tu integrate current and emerging research into the undergraduate curricula and train students on NASA-related fields. An Earth Observing System (EOS) mission was simulated where students are educated and trained in many aspects of Remote Sensing: detector physics and spectroscopy; signal processing; data conditioning, analysis, visualization; and atmospheric science. This model and its continued impact is expected to significantly enhance the quality of the Mathematics, Science, Engineering and Technology (MSET or SMET) educational experience and to inspire students from historically underrepresented groups to pursue careers in NASA-related fields. MiLEN(2) IUM will be applicable to other higher education institutions that are willing to make the commitment to this endeavor in terms of faculty interest and space.
Jiang, Jiefeng; Beck, Jeffrey; Heller, Katherine; Egner, Tobias
2015-01-01
The anterior cingulate and lateral prefrontal cortices have been implicated in implementing context-appropriate attentional control, but the learning mechanisms underlying our ability to flexibly adapt the control settings to changing environments remain poorly understood. Here we show that human adjustments to varying control demands are captured by a reinforcement learner with a flexible, volatility-driven learning rate. Using model-based functional magnetic resonance imaging, we demonstrate that volatility of control demand is estimated by the anterior insula, which in turn optimizes the prediction of forthcoming demand in the caudate nucleus. The caudate's prediction of control demand subsequently guides the implementation of proactive and reactive attentional control in dorsal anterior cingulate and dorsolateral prefrontal cortices. These data enhance our understanding of the neuro-computational mechanisms of adaptive behaviour by connecting the classic cingulate-prefrontal cognitive control network to a subcortical control-learning mechanism that infers future demands by flexibly integrating remote and recent past experiences. PMID:26391305
NASA Technical Reports Server (NTRS)
Demeo, Martha E.
1990-01-01
The feasibility of an experiment which will provide an on-orbit validation of Controls-Structures Interaction (CSI) technology, was investigated. The experiment will demonstrate the on-orbit characterization and flexible-body control of large flexible structure dynamics using the shuttle Remote Manipulator System (RMS) with an attached payload as a test article. By utilizing existing hardware as well as establishing integration, operation and safety algorithms, techniques and procedures, the experiment will minimize the costs and risks of implementing a flight experiment. The experiment will also offer spin-off enhancement to both the Shuttle RMS (SRMS) and the Space Station RMS (SSRMS).
NASA Technical Reports Server (NTRS)
Quattrochi, Dale A.; Estes, Sue
2011-01-01
The NASA Applied Sciences Program's public health initiative began in 2004 to illustratethe potential benefits for using remote sensing in public health applications. Objectives/Purpose: The CDC initiated a st udy with NASA through the National Center for Environmental Health (NCEH) to establish a pilot effort to use remote sensing data as part of its Environmental Public Health Tracking Network (EPHTN). As a consequence, the NCEH and NASA developed a project called HELIX-Atlanta (Health and Environment Linkage for Information Exchange) to demonstrate a process for developing a local environmental public health tracking and surveillance network that integrates non-infectious health and environment systems for the Atlanta metropolitan area. Methods: As an ongo ing, systematic integration, analysis and interpretation of data, an EPHTN focuses on: 1 -- environmental hazards; 2 -- human exposure to environmental hazards; and 3 -- health effects potentially related to exposure to environmental hazards. To satisfy the definition of a surveillance system the data must be disseminated to plan, implement, and evaluate environmental public health action. Results: A close working r elationship developed with NCEH where information was exchanged to assist in the development of an EPHTN that incorporated NASA remote sensing data into a surveillance network for disseminating public health tracking information to users. This project?s success provided NASA with the opportunity to work with other public health entities such as the University of Mississippi Medical Center, the University of New Mexico and the University of Arizona. Conclusions: HELIX-Atlanta became a functioning part of the national EPHTN for tracking environmental hazards and exposure, particularly as related to air quality over Atlanta. Learning Objectives: 1 -- remote sensing data can be integral to an EPHTN; 2 -- public tracking objectives can be enhanced through remote sensing data; 3 -- NASA's involvement in public health applications can have wider benefits in the future.
Integrated Simulation Design Challenges to Support TPS Repair Operations
NASA Technical Reports Server (NTRS)
Quiocho, Leslie J.; Crues, Edwin Z.; Huynh, An; Nguyen, Hung T.; MacLean, John
2005-01-01
During the Orbiter Repair Maneuver (ORM) operations planned for Return to Flight (RTF), the Shuttle Remote Manipulator System (SRMS) must grapple the International Space Station (ISS), undock the Orbiter, maneuver it through a long duration trajectory, and orient it to an EVA crewman poised at the end of the Space Station Remote Manipulator System (SSRMS) to facilitate the repair of the Thermal Protection System (TPS). Once repair has been completed and confirmed, then the SRMS proceeds back through the trajectory to dock the Orbiter to the Orbiter Docking System. In order to support analysis of the complex dynamic interactions of the integrated system formed by the Orbiter, ISS, SRMS, and SSRMS during the ORM, simulation tools used for previous 'nominal' mission support required substantial enhancements. These upgrades were necessary to provide analysts with the capabilities needed to study integrated system performance. This paper discusses the simulation design challenges encountered while developing simulation capabilities to mirror the ORM operations. The paper also describes the incremental build approach that was utilized, starting with the subsystem simulation elements and integration into increasing more complex simulations until the resulting ORM worksite dynamics simulation had been assembled. Furthermore, the paper presents an overall integrated simulation V&V methodology based upon a subsystem level testing, integrated comparisons, and phased checkout.
Provenance based data integrity checking and verification in cloud environments
Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais
2017-01-01
Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user’s data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user’s data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called “Data Provenance”. Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking. PMID:28545151
Provenance based data integrity checking and verification in cloud environments.
Imran, Muhammad; Hlavacs, Helmut; Haq, Inam Ul; Jan, Bilal; Khan, Fakhri Alam; Ahmad, Awais
2017-01-01
Cloud computing is a recent tendency in IT that moves computing and data away from desktop and hand-held devices into large scale processing hubs and data centers respectively. It has been proposed as an effective solution for data outsourcing and on demand computing to control the rising cost of IT setups and management in enterprises. However, with Cloud platforms user's data is moved into remotely located storages such that users lose control over their data. This unique feature of the Cloud is facing many security and privacy challenges which need to be clearly understood and resolved. One of the important concerns that needs to be addressed is to provide the proof of data integrity, i.e., correctness of the user's data stored in the Cloud storage. The data in Clouds is physically not accessible to the users. Therefore, a mechanism is required where users can check if the integrity of their valuable data is maintained or compromised. For this purpose some methods are proposed like mirroring, checksumming and using third party auditors amongst others. However, these methods use extra storage space by maintaining multiple copies of data or the presence of a third party verifier is required. In this paper, we address the problem of proving data integrity in Cloud computing by proposing a scheme through which users are able to check the integrity of their data stored in Clouds. In addition, users can track the violation of data integrity if occurred. For this purpose, we utilize a relatively new concept in the Cloud computing called "Data Provenance". Our scheme is capable to reduce the need of any third party services, additional hardware support and the replication of data items on client side for integrity checking.
Rapid prototyping of soil moisture estimates using the NASA Land Information System
NASA Astrophysics Data System (ADS)
Anantharaj, V.; Mostovoy, G.; Li, B.; Peters-Lidard, C.; Houser, P.; Moorhead, R.; Kumar, S.
2007-12-01
The Land Information System (LIS), developed at the NASA Goddard Space Flight Center, is a functional Land Data Assimilation System (LDAS) that incorporates a suite of land models in an interoperable computational framework. LIS has been integrated into a computational Rapid Prototyping Capabilities (RPC) infrastructure. LIS consists of a core, a number of community land models, data servers, and visualization systems - integrated in a high-performance computing environment. The land surface models (LSM) in LIS incorporate surface and atmospheric parameters of temperature, snow/water, vegetation, albedo, soil conditions, topography, and radiation. Many of these parameters are available from in-situ observations, numerical model analysis, and from NASA, NOAA, and other remote sensing satellite platforms at various spatial and temporal resolutions. The computational resources, available to LIS via the RPC infrastructure, support e- Science experiments involving the global modeling of land-atmosphere studies at 1km spatial resolutions as well as regional studies at finer resolutions. The Noah Land Surface Model, available with-in the LIS is being used to rapidly prototype soil moisture estimates in order to evaluate the viability of other science applications for decision making purposes. For example, LIS has been used to further extend the utility of the USDA Soil Climate Analysis Network of in-situ soil moisture observations. In addition, LIS also supports data assimilation capabilities that are used to assimilate remotely sensed soil moisture retrievals from the AMSR-E instrument onboard the Aqua satellite. The rapid prototyping of soil moisture estimates using LIS and their applications will be illustrated during the presentation.
A GPS-based Real-time Road Traffic Monitoring System
NASA Astrophysics Data System (ADS)
Tanti, Kamal Kumar
In recent years, monitoring systems are astonishingly inclined towards ever more automatic; reliably interconnected, distributed and autonomous operation. Specifically, the measurement, logging, data processing and interpretation activities may be carried out by separate units at different locations in near real-time. The recent evolution of mobile communication devices and communication technologies has fostered a growing interest in the GIS & GPS-based location-aware systems and services. This paper describes a real-time road traffic monitoring system based on integrated mobile field devices (GPS/GSM/IOs) working in tandem with advanced GIS-based application software providing on-the-fly authentications for real-time monitoring and security enhancement. The described system is developed as a fully automated, continuous, real-time monitoring system that employs GPS sensors and Ethernet and/or serial port communication techniques are used to transfer data between GPS receivers at target points and a central processing computer. The data can be processed locally or remotely based on the requirements of client’s satisfaction. Due to the modular architecture of the system, other sensor types may be supported with minimal effort. Data on the distributed network & measurements are transmitted via cellular SIM cards to a Control Unit, which provides for post-processing and network management. The Control Unit may be remotely accessed via an Internet connection. The new system will not only provide more consistent data about the road traffic conditions but also will provide methods for integrating with other Intelligent Transportation Systems (ITS). For communication between the mobile device and central monitoring service GSM technology is used. The resulting system is characterized by autonomy, reliability and a high degree of automation.
SALVEREMO, an automatic system for the search and rescue in the wilderness and mountain areas
NASA Astrophysics Data System (ADS)
Penna, Roberto; Allasia, Walter; Bianchi, Luca; Licata, Enrico; Duranti, Pierluigi; Molino, Andrea; Bagalini, Enea; Sagliocco, Sergio; Scarafia, Simone; Prinetto, Paolo; Airofarulla, Giuseppe; Carelli, Alberto
2016-04-01
SALVEREMO project aims at designing and prototyping an innovative system for searching and rescuing individuals (especially hikers and mountaineers) who got lost or in peril in wilderness or mountain areas. It makes use of Remotely Piloted Aircraft System (RPAS) equipped with a sensor suite specifically selected according to the requirements identified involving alpine rescuers and government officials. The peculiarity of the proposed solution is the exploitation and integration of the special skill and expertise coming from different competence fields. It will dramatically decrease the searching time in the wilderness and remote areas off the beaten tracks, providing rescuers and operators with a decision support system increasing successful results and reducing rescue missions costs. The system benefits from the adoption of a scaled-down Base Transceiver Station (BTS) embarked in the payload sensor suite of a small RPAS that can be carried in a back pack of rescuers. A Software Defined Radio (SDR) board implementing the BTS protocol stack has been integrated in a complex sensor suite made up of open processing boards and camera devices. Moreover computer vision (CV) algorithms for real time pattern detection and image enhancements have been investigated for assisting the rescuers during the searching operations. An easy-to-use ground station application has been developed for speeding up the overall mission accomplishment. Aknowledgement SALVEREMO project is a research project co-funded by Regione Piemonte according to the call for proposal POR F.E.S.R. 2007/2013, "Linea di attività I.1.3-Innovazione e PMI - Polo della Meccatronica e dei Sistemi Avanzati di Produzione". The authors want to thank "Il Soccorso Alpino Italiano" for the invaluable support for establishing operative requirements.
de Dumast, Priscille; Mirabel, Clément; Cevidanes, Lucia; Ruellas, Antonio; Yatabe, Marilia; Ioshida, Marcos; Ribera, Nina Tubau; Michoud, Loic; Gomes, Liliane; Huang, Chao; Zhu, Hongtu; Muniz, Luciana; Shoukri, Brandon; Paniagua, Beatriz; Styner, Martin; Pieper, Steve; Budin, Francois; Vimort, Jean-Baptiste; Pascal, Laura; Prieto, Juan Carlos
2018-07-01
The purpose of this study is to describe the methodological innovations of a web-based system for storage, integration and computation of biomedical data, using a training imaging dataset to remotely compute a deep neural network classifier of temporomandibular joint osteoarthritis (TMJOA). This study imaging dataset consisted of three-dimensional (3D) surface meshes of mandibular condyles constructed from cone beam computed tomography (CBCT) scans. The training dataset consisted of 259 condyles, 105 from control subjects and 154 from patients with diagnosis of TMJ OA. For the image analysis classification, 34 right and left condyles from 17 patients (39.9 ± 11.7 years), who experienced signs and symptoms of the disease for less than 5 years, were included as the testing dataset. For the integrative statistical model of clinical, biological and imaging markers, the sample consisted of the same 17 test OA subjects and 17 age and sex matched control subjects (39.4 ± 15.4 years), who did not show any sign or symptom of OA. For these 34 subjects, a standardized clinical questionnaire, blood and saliva samples were also collected. The technological methodologies in this study include a deep neural network classifier of 3D condylar morphology (ShapeVariationAnalyzer, SVA), and a flexible web-based system for data storage, computation and integration (DSCI) of high dimensional imaging, clinical, and biological data. The DSCI system trained and tested the neural network, indicating 5 stages of structural degenerative changes in condylar morphology in the TMJ with 91% close agreement between the clinician consensus and the SVA classifier. The DSCI remotely ran with a novel application of a statistical analysis, the Multivariate Functional Shape Data Analysis, that computed high dimensional correlations between shape 3D coordinates, clinical pain levels and levels of biological markers, and then graphically displayed the computation results. The findings of this study demonstrate a comprehensive phenotypic characterization of TMJ health and disease at clinical, imaging and biological levels, using novel flexible and versatile open-source tools for a web-based system that provides advanced shape statistical analysis and a neural network based classification of temporomandibular joint osteoarthritis. Published by Elsevier Ltd.
Accounting for ecosystem assets using remote sensing in the Colombian Orinoco River basin lowlands
NASA Astrophysics Data System (ADS)
Vargas, Leonardo; Hein, Lars; Remme, Roy P.
2016-10-01
In many parts of the world, ecosystems change compromises the supply of ecosystem services (ES). Better ecosystem management requires detailed and structured information. Ecosystem accounting has been developed as an information system for ecosystems, using concepts and valuation approaches that are aligned with the System of National Accounts (SNA). The SNA is used to store and analyse economic data, and the alignment of ecosystem accounts with the SNA facilitates the integrated analysis of economic and ecological aspects of ecosystem use. Ecosystem accounting requires detailed spatial information at aggregated scales. The objective of this paper is to explore how remote sensing images can be used to analyse ecosystems using an accounting approach in the Orinoco river basin. We assessed ecosystem assets in terms of extent, condition and capacity to supply ES. We focus on four specific ES: grasslands grazed by cattle, timber and oil palm harvest, and carbon sequestration. We link ES with six ecosystem assets; savannahs, woody grasslands, mixed agro-ecosystems, very dense forests, dense forest and oil palm plantations. We used remote sensing vegetation, surface temperature and productivity indexes to measure ecosystem assets. We found that remote sensing is a powerful tool to estimate ecosystem extent. The enhanced vegetation index can be used to assess ecosystems condition, and net primary productivity can be used for the assessment of ecosystem assets capacity to supply ES. Integrating remote sensing and ecological information facilitates efficient monitoring of ecosystem assets, in particular in data poor contexts.
Using an architectural approach to integrate heterogeneous, distributed software components
NASA Technical Reports Server (NTRS)
Callahan, John R.; Purtilo, James M.
1995-01-01
Many computer programs cannot be easily integrated because their components are distributed and heterogeneous, i.e., they are implemented in diverse programming languages, use different data representation formats, or their runtime environments are incompatible. In many cases, programs are integrated by modifying their components or interposing mechanisms that handle communication and conversion tasks. For example, remote procedure call (RPC) helps integrate heterogeneous, distributed programs. When configuring such programs, however, mechanisms like RPC must be used explicitly by software developers in order to integrate collections of diverse components. Each collection may require a unique integration solution. This paper describes improvements to the concepts of software packaging and some of our experiences in constructing complex software systems from a wide variety of components in different execution environments. Software packaging is a process that automatically determines how to integrate a diverse collection of computer programs based on the types of components involved and the capabilities of available translators and adapters in an environment. Software packaging provides a context that relates such mechanisms to software integration processes and reduces the cost of configuring applications whose components are distributed or implemented in different programming languages. Our software packaging tool subsumes traditional integration tools like UNIX make by providing a rule-based approach to software integration that is independent of execution environments.
Climbing the Slope of Enlightenment during NASA's Arctic Boreal Vulnerability Experiment
NASA Astrophysics Data System (ADS)
Griffith, P. C.; Hoy, E.; Duffy, D.; McInerney, M.
2015-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is a new field campaign sponsored by NASA's Terrestrial Ecology Program and designed to improve understanding of the vulnerability and resilience of Arctic and boreal social-ecological systems to environmental change (http://above.nasa.gov). ABoVE is integrating field-based studies, modeling, and data from airborne and satellite remote sensing. The NASA Center for Climate Simulation (NCCS) has partnered with the NASA Carbon Cycle and Ecosystems Office (CCEO) to create a high performance science cloud for this field campaign. The ABoVE Science Cloud combines high performance computing with emerging technologies and data management with tools for analyzing and processing geographic information to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage for "big data" with integrated data management, and integration of core variables from in-situ networks. The ABoVE Science Cloud is a collaboration that is accelerating the pace of new Arctic science for researchers participating in the field campaign. Specific examples of the utilization of the ABoVE Science Cloud by several funded projects will be presented.
Using Learning Analytics for Preserving Academic Integrity
ERIC Educational Resources Information Center
Amigud, Alexander; Arnedo-Moreno, Joan; Daradoumis, Thanasis; Guerrero-Roldan, Ana-Elena
2017-01-01
This paper presents the results of integrating learning analytics into the assessment process to enhance academic integrity in the e-learning environment. The goal of this research is to evaluate the computational-based approach to academic integrity. The machine-learning based framework learns students' patterns of language use from data,…
A Computer Learning Center for Environmental Sciences
NASA Technical Reports Server (NTRS)
Mustard, John F.
2000-01-01
In the fall of 1998, MacMillan Hall opened at Brown University to students. In MacMillan Hall was the new Computer Learning Center, since named the EarthLab which was outfitted with high-end workstations and peripherals primarily focused on the use of remotely sensed and other spatial data in the environmental sciences. The NASA grant we received as part of the "Centers of Excellence in Applications of Remote Sensing to Regional and Global Integrated Environmental Assessments" was the primary source of funds to outfit this learning and research center. Since opening, we have expanded the range of learning and research opportunities and integrated a cross-campus network of disciplines who have come together to learn and use spatial data of all kinds. The EarthLab also forms a core of undergraduate, graduate, and faculty research on environmental problems that draw upon the unique perspective of remotely sensed data. Over the last two years, the Earthlab has been a center for research on the environmental impact of water resource use in and regions, impact of the green revolution on forest cover in India, the design of forest preserves in Vietnam, and detailed assessments of the utility of thermal and hyperspectral data for water quality analysis. It has also been used extensively for local environmental activities, in particular studies on the impact of lead on the health of urban children in Rhode Island. Finally, the EarthLab has also served as a key educational and analysis center for activities related to the Brown University Affiliated Research Center that is devoted to transferring university research to the private sector.
Airborne Remote Sensing (ARS) for Agricultural Research and Commercialization Applications
NASA Technical Reports Server (NTRS)
Narayanan, Ram; Bowen, Brent D.; Nickerson, Jocelyn S.
2002-01-01
Tremendous advances in remote sensing technology and computing power over the last few decades are now providing scientists with the opportunity to investigate, measure, and model environmental patterns and processes with increasing confidence. Such advances are being pursued by the Nebraska Remote Sensing Facility, which consists of approximately 30 faculty members and is very competitive with other institutions in the depth of the work that is accomplished. The development of this facility targeted at applications, commercialization, and education programs in the area of precision agriculture provides a unique opportunity. This critical area is within the scope of NASA goals and objectives of NASA s Applications, Technology Transfer, Commercialization, and Education Division and the Earth Science Enterprise. This innovative integration of Aerospace (Aeronautics) Technology Enterprise applications with other NASA enterprises serves as a model of cross-enterprise transfer of science with specific commercial applications.
Keane, Robert E.; Burgan, Robert E.; Van Wagtendonk, Jan W.
2001-01-01
Fuel maps are essential for computing spatial fire hazard and risk and simulating fire growth and intensity across a landscape. However, fuel mapping is an extremely difficult and complex process requiring expertise in remotely sensed image classification, fire behavior, fuels modeling, ecology, and geographical information systems (GIS). This paper first presents the challenges of mapping fuels: canopy concealment, fuelbed complexity, fuel type diversity, fuel variability, and fuel model generalization. Then, four approaches to mapping fuels are discussed with examples provided from the literature: (1) field reconnaissance; (2) direct mapping methods; (3) indirect mapping methods; and (4) gradient modeling. A fuel mapping method is proposed that uses current remote sensing and image processing technology. Future fuel mapping needs are also discussed which include better field data and fuel models, accurate GIS reference layers, improved satellite imagery, and comprehensive ecosystem models.
NASA Technical Reports Server (NTRS)
Herrick, W. D.; Penegor, G. T.; Cotton, D. M.; Kaplan, G. C.; Chakrabarti, S.
1990-01-01
In September 1988 the Earth and Planetary Atmospheres Group of the Space Sciences Laboratory of the University of California at Berkeley flew an experiment on a high-altitude sounding rocket launched from the NASA Wallops Flight Facility in Virginia. The experiment, BEARS (Berkeley EUV Airglow Rocket Spectrometer), was designed to obtain spectroscopic data on the composition and structure of the earth's upper atmosphere. Consideration is given to the objectives of the BEARS experiment; the computer interface and software; the use of remote data transmission; and calibration, integration, and flight operations.
Resolution enhancement in integral microscopy by physical interpolation.
Llavador, Anabel; Sánchez-Ortiga, Emilio; Barreiro, Juan Carlos; Saavedra, Genaro; Martínez-Corral, Manuel
2015-08-01
Integral-imaging technology has demonstrated its capability for computing depth images from the microimages recorded after a single shot. This capability has been shown in macroscopic imaging and also in microscopy. Despite the possibility of refocusing different planes from one snap-shot is crucial for the study of some biological processes, the main drawback in integral imaging is the substantial reduction of the spatial resolution. In this contribution we report a technique, which permits to increase the two-dimensional spatial resolution of the computed depth images in integral microscopy by a factor of √2. This is made by a double-shot approach, carried out by means of a rotating glass plate, which shifts the microimages in the sensor plane. We experimentally validate the resolution enhancement as well as we show the benefit of applying the technique to biological specimens.
Resolution enhancement in integral microscopy by physical interpolation
Llavador, Anabel; Sánchez-Ortiga, Emilio; Barreiro, Juan Carlos; Saavedra, Genaro; Martínez-Corral, Manuel
2015-01-01
Integral-imaging technology has demonstrated its capability for computing depth images from the microimages recorded after a single shot. This capability has been shown in macroscopic imaging and also in microscopy. Despite the possibility of refocusing different planes from one snap-shot is crucial for the study of some biological processes, the main drawback in integral imaging is the substantial reduction of the spatial resolution. In this contribution we report a technique, which permits to increase the two-dimensional spatial resolution of the computed depth images in integral microscopy by a factor of √2. This is made by a double-shot approach, carried out by means of a rotating glass plate, which shifts the microimages in the sensor plane. We experimentally validate the resolution enhancement as well as we show the benefit of applying the technique to biological specimens. PMID:26309749
NASA Astrophysics Data System (ADS)
Washington-Allen, R. A.; Fatoyinbo, T. E.; Ribeiro, N. S.; Shugart, H. H.; Therrell, M. D.; Vaz, K. T.; von Schill, L.
2006-12-01
A workshop titled: Environmental Remote Sensing for Natural Resources Management was held from June 12 23, 2006 at Eduardo Mondlane University in Maputo Mozambique. The workshop was initiated through an invitation and pre-course evaluation form to interested NGOs, universities, and government organizations. The purpose of the workshop was to provide training to interested professionals, graduate students, faculty and researchers at Mozambican institutions on the research and practical uses of remote sensing for natural resource management. The course had 24 participants who were predominantly professionals in remote sensing and GIS from various NGOs, governmental and academic institutions in Mozambique. The course taught remote sensing from an ecological perspective, specifically the course focused on the application of new remote sensing technology [the Shuttle Radar Topography Mission (SRTM) C-band radar data] to carbon accounting research in Miombo woodlands and Mangrove forests. The 2-week course was free to participants and consisted of lectures, laboratories, and a field trip to the mangrove forests of Inhaca Island, Maputo. The field trip consisted of training in the use of forest inventory techniques in support of remote sensing studies. Specifically, the field workshop centered on use of Global Positioning Systems (GPS) and collection of forest inventory data on tree height, structure [leaf area index (LAI)], and productivity. Productivity studies were enhanced with the teaching of introductory dendrochronology including sample collection of tree rings from four different mangrove species. Students were provided with all course materials including a DVD that contained satellite data (e.g., Landsat and SRTM imagery), ancillary data, lectures, exercises, and remote sensing publications used in the course including a CD from the Environmental Protection Agency's Environmental Photographic Interpretation Center's (EPA-EPIC) program to teach remote sensing and data CDs from NASA's SAFARI 2000 field campaign. Nineteen participants evaluated the effectiveness of the course in regards to the course lectures, instructors, and the field trip. Future workshops should focus more on the individual projects that students are engaged with in their jobs, replace the laboratories computers with workstations geared towards computer intensive image processing software, and the purchase of field remote sensing instrumentation for practical exercises.
Image analysis by integration of disparate information
NASA Technical Reports Server (NTRS)
Lemoigne, Jacqueline
1993-01-01
Image analysis often starts with some preliminary segmentation which provides a representation of the scene needed for further interpretation. Segmentation can be performed in several ways, which are categorized as pixel based, edge-based, and region-based. Each of these approaches are affected differently by various factors, and the final result may be improved by integrating several or all of these methods, thus taking advantage of their complementary nature. In this paper, we propose an approach that integrates pixel-based and edge-based results by utilizing an iterative relaxation technique. This approach has been implemented on a massively parallel computer and tested on some remotely sensed imagery from the Landsat-Thematic Mapper (TM) sensor.
Interfacing with in-Situ Data Networks during the Arctic Boreal Vulnerability Experiment (ABoVE)
NASA Astrophysics Data System (ADS)
McInerney, M.; Griffith, P. C.; Duffy, D.; Hoy, E.; Schnase, J. L.; Sinno, S.; Thompson, J. H.
2014-12-01
The Arctic Boreal Vulnerability Experiment (ABoVE) is designed to improve understanding of the causes and impacts of ecological changes in Arctic/boreal regions, and will integrate field-based studies, modeling, and data from airborne and satellite remote sensing. ABoVE will result in a fuller understanding of ecosystem vulnerability and resilience to environmental change in the Arctic and boreal regions of western North America, and provide scientific information required to develop options for societal responses to the impacts of these changes. The studies sponsored by NASA during ABoVE will be coordinated with research and in-situ monitoring activities being sponsored by a number of national and international partners. The NASA Center for Climate Simulation at the Goddard Space Flight Center has partnered with the NASA Carbon Cycle & Ecosystems Office to create a science cloud designed for this field campaign - the ABoVE Science Cloud (ASC). The ASC combines high performance computing with emerging technologies to create an environment specifically designed for large-scale modeling, analysis of remote sensing data, copious disk storage with integrated data management, and integration of core variables from in-situ networks identified by the ABoVE Science Definition Team. In this talk, we will present the scientific requirements driving the development of the ABoVE Science Cloud, discuss the necessary interfaces, both computational and human, with in-situ monitoring networks, and show examples of how the ASC is being used to meet the needs of the ABoVE campaign.
MED31/437: A Web-based Diabetes Management System: DiabNet
Zhao, N; Roudsari, A; Carson, E
1999-01-01
Introduction A web-based system (DiabNet) was developed to provide instant access to the Electronic Diabetes Records (EDR) for end-users, and real-time information for healthcare professionals to facilitate their decision-making. It integrates portable glucometer, handheld computer, mobile phone and Internet access as a combined telecommunication and mobile computing solution for diabetes management. Methods: Active Server Pages (ASP) embedded with advanced ActiveX controls and VBScript were developed to allow remote data upload, retrieval and interpretation. Some advisory and Internet-based learning features, together with a video teleconferencing component make DiabNet web site an informative platform for Web-consultation. Results The evaluation of the system is being implemented among several UK Internet diabetes discussion groups and the Diabetes Day Centre at the Guy's & St. Thomas' Hospital. Many positive feedback are received from the web site demonstrating DiabNet is an advanced web-based diabetes management system which can help patients to keep closer control of self-monitoring blood glucose remotely, and is an integrated diabetes information resource that offers telemedicine knowledge in diabetes management. Discussion In summary, DiabNet introduces an innovative online diabetes management concept, such as online appointment and consultation, to enable users to access diabetes management information without time and location limitation and security concerns.
Recursive Newton-Euler formulation of manipulator dynamics
NASA Technical Reports Server (NTRS)
Nasser, M. G.
1989-01-01
A recursive Newton-Euler procedure is presented for the formulation and solution of manipulator dynamical equations. The procedure includes rotational and translational joints and a topological tree. This model was verified analytically using a planar two-link manipulator. Also, the model was tested numerically against the Walker-Orin model using the Shuttle Remote Manipulator System data. The hinge accelerations obtained from both models were identical. The computational requirements of the model vary linearly with the number of joints. The computational efficiency of this method exceeds that of Walker-Orin methods. This procedure may be viewed as a considerable generalization of Armstrong's method. A six-by-six formulation is adopted which enhances both the computational efficiency and simplicity of the model.
NASA Astrophysics Data System (ADS)
Coburn, C. A.; Qin, Y.; Zhang, J.; Staenz, K.
2015-12-01
Food security is one of the most pressing issues facing humankind. Recent estimates predict that over one billion people don't have enough food to meet their basic nutritional needs. The ability of remote sensing tools to monitor and model crop production and predict crop yield is essential for providing governments and farmers with vital information to ensure food security. Google Earth Engine (GEE) is a cloud computing platform, which integrates storage and processing algorithms for massive remotely sensed imagery and vector data sets. By providing the capabilities of storing and analyzing the data sets, it provides an ideal platform for the development of advanced analytic tools for extracting key variables used in regional and national food security systems. With the high performance computing and storing capabilities of GEE, a cloud-computing based system for near real-time crop land monitoring was developed using multi-source remotely sensed data over large areas. The system is able to process and visualize the MODIS time series NDVI profile in conjunction with Landsat 8 image segmentation for crop monitoring. With multi-temporal Landsat 8 imagery, the crop fields are extracted using the image segmentation algorithm developed by Baatz et al.[1]. The MODIS time series NDVI data are modeled by TIMESAT [2], a software package developed for analyzing time series of satellite data. The seasonality of MODIS time series data, for example, the start date of the growing season, length of growing season, and NDVI peak at a field-level are obtained for evaluating the crop-growth conditions. The system fuses MODIS time series NDVI data and Landsat 8 imagery to provide information of near real-time crop-growth conditions through the visualization of MODIS NDVI time series and comparison of multi-year NDVI profiles. Stakeholders, i.e., farmers and government officers, are able to obtain crop-growth information at crop-field level online. This unique utilization of GEE in combination with advanced analytic and extraction techniques provides a vital remote sensing tool for decision makers and scientists with a high-degree of flexibility to adapt to different uses.
Implementation of Multispectral Image Classification on a Remote Adaptive Computer
NASA Technical Reports Server (NTRS)
Figueiredo, Marco A.; Gloster, Clay S.; Stephens, Mark; Graves, Corey A.; Nakkar, Mouna
1999-01-01
As the demand for higher performance computers for the processing of remote sensing science algorithms increases, the need to investigate new computing paradigms its justified. Field Programmable Gate Arrays enable the implementation of algorithms at the hardware gate level, leading to orders of m a,gnitude performance increase over microprocessor based systems. The automatic classification of spaceborne multispectral images is an example of a computation intensive application, that, can benefit from implementation on an FPGA - based custom computing machine (adaptive or reconfigurable computer). A probabilistic neural network is used here to classify pixels of of a multispectral LANDSAT-2 image. The implementation described utilizes Java client/server application programs to access the adaptive computer from a remote site. Results verify that a remote hardware version of the algorithm (implemented on an adaptive computer) is significantly faster than a local software version of the same algorithm implemented on a typical general - purpose computer).
Computer vision camera with embedded FPGA processing
NASA Astrophysics Data System (ADS)
Lecerf, Antoine; Ouellet, Denis; Arias-Estrada, Miguel
2000-03-01
Traditional computer vision is based on a camera-computer system in which the image understanding algorithms are embedded in the computer. To circumvent the computational load of vision algorithms, low-level processing and imaging hardware can be integrated in a single compact module where a dedicated architecture is implemented. This paper presents a Computer Vision Camera based on an open architecture implemented in an FPGA. The system is targeted to real-time computer vision tasks where low level processing and feature extraction tasks can be implemented in the FPGA device. The camera integrates a CMOS image sensor, an FPGA device, two memory banks, and an embedded PC for communication and control tasks. The FPGA device is a medium size one equivalent to 25,000 logic gates. The device is connected to two high speed memory banks, an IS interface, and an imager interface. The camera can be accessed for architecture programming, data transfer, and control through an Ethernet link from a remote computer. A hardware architecture can be defined in a Hardware Description Language (like VHDL), simulated and synthesized into digital structures that can be programmed into the FPGA and tested on the camera. The architecture of a classical multi-scale edge detection algorithm based on a Laplacian of Gaussian convolution has been developed to show the capabilities of the system.
Remote photonic metrology in the conservation of cultural heritage
NASA Astrophysics Data System (ADS)
Tornari, Vivi; Pedrini, G.; Osten, W.
2013-05-01
Photonic technologies play a leading innovative role of research in the fields of Cultural Heritage (CH) conservation, preservation and digitisation. In particular photonic technologies have introduced a new indispensable era of research in the conservation of cultural artefacts expanding from decorative objects, paintings, sculptures, monuments to archaeological sites and including fields of application as diverse as materials characterisation to restoration practices and from defect topography to 3d artwork reconstruction. Thus the last two decades photonic technologies have emerged as unique answer or most competitive alternative into many long-term standing disputes in conservation and restoration of Cultural Heritage. Despite the impressive advances on the state-of-the-art ranging from custom-made system development to new methods and practises, photonic research and technological developments remain incoherently scattered and fragmented with a significant amount of duplication of work and misuse of resources. In this context, further progress should aim to capitalise on the so far achieved milestones in any of the diverse applications flourished in the field of CH. Embedding of experimental facilities and conclusions seems the only way to secure the progress beyond the existing state of the art and its false use. The solution to this embedment seems possible through the new computing environments. Cloud computing environment and remote laboratory access hold the missing research objective to bring the leading research together and integrate the achievements. The cloud environment would allow experts from museums, galleries, historical sites, art historians, conservators, scientists and technologists, conservation and technical laboratories and SMEs to interact their research, communicate their achievements and share data and resources. The main instrument of this integration is the creation of a common research platform termed here Virtual Laboratory allowing not only remote research, inspection and evaluation, but also providing the results to the members and the public with instant and simultaneous access to necessary information, knowledge and technologies. In this paper it is presented the concept and first results confirming the potential of implementing metrology techniques as remote digital laboratory facilities in artwork structural assessment. The method paves the way of the general objective to introduce remote photonic technologies in the sensitive field of Cultural Heritage.
Virtual Computing Laboratories: A Case Study with Comparisons to Physical Computing Laboratories
ERIC Educational Resources Information Center
Burd, Stephen D.; Seazzu, Alessandro F.; Conway, Christopher
2009-01-01
Current technology enables schools to provide remote or virtual computing labs that can be implemented in multiple ways ranging from remote access to banks of dedicated workstations to sophisticated access to large-scale servers hosting virtualized workstations. This paper reports on the implementation of a specific lab using remote access to…
NIF ICCS network design and loading analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tietbohl, G; Bryant, R
The National Ignition Facility (NIF) is housed within a large facility about the size of two football fields. The Integrated Computer Control System (ICCS) is distributed throughout this facility and requires the integration of about 40,000 control points and over 500 video sources. This integration is provided by approximately 700 control computers distributed throughout the NIF facility and a network that provides the communication infrastructure. A main control room houses a set of seven computer consoles providing operator access and control of the various distributed front-end processors (FEPs). There are also remote workstations distributed within the facility that allow providemore » operator console functions while personnel are testing and troubleshooting throughout the facility. The operator workstations communicate with the FEPs which implement the localized control and monitoring functions. There are different types of FEPs for the various subsystems being controlled. This report describes the design of the NIF ICCS network and how it meets the traffic loads that will are expected and the requirements of the Sub-System Design Requirements (SSDR's). This document supersedes the earlier reports entitled Analysis of the National Ignition Facility Network, dated November 6, 1996 and The National Ignition Facility Digital Video and Control Network, dated July 9, 1996. For an overview of the ICCS, refer to the document NIF Integrated Computer Controls System Description (NIF-3738).« less
Fusing human and machine skills for remote robotic operations
NASA Technical Reports Server (NTRS)
Schenker, Paul S.; Kim, Won S.; Venema, Steven C.; Bejczy, Antal K.
1991-01-01
The question of how computer assists can improve teleoperator trajectory tracking during both free and force-constrained motions is addressed. Computer graphics techniques which enable the human operator to both visualize and predict detailed 3D trajectories in real-time are reported. Man-machine interactive control procedures for better management of manipulator contact forces and positioning are also described. It is found that collectively, these novel advanced teleoperations techniques both enhance system performance and significantly reduce control problems long associated with teleoperations under time delay. Ongoing robotic simulations of the 1984 space shuttle Solar Maximum EVA Repair Mission are briefly described.
Multiple operating system rotation environment moving target defense
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Nathaniel; Thompson, Michael
Systems and methods for providing a multiple operating system rotation environment ("MORE") moving target defense ("MTD") computing system are described. The MORE-MTD system provides enhanced computer system security through a rotation of multiple operating systems. The MORE-MTD system increases attacker uncertainty, increases the cost of attacking the system, reduces the likelihood of an attacker locating a vulnerability, and reduces the exposure time of any located vulnerability. The MORE-MTD environment is effectuated by rotation of the operating systems at a given interval. The rotating operating systems create a consistently changing attack surface for remote attackers.
NASA Technical Reports Server (NTRS)
Coker, A. E.; Marshall, R.; Thomson, N. S.
1977-01-01
Data were collected near Bartow, Florida, for the purpose of studying land collapse phenomena using remote sensing techniques. Data obtained using the multispectral scanner system consisted of various combinations of 18 spectral bands ranging from 0.4-14.0 microns and several types of photography. The multispectral data were processed on a special-purpose analog computer in order to detect moisture-stressed vegetation and to enhance terrain surface temperatures. The processed results were printed on film to show the patterns of distribution of the proposed hydrogeologic indicators.
ERIC Educational Resources Information Center
Reynolds, Karen
1996-01-01
Outlines benefits of integrating optical instruments in computer-based instructional systems in a science classroom including budget, immediacy, pictorial records, and graphic enhancement. Presents examples of investigative activities involving optical instruments and images digitized for computer-based manipulation. (JRH)
The Development of GIS Educational Resources Sharing among Central Taiwan Universities
NASA Astrophysics Data System (ADS)
Chou, T.-Y.; Yeh, M.-L.; Lai, Y.-C.
2011-09-01
Using GIS in the classroom enhance students' computer skills and explore the range of knowledge. The paper highlights GIS integration on e-learning platform and introduces a variety of abundant educational resources. This research project will demonstrate tools for e-learning environment and delivers some case studies for learning interaction from Central Taiwan Universities. Feng Chia University (FCU) obtained a remarkable academic project subsidized by Ministry of Education and developed e-learning platform for excellence in teaching/learning programs among Central Taiwan's universities. The aim of the project is to integrate the educational resources of 13 universities in central Taiwan. FCU is serving as the hub of Center University. To overcome the problem of distance, e-platforms have been established to create experiences with collaboration enhanced learning. The e-platforms provide coordination of web service access among the educational community and deliver GIS educational resources. Most of GIS related courses cover the development of GIS, principles of cartography, spatial data analysis and overlaying, terrain analysis, buffer analysis, 3D GIS application, Remote Sensing, GPS technology, and WebGIS, MobileGIS, ArcGIS manipulation. In each GIS case study, students have been taught to know geographic meaning, collect spatial data and then use ArcGIS software to analyze spatial data. On one of e-Learning platforms provide lesson plans and presentation slides. Students can learn Arc GIS online. As they analyze spatial data, they can connect to GIS hub to get data they need including satellite images, aerial photos, and vector data. Moreover, e-learning platforms provide solutions and resources. Different levels of image scales have been integrated into the systems. Multi-scale spatial development and analyses in Central Taiwan integrate academic research resources among CTTLRC partners. Thus, establish decision-making support mechanism in teaching and learning. Accelerate communication, cooperation and sharing among academic units
Exploring Pacific Seamounts through Telepresence Mapping on the NOAA Ship Okeanos Explorer
NASA Astrophysics Data System (ADS)
Lobecker, E.; Malik, M.; Sowers, D.; Kennedy, B. R.
2016-12-01
Telepresence utilizes modern computer networks and a high bandwidth satellite connection to enable remote users to participate virtually in ocean research and exploration cruises. NOAA's Office of Ocean Exploration and Research (OER) has been leveraging telepresence capabilities since the early 2000s. Through telepresence, remote users have provided support for operations planning and execution, troubleshooting hardware and software, and data interpretation during exploratory ocean mapping and remotely operated vehicle missions conducted by OER. The potential for this technology's application to immersive data acquisition and processing during mapping missions, however, has not yet been fully realized. We report the results of the application of telepresence to an 18-day 24 hour / day seafloor mapping expedition with the NOAA Ship Okeanos Explorer. The mapping team was split between shipboard and shore-based mission team members based at the Exploration Command Center at the University of New Hampshire. This cruise represented the third dedicated mapping cruise in a multi-year NOAA Campaign to Address the Pacific monument Science, Technology, and Ocean Needs (CAPSTONE). Cruise objectives included mapping several previously unmapped seamounts in the Wake Atoll Unit of the recently expanded Pacific Remote Islands Marine National Monument, and mapping of prominent seamount, ridge, and fracture zone features during transits. We discuss (1) expanded shore-based data processing of multiple sonar data streams leading to enhanced, rapid, initial site characterization, (2) remote access control of shipboard sonar data acquisition and processing computers, and (3) potential for broadening multidisciplinary applications of ocean mapping cruises including outreach, education, and communications efforts focused on expanding societal cognition and benefits of ocean exploration.
Utilisation of Wearable Computing for Space Programmes Test Activities Optimasation
NASA Astrophysics Data System (ADS)
Basso, V.; Lazzari, D.; Alemanni, M.
2004-08-01
New technologies are assuming a relevant importance in the Space business domain also in the Assembly Integration and Test (AIT) activities allowing process optimization and capability that were unthinkable only few years ago. This paper has the aim to describe Alenia Spazio (ALS) gained experience on the remote interaction techniques as a results of collaborations established both on European Communities (EC) initiatives, with Alenia Aeronautica (ALA) and Politecnico of Torino (POLITO). The H/W and S/W components performances increase and costs reduction due to the home computing massive utilization (especially demanded by the games business) together with the network technology possibility (offered by the web as well as the hi-speed links and the wireless communications) allow today to re-think the traditional AIT process activities in the light of the multimedia data exchange: graphical, voice video and by sure more in the future. Aerospace business confirm its innovation vocation which in the year '80 represents the cradle of the CAD systems and today is oriented to the 3D data visualization/ interaction technologies and remote visualisation/ interaction in collaborative way on a much more user friendly bases (i.e. not for specialists). Fig. 1 collects AIT extended scenario studied and adopted by ALS in these years. ALS experimented two possibilities of remote visualization/interaction: Portable [e.g. Fig.2 Personal Digital Assistant (PDA), Wearable] and walls (e.g.VR-Lab) screens as both 2D/3D visualisation and interaction devices which could support many types of traditional (mainly based on EGSE and PDM/CAD utilisation/reports) company internal AIT applications: 1. design review support 2. facility management 3. storage management 4. personnel training 5. integration sequences definition 6. assembly and test operations follow up 7. documentation review and external access to AIT activities for remote operations (e.g. tele-testing) EGSE Portable Clean room Walls PDM/CAD Tele-operations Product Control room External World
Web-Based Learning in the Computer-Aided Design Curriculum.
ERIC Educational Resources Information Center
Sung, Wen-Tsai; Ou, S. C.
2002-01-01
Applies principles of constructivism and virtual reality (VR) to computer-aided design (CAD) curriculum, particularly engineering, by integrating network, VR and CAD technologies into a Web-based learning environment that expands traditional two-dimensional computer graphics into a three-dimensional real-time simulation that enhances user…
Vita-Finzi, Claudio
2012-05-13
During the last half century, advances in geomorphology-abetted by conceptual and technical developments in geophysics, geochemistry, remote sensing, geodesy, computing and ecology-have enhanced the potential value of fluvial history for reconstructing erosional and depositional sequences on the Earth and on Mars and for evaluating climatic and tectonic changes, the impact of fluvial processes on human settlement and health, and the problems faced in managing unstable fluvial systems. This journal is © 2012 The Royal Society
Remote Sensing Systems Optimization for Geobase Enhancement
2003-03-01
through feedback from base users, as well as the researcher’s observations. 3.1 GeoBase and GIS Learning GeoBase and Geographic Information System ...Abstract The U.S. Air Force is in the process of implementing GeoBase, a geographic information system (GIS), throughout its worldwide installations...Geographic Information System (GIS). A GIS is a computer database that contains geo-spatial information . It is the principal tool used to input, view
Aydın, Eda Akman; Bay, Ömer Faruk; Güler, İnan
2016-01-01
Brain Computer Interface (BCI) based environment control systems could facilitate life of people with neuromuscular diseases, reduces dependence on their caregivers, and improves their quality of life. As well as easy usage, low-cost, and robust system performance, mobility is an important functionality expected from a practical BCI system in real life. In this study, in order to enhance users' mobility, we propose internet based wireless communication between BCI system and home environment. We designed and implemented a prototype of an embedded low-cost, low power, easy to use web server which is employed in internet based wireless control of a BCI based home environment. The embedded web server provides remote access to the environmental control module through BCI and web interfaces. While the proposed system offers to BCI users enhanced mobility, it also provides remote control of the home environment by caregivers as well as the individuals in initial stages of neuromuscular disease. The input of BCI system is P300 potentials. We used Region Based Paradigm (RBP) as stimulus interface. Performance of the BCI system is evaluated on data recorded from 8 non-disabled subjects. The experimental results indicate that the proposed web server enables internet based wireless control of electrical home appliances successfully through BCIs.
Vision Based Autonomous Robotic Control for Advanced Inspection and Repair
NASA Technical Reports Server (NTRS)
Wehner, Walter S.
2014-01-01
The advanced inspection system is an autonomous control and analysis system that improves the inspection and remediation operations for ground and surface systems. It uses optical imaging technology with intelligent computer vision algorithms to analyze physical features of the real-world environment to make decisions and learn from experience. The advanced inspection system plans to control a robotic manipulator arm, an unmanned ground vehicle and cameras remotely, automatically and autonomously. There are many computer vision, image processing and machine learning techniques available as open source for using vision as a sensory feedback in decision-making and autonomous robotic movement. My responsibilities for the advanced inspection system are to create a software architecture that integrates and provides a framework for all the different subsystem components; identify open-source algorithms and techniques; and integrate robot hardware.
Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; Young, Steven D.
2005-01-01
In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.
Integrating Social Networks and Remote Patient Monitoring Systems to Disseminate Notifications.
Ribeiro, Hugo A; Germano, Eliseu; Carvalho, Sergio T; Albuquerque, Eduardo S
2017-01-01
Healthcare workforce shortage can be compensated by using information and communication technologies. Remote patient monitoring systems allow us to identify and communicate complications and anomalies. Integrating social networking services into remote patient monitoring systems enables users to manage their relationships. User defined relationships may be used to disseminate healthcare related notifications. Hence this integration leads to quicker interventions and may reduce hospital readmission rate. As a proof of concept, a module was integrated to a remote patient monitoring platform. A mobile application to manage relationships and receive notifications was also developed.
NASA Astrophysics Data System (ADS)
Weber, S. A.; Engel-Cox, J. A.; Hoff, R. M.; Prados, A.; Zhang, H.
2008-12-01
Integrating satellite- and ground-based aerosol optical depth (AOD) observations with surface total fine particulate (PM2.5) and sulfate concentrations allows for a more comprehensive understanding of local- and urban-scale air quality. This study evaluates the utility of integrated databases being developed for NOAA and EPA through the 3D-AQS project by examining the relationship between remotely-sensed AOD and PM2.5 concentrations for each platform for the summer of 2004 and the entire year of 2005. We compare results for the Baltimore, MD/Washington, DC metropolitan air shed, incorporating AOD products from the Terra and GOES-12 satellites, AERONET sunphotometer, and ground-based lidar, and PM2.5 concentrations from five surface monitoring sites. The satellite-derived products include AOD from the Moderate Resolution Imaging Spectroradiometer (MODIS) and Multi-angle Imaging Spectroradiometer (MISR), as well as the GOES Aerosol/Smoke Product (GASP). The vertical profile of lidar backscatter is used to retrieve the planetary boundary layer (PBL) height in an attempt to capture only that fraction of the AOD arising from near surface aerosols. Adjusting the AOD data using platform- and season-specific ratios, calculated using the parameters of the regression equations, for two case studies resulted in a more accurate representation of surface PM2.5 concentrations when compared to a constant ratio that is currently being used in the NOAA IDEA product. This work demonstrates that quantitative relationships between remotely-sensed and in-situ aerosol observations in an integrated database can be computed and applied to improve the use of remotely-sensed observations for estimating surface concentrations.
Integration of communications and tracking data processing simulation for space station
NASA Technical Reports Server (NTRS)
Lacovara, Robert C.
1987-01-01
A simplified model of the communications network for the Communications and Tracking Data Processing System (CTDP) was developed. It was simulated by use of programs running on several on-site computers. These programs communicate with one another by means of both local area networks and direct serial connections. The domain of the model and its simulation is from Orbital Replaceable Unit (ORU) interface to Data Management Systems (DMS). The simulation was designed to allow status queries from remote entities across the DMS networks to be propagated through the model to several simulated ORU's. The ORU response is then propagated back to the remote entity which originated the request. Response times at the various levels were investigated in a multi-tasking, multi-user operating system environment. Results indicate that the effective bandwidth of the system may be too low to support expected data volume requirements under conventional operating systems. Instead, some form of embedded process control program may be required on the node computers.
Environmental Monitoring Using Sensor Networks
NASA Astrophysics Data System (ADS)
Yang, J.; Zhang, C.; Li, X.; Huang, Y.; Fu, S.; Acevedo, M. F.
2008-12-01
Environmental observatories, consisting of a variety of sensor systems, computational resources and informatics, are important for us to observe, model, predict, and ultimately help preserve the health of the nature. The commoditization and proliferation of coin-to-palm sized wireless sensors will allow environmental monitoring with unprecedented fine spatial and temporal resolution. Once scattered around, these sensors can identify themselves, locate their positions, describe their functions, and self-organize into a network. They communicate through wireless channel with nearby sensors and transmit data through multi-hop protocols to a gateway, which can forward information to a remote data server. In this project, we describe an environmental observatory called Texas Environmental Observatory (TEO) that incorporates a sensor network system with intertwined wired and wireless sensors. We are enhancing and expanding the existing wired weather stations to include wireless sensor networks (WSNs) and telemetry using solar-powered cellular modems. The new WSNs will monitor soil moisture and support long-term hydrologic modeling. Hydrologic models are helpful in predicting how changes in land cover translate into changes in the stream flow regime. These models require inputs that are difficult to measure over large areas, especially variables related to storm events, such as soil moisture antecedent conditions and rainfall amount and intensity. This will also contribute to improve rainfall estimations from meteorological radar data and enhance hydrological forecasts. Sensor data are transmitted from monitoring site to a Central Data Collection (CDC) Server. We incorporate a GPRS modem for wireless telemetry, a single-board computer (SBC) as Remote Field Gateway (RFG) Server, and a WSN for distributed soil moisture monitoring. The RFG provides effective control, management, and coordination of two independent sensor systems, i.e., a traditional datalogger-based wired sensor system and the WSN-based wireless sensor system. The RFG also supports remote manipulation of the devices in the field such as the SBC, datalogger, and WSN. Sensor data collected from the distributed monitoring stations are stored in a database (DB) Server. The CDC Server acts as an intermediate component to hide the heterogeneity of different devices and support data validation required by the DB Server. Daemon programs running on the CDC Server pre-process the data before it is inserted into the database, and periodically perform synchronization tasks. A SWE-compliant data repository is installed to enable data exchange, accepting data from both internal DB Server and external sources through the OGC web services. The web portal, i.e. TEO Online, serves as a user-friendly interface for data visualization, analysis, synthesis, modeling, and K-12 educational outreach activities. It also provides useful capabilities for system developers and operators to remotely monitor system status and remotely update software and system configuration, which greatly simplifies the system debugging and maintenance tasks. We also implement Sensor Observation Services (SOS) at this layer, conforming to the SWE standard to facilitate data exchange. The standard SensorML/O&M data representation makes it easy to integrate our sensor data into the existing Geographic Information Systems (GIS) web services and exchange the data with other organizations.
NASA Technical Reports Server (NTRS)
Cardullo, Frank M.; Lewis, Harold W., III; Panfilov, Peter B.
2007-01-01
An extremely innovative approach has been presented, which is to have the surgeon operate through a simulator running in real-time enhanced with an intelligent controller component to enhance the safety and efficiency of a remotely conducted operation. The use of a simulator enables the surgeon to operate in a virtual environment free from the impediments of telecommunication delay. The simulator functions as a predictor and periodically the simulator state is corrected with truth data. Three major research areas must be explored in order to ensure achieving the objectives. They are: simulator as predictor, image processing, and intelligent control. Each is equally necessary for success of the project and each of these involves a significant intelligent component in it. These are diverse, interdisciplinary areas of investigation, thereby requiring a highly coordinated effort by all the members of our team, to ensure an integrated system. The following is a brief discussion of those areas. Simulator as a predictor: The delays encountered in remote robotic surgery will be greater than any encountered in human-machine systems analysis, with the possible exception of remote operations in space. Therefore, novel compensation techniques will be developed. Included will be the development of the real-time simulator, which is at the heart of our approach. The simulator will present real-time, stereoscopic images and artificial haptic stimuli to the surgeon. Image processing: Because of the delay and the possibility of insufficient bandwidth a high level of novel image processing is necessary. This image processing will include several innovative aspects, including image interpretation, video to graphical conversion, texture extraction, geometric processing, image compression and image generation at the surgeon station. Intelligent control: Since the approach we propose is in a sense predictor based, albeit a very sophisticated predictor, a controller, which not only optimizes end effector trajectory but also avoids error, is essential. We propose to investigate two different approaches to the controller design. One approach employs an optimal controller based on modern control theory; the other one involves soft computing techniques, i.e. fuzzy logic, neural networks, genetic algorithms and hybrids of these.
Albon, Simon P.; Cancilla, Devon A.; Hubball, Harry
2006-01-01
Objectives To pilot test and evaluate a gas chromatography-mass spectrometry (GCMS) case study as a teaching and learning tool. Design A case study incorporating remote access to a GCMS instrument through the Integrated Laboratory Network (ILN) at Western Washington University was developed and implemented. Student surveys, faculty interviews, and examination score data were used to evaluate learning. Assessment While the case study did not impact final examination scores, approximately 70% of students and all faculty members felt the ILN-supported case study improved student learning about GCMS. Faculty members felt the “live” instrument access facilitated more authentic teaching. Students and faculty members felt the ILN should continue to be developed as a teaching tool. Conclusion Remote access to scientific instrumentation can be used to modify case studies to enhance student learning and teaching practice in pharmaceutical analysis. PMID:17149450
Integrated Geo Hazard Management System in Cloud Computing Technology
NASA Astrophysics Data System (ADS)
Hanifah, M. I. M.; Omar, R. C.; Khalid, N. H. N.; Ismail, A.; Mustapha, I. S.; Baharuddin, I. N. Z.; Roslan, R.; Zalam, W. M. Z.
2016-11-01
Geo hazard can result in reducing of environmental health and huge economic losses especially in mountainous area. In order to mitigate geo-hazard effectively, cloud computer technology are introduce for managing geo hazard database. Cloud computing technology and it services capable to provide stakeholder's with geo hazards information in near to real time for an effective environmental management and decision-making. UNITEN Integrated Geo Hazard Management System consist of the network management and operation to monitor geo-hazard disaster especially landslide in our study area at Kelantan River Basin and boundary between Hulu Kelantan and Hulu Terengganu. The system will provide easily manage flexible measuring system with data management operates autonomously and can be controlled by commands to collects and controls remotely by using “cloud” system computing. This paper aims to document the above relationship by identifying the special features and needs associated with effective geohazard database management using “cloud system”. This system later will use as part of the development activities and result in minimizing the frequency of the geo-hazard and risk at that research area.
An SNMP-based solution to enable remote ISO/IEEE 11073 technical management.
Lasierra, Nelia; Alesanco, Alvaro; García, José
2012-07-01
This paper presents the design and implementation of an architecture based on the integration of simple network management protocol version 3 (SNMPv3) and the standard ISO/IEEE 11073 (X73) to manage technical information in home-based telemonitoring scenarios. This architecture includes the development of an SNMPv3-proxyX73 agent which comprises a management information base (MIB) module adapted to X73. In the proposed scenario, medical devices (MDs) send information to a concentrator device [designated as compute engine (CE)] using the X73 standard. This information together with extra information collected in the CE is stored in the developed MIB. Finally, the information collected is available for remote access via SNMP connection. Moreover, alarms and events can be configured by an external manager in order to provide warnings of irregularities in the MDs' technical performance evaluation. This proposed SNMPv3 agent provides a solution to integrate and unify technical device management in home-based telemonitoring scenarios fully adapted to X73.
Modeling a Wireless Network for International Space Station
NASA Technical Reports Server (NTRS)
Alena, Richard; Yaprak, Ece; Lamouri, Saad
2000-01-01
This paper describes the application of wireless local area network (LAN) simulation modeling methods to the hybrid LAN architecture designed for supporting crew-computing tools aboard the International Space Station (ISS). These crew-computing tools, such as wearable computers and portable advisory systems, will provide crew members with real-time vehicle and payload status information and access to digital technical and scientific libraries, significantly enhancing human capabilities in space. A wireless network, therefore, will provide wearable computer and remote instruments with the high performance computational power needed by next-generation 'intelligent' software applications. Wireless network performance in such simulated environments is characterized by the sustainable throughput of data under different traffic conditions. This data will be used to help plan the addition of more access points supporting new modules and more nodes for increased network capacity as the ISS grows.
Attitude Towards Computers and Classroom Management of Language School Teachers
ERIC Educational Resources Information Center
Jalali, Sara; Panahzade, Vahid; Firouzmand, Ali
2014-01-01
Computer-assisted language learning (CALL) is the realization of computers in schools and universities which has potentially enhanced the language learning experience inside the classrooms. The integration of the technologies into the classroom demands that the teachers adopt a number of classroom management procedures to maintain a more…
Change Detection of Mobile LIDAR Data Using Cloud Computing
NASA Astrophysics Data System (ADS)
Liu, Kun; Boehm, Jan; Alis, Christian
2016-06-01
Change detection has long been a challenging problem although a lot of research has been conducted in different fields such as remote sensing and photogrammetry, computer vision, and robotics. In this paper, we blend voxel grid and Apache Spark together to propose an efficient method to address the problem in the context of big data. Voxel grid is a regular geometry representation consisting of the voxels with the same size, which fairly suites parallel computation. Apache Spark is a popular distributed parallel computing platform which allows fault tolerance and memory cache. These features can significantly enhance the performance of Apache Spark and results in an efficient and robust implementation. In our experiments, both synthetic and real point cloud data are employed to demonstrate the quality of our method.
Cobb, Joshua N; Declerck, Genevieve; Greenberg, Anthony; Clark, Randy; McCouch, Susan
2013-04-01
More accurate and precise phenotyping strategies are necessary to empower high-resolution linkage mapping and genome-wide association studies and for training genomic selection models in plant improvement. Within this framework, the objective of modern phenotyping is to increase the accuracy, precision and throughput of phenotypic estimation at all levels of biological organization while reducing costs and minimizing labor through automation, remote sensing, improved data integration and experimental design. Much like the efforts to optimize genotyping during the 1980s and 1990s, designing effective phenotyping initiatives today requires multi-faceted collaborations between biologists, computer scientists, statisticians and engineers. Robust phenotyping systems are needed to characterize the full suite of genetic factors that contribute to quantitative phenotypic variation across cells, organs and tissues, developmental stages, years, environments, species and research programs. Next-generation phenotyping generates significantly more data than previously and requires novel data management, access and storage systems, increased use of ontologies to facilitate data integration, and new statistical tools for enhancing experimental design and extracting biologically meaningful signal from environmental and experimental noise. To ensure relevance, the implementation of efficient and informative phenotyping experiments also requires familiarity with diverse germplasm resources, population structures, and target populations of environments. Today, phenotyping is quickly emerging as the major operational bottleneck limiting the power of genetic analysis and genomic prediction. The challenge for the next generation of quantitative geneticists and plant breeders is not only to understand the genetic basis of complex trait variation, but also to use that knowledge to efficiently synthesize twenty-first century crop varieties.
Computer applications in remote sensing education
NASA Technical Reports Server (NTRS)
Danielson, R. L.
1980-01-01
Computer applications to instruction in any field may be divided into two broad generic classes: computer-managed instruction and computer-assisted instruction. The division is based on how frequently the computer affects the instructional process and how active a role the computer affects the instructional process and how active a role the computer takes in actually providing instruction. There are no inherent characteristics of remote sensing education to preclude the use of one or both of these techniques, depending on the computer facilities available to the instructor. The characteristics of the two classes are summarized, potential applications to remote sensing education are discussed, and the advantages and disadvantages of computer applications to the instructional process are considered.
Nakata, Norio; Suzuki, Naoki; Hattori, Asaki; Hirai, Naoya; Miyamoto, Yukio; Fukuda, Kunihiko
2012-01-01
Although widely used as a pointing device on personal computers (PCs), the mouse was originally designed for control of two-dimensional (2D) cursor movement and is not suited to complex three-dimensional (3D) image manipulation. Augmented reality (AR) is a field of computer science that involves combining the physical world and an interactive 3D virtual world; it represents a new 3D user interface (UI) paradigm. A system for 3D and four-dimensional (4D) image manipulation has been developed that uses optical tracking AR integrated with a smartphone remote control. The smartphone is placed in a hard case (jacket) with a 2D printed fiducial marker for AR on the back. It is connected to a conventional PC with an embedded Web camera by means of WiFi. The touch screen UI of the smartphone is then used as a remote control for 3D and 4D image manipulation. Using this system, the radiologist can easily manipulate 3D and 4D images from computed tomography and magnetic resonance imaging in an AR environment with high-quality image resolution. Pilot assessment of this system suggests that radiologists will be able to manipulate 3D and 4D images in the reading room in the near future. Supplemental material available at http://radiographics.rsna.org/lookup/suppl/doi:10.1148/rg.324115086/-/DC1.
Autonomous mobile robot for radiologic surveys
Dudar, A.M.; Wagner, D.G.; Teese, G.D.
1994-06-28
An apparatus is described for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm. 5 figures.
Autonomous mobile robot for radiologic surveys
Dudar, Aed M.; Wagner, David G.; Teese, Gregory D.
1994-01-01
An apparatus for conducting radiologic surveys. The apparatus comprises in the main a robot capable of following a preprogrammed path through an area, a radiation monitor adapted to receive input from a radiation detector assembly, ultrasonic transducers for navigation and collision avoidance, and an on-board computer system including an integrator for interfacing the radiation monitor and the robot. Front and rear bumpers are attached to the robot by bumper mounts. The robot may be equipped with memory boards for the collection and storage of radiation survey information. The on-board computer system is connected to a remote host computer via a UHF radio link. The apparatus is powered by a rechargeable 24-volt DC battery, and is stored at a docking station when not in use and/or for recharging. A remote host computer contains a stored database defining paths between points in the area where the robot is to operate, including but not limited to the locations of walls, doors, stationary furniture and equipment, and sonic markers if used. When a program consisting of a series of paths is downloaded to the on-board computer system, the robot conducts a floor survey autonomously at any preselected rate. When the radiation monitor detects contamination, the robot resurveys the area at reduced speed and resumes its preprogrammed path if the contamination is not confirmed. If the contamination is confirmed, the robot stops and sounds an alarm.
Forbes, Thomas P.; Staymates, Matthew
2017-01-01
Venturi-assisted ENTrainment and Ionization (VENTI) was developed, demonstrating efficient entrainment, collection, and transport of remotely sampled vapors, aerosols, and dust particulate for real-time mass spectrometry (MS) detection. Integrating the Venturi and Coandă effects at multiple locations generated flow and analyte transport from non-proximate locations and more importantly enhanced the aerodynamic reach at the point of collection. Transport through remote sampling probes up to 2.5 m in length was achieved with residence times on the order of 10-2 s to 10-1 s and Reynolds numbers on the order of 103 to 104. The Venturi-assisted entrainment successfully enhanced vapor collection and detection by greater than an order of magnitude at 20 cm stand-off (limit of simple suction). This enhancement is imperative, as simple suction restricts sampling to the immediate vicinity, requiring close proximity to the vapor source. In addition, the overall aerodynamic reach distance was increased by approximately 3-fold over simple suction under the investigated conditions. Enhanced aerodynamic reach was corroborated and observed with laser-light sheet flow visualization and schlieren imaging. Coupled with atmospheric pressure chemical ionization (APCI), the detection of a range of volatile chemical vapors; explosive vapors; explosive, narcotic, and mustard gas surrogate (methyl salicylate) aerosols; and explosive dust particulate was demonstrated. Continuous real-time Venturi-assisted monitoring of a large room (approximately 90 m2 area, 570 m3 volume) was demonstrated for a 60-minute period without the remote sampling probe, exhibiting detection of chemical vapors and methyl salicylate at approximately 3 m stand-off distances within 2 minutes of exposure. PMID:28107830
Forbes, Thomas P; Staymates, Matthew
2017-03-08
Venturi-assisted ENTrainment and Ionization (VENTI) was developed, demonstrating efficient entrainment, collection, and transport of remotely sampled vapors, aerosols, and dust particulate for real-time mass spectrometry (MS) detection. Integrating the Venturi and Coandă effects at multiple locations generated flow and analyte transport from non-proximate locations and more importantly enhanced the aerodynamic reach at the point of collection. Transport through remote sampling probes up to 2.5 m in length was achieved with residence times on the order of 10 -2 s to 10 -1 s and Reynolds numbers on the order of 10 3 to 10 4 . The Venturi-assisted entrainment successfully enhanced vapor collection and detection by greater than an order of magnitude at 20 cm stand-off (limit of simple suction). This enhancement is imperative, as simple suction restricts sampling to the immediate vicinity, requiring close proximity to the vapor source. In addition, the overall aerodynamic reach distance was increased by approximately 3-fold over simple suction under the investigated conditions. Enhanced aerodynamic reach was corroborated and observed with laser-light sheet flow visualization and schlieren imaging. Coupled with atmospheric pressure chemical ionization (APCI), the detection of a range of volatile chemical vapors; explosive vapors; explosive, narcotic, and mustard gas surrogate (methyl salicylate) aerosols; and explosive dust particulate was demonstrated. Continuous real-time Venturi-assisted monitoring of a large room (approximately 90 m 2 area, 570 m 3 volume) was demonstrated for a 60-min period without the remote sampling probe, exhibiting detection of chemical vapors and methyl salicylate at approximately 3 m stand-off distances within 2 min of exposure. Published by Elsevier B.V.
Optical sampling of the flux tower footprint
NASA Astrophysics Data System (ADS)
Gamon, J. A.
2015-03-01
The purpose of this review is to address the reasons and methods for conducting optical remote sensing within the flux tower footprint. Fundamental principles and conclusions gleaned from over two decades of proximal remote sensing at flux tower sites are reviewed. An organizing framework is the light-use efficiency (LUE) model, both because it is widely used, and because it provides a useful theoretical construct for integrating optical remote sensing with flux measurements. Multiple ways of driving this model, ranging from meteorological measurements to remote sensing, have emerged in recent years, making it a convenient conceptual framework for comparative experimental studies. New interpretations of established optical sampling methods, including the Photochemical Reflectance Index (PRI) and Solar-Induced Fluorescence (SIF), are discussed within the context of the LUE model. Multi-scale analysis across temporal and spatial axes is a central theme, because such scaling can provide links between ecophysiological mechanisms detectable at the level of individual organisms and broad patterns emerging at larger scales, enabling evaluation of emergent properties and extrapolation to the flux footprint and beyond. Proper analysis of sampling scale requires an awareness of sampling context that is often essential to the proper interpretation of optical signals. Additionally, the concept of optical types, vegetation exhibiting contrasting optical behavior in time and space, is explored as a way to frame our understanding of the controls on surface-atmosphere fluxes. Complementary NDVI and PRI patterns across ecosystems are offered as an example of this hypothesis, with the LUE model and light-response curve providing an integrating framework. We conclude that experimental approaches allowing systematic exploration of plant optical behavior in the context of the flux tower network provides a unique way to improve our understanding of environmental constraints and ecophysiological function. In addition to an enhanced mechanistic understanding of ecosystem processes, this integration of remote sensing with flux measurements offers many rich opportunities for upscaling, satellite validation, and informing practical management objectives ranging form assessing ecosystem health and productivity to quantifying biospheric carbon sequestration.
NASA Technical Reports Server (NTRS)
Foale, C. Michael; Kaleri, Alexander Y.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Melton, Shannon; Martin, David; Dulchavsky, Scott A.
2004-01-01
The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed just-in-time training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This just-in-time concept was used to support real-time remote expert guidance to complete medical examinations using the ISS Human Research Facility (HRF). An American and Russian ISS crewmember received 2-hours of hands on ultrasound training 8 months prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember six days prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. Results of the CD ROM based OPE session were used to modify the instructions during a complete 35 minute real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were excellent and adequate for clinical decision-making. Complex ultrasound experiments with expert guidance were performed with high accuracy following limited pre-flight training and CD-ROM-based in-flight review, despite a 2-second communication latency. In-flight application of multimedia proficiency enhancement software, coupled with real-time remote expert guidance, can facilitate the performance of complex demanding tasks.
A remote monitor of bed patient cardiac vibration, respiration and movement.
Mukai, Koji; Yonezawa, Yoshiharu; Ogawa, Hidekuni; Maki, Hiromichi; Caldwell, W Morton
2009-01-01
We have developed a remote system for monitoring heart rate, respiration rate and movement behavior of at-home elderly people who are living alone. The system consists of a 40 kHz ultrasonic transmitter and receiver, linear integrated circuits, a low-power 8-bit single chip microcomputer and an Internet server computer. The 40 kHz ultrasonic transmitter and receiver are installed into a bed mattress. The transmitted signal diffuses into the bed mattress, and the amplitude of the received ultrasonic wave is modulated by the shape of the mattress and parameters such as respiration, cardiac vibration and movement. The modulated ultrasonic signal is received and demodulated by an envelope detection circuit. Low, high and band pass filters separate the respiration, cardiac vibration and movement signals, which are fed into the microcontroller and digitized at a sampling rate of 50 Hz by 8-bit A/D converters. The digitized data are sent to the server computer as a serial signal. This computer stores the data and also creates a graphic chart of the latest hour. The person's family or caregiver can download this chart via the Internet at any time.
[Modeling continuous scaling of NDVI based on fractal theory].
Luan, Hai-Jun; Tian, Qing-Jiu; Yu, Tao; Hu, Xin-Li; Huang, Yan; Du, Ling-Tong; Zhao, Li-Min; Wei, Xi; Han, Jie; Zhang, Zhou-Wei; Li, Shao-Peng
2013-07-01
Scale effect was one of the very important scientific problems of remote sensing. The scale effect of quantitative remote sensing can be used to study retrievals' relationship between different-resolution images, and its research became an effective way to confront the challenges, such as validation of quantitative remote sensing products et al. Traditional up-scaling methods cannot describe scale changing features of retrievals on entire series of scales; meanwhile, they are faced with serious parameters correction issues because of imaging parameters' variation of different sensors, such as geometrical correction, spectral correction, etc. Utilizing single sensor image, fractal methodology was utilized to solve these problems. Taking NDVI (computed by land surface radiance) as example and based on Enhanced Thematic Mapper Plus (ETM+) image, a scheme was proposed to model continuous scaling of retrievals. Then the experimental results indicated that: (a) For NDVI, scale effect existed, and it could be described by fractal model of continuous scaling; (2) The fractal method was suitable for validation of NDVI. All of these proved that fractal was an effective methodology of studying scaling of quantitative remote sensing.
NASA Astrophysics Data System (ADS)
Strotov, Valery V.; Taganov, Alexander I.; Konkin, Yuriy V.; Kolesenkov, Aleksandr N.
2017-10-01
Task of processing and analysis of obtained Earth remote sensing data on ultra-small spacecraft board is actual taking into consideration significant expenditures of energy for data transfer and low productivity of computers. Thereby, there is an issue of effective and reliable storage of the general information flow obtained from onboard systems of information collection, including Earth remote sensing data, into a specialized data base. The paper has considered peculiarities of database management system operation with the multilevel memory structure. For storage of data in data base the format has been developed that describes a data base physical structure which contains required parameters for information loading. Such structure allows reducing a memory size occupied by data base because it is not necessary to store values of keys separately. The paper has shown architecture of the relational database management system oriented into embedment into the onboard ultra-small spacecraft software. Data base for storage of different information, including Earth remote sensing data, can be developed by means of such database management system for its following processing. Suggested database management system architecture has low requirements to power of the computer systems and memory resources on the ultra-small spacecraft board. Data integrity is ensured under input and change of the structured information.
Dual-contrast agent photon-counting computed tomography of the heart: initial experience.
Symons, Rolf; Cork, Tyler E; Lakshmanan, Manu N; Evers, Robert; Davies-Venn, Cynthia; Rice, Kelly A; Thomas, Marvin L; Liu, Chia-Ying; Kappler, Steffen; Ulzheimer, Stefan; Sandfort, Veit; Bluemke, David A; Pourmorteza, Amir
2017-08-01
To determine the feasibility of dual-contrast agent imaging of the heart using photon-counting detector (PCD) computed tomography (CT) to simultaneously assess both first-pass and late enhancement of the myocardium. An occlusion-reperfusion canine model of myocardial infarction was used. Gadolinium-based contrast was injected 10 min prior to PCD CT. Iodinated contrast was infused immediately prior to PCD CT, thus capturing late gadolinium enhancement as well as first-pass iodine enhancement. Gadolinium and iodine maps were calculated using a linear material decomposition technique and compared to single-energy (conventional) images. PCD images were compared to in vivo and ex vivo magnetic resonance imaging (MRI) and histology. For infarct versus remote myocardium, contrast-to-noise ratio (CNR) was maximal on late enhancement gadolinium maps (CNR 9.0 ± 0.8, 6.6 ± 0.7, and 0.4 ± 0.4, p < 0.001 for gadolinium maps, single-energy images, and iodine maps, respectively). For infarct versus blood pool, CNR was maximum for iodine maps (CNR 11.8 ± 1.3, 3.8 ± 1.0, and 1.3 ± 0.4, p < 0.001 for iodine maps, gadolinium maps, and single-energy images, respectively). Combined first-pass iodine and late gadolinium maps allowed quantitative separation of blood pool, scar, and remote myocardium. MRI and histology analysis confirmed accurate PCD CT delineation of scar. Simultaneous multi-contrast agent cardiac imaging is feasible with photon-counting detector CT. These initial proof-of-concept results may provide incentives to develop new k-edge contrast agents, to investigate possible interactions between multiple simultaneously administered contrast agents, and to ultimately bring them to clinical practice.
2011-01-01
Background Based on barriers to the use of computerized clinical decision support (CDS) learned in an earlier field study, we prototyped design enhancements to the Veterans Health Administration's (VHA's) colorectal cancer (CRC) screening clinical reminder to compare against the VHA's current CRC reminder. Methods In a controlled simulation experiment, 12 primary care providers (PCPs) used prototypes of the current and redesigned CRC screening reminder in a within-subject comparison. Quantitative measurements were based on a usability survey, workload assessment instrument, and workflow integration survey. We also collected qualitative data on both designs. Results Design enhancements to the VHA's existing CRC screening clinical reminder positively impacted aspects of usability and workflow integration but not workload. The qualitative analysis revealed broad support across participants for the design enhancements with specific suggestions for improving the reminder further. Conclusions This study demonstrates the value of a human-computer interaction evaluation in informing the redesign of information tools to foster uptake, integration into workflow, and use in clinical practice. PMID:22126324
NASA Technical Reports Server (NTRS)
Rediess, Herman A.; Hewett, M. D.
1991-01-01
The requirements are assessed for the use of remote computation to support HRV flight testing. First, remote computational requirements were developed to support functions that will eventually be performed onboard operational vehicles of this type. These functions which either cannot be performed onboard in the time frame of initial HRV flight test programs because the technology of airborne computers will not be sufficiently advanced to support the computational loads required, or it is not desirable to perform the functions onboard in the flight test program for other reasons. Second, remote computational support either required or highly desirable to conduct flight testing itself was addressed. The use is proposed of an Automated Flight Management System which is described in conceptual detail. Third, autonomous operations is discussed and finally, unmanned operations.
NASA Technical Reports Server (NTRS)
Rickman, Douglas
2008-01-01
Remote sensing is measuring something without touching it. Most methods measure a portion of the electro-magnetic spectrum using energy reflected from or emitted by a material. Moving the instrument away makes it easier to see more at one time. Airplanes are good but satellites are much better. Many things can not be easily measured on the scale of an individual person. Example - measuring all the vegetation growing at one time in even the smallest country. A satellite can see things over large areas repeatedly and in a consistent way. Data from the detector is reported as digital values for a grid that covers some portion of the Earth. Because it is digital and consistent a computer can extract information or enhance the data for a specific purpose.
Performance analysis of routing protocols for IoT
NASA Astrophysics Data System (ADS)
Manda, Sridhar; Nalini, N.
2018-04-01
Internet of Things (IoT) is an arrangement of advancements that are between disciplinary. It is utilized to have compelling combination of both physical and computerized things. With IoT physical things can have personal virtual identities and participate in distributed computing. Realization of IoT needs the usage of sensors based on the sector for which IoT is integrated. For instance, in healthcare domain, IoT needs to have integration with wearable sensors used by patients. As sensor devices produce huge amount of data, often called big data, there should be efficient routing protocols in place. To the extent remote systems is worried there are some current protocols, for example, OLSR, DSR and AODV. It additionally tosses light into Trust based routing protocol for low-power and lossy systems (TRPL) for IoT. These are broadly utilized remote directing protocols. As IoT is developing round the corner, it is basic to investigate routing protocols that and evaluate their execution regarding throughput, end to end delay, and directing overhead. The execution experiences can help in settling on very much educated choices while incorporating remote systems with IoT. In this paper, we analyzed different routing protocols and their performance is compared. It is found that AODV showed better performance than other routing protocols aforementioned.
He, Longjun; Ming, Xing; Liu, Qian
2014-04-01
With computing capability and display size growing, the mobile device has been used as a tool to help clinicians view patient information and medical images anywhere and anytime. However, for direct interactive 3D visualization, which plays an important role in radiological diagnosis, the mobile device cannot provide a satisfactory quality of experience for radiologists. This paper developed a medical system that can get medical images from the picture archiving and communication system on the mobile device over the wireless network. In the proposed application, the mobile device got patient information and medical images through a proxy server connecting to the PACS server. Meanwhile, the proxy server integrated a range of 3D visualization techniques, including maximum intensity projection, multi-planar reconstruction and direct volume rendering, to providing shape, brightness, depth and location information generated from the original sectional images for radiologists. Furthermore, an algorithm that changes remote render parameters automatically to adapt to the network status was employed to improve the quality of experience. Finally, performance issues regarding the remote 3D visualization of the medical images over the wireless network of the proposed application were also discussed. The results demonstrated that this proposed medical application could provide a smooth interactive experience in the WLAN and 3G networks.
Facilitating preemptive hardware system design using partial reconfiguration techniques.
Dondo Gazzano, Julio; Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos
2014-01-01
In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration.
Facilitating Preemptive Hardware System Design Using Partial Reconfiguration Techniques
Rincon, Fernando; Vaderrama, Carlos; Villanueva, Felix; Caba, Julian; Lopez, Juan Carlos
2014-01-01
In FPGA-based control system design, partial reconfiguration is especially well suited to implement preemptive systems. In real-time systems, the deadline for critical task can compel the preemption of noncritical one. Besides, an asynchronous event can demand immediate attention and, then, force launching a reconfiguration process for high-priority task implementation. If the asynchronous event is previously scheduled, an explicit activation of the reconfiguration process is performed. If the event cannot be previously programmed, such as in dynamically scheduled systems, an implicit activation to the reconfiguration process is demanded. This paper provides a hardware-based approach to explicit and implicit activation of the partial reconfiguration process in dynamically reconfigurable SoCs and includes all the necessary tasks to cope with this issue. Furthermore, the reconfiguration service introduced in this work allows remote invocation of the reconfiguration process and then the remote integration of off-chip components. A model that offers component location transparency is also presented to enhance and facilitate system integration. PMID:24672292
Web-based interactive visualization in a Grid-enabled neuroimaging application using HTML5.
Siewert, René; Specovius, Svenja; Wu, Jie; Krefting, Dagmar
2012-01-01
Interactive visualization and correction of intermediate results are required in many medical image analysis pipelines. To allow certain interaction in the remote execution of compute- and data-intensive applications, new features of HTML5 are used. They allow for transparent integration of user interaction into Grid- or Cloud-enabled scientific workflows. Both 2D and 3D visualization and data manipulation can be performed through a scientific gateway without the need to install specific software or web browser plugins. The possibilities of web-based visualization are presented along the FreeSurfer-pipeline, a popular compute- and data-intensive software tool for quantitative neuroimaging.
A model of the temporal dynamics of multisensory enhancement
Rowland, Benjamin A.; Stein, Barry E.
2014-01-01
The senses transduce different forms of environmental energy, and the brain synthesizes information across them to enhance responses to salient biological events. We hypothesize that the potency of multisensory integration is attributable to the convergence of independent and temporally aligned signals derived from cross-modal stimulus configurations onto multisensory neurons. The temporal profile of multisensory integration in neurons of the deep superior colliculus (SC) is consistent with this hypothesis. The responses of these neurons to visual, auditory, and combinations of visual–auditory stimuli reveal that multisensory integration takes place in real-time; that is, the input signals are integrated as soon as they arrive at the target neuron. Interactions between cross-modal signals may appear to reflect linear or nonlinear computations on a moment-by-moment basis, the aggregate of which determines the net product of multisensory integration. Modeling observations presented here suggest that the early nonlinear components of the temporal profile of multisensory integration can be explained with a simple spiking neuron model, and do not require more sophisticated assumptions about the underlying biology. A transition from nonlinear “super-additive” computation to linear, additive computation can be accomplished via scaled inhibition. The findings provide a set of design constraints for artificial implementations seeking to exploit the basic principles and potency of biological multisensory integration in contexts of sensory substitution or augmentation. PMID:24374382
DOE Office of Scientific and Technical Information (OSTI.GOV)
Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.
This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less
TCR backscattering characterization for microwave remote sensing
NASA Astrophysics Data System (ADS)
Riccio, Giovanni; Gennarelli, Claudio
2014-05-01
A Trihedral Corner Reflector (TCR) is formed by three mutually orthogonal metal plates of various shapes and is a very important scattering structure since it exhibits a high monostatic Radar Cross Section (RCS) over a wide angular range. Moreover it is a handy passive device with low manufacturing costs and robust geometric construction, the maintenance of its efficiency is not difficult and expensive, and it can be used in all weather conditions (i.e., fog, rain, smoke, and dusty environment). These characteristics make it suitable as reference target and radar enhancement device for satellite- and ground-based microwave remote sensing techniques. For instance, TCRs have been recently employed to improve the signal-to-noise ratio of the backscattered signal in the case of urban ground deformation monitoring [1] and dynamic survey of civil infrastructures without natural corners as the Musmeci bridge in Basilicata, Italy [2]. The region of interest for the calculation of TCR's monostatic RCS is here confined to the first quadrant containing the boresight direction. The backscattering term is presented in closed form by evaluating the far-field scattering integral involving the contributions related to the direct illumination and the internal bouncing mechanisms. The Geometrical Optics (GO) laws allow one to determine the field incident on each TCR plate and the patch (integration domain) illuminated by it, thus enabling the use of a Physical Optics (PO) approximation for the corresponding surface current densities to consider for integration on each patch. Accordingly, five contributions are associated to each TCR plate: one contribution is due to the direct illumination of the whole internal surface; two contributions originate by the impinging rays that are simply reflected by the other two internal surfaces; and two contributions are related to the impinging rays that undergo two internal reflections. It is useful to note that the six contributions due to the doubly reflected rays define the leading term in the angular region around the boresight direction. The validity of the approach is well assessed by comparisons with experimental results, and its formulation is computer time inexpensive since in closed form. Moreover it is preferable to the model using near-field PO integrations for describing the interactions between the internal TCR's faces since this last requires the evaluation of multi-dimensional integrals, i.e., the expression of the final incident field contains a two-dimensional integral for each previous interaction. [1] Y. Qin, D. Perissin, and L. Lei, "The Design and Experiments on Corner Reflectors for Urban Ground Deformation Monitoring in Hong Kong," Int. J. Antennas Propagat., vol. 2013, pp. 1-8. [2] T. A. Stabile, A. Perrone, M. R. Gallipoli, R. Ditommaso, and F. C. Ponzo, "Dynamic Survey of the Musmeci Bridge by Joint Application of Ground-Based Microwave Radar Interferometry and Ambient Noise Standard Spectral Ratio Techniques," IEEE Geosci. Remote Sens. Lett., vol. 10, no. 4, pp. 870-874, 2013.
Virtual collaborative environments: programming and controlling robotic devices remotely
NASA Astrophysics Data System (ADS)
Davies, Brady R.; McDonald, Michael J., Jr.; Harrigan, Raymond W.
1995-12-01
This paper describes a technology for remote sharing of intelligent electro-mechanical devices. An architecture and actual system have been developed and tested, based on the proposed National Information Infrastructure (NII) or Information Highway, to facilitate programming and control of intelligent programmable machines (like robots, machine tools, etc.). Using appropriate geometric models, integrated sensors, video systems, and computing hardware; computer controlled resources owned and operated by different (in a geographic sense as well as legal sense) entities can be individually or simultaneously programmed and controlled from one or more remote locations. Remote programming and control of intelligent machines will create significant opportunities for sharing of expensive capital equipment. Using the technology described in this paper, university researchers, manufacturing entities, automation consultants, design entities, and others can directly access robotic and machining facilities located across the country. Disparate electro-mechanical resources will be shared in a manner similar to the way supercomputers are accessed by multiple users. Using this technology, it will be possible for researchers developing new robot control algorithms to validate models and algorithms right from their university labs without ever owning a robot. Manufacturers will be able to model, simulate, and measure the performance of prospective robots before selecting robot hardware optimally suited for their intended application. Designers will be able to access CNC machining centers across the country to fabricate prototypic parts during product design validation. An existing prototype architecture and system has been developed and proven. Programming and control of a large gantry robot located at Sandia National Laboratories in Albuquerque, New Mexico, was demonstrated from such remote locations as Washington D.C., Washington State, and Southern California.
Computers and Individualized Instruction: Moving to Alternative Learning Environments.
ERIC Educational Resources Information Center
Robbat, Richard J.
The overall focus of this booklet is on planning for change that allows for integration of computers into articulated learning environments that will enhance the learning goal of students. The first chapter presents four major themes to increase the likelihood of combining computers and individualized instruction in schools: (1) a revitalized form…
Integration of Computer Technology Into an Introductory-Level Neuroscience Laboratory
ERIC Educational Resources Information Center
Evert, Denise L.; Goodwin, Gregory; Stavnezer, Amy Jo
2005-01-01
We describe 3 computer-based neuroscience laboratories. In the first 2 labs, we used commercially available interactive software to enhance the study of functional and comparative neuroanatomy and neurophysiology. In the remaining lab, we used customized software and hardware in 2 psychophysiological experiments. With the use of the computer-based…
Integrated exhaust gas analysis system for aircraft turbine engine component testing
NASA Technical Reports Server (NTRS)
Summers, R. L.; Anderson, R. C.
1985-01-01
An integrated exhaust gas analysis system was designed and installed in the hot-section facility at the Lewis Research Center. The system is designed to operate either manually or automatically and also to be operated from a remote station. The system measures oxygen, water vapor, total hydrocarbons, carbon monoxide, carbon dioxide, and oxides of nitrogen. Two microprocessors control the system and the analyzers, collect data and process them into engineering units, and present the data to the facility computers and the system operator. Within the design of this system there are innovative concepts and procedures that are of general interest and application to other gas analysis tasks.
Information recovery through image sequence fusion under wavelet transformation
NASA Astrophysics Data System (ADS)
He, Qiang
2010-04-01
Remote sensing is widely applied to provide information of areas with limited ground access with applications such as to assess the destruction from natural disasters and to plan relief and recovery operations. However, the data collection of aerial digital images is constrained by bad weather, atmospheric conditions, and unstable camera or camcorder. Therefore, how to recover the information from the low-quality remote sensing images and how to enhance the image quality becomes very important for many visual understanding tasks, such like feature detection, object segmentation, and object recognition. The quality of remote sensing imagery can be improved through meaningful combination of the employed images captured from different sensors or from different conditions through information fusion. Here we particularly address information fusion to remote sensing images under multi-resolution analysis in the employed image sequences. The image fusion is to recover complete information by integrating multiple images captured from the same scene. Through image fusion, a new image with high-resolution or more perceptive for human and machine is created from a time series of low-quality images based on image registration between different video frames.
Flow Ambiguity: A Path Towards Classically Driven Blind Quantum Computation
NASA Astrophysics Data System (ADS)
Mantri, Atul; Demarie, Tommaso F.; Menicucci, Nicolas C.; Fitzsimons, Joseph F.
2017-07-01
Blind quantum computation protocols allow a user to delegate a computation to a remote quantum computer in such a way that the privacy of their computation is preserved, even from the device implementing the computation. To date, such protocols are only known for settings involving at least two quantum devices: either a user with some quantum capabilities and a remote quantum server or two or more entangled but noncommunicating servers. In this work, we take the first step towards the construction of a blind quantum computing protocol with a completely classical client and single quantum server. Specifically, we show how a classical client can exploit the ambiguity in the flow of information in measurement-based quantum computing to construct a protocol for hiding critical aspects of a computation delegated to a remote quantum computer. This ambiguity arises due to the fact that, for a fixed graph, there exist multiple choices of the input and output vertex sets that result in deterministic measurement patterns consistent with the same fixed total ordering of vertices. This allows a classical user, computing only measurement angles, to drive a measurement-based computation performed on a remote device while hiding critical aspects of the computation.
Achievements and Challenges in Computational Protein Design.
Samish, Ilan
2017-01-01
Computational protein design (CPD), a yet evolving field, includes computer-aided engineering for partial or full de novo designs of proteins of interest. Designs are defined by a requested structure, function, or working environment. This chapter describes the birth and maturation of the field by presenting 101 CPD examples in a chronological order emphasizing achievements and pending challenges. Integrating these aspects presents the plethora of CPD approaches with the hope of providing a "CPD 101". These reflect on the broader structural bioinformatics and computational biophysics field and include: (1) integration of knowledge-based and energy-based methods, (2) hierarchical designated approach towards local, regional, and global motifs and the integration of high- and low-resolution design schemes that fit each such region, (3) systematic differential approaches towards different protein regions, (4) identification of key hot-spot residues and the relative effect of remote regions, (5) assessment of shape-complementarity, electrostatics and solvation effects, (6) integration of thermal plasticity and functional dynamics, (7) negative design, (8) systematic integration of experimental approaches, (9) objective cross-assessment of methods, and (10) successful ranking of potential designs. Future challenges also include dissemination of CPD software to the general use of life-sciences researchers and the emphasis of success within an in vivo milieu. CPD increases our understanding of protein structure and function and the relationships between the two along with the application of such know-how for the benefit of mankind. Applied aspects range from biological drugs, via healthier and tastier food products to nanotechnology and environmentally friendly enzymes replacing toxic chemicals utilized in the industry.
The Integrated Sounding System: Description and Preliminary Observations from TOGA COARE.
NASA Astrophysics Data System (ADS)
Parsons, David; Dabberdt, Walter; Cole, Harold; Hock, Terrence; Martin, Charles; Barrett, Anne-Leslie; Miller, Erik; Spowart, Michael; Howard, Michael; Ecklund, Warner; Carter, David; Gage, Kenneth; Wilson, John
1994-04-01
An Integrated Sounding System (ISS) that combines state-of- the-art remote and in situ sensors into a single transportable facility has been developed jointly by the National Center for Atmospheric Research (NCAR) and the Aeronomy laboratory of the National Oceanic and Atmospheric Administration (NOAA/AL). The instrumentation for each ISS includes a 915-MHz wind profiler, a Radio Acoustic Sounding System (RASS), an Omega-based NAVAID sounding system, and an enhanced surface meteorological station. The general philosophy behind the ISS is that the integration of various measurement systems overcomes each system's respective limitations while taking advantage of its positive attributes. The individual observing systems within the ISS provide high-level data products to a central workstation that manages and integrates these measurements. The ISS software package performs a wide range of functions: real-time data acquisition, database support, and graphical displays; data archival and communications; and operational and post time analysis. The first deployment of the ISS consists of six sites in the western tropical Pacific-four land-based deployments and two ship-based deployments. The sites serve the Coupled Ocean-Atmosphere Response Experiment (COARE) of the Tropical Ocean and Global Atmosphere (TOGA) program and TOGA's enhanced atmospheric monitoring effort. Examples of ISS data taken during this deployment are shown in order to demonstrate the capabilities of this new sounding system and to demonstrate the performance of these in situ and remote sensing instruments in a moist tropical environment. In particular, a strong convective outflow with a pronounced impact of the atmospheric boundary layer and heat fluxes from the ocean surface was examined with a shipboard ISS. If these strong outflows commonly occur, they may prove to be an important component of the surface energy budget of the western tropical Pacific.
The Vendors' Corner: Biblio-Techniques' Library and Information System (BLIS).
ERIC Educational Resources Information Center
Library Software Review, 1984
1984-01-01
Describes online catalog and integrated library computer system designed to enhance Washington Library Network's software. Highlights include system components; implementation options; system features (integrated library functions, database design, system management facilities); support services (installation and training, software maintenance and…
NASA Astrophysics Data System (ADS)
Wang, Xiao; Zhang, Tian-Bao; Yang, Wen; Zhu, Hao; Chen, Lin; Sun, Qing-Qing; Zhang, David Wei
2017-01-01
The effective and high-quality integration of high-k dielectrics on two-dimensional (2D) crystals is essential to the device structure engineering and performance improvement of field-effect transistor (FET) based on the 2D semiconductors. We report a 2D MoS2 transistor with ultra-thin Al2O3 top-gate dielectric (6.1 nm) and extremely low leakage current. Remote forming gas plasma pretreatment was carried out prior to the atomic layer deposition, providing nucleation sites with the physically adsorbed ions on the MoS2 surface. The top gate MoS2 FET exhibited excellent electrical performance, including high on/off current ratio over 109, subthreshold swing of 85 mV/decade and field-effect mobility of 45.03 cm2/V s. Top gate leakage current less than 0.08 pA/μm2 at 4 MV/cm has been obtained, which is the smallest compared with the reported top-gated MoS2 transistors. Such an optimized integration of high-k dielectric in 2D semiconductor FET with enhanced performance is very attractive, and it paves the way towards the realization of more advanced 2D nanoelectronic devices and integrated circuits.
Nelson, Kurtis J.; Long, Donald G.; Connot, Joel A.
2016-02-29
The Landscape Fire and Resource Management Planning Tools (LANDFIRE) 2010 data release provides updated and enhanced vegetation, fuel, and fire regime layers consistently across the United States. The data represent landscape conditions from approximately 2010 and are the latest release in a series of planned updates to maintain currency of LANDFIRE data products. Enhancements to the data products included refinement of urban areas by incorporating the National Land Cover Database 2006 land cover product, refinement of agricultural lands by integrating the National Agriculture Statistics Service 2011 cropland data layer, and improved wetlands delineations using the National Land Cover Database 2006 land cover and the U.S. Fish and Wildlife Service National Wetlands Inventory data. Disturbance layers were generated for years 2008 through 2010 using remotely sensed imagery, polygons representing disturbance events submitted by local organizations, and fire mapping program data such as the Monitoring Trends in Burn Severity perimeters produced by the U.S. Geological Survey and the U.S. Forest Service. Existing vegetation data were updated to account for transitions in disturbed areas and to account for vegetation growth and succession in undisturbed areas. Surface and canopy fuel data were computed from the updated vegetation type, cover, and height and occasionally from potential vegetation. Historical fire frequency and succession classes were also updated. Revised topographic layers were created based on updated elevation data from the National Elevation Dataset. The LANDFIRE program also released a new Web site offering updated content, enhanced usability, and more efficient navigation.
Study of turbine bypass remote augmentor lift system for V/STOL aircraft
NASA Technical Reports Server (NTRS)
Sheridan, A. E.
1985-01-01
The airframe design and engine/aircraft integration were emphasized in a NASA comparative study of turbofan and turbine bypass engine (TBE) with remote augmentor lift systems (RALS) for supersonic V/STOL aircraft. Functional features of the TBE are reviewed, noting the enhanced cycle efficiency and reduced afterbody drag compared to the turbojets. The present studies examied performance levels for aircraft with fleet defense and secondary anti-surface warfare roles, carrying AMRAAM and AIM missiles. TBE engine cycles were configured for hover and up-and-away flight from deck launch, and all tests were done from a conceptual design viewpoint. The results indicate that the TBE-RALS is superior to turbofan-RALS aircraft in both gross take-off weight and life cycle cost.
Running VisIt Software on the Peregrine System | High-Performance Computing
kilobyte range. VisIt features a robust remote visualization capability. VisIt can be started on a local machine and used to visualize data on a remote compute cluster.The remote machine must be able to send VisIt module must be loaded as part of this process. To enable remote visualization the 'module load
Is This Real Life? Is This Just Fantasy?: Realism and Representations in Learning with Technology
NASA Astrophysics Data System (ADS)
Sauter, Megan Patrice
Students often engage in hands-on activities during science learning; however, financial and practical constraints often limit the availability of these activities. Recent advances in technology have led to increases in the use of simulations and remote labs, which attempt to recreate hands-on science learning via computer. Remote labs and simulations are interesting from a cognitive perspective because they allow for different relations between representations and their referents. Remote labs are unique in that they provide a yoked representation, meaning that the representation of the lab on the computer screen is actually linked to that which it represents: a real scientific device. Simulations merely represent the lab and are not connected to any real scientific devices. However, the type of visual representations used in the lab may modify the effects of the lab technology. The purpose of this dissertation is to examine the relation between representation and technology and its effects of students' psychological experiences using online science labs. Undergraduates participated in two studies that investigated the relation between technology and representation. In the first study, participants performed either a remote lab or a simulation incorporating one of two visual representations, either a static image or a video of the equipment. Although participants in both lab conditions learned, participants in the remote lab condition had more authentic experiences. However, effects were moderated by the realism of the visual representation. Participants who saw a video were more invested and felt the experience was more authentic. In a second study, participants performed a remote lab and either saw the same video as in the first study, an animation, or the video and an animation. Most participants had an authentic experience because both representations evoked strong feelings of presence. However, participants who saw the video were more likely to believe the remote technology was real. Overall, the findings suggest that participants' experiences with technology were shaped by representation. Students had more authentic experiences using the remote lab than the simulation. However, incorporating visual representations that enhance presence made these experiences even more authentic and meaningful than afforded by the technology alone.
The flight robotics laboratory
NASA Technical Reports Server (NTRS)
Tobbe, Patrick A.; Williamson, Marlin J.; Glaese, John R.
1988-01-01
The Flight Robotics Laboratory of the Marshall Space Flight Center is described in detail. This facility, containing an eight degree of freedom manipulator, precision air bearing floor, teleoperated motion base, reconfigurable operator's console, and VAX 11/750 computer system, provides simulation capability to study human/system interactions of remote systems. The facility hardware, software and subsequent integration of these components into a real time man-in-the-loop simulation for the evaluation of spacecraft contact proximity and dynamics are described.
Integration of land-use data and soil survey data
NASA Technical Reports Server (NTRS)
Cox, T. L.
1977-01-01
Approaches are discussed for increasing the utility of remotely sensed interpretations through the use of a computer-assisted process which provides capabilities for merging several types of data of varying formats. The resulting maps and summary data are used for planning and zoning in a rapidly developing area (34,000 ha) adjacent to the Black Hills in South Dakota. Attention is given to the data source, data digitization, and aspects of data handling and analysis.
Development of Improved Surface Integral Methods for Jet Aeroacoustic Predictions
NASA Technical Reports Server (NTRS)
Pilon, Anthony R.; Lyrintzis, Anastasios S.
1997-01-01
The accurate prediction of aerodynamically generated noise has become an important goal over the past decade. Aeroacoustics must now be an integral part of the aircraft design process. The direct calculation of aerodynamically generated noise with CFD-like algorithms is plausible. However, large computer time and memory requirements often make these predictions impractical. It is therefore necessary to separate the aeroacoustics problem into two parts, one in which aerodynamic sound sources are determined, and another in which the propagating sound is calculated. This idea is applied in acoustic analogy methods. However, in the acoustic analogy, the determination of far-field sound requires the solution of a volume integral. This volume integration again leads to impractical computer requirements. An alternative to the volume integrations can be found in the Kirchhoff method. In this method, Green's theorem for the linear wave equation is used to determine sound propagation based on quantities on a surface surrounding the source region. The change from volume to surface integrals represents a tremendous savings in the computer resources required for an accurate prediction. This work is concerned with the development of enhancements of the Kirchhoff method for use in a wide variety of aeroacoustics problems. This enhanced method, the modified Kirchhoff method, is shown to be a Green's function solution of Lighthill's equation. It is also shown rigorously to be identical to the methods of Ffowcs Williams and Hawkings. This allows for development of versatile computer codes which can easily alternate between the different Kirchhoff and Ffowcs Williams-Hawkings formulations, using the most appropriate method for the problem at hand. The modified Kirchhoff method is developed primarily for use in jet aeroacoustics predictions. Applications of the method are shown for two dimensional and three dimensional jet flows. Additionally, the enhancements are generalized so that they may be used in any aeroacoustics problem.
Secure data aggregation in heterogeneous and disparate networks using stand off server architecture
NASA Astrophysics Data System (ADS)
Vimalathithan, S.; Sudarsan, S. D.; Seker, R.; Lenin, R. B.; Ramaswamy, S.
2009-04-01
The emerging global reach of technology presents myriad challenges and intricacies as Information Technology teams aim to provide anywhere, anytime and anyone access, for service providers and customers alike. The world is fraught with stifling inequalities, both from an economic as well as socio-political perspective. The net result has been large capability gaps between various organizational locations that need to work together, which has raised new challenges for information security teams. Similar issues arise, when mergers and acquisitions among and between organizations take place. While integrating remote business locations with mainstream operations, one or more of the issues including the lack of application level support, computational capabilities, communication limitations, and legal requirements cause a serious impediment thereby complicating integration while not violating the organizations' security requirements. Often resorted techniques like IPSec, tunneling, secure socket layer, etc. may not be always techno-economically feasible. This paper addresses such security issues by introducing an intermediate server between corporate central server and remote sites, called stand-off-server. We present techniques such as break-before-make connection, break connection after transfer, multiple virtual machine instances with different operating systems using the concept of a stand-off-server. Our experiments show that the proposed solution provides sufficient isolation for the central server/site from attacks arising out of weak communication and/or computing links and is simple to implement.
Research on the man in the loop control system of the robot arm based on gesture control
NASA Astrophysics Data System (ADS)
Xiao, Lifeng; Peng, Jinbao
2017-03-01
The Man in the loop control system of the robot arm based on gesture control research complex real-world environment, which requires the operator to continuously control and adjust the remote manipulator, as the background, completes the specific mission human in the loop entire system as the research object. This paper puts forward a kind of robot arm control system of Man in the loop based on gesture control, by robot arm control system based on gesture control and Virtual reality scene feedback to enhance immersion and integration of operator, to make operator really become a part of the whole control loop. This paper expounds how to construct a man in the loop control system of the robot arm based on gesture control. The system is a complex system of human computer cooperative control, but also people in the loop control problem areas. The new system solves the problems that the traditional method has no immersion feeling and the operation lever is unnatural, the adjustment time is long, and the data glove mode wears uncomfortable and the price is expensive.
Accommodating Student Diversity in Remote Sensing Instruction.
ERIC Educational Resources Information Center
Hammen, John L., III.
1992-01-01
Discusses the difficulty of teaching computer-based remote sensing to students of varying levels of computer literacy. Suggests an instructional method that accommodates all levels of technical expertise through the use of microcomputers. Presents a curriculum that includes an introduction to remote sensing, digital image processing, and…
Jeong, Kyeong-Min; Kim, Hee-Seung; Hong, Sung-In; Lee, Sung-Keun; Jo, Na-Young; Kim, Yong-Soo; Lim, Hong-Gi; Park, Jae-Hyeung
2012-10-08
Speed enhancement of integral imaging based incoherent Fourier hologram capture using a graphic processing unit is reported. Integral imaging based method enables exact hologram capture of real-existing three-dimensional objects under regular incoherent illumination. In our implementation, we apply parallel computation scheme using the graphic processing unit, accelerating the processing speed. Using enhanced speed of hologram capture, we also implement a pseudo real-time hologram capture and optical reconstruction system. The overall operation speed is measured to be 1 frame per second.
Integrating structure-based and ligand-based approaches for computational drug design.
Wilson, Gregory L; Lill, Markus A
2011-04-01
Methods utilized in computer-aided drug design can be classified into two major categories: structure based and ligand based, using information on the structure of the protein or on the biological and physicochemical properties of bound ligands, respectively. In recent years there has been a trend towards integrating these two methods in order to enhance the reliability and efficiency of computer-aided drug-design approaches by combining information from both the ligand and the protein. This trend resulted in a variety of methods that include: pseudoreceptor methods, pharmacophore methods, fingerprint methods and approaches integrating docking with similarity-based methods. In this article, we will describe the concepts behind each method and selected applications.
Remote visual analysis of large turbulence databases at multiple scales
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Remote visual analysis of large turbulence databases at multiple scales
Pulido, Jesus; Livescu, Daniel; Kanov, Kalin; ...
2018-06-15
The remote analysis and visualization of raw large turbulence datasets is challenging. Current accurate direct numerical simulations (DNS) of turbulent flows generate datasets with billions of points per time-step and several thousand time-steps per simulation. Until recently, the analysis and visualization of such datasets was restricted to scientists with access to large supercomputers. The public Johns Hopkins Turbulence database simplifies access to multi-terabyte turbulence datasets and facilitates the computation of statistics and extraction of features through the use of commodity hardware. In this paper, we present a framework designed around wavelet-based compression for high-speed visualization of large datasets and methodsmore » supporting multi-resolution analysis of turbulence. By integrating common technologies, this framework enables remote access to tools available on supercomputers and over 230 terabytes of DNS data over the Web. Finally, the database toolset is expanded by providing access to exploratory data analysis tools, such as wavelet decomposition capabilities and coherent feature extraction.« less
Miniature Wireless BioSensor for Remote Endoscopic Monitoring
NASA Astrophysics Data System (ADS)
Nemiroski, Alex; Brown, Keith; Issadore, David; Westervelt, Robert; Thompson, Chris; Obstein, Keith; Laine, Michael
2009-03-01
We have built a miniature wireless biosensor with fluorescence detection capability that explores the miniaturization limit for a self-powered sensor device assembled from the latest off-the-shelf technology. The device is intended as a remote medical sensor to be inserted endoscopically and remainin a patient's gastrointestinal tract for a period of weeks, recording and transmitting data as necessary. A sensing network may be formed by using multiple such devices within the patient, routing information to an external receiver that communicates through existing mobilephone networks to relay data remotely. By using a monolithic IC chip with integrated processor, memory, and 2.4 GHz radio,combined with a photonic sensor and miniature battery, we have developed a fully functional computing device in a form factorcompliantwith insertion through the narrowest endoscopic channels (less than 3mm x 3mm x 20mm). We envision similar devices with various types of sensors to be used in many different areas of the human body.
NASA Astrophysics Data System (ADS)
Aoki, K.; Ohuchi, N.; Zong, Z.; Arimoto, Y.; Wang, X.; Yamaoka, H.; Kawai, M.; Kondou, Y.; Makida, Y.; Hirose, M.; Endou, T.; Iwasaki, M.; Nakamura, T.
2017-12-01
A remote monitoring system was developed based on the software infrastructure of the Experimental Physics and Industrial Control System (EPICS) for the cryogenic system of superconducting magnets in the interaction region of the SuperKEKB accelerator. The SuperKEKB has been constructed to conduct high-energy physics experiments at KEK. These superconducting magnets consist of three apparatuses, the Belle II detector solenoid, and QCSL and QCSR accelerator magnets. They are each contained in three cryostats cooled by dedicated helium cryogenic systems. The monitoring system was developed to read data of the EX-8000, which is an integrated instrumentation system to control all cryogenic components. The monitoring system uses the I/O control tools of EPICS software for TCP/IP, archiving techniques using a relational database, and easy human-computer interface. Using this monitoring system, it is possible to remotely monitor all real-time data of the superconducting magnets and cryogenic systems. It is also convenient to share data among multiple groups.
Eye-movements and Voice as Interface Modalities to Computer Systems
NASA Astrophysics Data System (ADS)
Farid, Mohsen M.; Murtagh, Fionn D.
2003-03-01
We investigate the visual and vocal modalities of interaction with computer systems. We focus our attention on the integration of visual and vocal interface as possible replacement and/or additional modalities to enhance human-computer interaction. We present a new framework for employing eye gaze as a modality of interface. While voice commands, as means of interaction with computers, have been around for a number of years, integration of both the vocal interface and the visual interface, in terms of detecting user's eye movements through an eye-tracking device, is novel and promises to open the horizons for new applications where a hand-mouse interface provides little or no apparent support to the task to be accomplished. We present an array of applications to illustrate the new framework and eye-voice integration.
An infrastructure for the integration of geoscience instruments and sensors on the Grid
NASA Astrophysics Data System (ADS)
Pugliese, R.; Prica, M.; Kourousias, G.; Del Linz, A.; Curri, A.
2009-04-01
The Grid, as a computing paradigm, has long been in the attention of both academia and industry[1]. The distributed and expandable nature of its general architecture result to scalability and more efficient utilisation of the computing infrastructures. The scientific community, including that of geosciences, often handles problems with very high requirements in data processing, transferring, and storing[2,3]. This has raised the interest on Grid technologies but these are often viewed solely as an access gateway to HPC. Suitable Grid infrastructures could provide the geoscience community with additional benefits like those of sharing, remote access and control of scientific systems. These systems can be scientific instruments, sensors, robots, cameras and any other device used in geosciences. The solution for practical, general, and feasible Grid-enabling of such devices requires non-intrusive extensions on core parts of the current Grid architecture. We propose an extended version of an architecture[4] that can serve as the solution to the problem. The solution we propose is called Grid Instrument Element (IE) [5]. It is an addition to the existing core Grid parts; the Computing Element (CE) and the Storage Element (SE) that serve the purposes that their name suggests. The IE that we will be referring to, and the related technologies have been developed in the EU project on the Deployment of Remote Instrumentation Infrastructure (DORII1). In DORII, partners of various scientific communities including those of Earthquake, Environmental science, and Experimental science, have adopted the technology of the Instrument Element in order to integrate to the Grid their devices. The Oceanographic and coastal observation and modelling Mediterranean Ocean Observing Network (OGS2), a DORII partner, is in the process of deploying the above mentioned Grid technologies on two types of observational modules: Argo profiling floats and a novel Autonomous Underwater Vehicle (AUV). In this paper i) we define the need for integration of instrumentation in the Grid, ii) we introduce the solution of the Instrument Element, iii) we demonstrate a suitable end-user web portal for accessing Grid resources, iv) we describe from the Grid-technological point of view the process of the integration to the Grid of two advanced environmental monitoring devices. References [1] M. Surridge, S. Taylor, D. De Roure, and E. Zaluska, "Experiences with GRIA—Industrial Applications on a Web Services Grid," e-Science and Grid Computing, First International Conference on e-Science and Grid Computing, 2005, pp. 98-105. [2] A. Chervenak, I. Foster, C. Kesselman, C. Salisbury, and S. Tuecke, "The data grid: Towards an architecture for the distributed management and analysis of large scientific datasets," Journal of Network and Computer Applications, vol. 23, 2000, pp. 187-200. [3] B. Allcock, J. Bester, J. Bresnahan, A.L. Chervenak, I. Foster, C. Kesselman, S. Meder, V. Nefedova, D. Quesnel, and S. Tuecke, "Data management and transfer in high-performance computational grid environments," Parallel Computing, vol. 28, 2002, pp. 749-771. [4] E. Frizziero, M. Gulmini, F. Lelli, G. Maron, A. Oh, S. Orlando, A. Petrucci, S. Squizzato, and S. Traldi, "Instrument Element: A New Grid component that Enables the Control of Remote Instrumentation," Proceedings of the Sixth IEEE International Symposium on Cluster Computing and the Grid (CCGRID'06)-Volume 00, IEEE Computer Society Washington, DC, USA, 2006. [5] R. Ranon, L. De Marco, A. Senerchia, S. Gabrielli, L. Chittaro, R. Pugliese, L. Del Cano, F. Asnicar, and M. Prica, "A Web-based Tool for Collaborative Access to Scientific Instruments in Cyberinfrastructures." 1 The DORII project is supported by the European Commission within the 7th Framework Programme (FP7/2007-2013) under grant agreement no. RI-213110. URL: http://www.dorii.eu 2 Istituto Nazionale di Oceanografia e di Geofisica Sperimentale. URL: http://www.ogs.trieste.it
In-database processing of a large collection of remote sensing data: applications and implementation
NASA Astrophysics Data System (ADS)
Kikhtenko, Vladimir; Mamash, Elena; Chubarov, Dmitri; Voronina, Polina
2016-04-01
Large archives of remote sensing data are now available to scientists, yet the need to work with individual satellite scenes or product files constrains studies that span a wide temporal range or spatial extent. The resources (storage capacity, computing power and network bandwidth) required for such studies are often beyond the capabilities of individual geoscientists. This problem has been tackled before in remote sensing research and inspired several information systems. Some of them such as NASA Giovanni [1] and Google Earth Engine have already proved their utility for science. Analysis tasks involving large volumes of numerical data are not unique to Earth Sciences. Recent advances in data science are enabled by the development of in-database processing engines that bring processing closer to storage, use declarative query languages to facilitate parallel scalability and provide high-level abstraction of the whole dataset. We build on the idea of bridging the gap between file archives containing remote sensing data and databases by integrating files into relational database as foreign data sources and performing analytical processing inside the database engine. Thereby higher level query language can efficiently address problems of arbitrary size: from accessing the data associated with a specific pixel or a grid cell to complex aggregation over spatial or temporal extents over a large number of individual data files. This approach was implemented using PostgreSQL for a Siberian regional archive of satellite data products holding hundreds of terabytes of measurements from multiple sensors and missions taken over a decade-long span. While preserving the original storage layout and therefore compatibility with existing applications the in-database processing engine provides a toolkit for provisioning remote sensing data in scientific workflows and applications. The use of SQL - a widely used higher level declarative query language - simplifies interoperability between desktop GIS, web applications and geographic web services and interactive scientific applications (MATLAB, IPython). The system is also automatically ingesting direct readout data from meteorological and research satellites in near-real time with distributed acquisition workflows managed by Taverna workflow engine [2]. The system has demonstrated its utility in performing non-trivial analytic processing such as the computation of the Robust Satellite Technique (RST) indices [3]. It had been useful in different tasks such as studying urban heat islands, analyzing patterns in the distribution of wildfire occurrences, detecting phenomena related to seismic and earthquake activity. Initial experience has highlighted several limitations of the proposed approach yet it has demonstrated ability to facilitate the use of large archives of remote sensing data by geoscientists. 1. J.G. Acker, G. Leptoukh, Online analysis enhances use of NASA Earth science data. EOS Trans. AGU, 2007, 88(2), P. 14-17. 2. D. Hull, K. Wolsfencroft, R. Stevens, C. Goble, M.R. Pocock, P. Li and T. Oinn, Taverna: a tool for building and running workflows of services. Nucleic Acids Research. 2006. V. 34. P. W729-W732. 3. V. Tramutoli, G. Di Bello, N. Pergola, S. Piscitelli, Robust satellite techniques for remote sensing of seismically active areas // Annals of Geophysics. 2001. no. 44(2). P. 295-312.
Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi; Wang, Chun-Cheng
2015-11-01
To protect patient privacy and ensure authorized access to remote medical services, many remote user authentication schemes for the integrated electronic patient record (EPR) information system have been proposed in the literature. In a recent paper, Das proposed a hash based remote user authentication scheme using passwords and smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various passive and active attacks. However, in this paper, we found that Das's authentication scheme is still vulnerable to modification and user duplication attacks. Thereafter we propose a secure and efficient authentication scheme for the integrated EPR information system based on lightweight hash function and bitwise exclusive-or (XOR) operations. The security proof and performance analysis show our new scheme is well-suited to adoption in remote medical healthcare services.
2013-01-01
Background and purpose Stroke rehabilitation does not often integrate both sensory and motor recovery. While subthreshold noise was shown to enhance sensory signal detection at the site of noise application, having a noise-generating device at the fingertip to enhance fingertip sensation and potentially enhance dexterity for stroke survivors is impractical, since the device would interfere with object manipulation. This study determined if remote application of subthreshold vibrotactile noise (away from the fingertips) improves fingertip tactile sensation with potential to enhance dexterity for stroke survivors. Methods Index finger and thumb pad sensation was measured for ten stroke survivors with fingertip sensory deficit using the Semmes-Weinstein Monofilament and Two-Point Discrimination Tests. Sensation scores were measured with noise applied at one of three intensities (40%, 60%, 80% of the sensory threshold) to one of four locations of the paretic upper extremity (dorsal hand proximal to the index finger knuckle, dorsal hand proximal to the thumb knuckle, dorsal wrist, volar wrist) in a random order, as well as without noise at beginning (Pre) and end (Post) of the testing session. Results Vibrotactile noise of all intensities and locations instantaneously and significantly improved Monofilament scores of the index fingertip and thumb tip (p < .01). No significant effect of the noise was seen for the Two-Point Discrimination Test scores. Conclusions Remote application of subthreshold (imperceptible) vibrotactile noise at the wrist and dorsal hand instantaneously improved stroke survivors’ light touch sensation, independent of noise location and intensity. Vibrotactile noise at the wrist and dorsal hand may have enhanced the fingertips’ light touch sensation via stochastic resonance and interneuronal connections. While long-term benefits of noise in stroke patients warrants further investigation, this result demonstrates potential that a wearable device applying vibrotactile noise at the wrist could enhance sensation and grip ability without interfering with object manipulation in everyday tasks. PMID:24112371
Device for inspecting vessel surfaces
Appel, D. Keith
1995-01-01
A portable, remotely-controlled inspection crawler for use along the walls of tanks, vessels, piping and the like. The crawler can be configured to use a vacuum chamber for supporting itself on the inspected surface by suction or a plurality of magnetic wheels for moving the crawler along the inspected surface. The crawler is adapted to be equipped with an ultrasonic probe for mapping the structural integrity or other characteristics of the surface being inspected. Navigation of the crawler is achieved by triangulation techniques between a signal transmitter on the crawler and a pair of microphones attached to a fixed, remote location, such as the crawler's deployment unit. The necessary communications are established between the crawler and computers external to the inspection environment for position control and storage and/or monitoring of data acquisition.
The role of assisted self-help in services for alcohol-related disorders.
Kavanagh, David J; Proctor, Dawn M
2011-06-01
Potentially harmful substance use is common, but many affected people do not receive treatment. Brief face-to-face treatments show impact, as do strategies to assist self-help remotely, by using bibliotherapies, computers or mobile phones. Remotely delivered treatments offer more sustained and multifaceted support than brief interventions, and they show a substantial cost advantage as users increase in number. They may also build skills, confidence and treatment fidelity in providers who use them in sessions. Engagement and retention remain challenges, but electronic treatments show promise in engaging younger populations. Recruitment may be assisted by integration with community campaigns or brief opportunistic interventions. However, routine use of assisted self-help by standard services faces significant challenges. Strategies to optimize adoption are discussed. Copyright © 2011. Published by Elsevier Ltd.
NASA Technical Reports Server (NTRS)
Chan, William M.
1993-01-01
An enhanced grid system for the Space Shuttle Orbiter was built by integrating CAD definitions from several sources and then generating the surface and volume grids. The new grid system contains geometric components not modeled previously plus significant enhancements on geometry that has been modeled in the old grid system. The new orbiter grids were then integrated with new grids for the rest of the launch vehicle. Enhancements were made to the hyperbolic grid generator HYPGEN and new tools for grid projection, manipulation, and modification, Cartesian box grid and far field grid generation and post-processing of flow solver data were developed.
Computerized Manufacturing Automation. Employment, Education, and the Workplace. Summary.
ERIC Educational Resources Information Center
Congress of the U.S., Washington, DC. Office of Technology Assessment.
The application of programmable automation (PA) offers new opportunities to enhance and streamline manufacturing processes. Five PA technologies are examined in this report: computer-aided design, robots, numerically controlled machine tools, flexible manufacturing systems, and computer-integrated manufacturing. Each technology is in a relatively…
NASA Astrophysics Data System (ADS)
Aktas, Mehmet; Aydin, Galip; Donnellan, Andrea; Fox, Geoffrey; Granat, Robert; Grant, Lisa; Lyzenga, Greg; McLeod, Dennis; Pallickara, Shrideep; Parker, Jay; Pierce, Marlon; Rundle, John; Sayar, Ahmet; Tullis, Terry
2006-12-01
We describe the goals and initial implementation of the International Solid Earth Virtual Observatory (iSERVO). This system is built using a Web Services approach to Grid computing infrastructure and is accessed via a component-based Web portal user interface. We describe our implementations of services used by this system, including Geographical Information System (GIS)-based data grid services for accessing remote data repositories and job management services for controlling multiple execution steps. iSERVO is an example of a larger trend to build globally scalable scientific computing infrastructures using the Service Oriented Architecture approach. Adoption of this approach raises a number of research challenges in millisecond-latency message systems suitable for internet-enabled scientific applications. We review our research in these areas.
Integrated Test Facility (ITF)
NASA Technical Reports Server (NTRS)
1992-01-01
The NASA-Dryden Integrated Test Facility (ITF), also known as the Walter C. Williams Research Aircraft Integration Facility (RAIF), provides an environment for conducting efficient and thorough testing of advanced, highly integrated research aircraft. Flight test confidence is greatly enhanced by the ability to qualify interactive aircraft systems in a controlled environment. In the ITF, each element of a flight vehicle can be regulated and monitored in real time as it interacts with the rest of the aircraft systems. Testing in the ITF is accomplished through automated techniques in which the research aircraft is interfaced to a high-fidelity real-time simulation. Electric and hydraulic power are also supplied, allowing all systems except the engines to function as if in flight. The testing process is controlled by an engineering workstation that sets up initial conditions for a test, initiates the test run, monitors its progress, and archives the data generated. The workstation is also capable of analyzing results of individual tests, comparing results of multiple tests, and producing reports. The computers used in the automated aircraft testing process are also capable of operating in a stand-alone mode with a simulation cockpit, complete with its own instruments and controls. Control law development and modification, aerodynamic, propulsion, guidance model qualification, and flight planning -- functions traditionally associated with real-time simulation -- can all be performed in this manner. The Remotely Augmented Vehicles (RAV) function, now located in the ITF, is a mainstay in the research techniques employed at Dryden. This function is used for tests that are too dangerous for direct human involvement or for which computational capacity does not exist onboard a research aircraft. RAV provides the researcher with a ground-based computer that is radio linked to the test aircraft during actual flight. The Ground Vibration Testing (GVT) system, formerly housed in the Thermostructural Laboratory, now also resides in the ITF. In preparing a research aircraft for flight testing, it is vital to measure its structural frequencies and mode shapes and compare results to the models used in design analysis. The final function performed in the ITF is routine aircraft maintenance. This includes preflight and post-flight instrumentation checks and the servicing of hydraulics, avionics, and engines necessary on any research aircraft. Aircraft are not merely moved to the ITF for automated testing purposes but are housed there throughout their flight test programs.
Integrated Test Facility (ITF)
NASA Technical Reports Server (NTRS)
1991-01-01
The NASA-Dryden Integrated Test Facility (ITF), also known as the Walter C. Williams Research Aircraft Integration Facility (RAIF), provides an environment for conducting efficient and thorough testing of advanced, highly integrated research aircraft. Flight test confidence is greatly enhanced by the ability to qualify interactive aircraft systems in a controlled environment. In the ITF, each element of a flight vehicle can be regulated and monitored in real time as it interacts with the rest of the aircraft systems. Testing in the ITF is accomplished through automated techniques in which the research aircraft is interfaced to a high-fidelity real-time simulation. Electric and hydraulic power are also supplied, allowing all systems except the engines to function as if in flight. The testing process is controlled by an engineering workstation that sets up initial conditions for a test, initiates the test run, monitors its progress, and archives the data generated. The workstation is also capable of analyzing results of individual tests, comparing results of multiple tests, and producing reports. The computers used in the automated aircraft testing process are also capable of operating in a stand-alone mode with a simulation cockpit, complete with its own instruments and controls. Control law development and modification, aerodynamic, propulsion, guidance model qualification, and flight planning -- functions traditionally associated with real-time simulation -- can all be performed in this manner. The Remotely Augmented Vehicles (RAV) function, now located in the ITF, is a mainstay in the research techniques employed at Dryden. This function is used for tests that are too dangerous for direct human involvement or for which computational capacity does not exist onboard a research aircraft. RAV provides the researcher with a ground-based computer that is radio linked to the test aircraft during actual flight. The Ground Vibration Testing (GVT) system, formerly housed in the Thermostructural Laboratory, now also resides in the ITF. In preparing a research aircraft for flight testing, it is vital to measure its structural frequencies and mode shapes and compare results to the models used in design analysis. The final function performed in the ITF is routine aircraft maintenance. This includes preflight and post-flight instrumentation checks and the servicing of hydraulics, avionics, and engines necessary on any research aircraft. Aircraft are not merely moved to the ITF for automated testing purposes but are housed there throughout their flight test programs.
Walter C. Williams Research Aircraft Integration Facility (RAIF)
NASA Technical Reports Server (NTRS)
1996-01-01
The NASA-Dryden Integrated Test Facility (ITF), also known as the Walter C. Williams Research Aircraft Integration Facility (RAIF), provides an environment for conducting efficient and thorough testing of advanced, highly integrated research aircraft. Flight test confidence is greatly enhanced by the ability to qualify interactive aircraft systems in a controlled environment. In the ITF, each element of a flight vehicle can be regulated and monitored in real time as it interacts with the rest of the aircraft systems. Testing in the ITF is accomplished through automated techniques in which the research aircraft is interfaced to a high-fidelity real-time simulation. Electric and hydraulic power are also supplied, allowing all systems except the engines to function as if in flight. The testing process is controlled by an engineering workstation that sets up initial conditions for a test, initiates the test run, monitors its progress, and archives the data generated. The workstation is also capable of analyzing results of individual tests, comparing results of multiple tests, and producing reports. The computers used in the automated aircraft testing process are also capable of operating in a stand-alone mode with a simulation cockpit, complete with its own instruments and controls. Control law development and modification, aerodynamic, propulsion, guidance model qualification, and flight planning -- functions traditionally associated with real-time simulation -- can all be performed in this manner. The Remotely Augmented Vehicles (RAV) function, now located in the ITF, is a mainstay in the research techniques employed at Dryden. This function is used for tests that are too dangerous for direct human involvement or for which computational capacity does not exist onboard a research aircraft. RAV provides the researcher with a ground-based computer that is radio linked to the test aircraft during actual flight. The Ground Vibration Testing (GVT) system, formerly housed in the Thermostructural Laboratory, now also resides in the ITF. In preparing a research aircraft for flight testing, it is vital to measure its structural frequencies and mode shapes and compare results to the models used in design analysis. The final function performed in the ITF is routine aircraft maintenance. This includes preflight and post-flight instrumentation checks and the servicing of hydraulics, avionics, and engines necessary on any research aircraft. Aircraft are not merely moved to the ITF for automated testing purposes but are housed there throughout their flight test programs.
Deterministic delivery of remote entanglement on a quantum network.
Humphreys, Peter C; Kalb, Norbert; Morits, Jaco P J; Schouten, Raymond N; Vermeulen, Raymond F L; Twitchen, Daniel J; Markham, Matthew; Hanson, Ronald
2018-06-01
Large-scale quantum networks promise to enable secure communication, distributed quantum computing, enhanced sensing and fundamental tests of quantum mechanics through the distribution of entanglement across nodes 1-7 . Moving beyond current two-node networks 8-13 requires the rate of entanglement generation between nodes to exceed the decoherence (loss) rate of the entanglement. If this criterion is met, intrinsically probabilistic entangling protocols can be used to provide deterministic remote entanglement at pre-specified times. Here we demonstrate this using diamond spin qubit nodes separated by two metres. We realize a fully heralded single-photon entanglement protocol that achieves entangling rates of up to 39 hertz, three orders of magnitude higher than previously demonstrated two-photon protocols on this platform 14 . At the same time, we suppress the decoherence rate of remote-entangled states to five hertz through dynamical decoupling. By combining these results with efficient charge-state control and mitigation of spectral diffusion, we deterministically deliver a fresh remote state with an average entanglement fidelity of more than 0.5 at every clock cycle of about 100 milliseconds without any pre- or post-selection. These results demonstrate a key building block for extended quantum networks and open the door to entanglement distribution across multiple remote nodes.
Communication network for decentralized remote tele-science during the Spacelab mission IML-2
NASA Technical Reports Server (NTRS)
Christ, Uwe; Schulz, Klaus-Juergen; Incollingo, Marco
1994-01-01
The ESA communication network for decentralized remote telescience during the Spacelab mission IML-2, called Interconnection Ground Subnetwork (IGS), provided data, voice conferencing, video distribution/conferencing and high rate data services to 5 remote user centers in Europe. The combination of services allowed the experimenters to interact with their experiments as they would normally do from the Payload Operations Control Center (POCC) at MSFC. In addition, to enhance their science results, they were able to make use of reference facilities and computing resources in their home laboratory, which typically are not available in the POCC. Characteristics of the IML-2 communications implementation were the adaptation to the different user needs based on modular service capabilities of IGS and the cost optimization for the connectivity. This was achieved by using a combination of traditional leased lines, satellite based VSAT connectivity and N-ISDN according to the simulation and mission schedule for each remote site. The central management system of IGS allows minimization of staffing and the involvement of communications personnel at the remote sites. The successful operation of IGS for IML-2 as a precursor network for the Columbus Orbital Facility (COF) has proven the concept for communications to support the operation of the COF decentralized scenario.
Computers, Remote Teleprocessing and Mass Communication.
ERIC Educational Resources Information Center
Cropley, A. J.
Recent developments in computer technology are reducing the limitations of computers as mass communication devices. The growth of remote teleprocessing is one important step. Computers can now interact with users via terminals which may be hundreds of miles from the actual mainframe machine. Many terminals can be in operation at once, so that many…
Diagnostic instrumentation aboard ISS: just-in-time training for non-physician crewmembers.
Foale, C Michael; Kaleri, Alexander Y; Sargsyan, Ashot E; Hamilton, Douglas R; Melton, Shannon; Martin, David; Dulchavsky, Scott A
2005-06-01
The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed "just-in-time" training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This "just-in-time" concept was used to support real-time remote expert guidance to complete ultrasound examinations using the ISS Human Research Facility (HRF). An American and Russian ISS crewmember received 2 h of "hands on" ultrasound training 8 mo prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember 6 d prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. Results of the CD-ROM-based OPE session were used to modify the instructions during a complete 35-min real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were adequate for clinical decision making. Complex ultrasound experiments with expert guidance were performed with high accuracy following limited preflight training and multimedia based in-flight review, despite a 2-s communication latency. In-flight application of multimedia proficiency enhancement software, coupled with real-time remote expert guidance, facilitates the successful performance of ultrasound examinations on orbit and may have additional terrestrial and space applications.
Diagnostic instrumentation aboard ISS: just-in-time training for non-physician crewmembers
NASA Technical Reports Server (NTRS)
Foale, C. Michael; Kaleri, Alexander Y.; Sargsyan, Ashot E.; Hamilton, Douglas R.; Melton, Shannon; Martin, David; Dulchavsky, Scott A.
2005-01-01
INTRODUCTION: The performance of complex tasks on the International Space Station (ISS) requires significant preflight crew training commitments and frequent skill and knowledge refreshment. This report documents a recently developed "just-in-time" training methodology, which integrates preflight hardware familiarization and procedure training with an on-orbit CD-ROM-based skill enhancement. This "just-in-time" concept was used to support real-time remote expert guidance to complete ultrasound examinations using the ISS Human Research Facility (HRF). METHODS: An American and Russian ISS crewmember received 2 h of "hands on" ultrasound training 8 mo prior to the on-orbit ultrasound exam. A CD-ROM-based Onboard Proficiency Enhancement (OPE) interactive multimedia program consisting of memory enhancing tutorials, and skill testing exercises, was completed by the crewmember 6 d prior to the on-orbit ultrasound exam. The crewmember was then remotely guided through a thoracic, vascular, and echocardiographic examination by ultrasound imaging experts. RESULTS: Results of the CD-ROM-based OPE session were used to modify the instructions during a complete 35-min real-time thoracic, cardiac, and carotid/jugular ultrasound study. Following commands from the ground-based expert, the crewmember acquired all target views and images without difficulty. The anatomical content and fidelity of ultrasound video were adequate for clinical decision making. CONCLUSIONS: Complex ultrasound experiments with expert guidance were performed with high accuracy following limited preflight training and multimedia based in-flight review, despite a 2-s communication latency. In-flight application of multimedia proficiency enhancement software, coupled with real-time remote expert guidance, facilitates the successful performance of ultrasound examinations on orbit and may have additional terrestrial and space applications.
NASA Astrophysics Data System (ADS)
Istvan Etesi, Laszlo; Tolbert, K.; Schwartz, R.; Zarro, D.; Dennis, B.; Csillaghy, A.
2010-05-01
In our project "Extending the Virtual Solar Observatory (VSO)” we have combined some of the features available in Solar Software (SSW) to produce an integrated environment for data analysis, supporting the complete workflow from data location, retrieval, preparation, and analysis to creating publication-quality figures. Our goal is an integrated analysis experience in IDL, easy-to-use but flexible enough to allow more sophisticated procedures such as multi-instrument analysis. To that end, we have made the transition from a locally oriented setting where all the analysis is done on the user's computer, to an extended analysis environment where IDL has access to services available on the Internet. We have implemented a form of Cloud Computing that uses the VSO search and a new data retrieval and pre-processing server (PrepServer) that provides remote execution of instrument-specific data preparation. We have incorporated the interfaces to the VSO search and the PrepServer into an IDL widget (SHOW_SYNOP) that provides user-friendly searching and downloading of raw solar data and optionally sends search results for pre-processing to the PrepServer prior to downloading the data. The raw and pre-processed data can be displayed with our plotting suite, PLOTMAN, which can handle different data types (light curves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. PLOTMAN is highly configurable and suited for visual data analysis and for creating publishable figures. PLOTMAN and SHOW_SYNOP work hand-in-hand for a convenient working environment. Our environment supports a growing number of solar instruments that currently includes RHESSI, SOHO/EIT, TRACE, SECCHI/EUVI, HINODE/XRT, and HINODE/EIS.
Application of computer-generated models using low-bandwidth vehicle data
NASA Astrophysics Data System (ADS)
Heyes, Neil J.
2002-05-01
One of the main issues with remote teleoperation of vehicles is that during visual operation, one relies on fixed camera positions that ultimately constrain the operator's view of the real world. The paper describes a solution that has been developed at QinetiQ where the operator his given a unique virtual perspective of the vehicle and the surrounding terrain as the vehicle operates. This system helps to solve problems that are generic to remote systems, such as reduction of high data transmission rates and providing 360 degree(s) three dimensional operator view positions regardless of terrain features, light levels and near real time operation. A summary of technologies is listed that could be applied to different types of vehicles and placed in many different situations in order to enhance operator spatial awareness.
Probabilistic data integration and computational complexity
NASA Astrophysics Data System (ADS)
Hansen, T. M.; Cordua, K. S.; Mosegaard, K.
2016-12-01
Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under-estimation of uncertainty. However, in both examples, one can also analyze the performance of the sampling methods used to solve the data integration problem to indicate the existence of biased information. This can be used actively to avoid biases in the available information and subsequently in the final uncertainty evaluation.
Suzuki, Keishiro; Hirasawa, Yukinori; Yaegashi, Yuji; Miyamoto, Hideki; Shirato, Hiroki
2009-01-01
We developed a web-based, remote radiation treatment planning system which allowed staff at an affiliated hospital to obtain support from a fully staffed central institution. Network security was based on a firewall and a virtual private network (VPN). Client computers were installed at a cancer centre, at a university hospital and at a staff home. We remotely operated the treatment planning computer using the Remote Desktop function built in to the Windows operating system. Except for the initial setup of the VPN router, no special knowledge was needed to operate the remote radiation treatment planning system. There was a time lag that seemed to depend on the volume of data traffic on the Internet, but it did not affect smooth operation. The initial cost and running cost of the system were reasonable.
The Fabric for Frontier Experiments Project at Fermilab
NASA Astrophysics Data System (ADS)
Kirby, Michael
2014-06-01
The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere; 2) an extensive data management system for managing local and remote caches, cataloging, querying, moving, and tracking the use of data; 3) custom and generic database applications for calibrations, beam information, and other purposes; 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.
Improving the Capture and Re-Use of Data with Wearable Computers
NASA Technical Reports Server (NTRS)
Pfarr, Barbara; Fating, Curtis C.; Green, Daniel; Powers, Edward I. (Technical Monitor)
2001-01-01
At the Goddard Space Flight Center, members of the Real-Time Software Engineering Branch are developing a wearable, wireless, voice-activated computer for use in a wide range of crosscutting space applications that would benefit from having instant Internet, network, and computer access with complete mobility and hands-free operations. These applications can be applied across many fields and disciplines including spacecraft fabrication, integration and testing (including environmental testing), and astronaut on-orbit control and monitoring of experiments with ground based experimenters. To satisfy the needs of NASA customers, this wearable computer needs to be connected to a wireless network, to transmit and receive real-time video over the network, and to receive updated documents via the Internet or NASA servers. The voice-activated computer, with a unique vocabulary, will allow the users to access documentation in a hands free environment and interact in real-time with remote users. We will discuss wearable computer development, hardware and software issues, wireless network limitations, video/audio solutions and difficulties in language development.
NASA Technical Reports Server (NTRS)
Tanelli, Simone; Tao, Wei-Kuo; Hostetler, Chris; Kuo, Kwo-Sen; Matsui, Toshihisa; Jacob, Joseph C.; Niamsuwam, Noppasin; Johnson, Michael P.; Hair, John; Butler, Carolyn;
2011-01-01
Forward simulation is an indispensable tool for evaluation of precipitation retrieval algorithms as well as for studying snow/ice microphysics and their radiative properties. The main challenge of the implementation arises due to the size of the problem domain. To overcome this hurdle, assumptions need to be made to simplify compiles cloud microphysics. It is important that these assumptions are applied consistently throughout the simulation process. ISSARS addresses this issue by providing a computationally efficient and modular framework that can integrate currently existing models and is also capable of expanding for future development. ISSARS is designed to accommodate the simulation needs of the Aerosol/Clouds/Ecosystems (ACE) mission and the Global Precipitation Measurement (GPM) mission: radars, microwave radiometers, and optical instruments such as lidars and polarimeter. ISSARS's computation is performed in three stages: input reconditioning (IRM), electromagnetic properties (scattering/emission/absorption) calculation (SEAM), and instrument simulation (ISM). The computation is implemented as a web service while its configuration can be accessed through a web-based interface.
Hengel, Belinda; Bell, Stephen; Garton, Linda; Ward, James; Rumbold, Alice; Taylor-Thomson, Debbie; Silver, Bronwyn; McGregor, Skye; Dyda, Amalie; Knox, Janet; Guy, Rebecca; Maher, Lisa; Kaldor, John Martin
2018-04-02
Young people living in remote Australian Aboriginal communities experience high rates of sexually transmissible infections (STIs). STRIVE (STIs in Remote communities, ImproVed and Enhanced primary care) was a cluster randomised control trial of a sexual health continuous quality improvement (CQI) program. As part of the trial, qualitative research was conducted to explore staff perceptions of the CQI components, their normalisation and integration into routine practice, and the factors which influenced these processes. In-depth semi-structured interviews were conducted with 41 clinical staff at 22 remote community clinics during 2011-2013. Normalisation process theory was used to frame the analysis of interview data and to provide insights into enablers and barriers to the integration and normalisation of the CQI program and its six specific components. Of the CQI components, participants reported that the clinical data reports had the highest degree of integration and normalisation. Action plan setting, the Systems Assessment Tool, and the STRIVE coordinator role, were perceived as adding value to the program, but were less readily integrated or normalised. The remaining two components (dedicated funding for health promotion and service incentive payments) were seen as least relevant. Our analysis also highlighted factors which enabled greater integration of the CQI components. These included familiarity with CQI tools, increased accountability of health centre staff and the translation of the CQI program into guideline-driven care. The analysis also identified barriers, including high staff turnover, limited time involved in the program and competing clinical demands and programs. Across all of the CQI components, the clinical data reports had the highest degree of integration and normalisation. The action plans, systems assessment tool and the STRIVE coordinator role all complemented the data reports and allowed these components to be translated directly into clinical activity. To ensure their uptake, CQI programs must acknowledge local clinical guidelines, be compatible with translation into clinical activity and have managerial support. Sexual health CQI needs to align with other CQI activities, engage staff and promote accountability through the provision of clinic specific data and regular face-to-face meetings. Australian and New Zealand Clinical Trials Registry ACTRN12610000358044 . Registered 6/05/2010. Prospectively Registered.
Space Construction Experiment Definition Study (SCEDS), part 3. Volume 2: Study results
NASA Technical Reports Server (NTRS)
1983-01-01
The essential controls and dynamics community needs for a large space structures is addressed by the basic Space Construction Experiments (SCE)/MAST configuration and enhanced configurations for follow-on flights. The SCE/MAST can be integrated on a single structures technology experiments platform (STEP). The experiment objectives can be accomplished without the need for EVA and it is anticipated that further design refinements will eliminate the requirement to use the remote manipulator system.
Registration and rectification needs of geology
NASA Technical Reports Server (NTRS)
Chavez, P. S., Jr.
1982-01-01
Geologic applications of remotely sensed imaging encompass five areas of interest. The five areas include: (1) enhancement and analysis of individual images; (2) work with small area mosaics of imagery which have been map projection rectified to individual quadrangles; (3) development of large area mosaics of multiple images for several counties or states; (4) registration of multitemporal images; and (5) data integration from several sensors and map sources. Examples for each of these types of applications are summarized.
2014-01-01
Background We recently demonstrated that quality of spirometry in primary care could markedly improve with remote offline support from specialized professionals. It is hypothesized that implementation of automatic online assessment of quality of spirometry using information and communication technologies may significantly enhance the potential for extensive deployment of a high quality spirometry program in integrated care settings. Objective The objective of the study was to elaborate and validate a Clinical Decision Support System (CDSS) for automatic online quality assessment of spirometry. Methods The CDSS was done through a three step process including: (1) identification of optimal sampling frequency; (2) iterations to build-up an initial version using the 24 standard spirometry curves recommended by the American Thoracic Society; and (3) iterations to refine the CDSS using 270 curves from 90 patients. In each of these steps the results were checked against one expert. Finally, 778 spirometry curves from 291 patients were analyzed for validation purposes. Results The CDSS generated appropriate online classification and certification in 685/778 (88.1%) of spirometry testing, with 96% sensitivity and 95% specificity. Conclusions Consequently, only 93/778 (11.9%) of spirometry testing required offline remote classification by an expert, indicating a potential positive role of the CDSS in the deployment of a high quality spirometry program in an integrated care setting. PMID:25600957
Imaging and Non-Imaging Polarimetric Methods for Remote Sensing
2016-02-09
2013. 12. T. Wakayama, K. Komaki, I. Vaughn, J. S. Tyo, Y. Otani, T. Yoshizawa, Evaluation of Mueller matrix of achromatic axially symmetric...integral over all time in Eq. 4.6 can be computed by evaluating the Fourier transform of the integrand at f = 0: I[n] = ( h̃(f)P̃ (f) ) ∗ X̃(n, f) ∣∣∣ f...Variance (EWV) as an appropriate metric to evaluate Stokes polarime- ters,32 and Twietmeyer later adopted a similar metric for use with Mueller polarimeters
Telescience Resource Kit Software Capabilities and Future Enhancements
NASA Technical Reports Server (NTRS)
Schneider, Michelle
2004-01-01
The Telescience Resource Kit (TReK) is a suite of PC-based software applications that can be used to monitor and control a payload on board the International Space Station (ISS). This software provides a way for payload users to operate their payloads from their home sites. It can be used by an individual or a team of people. TReK provides both local ground support system services and an interface to utilize remote services provided by the Payload Operations Integration Center (POIC). by the POIC and to perform local data functions such as processing the data, storing it in local files, and forwarding it to other computer systems. TReK can also be used to build, send, and track payload commands. In addition to these features, work is in progress to add a new command management capability. This capability will provide a way to manage a multi- platform command environment that can include geographically distributed computers. This is intended to help those teams that need to manage a shared on-board resource such as a facility class payload. The environment can be configured such that one individual can manage all the command activities associated with that payload. This paper will provide a summary of existing TReK capabilities and a description of the new command management capability. For example, 7'ReK can be used to receive payload data distributed
Groundwater resource exploration in Salem district, Tamil Nadu using GIS and remote sensing
NASA Astrophysics Data System (ADS)
Maheswaran, G.; Selvarani, A. Geetha; Elangovan, K.
2016-03-01
Since last decade, the value per barrel of potable groundwater has outpaced the value of a barrel of oil in many areas of the world. Hence, proper assessment of groundwater potential and management practices are the needs of the day. Establishing relationship between remote sensing data and hydrologic phenomenon can maximize the efficiency of water resources development projects. Present study focuses on groundwater potential assessment in Salem district, Tamil Nadu to investigate groundwater resource potential. At the same, all thematic layers important from ground water occurrence and movement point of view were digitized and integrated in the GIS environment. The weights of different parameters/themes were computed using weighed index overlay analysis (WIOA), analytic hierarchy process (AHP) and fuzzy logic technique. Through this integrated GIS analysis, groundwater prospect map of the study area was prepared qualitatively. Field verification at observation wells was used to verify identified potential zones and depth of water measured at observation wells. Generated map from weighed overlay using AHP performed very well in predicting the groundwater surface and hence this methodology proves to be a promising tool for future.
SCORPION persistent surveillance system with universal gateway
NASA Astrophysics Data System (ADS)
Coster, Michael; Chambers, Jon; Winters, Michael; Belesi, Joe
2008-04-01
This paper addresses benefits derived from the universal gateway utilized in Northrop Grumman Systems Corporation's (NGSC) SCORPION, a persistent surveillance and target recognition system produced by the Xetron campus in Cincinnati, Ohio. SCORPION is currently deployed in Operations Iraqi Freedom (OIF) and Enduring Freedom (OEF). The SCORPION universal gateway is a flexible, field programmable system that provides integration of over forty Unattended Ground Sensor (UGS) types from a variety of manufacturers, multiple visible and thermal electro-optical (EO) imagers, and numerous long haul satellite and terrestrial communications links, including the Army Research Lab (ARL) Blue Radio. Xetron has been integrating best in class sensors with this universal gateway to provide encrypted data exfiltration and remote sensor command and control since 1998. SCORPION data can be distributed point to point, or to multiple Common Operational Picture (COP) systems, including Command and Control Personal Computer (C2PC), Common Data Interchange Format for the Situational Awareness Display (CDIF/SAD), Force XXI Battle Command Brigade and Below (FBCB2), Defense Common Ground Systems (DCGS), and Remote Automated Position Identification System (RAPIDS).
Resolution Enhancement of Spaceborne Radiometer Images
NASA Technical Reports Server (NTRS)
Krim, Hamid
2001-01-01
Our progress over the last year has been along several dimensions: 1. Exploration and understanding of Earth Observatory System (EOS) mission with available data from NASA. 2. Comprehensive review of state of the art techniques and uncovering of limitations to be investigated (e.g. computational, algorithmic ...). and 3. Preliminary development of resolution enhancement algorithms. With the advent of well-collaborated satellite microwave radiometers, it is now possible to obtain long time series of geophysical parameters that are important for studying the global hydrologic cycle and earth radiation budget. Over the world's ocean, these radiometers simultaneously measure profiles of air temperature and the three phases of atmospheric water (vapor, liquid, and ice). In addition, surface parameters such as the near surface wind speed, the sea surface temperature, and the sea ice type and concentration can be retrieved. The special sensor microwaves imager SSM/I has wide application in atmospheric remote sensing over the ocean and provide essential inputs to numerical weather-prediction models. SSM/I data has also been used for land and ice studies, including snow cover classification measurements of soil and plant moisture contents, atmospheric moisture over land, land surface temperature and mapping polar ice. The brightness temperature observed by SSM/I is function of the effective brightness temperature of the earth's surface and the emission scattering and attenuation of the atmosphere. Advanced Microwave Scanning Radiometer (AMSR) is a new instrument that will measure the earth radiation over the spectral range from 7 to 90 GHz. Over the world's ocean, it will be possible to retrieve the four important geographical parameters SST, wind speed, vertically integrated water vapor, vertically integrated cloud liquid water L.
An Interactive Web-Based Analysis Framework for Remote Sensing Cloud Computing
NASA Astrophysics Data System (ADS)
Wang, X. Z.; Zhang, H. M.; Zhao, J. H.; Lin, Q. H.; Zhou, Y. C.; Li, J. H.
2015-07-01
Spatiotemporal data, especially remote sensing data, are widely used in ecological, geographical, agriculture, and military research and applications. With the development of remote sensing technology, more and more remote sensing data are accumulated and stored in the cloud. An effective way for cloud users to access and analyse these massive spatiotemporal data in the web clients becomes an urgent issue. In this paper, we proposed a new scalable, interactive and web-based cloud computing solution for massive remote sensing data analysis. We build a spatiotemporal analysis platform to provide the end-user with a safe and convenient way to access massive remote sensing data stored in the cloud. The lightweight cloud storage system used to store public data and users' private data is constructed based on open source distributed file system. In it, massive remote sensing data are stored as public data, while the intermediate and input data are stored as private data. The elastic, scalable, and flexible cloud computing environment is built using Docker, which is a technology of open-source lightweight cloud computing container in the Linux operating system. In the Docker container, open-source software such as IPython, NumPy, GDAL, and Grass GIS etc., are deployed. Users can write scripts in the IPython Notebook web page through the web browser to process data, and the scripts will be submitted to IPython kernel to be executed. By comparing the performance of remote sensing data analysis tasks executed in Docker container, KVM virtual machines and physical machines respectively, we can conclude that the cloud computing environment built by Docker makes the greatest use of the host system resources, and can handle more concurrent spatial-temporal computing tasks. Docker technology provides resource isolation mechanism in aspects of IO, CPU, and memory etc., which offers security guarantee when processing remote sensing data in the IPython Notebook. Users can write complex data processing code on the web directly, so they can design their own data processing algorithm.
NASA Astrophysics Data System (ADS)
Teng, W. L.; de Jeu, R. A.; Doraiswamy, P. C.; Kempler, S. J.; Shannon, H. D.
2009-12-01
A primary goal of the U.S. Department of Agriculture (USDA) is to expand markets for U.S. agricultural products and support global economic development. The USDA World Agricultural Outlook Board (WAOB) supports this goal by developing monthly World Agricultural Supply and Demand Estimates (WASDE) for the U.S. and major foreign producing countries. Because weather has a significant impact on crop progress, conditions, and production, WAOB prepares frequent agricultural weather assessments, in a GIS-based, Global Agricultural Decision Support Environment (GLADSE). The main objective of this project, thus, is to improve WAOB's estimates by integrating NASA remote sensing soil moisture observations and research results into GLADSE. Soil moisture is a primary data gap at WAOB. Soil moisture data, generated by the Land Parameter Retrieval Model (LPRM, developed by NASA GSFC and Vrije Universiteit Amsterdam) and customized to WAOB's requirements, will be directly integrated into GLADSE, as well as indirectly by first being integrated into USDA Agricultural Research Service (ARS)'s Environmental Policy Integrated Climate (EPIC) crop model. The LPRM-enhanced EPIC will be validated using three major agricultural regions important to WAOB and then integrated into GLADSE. Project benchmarking will be based on retrospective analyses of WAOB's analog year comparisons. The latter are between a given year and historical years with similar weather patterns. WAOB is the focal point for economic intelligence within the USDA. Thus, improving WAOB's agricultural estimates by integrating NASA satellite observations and model outputs will visibly demonstrate the value of NASA resources and maximize the societal benefits of NASA investments.
Accounting for ecosystem assets using remote sensing in the Colombian Orinoco River Basin lowlands
NASA Astrophysics Data System (ADS)
Vargas, Leonardo; Hein, Lars; Remme, Roy P.
2017-04-01
Worldwide, ecosystem change compromises the supply of ecosystem services (ES). Better managing ecosystems requires detailed information on these changes and their implications for ES supply. Ecosystem accounting has been developed as an environmental-economic accounting system using concepts aligned with the System of National Accounts. Ecosystem accounting requires spatial information from a local to national scale. The objective of this paper is to explore how remote sensing can be used to analyze ecosystems using an accounting approach in the Orinoco River Basin. We assessed ecosystem assets in terms of extent, condition, and capacity to supply ES. We focus on four specific ES: grasslands grazed by cattle, timber harvesting, oil palm fresh fruit bunches harvesting, and carbon sequestration. We link ES with six ecosystem assets: savannahs, woody grasslands, mixed agroecosystems, very dense forests, dense forest, and oil palm plantations. We used remote sensing vegetation and productivity indexes to measure ecosystem assets. We found that remote sensing is a powerful tool to estimate ecosystem extent. The enhanced vegetation index can be used to assess ecosystems condition, and net primary productivity can be used for the assessment of ecosystem assets capacity to supply ES. Integrating remote sensing and ecological information facilitates efficient monitoring of ecosystem assets.
Data management and analysis for the Earth System Grid
NASA Astrophysics Data System (ADS)
Williams, D. N.; Ananthakrishnan, R.; Bernholdt, D. E.; Bharathi, S.; Brown, D.; Chen, M.; Chervenak, A. L.; Cinquini, L.; Drach, R.; Foster, I. T.; Fox, P.; Hankin, S.; Henson, V. E.; Jones, P.; Middleton, D. E.; Schwidder, J.; Schweitzer, R.; Schuler, R.; Shoshani, A.; Siebenlist, F.; Sim, A.; Strand, W. G.; Wilhelmi, N.; Su, M.
2008-07-01
The international climate community is expected to generate hundreds of petabytes of simulation data within the next five to seven years. This data must be accessed and analyzed by thousands of analysts worldwide in order to provide accurate and timely estimates of the likely impact of climate change on physical, biological, and human systems. Climate change is thus not only a scientific challenge of the first order but also a major technological challenge. In order to address this technological challenge, the Earth System Grid Center for Enabling Technologies (ESG-CET) has been established within the U.S. Department of Energy's Scientific Discovery through Advanced Computing (SciDAC)-2 program, with support from the offices of Advanced Scientific Computing Research and Biological and Environmental Research. ESG-CET's mission is to provide climate researchers worldwide with access to the data, information, models, analysis tools, and computational capabilities required to make sense of enormous climate simulation datasets. Its specific goals are to (1) make data more useful to climate researchers by developing Grid technology that enhances data usability; (2) meet specific distributed database, data access, and data movement needs of national and international climate projects; (3) provide a universal and secure web-based data access portal for broad multi-model data collections; and (4) provide a wide-range of Grid-enabled climate data analysis tools and diagnostic methods to international climate centers and U.S. government agencies. Building on the successes of the previous Earth System Grid (ESG) project, which has enabled thousands of researchers to access tens of terabytes of data from a small number of ESG sites, ESG-CET is working to integrate a far larger number of distributed data providers, high-bandwidth wide-area networks, and remote computers in a highly collaborative problem-solving environment.
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Tasseled cap transformation for HJ multispectral remote sensing data
NASA Astrophysics Data System (ADS)
Han, Ling; Han, Xiaoyong
2015-12-01
The tasseled cap transformation of remote sensing data has been widely used in environment, agriculture, forest and ecology. Tasseled cap transformation coefficients matrix of HJ multi-spectrum data has been established through Givens rotation matrix to rotate principal component transform vector to whiteness, greenness and blueness direction of ground object basing on 24 scenes year-round HJ multispectral remote sensing data. The whiteness component enhances the brightness difference of ground object, and the greenness component preserves more detailed information of vegetation change while enhances the vegetation characteristic, and the blueness component significantly enhances factory with blue plastic house roof around the town and also can enhance brightness of water. Tasseled cap transformation coefficients matrix of HJ will enhance the application effect of HJ multispectral remote sensing data in their application fields.
Remote sensing image ship target detection method based on visual attention model
NASA Astrophysics Data System (ADS)
Sun, Yuejiao; Lei, Wuhu; Ren, Xiaodong
2017-11-01
The traditional methods of detecting ship targets in remote sensing images mostly use sliding window to search the whole image comprehensively. However, the target usually occupies only a small fraction of the image. This method has high computational complexity for large format visible image data. The bottom-up selective attention mechanism can selectively allocate computing resources according to visual stimuli, thus improving the computational efficiency and reducing the difficulty of analysis. Considering of that, a method of ship target detection in remote sensing images based on visual attention model was proposed in this paper. The experimental results show that the proposed method can reduce the computational complexity while improving the detection accuracy, and improve the detection efficiency of ship targets in remote sensing images.
Atoche, Alejandro Castillo; Castillo, Javier Vázquez
2012-01-01
A high-speed dual super-systolic core for reconstructive signal processing (SP) operations consists of a double parallel systolic array (SA) machine in which each processing element of the array is also conceptualized as another SA in a bit-level fashion. In this study, we addressed the design of a high-speed dual super-systolic array (SSA) core for the enhancement/reconstruction of remote sensing (RS) imaging of radar/synthetic aperture radar (SAR) sensor systems. The selected reconstructive SP algorithms are efficiently transformed in their parallel representation and then, they are mapped into an efficient high performance embedded computing (HPEC) architecture in reconfigurable Xilinx field programmable gate array (FPGA) platforms. As an implementation test case, the proposed approach was aggregated in a HW/SW co-design scheme in order to solve the nonlinear ill-posed inverse problem of nonparametric estimation of the power spatial spectrum pattern (SSP) from a remotely sensed scene. We show how such dual SSA core, drastically reduces the computational load of complex RS regularization techniques achieving the required real-time operational mode. PMID:22736964
Impact of remote sensing upon the planning, management, and development of water resources
NASA Technical Reports Server (NTRS)
Loats, H. L.; Fowler, T. R.; Frech, S. L.
1974-01-01
A survey of the principal water resource users was conducted to determine the impact of new remote data streams on hydrologic computer models. The analysis of the responses and direct contact demonstrated that: (1) the majority of water resource effort of the type suitable to remote sensing inputs is conducted by major federal water resources agencies or through federally stimulated research, (2) the federal government develops most of the hydrologic models used in this effort; and (3) federal computer power is extensive. The computers, computer power, and hydrologic models in current use were determined.
NASA Astrophysics Data System (ADS)
Meertens, C. M.; Murray, D.; McWhirter, J.
2004-12-01
Over the last five years, UNIDATA has developed an extensible and flexible software framework for analyzing and visualizing geoscience data and models. The Integrated Data Viewer (IDV), initially developed for visualization and analysis of atmospheric data, has broad interdisciplinary application across the geosciences including atmospheric, ocean, and most recently, earth sciences. As part of the NSF-funded GEON Information Technology Research project, UNAVCO has enhanced the IDV to display earthquakes, GPS velocity vectors, and plate boundary strain rates. These and other geophysical parameters can be viewed simultaneously with three-dimensional seismic tomography and mantle geodynamic model results. Disparate data sets of different formats, variables, geographical projections and scales can automatically be displayed in a common projection. The IDV is efficient and fully interactive allowing the user to create and vary 2D and 3D displays with contour plots, vertical and horizontal cross-sections, plan views, 3D isosurfaces, vector plots and streamlines, as well as point data symbols or numeric values. Data probes (values and graphs) can be used to explore the details of the data and models. The IDV is a freely available Java application using Java3D and VisAD and runs on most computers. UNIDATA provides easy-to-follow instructions for download, installation and operation of the IDV. The IDV primarily uses netCDF, a self-describing binary file format, to store multi-dimensional data, related metadata, and source information. The IDV is designed to work with OPeNDAP-equipped data servers that provide real-time observations and numerical models from distributed locations. Users can capture and share screens and animations, or exchange XML "bundles" that contain the state of the visualization and embedded links to remote data files. A real-time collaborative feature allows groups of users to remotely link IDV sessions via the Internet and simultaneously view and control the visualization. A Jython-based formulation facility allows computations on disparate data sets using simple formulas. Although the IDV is an advanced tool for research, its flexible architecture has also been exploited for educational purposes with the Virtual Geophysical Exploration Environment (VGEE) development. The VGEE demonstration added physical concept models to the IDV and curricula for atmospheric science education intended for the high school to graduate student levels.
NASA Technical Reports Server (NTRS)
Anderson, R. C.; Summers, R. L.
1981-01-01
An integrated gas analysis system designed to operate in automatic, semiautomatic, and manual modes from a remote control panel is described. The system measures the carbon monoxide, oxygen, water vapor, total hydrocarbons, carbon dioxide, and oxides of nitrogen. A pull through design provides increased reliability and eliminates the need for manual flow rate adjustment and pressure correction. The system contains two microprocessors to range the analyzers, calibrate the system, process the raw data to units of concentration, and provides information to the facility research computer and to the operator through terminal and the control panels. After initial setup, the system operates for several hours without significant operator attention.
Web-based network analysis and visualization using CellMaps
Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín
2016-01-01
Summary: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. Availability and Implementation: The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps. The client is implemented in JavaScript and the server in C and Java. Contact: jdopazo@cipf.es Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27296979
Web-based network analysis and visualization using CellMaps.
Salavert, Francisco; García-Alonso, Luz; Sánchez, Rubén; Alonso, Roberto; Bleda, Marta; Medina, Ignacio; Dopazo, Joaquín
2016-10-01
: CellMaps is an HTML5 open-source web tool that allows displaying, editing, exploring and analyzing biological networks as well as integrating metadata into them. Computations and analyses are remotely executed in high-end servers, and all the functionalities are available through RESTful web services. CellMaps can easily be integrated in any web page by using an available JavaScript API. The application is available at: http://cellmaps.babelomics.org/ and the code can be found in: https://github.com/opencb/cell-maps The client is implemented in JavaScript and the server in C and Java. jdopazo@cipf.es Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.
Computer integration of engineering design and production: A national opportunity
NASA Astrophysics Data System (ADS)
1984-10-01
The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.
Computer integration of engineering design and production: A national opportunity
NASA Technical Reports Server (NTRS)
1984-01-01
The National Aeronautics and Space Administration (NASA), as a purchaser of a variety of manufactured products, including complex space vehicles and systems, clearly has a stake in the advantages of computer-integrated manufacturing (CIM). Two major NASA objectives are to launch a Manned Space Station by 1992 with a budget of $8 billion, and to be a leader in the development and application of productivity-enhancing technology. At the request of NASA, a National Research Council committee visited five companies that have been leaders in using CIM. Based on these case studies, technical, organizational, and financial issues that influence computer integration are described, guidelines for its implementation in industry are offered, and the use of CIM to manage the space station program is recommended.
Development of a PC-based ground support system for a small satellite instrument
NASA Astrophysics Data System (ADS)
Deschambault, Robert L.; Gregory, Philip R.; Spenler, Stephen; Whalen, Brian A.
1993-11-01
The importance of effective ground support for the remote control and data retrieval of a satellite instrument cannot be understated. Problems with ground support may include the need to base personnel at a ground tracking station for extended periods, and the delay between the instrument observation and the processing of the data by the science team. Flexible solutions to such problems in the case of small satellite systems are provided by using low-cost, powerful personal computers and off-the-shelf software for data acquisition and processing, and by using Internet as a communication pathway to enable scientists to view and manipulate satellite data in real time at any ground location. The personal computer based ground support system is illustrated for the case of the cold plasma analyzer flown on the Freja satellite. Commercial software was used as building blocks for writing the ground support equipment software. Several levels of hardware support, including unit tests and development, functional tests, and integration were provided by portable and desktop personal computers. Satellite stations in Saskatchewan and Sweden were linked to the science team via phone lines and Internet, which provided remote control through a central point. These successful strategies will be used on future small satellite space programs.
More About Software for No-Loss Computing
NASA Technical Reports Server (NTRS)
Edmonds, Iarina
2007-01-01
A document presents some additional information on the subject matter of "Integrated Hardware and Software for No- Loss Computing" (NPO-42554), which appears elsewhere in this issue of NASA Tech Briefs. To recapitulate: The hardware and software designs of a developmental parallel computing system are integrated to effectuate a concept of no-loss computing (NLC). The system is designed to reconfigure an application program such that it can be monitored in real time and further reconfigured to continue a computation in the event of failure of one of the computers. The design provides for (1) a distributed class of NLC computation agents, denoted introspection agents, that effects hierarchical detection of anomalies; (2) enhancement of the compiler of the parallel computing system to cause generation of state vectors that can be used to continue a computation in the event of a failure; and (3) activation of a recovery component when an anomaly is detected.
Modeling Global Urbanization Supported by Nighttime Light Remote Sensing
NASA Astrophysics Data System (ADS)
Zhou, Y.
2015-12-01
Urbanization, a major driver of global change, profoundly impacts our physical and social world, for example, altering carbon cycling and climate. Understanding these consequences for better scientific insights and effective decision-making unarguably requires accurate information on urban extent and its spatial distributions. In this study, we developed a cluster-based method to estimate the optimal thresholds and map urban extents from the nighttime light remote sensing data, extended this method to the global domain by developing a computational method (parameterization) to estimate the key parameters in the cluster-based method, and built a consistent 20-year global urban map series to evaluate the time-reactive nature of global urbanization (e.g. 2000 in Fig. 1). Supported by urban maps derived from nightlights remote sensing data and socio-economic drivers, we developed an integrated modeling framework to project future urban expansion by integrating a top-down macro-scale statistical model with a bottom-up urban growth model. With the models calibrated and validated using historical data, we explored urban growth at the grid level (1-km) over the next two decades under a number of socio-economic scenarios. The derived spatiotemporal information of historical and potential future urbanization will be of great value with practical implications for developing adaptation and risk management measures for urban infrastructure, transportation, energy, and water systems when considered together with other factors such as climate variability and change, and high impact weather events.
Structurally Integrated Antenna Concepts for HALE UAVs
NASA Technical Reports Server (NTRS)
Cravey, Robin L.; Vedeler, Erik; Goins, Larry; Young, W. Robert; Lawrence, Roland W.
2006-01-01
This technical memorandum describes work done in support of the Multifunctional Structures and Materials Team under the Vehicle Systems Program's ITAS (Integrated Tailored Aero Structures) Project during FY 2005. The Electromagnetics and Sensors Branch (ESB) developed three ultra lightweight antenna concepts compatible with HALE UAVs (High Altitude Long Endurance Unmanned Aerial Vehicles). ESB also developed antenna elements that minimize the interaction between elements and the vehicle to minimize the impact of wing flexure on the EM (electromagnetic) performance of the integrated array. In addition, computer models were developed to perform phase correction for antenna arrays whose elements are moving relative to each other due to wing deformations expected in HALE vehicle concepts. Development of lightweight, conformal or structurally integrated antenna elements and compensating for the impact of a lightweight, flexible structure on a large antenna array are important steps in the realization of HALE UAVs for microwave applications such as passive remote sensing and communications.
NASA Astrophysics Data System (ADS)
Prodanovic, M.; Esteva, M.; Ketcham, R. A.
2017-12-01
Nanometer to centimeter-scale imaging such as (focused ion beam) scattered electron microscopy, magnetic resonance imaging and X-ray (micro)tomography has since 1990s introduced 2D and 3D datasets of rock microstructure that allow investigation of nonlinear flow and mechanical phenomena on the length scales that are otherwise impervious to laboratory measurements. The numerical approaches that use such images produce various upscaled parameters required by subsurface flow and deformation simulators. All of this has revolutionized our knowledge about grain scale phenomena. However, a lack of data-sharing infrastructure among research groups makes it difficult to integrate different length scales. We have developed a sustainable, open and easy-to-use repository called the Digital Rocks Portal (https://www.digitalrocksportal.org), that (1) organizes images and related experimental measurements of different porous materials, (2) improves access to them for a wider community of engineering or geosciences researchers not necessarily trained in computer science or data analysis. Digital Rocks Portal (NSF EarthCube Grant 1541008) is the first repository for imaged porous microstructure data. It is implemented within the reliable, 24/7 maintained High Performance Computing Infrastructure supported by the Texas Advanced Computing Center (University of Texas at Austin). Long-term storage is provided through the University of Texas System Research Cyber-infrastructure initiative. We show how the data can be documented, referenced in publications via digital object identifiers (see Figure below for examples), visualized, searched for and linked to other repositories. We show recently implemented integration of the remote parallel visualization, bulk upload for large datasets as well as preliminary flow simulation workflow with the pore structures currently stored in the repository. We discuss the issues of collecting correct metadata, data discoverability and repository sustainability.
Remote Symbolic Computation of Loci
ERIC Educational Resources Information Center
Abanades, Miguel A.; Escribano, Jesus; Botana, Francisco
2010-01-01
This article presents a web-based tool designed to compute certified equations and graphs of geometric loci specified using standard Dynamic Geometry Systems (DGS). Complementing the graphing abilities of the considered DGS, the equations of the loci produced by the application are remotely computed using symbolic algebraic techniques from the…
NASA Astrophysics Data System (ADS)
Payton, Jamie; Barnes, Tiffany; Buch, Kim; Rorrer, Audrey; Zuo, Huifang
2015-07-01
This study is a follow-up to one published in computer science education in 2010 that reported preliminary results showing a positive impact of service learning on student attitudes associated with success and retention in computer science. That paper described how service learning was incorporated into a computer science course in the context of the Students & Technology in Academia, Research, and Service (STARS) Alliance, an NSF-supported broadening participation in computing initiative that aims to diversify the computer science pipeline through innovative pedagogy and inter-institutional partnerships. The current paper describes how the STARS Alliance has expanded to diverse institutions, all using service learning as a vehicle for broadening participation in computing and enhancing attitudes and behaviors associated with student success. Results supported the STARS model of service learning for enhancing computing efficacy and computing commitment and for providing diverse students with many personal and professional development benefits.
A study on haptic collaborative game in shared virtual environment
NASA Astrophysics Data System (ADS)
Lu, Keke; Liu, Guanyang; Liu, Lingzhi
2013-03-01
A study on collaborative game in shared virtual environment with haptic feedback over computer networks is introduced in this paper. A collaborative task was used where the players located at remote sites and played the game together. The player can feel visual and haptic feedback in virtual environment compared to traditional networked multiplayer games. The experiment was desired in two conditions: visual feedback only and visual-haptic feedback. The goal of the experiment is to assess the impact of force feedback on collaborative task performance. Results indicate that haptic feedback is beneficial for performance enhancement for collaborative game in shared virtual environment. The outcomes of this research can have a powerful impact on the networked computer games.
Neural networks: Application to medical imaging
NASA Technical Reports Server (NTRS)
Clarke, Laurence P.
1994-01-01
The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.
Supervising simulations with the Prodiguer Messaging Platform
NASA Astrophysics Data System (ADS)
Greenslade, Mark; Carenton, Nicolas; Denvil, Sebastien
2015-04-01
At any one moment in time, researchers affiliated with the Institut Pierre Simon Laplace (IPSL) climate modeling group, are running hundreds of global climate simulations. These simulations execute upon a heterogeneous set of High Performance Computing (HPC) environments spread throughout France. The IPSL's simulation execution runtime is called libIGCM (library for IPSL Global Climate Modeling group). libIGCM has recently been enhanced so as to support realtime operational use cases. Such use cases include simulation monitoring, data publication, environment metrics collection, automated simulation control … etc. At the core of this enhancement is the Prodiguer messaging platform. libIGCM now emits information, in the form of messages, for remote processing at IPSL servers in Paris. The remote message processing takes several forms, for example: 1. Persisting message content to database(s); 2. Notifying an operator of changes in a simulation's execution status; 3. Launching rollback jobs upon simulation failure; 4. Dynamically updating controlled vocabularies; 5. Notifying downstream applications such as the Prodiguer web portal; We will describe how the messaging platform has been implemented from a technical perspective and demonstrate the Prodiguer web portal receiving realtime notifications.
Parmanto, Bambang; Saptono, Andi; Murthi, Raymond; Safos, Charlotte; Lathan, Corinna E
2008-11-01
A secure telemonitoring system was developed to transform CosmoBot system, a stand-alone speech-language therapy software, into a telerehabilitation system. The CosmoBot system is a motivating, computer-based play character designed to enhance children's communication skills and stimulate verbal interaction during the remediation of speech and language disorders. The CosmoBot system consists of the Mission Control human interface device and Cosmo's Play and Learn software featuring a robot character named Cosmo that targets educational goals for children aged 3-5 years. The secure telemonitoring infrastructure links a distant speech-language therapist and child/parents at home or school settings. The result is a telerehabilitation system that allows a speech-language therapist to monitor children's activities at home while providing feedback and therapy materials remotely. We have developed the means for telerehabilitation of communication skills that can be implemented in children's home settings. The architecture allows the therapist to remotely monitor the children after completion of the therapy session and to provide feedback for the following session.
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-03-06
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user's home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered.
Off the Shelf Cloud Robotics for the Smart Home: Empowering a Wireless Robot through Cloud Computing
Ramírez De La Pinta, Javier; Maestre Torreblanca, José María; Jurado, Isabel; Reyes De Cozar, Sergio
2017-01-01
In this paper, we explore the possibilities offered by the integration of home automation systems and service robots. In particular, we examine how advanced computationally expensive services can be provided by using a cloud computing approach to overcome the limitations of the hardware available at the user’s home. To this end, we integrate two wireless low-cost, off-the-shelf systems in this work, namely, the service robot Rovio and the home automation system Z-wave. Cloud computing is used to enhance the capabilities of these systems so that advanced sensing and interaction services based on image processing and voice recognition can be offered. PMID:28272305
Burbank works on the EPIC in the Node 2
2012-02-28
ISS030-E-114433 (29 Feb. 2012) --- In the International Space Station?s Destiny laboratory, NASA astronaut Dan Burbank, Expedition 30 commander, upgrades Multiplexer/Demultiplexer (MDM) computers and Portable Computer System (PCS) laptops and installs the Enhanced Processor & Integrated Communications (EPIC) hardware in the Payload 1 (PL-1) MDM.
Enhancing Army S&T Vol. 2: The Future
2012-03-01
Numerical Integrator and Computer ( ENIAC ), was commissioned by the Army’s Ballistic Research Laboratory in 1943 and operated for several years at the Army’s...Aberdeen Proving Ground? The ENIAC is considered to be the genesis of modern digital computing. It is often the case the Army’s laboratories have
An Integrated Testbed for Cooperative Perception with Heterogeneous Mobile and Static Sensors
Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal
2011-01-01
Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper. PMID:22247679
An integrated testbed for cooperative perception with heterogeneous mobile and static sensors.
Jiménez-González, Adrián; Martínez-De Dios, José Ramiro; Ollero, Aníbal
2011-01-01
Cooperation among devices with different sensing, computing and communication capabilities provides interesting possibilities in a growing number of problems and applications including domotics (domestic robotics), environmental monitoring or intelligent cities, among others. Despite the increasing interest in academic and industrial communities, experimental tools for evaluation and comparison of cooperative algorithms for such heterogeneous technologies are still very scarce. This paper presents a remote testbed with mobile robots and Wireless Sensor Networks (WSN) equipped with a set of low-cost off-the-shelf sensors, commonly used in cooperative perception research and applications, that present high degree of heterogeneity in their technology, sensed magnitudes, features, output bandwidth, interfaces and power consumption, among others. Its open and modular architecture allows tight integration and interoperability between mobile robots and WSN through a bidirectional protocol that enables full interaction. Moreover, the integration of standard tools and interfaces increases usability, allowing an easy extension to new hardware and software components and the reuse of code. Different levels of decentralization are considered, supporting from totally distributed to centralized approaches. Developed for the EU-funded Cooperating Objects Network of Excellence (CONET) and currently available at the School of Engineering of Seville (Spain), the testbed provides full remote control through the Internet. Numerous experiments have been performed, some of which are described in the paper.
HIPAA-compliant automatic monitoring system for RIS-integrated PACS operation
NASA Astrophysics Data System (ADS)
Jin, Jin; Zhang, Jianguo; Chen, Xiaomeng; Sun, Jianyong; Yang, Yuanyuan; Liang, Chenwen; Feng, Jie; Sheng, Liwei; Huang, H. K.
2006-03-01
As a governmental regulation, Health Insurance Portability and Accountability Act (HIPAA) was issued to protect the privacy of health information that identifies individuals who are living or deceased. HIPAA requires security services supporting implementation features: Access control; Audit controls; Authorization control; Data authentication; and Entity authentication. These controls, which proposed in HIPAA Security Standards, are Audit trails here. Audit trails can be used for surveillance purposes, to detect when interesting events might be happening that warrant further investigation. Or they can be used forensically, after the detection of a security breach, to determine what went wrong and who or what was at fault. In order to provide security control services and to achieve the high and continuous availability, we design the HIPAA-Compliant Automatic Monitoring System for RIS-Integrated PACS operation. The system consists of two parts: monitoring agents running in each PACS component computer and a Monitor Server running in a remote computer. Monitoring agents are deployed on all computer nodes in RIS-Integrated PACS system to collect the Audit trail messages defined by the Supplement 95 of the DICOM standard: Audit Trail Messages. Then the Monitor Server gathers all audit messages and processes them to provide security information in three levels: system resources, PACS/RIS applications, and users/patients data accessing. Now the RIS-Integrated PACS managers can monitor and control the entire RIS-Integrated PACS operation through web service provided by the Monitor Server. This paper presents the design of a HIPAA-compliant automatic monitoring system for RIS-Integrated PACS Operation, and gives the preliminary results performed by this monitoring system on a clinical RIS-integrated PACS.
NASA Astrophysics Data System (ADS)
Guardo, R.; De Siena, L.
2017-11-01
The timely estimation of short- and long-term volcanic hazard relies on the availability of detailed 3D geophysical images of volcanic structures. High-resolution seismic models of the absorbing uppermost conduit systems and highly-heterogeneous shallowest volcanic layers, while particularly challenging to obtain, provide important data to locate feasible eruptive centres and forecast flank collapses and lava ascending paths. Here, we model the volcanic structures of Mt. Etna (Sicily, Italy) and its outskirts using the Horizontal to Vertical Spectral Ratio method, generally applied to industrial and engineering settings. The integration of this technique with Web-based Geographic Information System improves precision during the acquisition phase. It also integrates geological and geophysical visualization of 3D surface and subsurface structures in a queryable environment representing their exact three-dimensional geographic position, enhancing interpretation. The results show high-resolution 3D images of the shallowest volcanic and feeding systems, which complement (1) deeper seismic tomography imaging and (2) the results of recent remote sensing imaging. The study recovers a vertical structure that divides the pre-existing volcanic complexes of Ellittico and Cuvigghiuni. This could be interpreted as a transitional phase between the two systems. A comparison with recent remote sensing and geological results, however, shows that anomalies are generally related to volcano-tectonic structures active during the last 17 years. We infer that seismic noise measurements from miniaturized instruments, when combined with remote sensing techniques, represent an important resource to monitor volcanoes in unrest, reducing the risk of loss of human lives and instrumentation.
Virtual patient simulator for distributed collaborative medical education.
Caudell, Thomas P; Summers, Kenneth L; Holten, Jim; Hakamata, Takeshi; Mowafi, Moad; Jacobs, Joshua; Lozanoff, Beth K; Lozanoff, Scott; Wilks, David; Keep, Marcus F; Saiki, Stanley; Alverson, Dale
2003-01-01
Project TOUCH (Telehealth Outreach for Unified Community Health; http://hsc.unm.edu/touch) investigates the feasibility of using advanced technologies to enhance education in an innovative problem-based learning format currently being used in medical school curricula, applying specific clinical case models, and deploying to remote sites/workstations. The University of New Mexico's School of Medicine and the John A. Burns School of Medicine at the University of Hawai'i face similar health care challenges in providing and delivering services and training to remote and rural areas. Recognizing that health care needs are local and require local solutions, both states are committed to improving health care delivery to their unique populations by sharing information and experiences through emerging telehealth technologies by using high-performance computing and communications resources. The purpose of this study is to describe the deployment of a problem-based learning case distributed over the National Computational Science Alliance's Access Grid. Emphasis is placed on the underlying technical components of the TOUCH project, including the virtual reality development tool Flatland, the artificial intelligence-based simulation engine, the Access Grid, high-performance computing platforms, and the software that connects them all. In addition, educational and technical challenges for Project TOUCH are identified. Copyright 2003 Wiley-Liss, Inc.
Unmanned Aerial Mass Spectrometer Systems for In-Situ Volcanic Plume Analysis
NASA Astrophysics Data System (ADS)
Diaz, Jorge Andres; Pieri, David; Wright, Kenneth; Sorensen, Paul; Kline-Shoder, Robert; Arkin, C. Richard; Fladeland, Matthew; Bland, Geoff; Buongiorno, Maria Fabrizia; Ramirez, Carlos; Corrales, Ernesto; Alan, Alfredo; Alegria, Oscar; Diaz, David; Linick, Justin
2015-02-01
Technology advances in the field of small, unmanned aerial vehicles and their integration with a variety of sensor packages and instruments, such as miniature mass spectrometers, have enhanced the possibilities and applications of what are now called unmanned aerial systems (UAS). With such technology, in situ and proximal remote sensing measurements of volcanic plumes are now possible without risking the lives of scientists and personnel in charge of close monitoring of volcanic activity. These methods provide unprecedented, and otherwise unobtainable, data very close in space and time to eruptions, to better understand the role of gas volatiles in magma and subsequent eruption products. Small mass spectrometers, together with the world's smallest turbo molecular pump, have being integrated into NASA and University of Costa Rica UAS platforms to be field-tested for in situ volcanic plume analysis, and in support of the calibration and validation of satellite-based remote sensing data. These new UAS-MS systems are combined with existing UAS flight-tested payloads and assets, such as temperature, pressure, relative humidity, SO2, H2S, CO2, GPS sensors, on-board data storage, and telemetry. Such payloads are capable of generating real time 3D concentration maps of the Turrialba volcano active plume in Costa Rica, while remote sensing data are simultaneously collected from the ASTER and OMI space-borne instruments for comparison. The primary goal is to improve the understanding of the chemical and physical properties of emissions for mitigation of local volcanic hazards, for the validation of species detection and abundance of retrievals based on remote sensing, and to validate transport models.
Unmanned aerial mass spectrometer systems for in-situ volcanic plume analysis.
Diaz, Jorge Andres; Pieri, David; Wright, Kenneth; Sorensen, Paul; Kline-Shoder, Robert; Arkin, C Richard; Fladeland, Matthew; Bland, Geoff; Buongiorno, Maria Fabrizia; Ramirez, Carlos; Corrales, Ernesto; Alan, Alfredo; Alegria, Oscar; Diaz, David; Linick, Justin
2015-02-01
Technology advances in the field of small, unmanned aerial vehicles and their integration with a variety of sensor packages and instruments, such as miniature mass spectrometers, have enhanced the possibilities and applications of what are now called unmanned aerial systems (UAS). With such technology, in situ and proximal remote sensing measurements of volcanic plumes are now possible without risking the lives of scientists and personnel in charge of close monitoring of volcanic activity. These methods provide unprecedented, and otherwise unobtainable, data very close in space and time to eruptions, to better understand the role of gas volatiles in magma and subsequent eruption products. Small mass spectrometers, together with the world's smallest turbo molecular pump, have being integrated into NASA and University of Costa Rica UAS platforms to be field-tested for in situ volcanic plume analysis, and in support of the calibration and validation of satellite-based remote sensing data. These new UAS-MS systems are combined with existing UAS flight-tested payloads and assets, such as temperature, pressure, relative humidity, SO2, H2S, CO2, GPS sensors, on-board data storage, and telemetry. Such payloads are capable of generating real time 3D concentration maps of the Turrialba volcano active plume in Costa Rica, while remote sensing data are simultaneously collected from the ASTER and OMI space-borne instruments for comparison. The primary goal is to improve the understanding of the chemical and physical properties of emissions for mitigation of local volcanic hazards, for the validation of species detection and abundance of retrievals based on remote sensing, and to validate transport models.
DOT National Transportation Integrated Search
2009-12-01
This volume introduces several applications of remote bridge inspection technologies studied in : this Integrated Remote Sensing and Visualization (IRSV) study using ground-based LiDAR : systems. In particular, the application of terrestrial LiDAR fo...
Understanding Greenland ice sheet hydrology using an integrated multi-scale approach
NASA Astrophysics Data System (ADS)
Rennermalm, A. K.; Moustafa, S. E.; Mioduszewski, J.; Chu, V. W.; Forster, R. R.; Hagedorn, B.; Harper, J. T.; Mote, T. L.; Robinson, D. A.; Shuman, C. A.; Smith, L. C.; Tedesco, M.
2013-03-01
Improved understanding of Greenland ice sheet hydrology is critically important for assessing its impact on current and future ice sheet dynamics and global sea level rise. This has motivated the collection and integration of in situ observations, model development, and remote sensing efforts to quantify meltwater production, as well as its phase changes, transport, and export. Particularly urgent is a better understanding of albedo feedbacks leading to enhanced surface melt, potential positive feedbacks between ice sheet hydrology and dynamics, and meltwater retention in firn. These processes are not isolated, but must be understood as part of a continuum of processes within an integrated system. This letter describes a systems approach to the study of Greenland ice sheet hydrology, emphasizing component interconnections and feedbacks, and highlighting research and observational needs.
Computer aided design environment for the analysis and design of multi-body flexible structures
NASA Technical Reports Server (NTRS)
Ramakrishnan, Jayant V.; Singh, Ramen P.
1989-01-01
A computer aided design environment consisting of the programs NASTRAN, TREETOPS and MATLAB is presented in this paper. With links for data transfer between these programs, the integrated design of multi-body flexible structures is significantly enhanced. The CAD environment is used to model the Space Shuttle/Pinhole Occulater Facility. Then a controller is designed and evaluated in the nonlinear time history sense. Recent enhancements and ongoing research to add more capabilities are also described.
System design and implementation of digital-image processing using computational grids
NASA Astrophysics Data System (ADS)
Shen, Zhanfeng; Luo, Jiancheng; Zhou, Chenghu; Huang, Guangyu; Ma, Weifeng; Ming, Dongping
2005-06-01
As a special type of digital image, remotely sensed images are playing increasingly important roles in our daily lives. Because of the enormous amounts of data involved, and the difficulties of data processing and transfer, an important issue for current computer and geo-science experts is developing internet technology to implement rapid remotely sensed image processing. Computational grids are able to solve this problem effectively. These networks of computer workstations enable the sharing of data and resources, and are used by computer experts to solve imbalances of network resources and lopsided usage. In China, computational grids combined with spatial-information-processing technology have formed a new technology: namely, spatial-information grids. In the field of remotely sensed images, spatial-information grids work more effectively for network computing, data processing, resource sharing, task cooperation and so on. This paper focuses mainly on the application of computational grids to digital-image processing. Firstly, we describe the architecture of digital-image processing on the basis of computational grids, its implementation is then discussed in detail with respect to the technology of middleware. The whole network-based intelligent image-processing system is evaluated on the basis of the experimental analysis of remotely sensed image-processing tasks; the results confirm the feasibility of the application of computational grids to digital-image processing.
Educational Computer Utilization and Computer Communications.
ERIC Educational Resources Information Center
Singh, Jai P.; Morgan, Robert P.
As part of an analysis of educational needs and telecommunications requirements for future educational satellite systems, three studies were carried out. 1) The role of the computer in education was examined and both current status and future requirements were analyzed. Trade-offs between remote time sharing and remote batch process were explored…
NASA Technical Reports Server (NTRS)
Moran, M. S.; Goodrich, D. C.; Kustas, W. P.
1994-01-01
A research and modeling strategy is presented for development of distributed hydrologic models given by a combination of remotely sensed and ground based data. In support of this strategy, two experiments Moonsoon'90 and Walnut Gulch'92 were conducted in a semiarid rangeland southeast of Tucson, Arizona, (U.S.) and a third experiment, the SALSA-MEX (Semi Arid Land Surface Atmospheric Mountain Experiment) was proposed. Results from the Moonsoon'90 experiment substantially advanced the understanding of the hydrologic and atmospheric fluxes in an arid environment and provided insight into the use of remote sensing data for hydrologic modeling. The Walnut Gulch'92 experiment addressed the seasonal hydrologic dynamics of the region and the potential of combined optical microwave remote sensing for hydrologic applications. SALSA-MEX will combine measurements and modeling to study hydrologic processes influenced by surrounding mountains, such as enhanced precipitation, snowmelt and recharge to ground water aquifers. The results from these experiments, along with the extensive experimental data bases, should aid the research community in large scale modeling of mass and energy exchanges across the soil-plant-atmosphere interface.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kostuk, M.; Uram, T. D.; Evans, T.
For the first time, an automatically triggered, between-pulse fusion science analysis code was run on-demand at a remotely located supercomputer at Argonne Leadership Computing Facility (ALCF, Lemont, IL) in support of in-process experiments being performed at DIII-D (San Diego, CA). This represents a new paradigm for combining geographically distant experimental and high performance computing (HPC) facilities to provide enhanced data analysis that is quickly available to researchers. Enhanced analysis improves the understanding of the current pulse, translating into a more efficient use of experimental resources, and to the quality of the resultant science. The analysis code used here, called SURFMN,more » calculates the magnetic structure of the plasma using Fourier transform. Increasing the number of Fourier components provides a more accurate determination of the stochastic boundary layer near the plasma edge by better resolving magnetic islands, but requires 26 minutes to complete using local DIII-D resources, putting it well outside the useful time range for between pulse analysis. These islands relate to confinement and edge localized mode (ELM) suppression, and may be controlled by adjusting coil currents for the next pulse. Argonne has ensured on-demand execution of SURFMN by providing a reserved queue, a specialized service that launches the code after receiving an automatic trigger, and with network access from the worker nodes for data transfer. Runs are executed on 252 cores of ALCF’s Cooley cluster and the data is available locally at DIII-D within three minutes of triggering. The original SURFMN design limits additional improvements with more cores, however our work shows a path forward where codes that benefit from thousands of processors can run between pulses.« less
Kostuk, M.; Uram, T. D.; Evans, T.; ...
2018-02-01
For the first time, an automatically triggered, between-pulse fusion science analysis code was run on-demand at a remotely located supercomputer at Argonne Leadership Computing Facility (ALCF, Lemont, IL) in support of in-process experiments being performed at DIII-D (San Diego, CA). This represents a new paradigm for combining geographically distant experimental and high performance computing (HPC) facilities to provide enhanced data analysis that is quickly available to researchers. Enhanced analysis improves the understanding of the current pulse, translating into a more efficient use of experimental resources, and to the quality of the resultant science. The analysis code used here, called SURFMN,more » calculates the magnetic structure of the plasma using Fourier transform. Increasing the number of Fourier components provides a more accurate determination of the stochastic boundary layer near the plasma edge by better resolving magnetic islands, but requires 26 minutes to complete using local DIII-D resources, putting it well outside the useful time range for between pulse analysis. These islands relate to confinement and edge localized mode (ELM) suppression, and may be controlled by adjusting coil currents for the next pulse. Argonne has ensured on-demand execution of SURFMN by providing a reserved queue, a specialized service that launches the code after receiving an automatic trigger, and with network access from the worker nodes for data transfer. Runs are executed on 252 cores of ALCF’s Cooley cluster and the data is available locally at DIII-D within three minutes of triggering. The original SURFMN design limits additional improvements with more cores, however our work shows a path forward where codes that benefit from thousands of processors can run between pulses.« less
Reviews and Syntheses: optical sampling of the flux tower footprint
NASA Astrophysics Data System (ADS)
Gamon, J. A.
2015-07-01
The purpose of this review is to address the reasons and methods for conducting optical remote sensing within the flux tower footprint. Fundamental principles and conclusions gleaned from over 2 decades of proximal remote sensing at flux tower sites are reviewed. The organizing framework used here is the light-use efficiency (LUE) model, both because it is widely used, and because it provides a useful theoretical construct for integrating optical remote sensing with flux measurements. Multiple ways of driving this model, ranging from meteorological measurements to remote sensing, have emerged in recent years, making it a convenient conceptual framework for comparative experimental studies. New interpretations of established optical sampling methods, including the photochemical reflectance index (PRI) and solar-induced chlorophyll fluorescence (SIF), are discussed within the context of the LUE model. Multi-scale analysis across temporal and spatial axes is a central theme because such scaling can provide links between ecophysiological mechanisms detectable at the level of individual organisms and broad patterns emerging at larger scales, enabling evaluation of emergent properties and extrapolation to the flux footprint and beyond. Proper analysis of the sampling scale requires an awareness of sampling context that is often essential to the proper interpretation of optical signals. Additionally, the concept of optical types, vegetation exhibiting contrasting optical behavior in time and space, is explored as a way to frame our understanding of the controls on surface-atmosphere fluxes. Complementary normalized difference vegetation index (NDVI) and PRI patterns across ecosystems are offered as an example of this hypothesis, with the LUE model and light-response curve providing an integrating framework. I conclude that experimental approaches allowing systematic exploration of plant optical behavior in the context of the flux tower network provides a unique way to improve our understanding of environmental constraints and ecophysiological function. In addition to an enhanced mechanistic understanding of ecosystem processes, this integration of remote sensing with flux measurements offers many rich opportunities for upscaling, satellite validation, and informing practical management objectives ranging from assessing ecosystem health and productivity to quantifying biospheric carbon sequestration.
NASA Technical Reports Server (NTRS)
Wilcox, Brian H.
1994-01-01
System for remote control of robotic land vehicle requires only small radio-communication bandwidth. Twin video cameras on vehicle create stereoscopic images. Operator views cross-polarized images on two cathode-ray tubes through correspondingly polarized spectacles. By use of cursor on frozen image, remote operator designates path. Vehicle proceeds to follow path, by use of limited degree of autonomous control to cope with unexpected conditions. System concept, called "computer-aided remote driving" (CARD), potentially useful in exploration of other planets, military surveillance, firefighting, and clean-up of hazardous materials.
Integrated Remote Sensing Modalities for Classification at a Legacy Test Site
NASA Astrophysics Data System (ADS)
Lee, D. J.; Anderson, D.; Craven, J.
2016-12-01
Detecting, locating, and characterizing suspected underground nuclear test sites is of interest to the worldwide nonproliferation monitoring community. Remote sensing provides both cultural and surface geological information over a large search area in a non-intrusive manner. We have characterized a legacy nuclear test site at the Nevada National Security Site (NNSS) using an aerial system based on RGB imagery, light detection and ranging, and hyperspectral imaging. We integrate these different remote sensing modalities to perform pattern recognition and classification tasks on the test site. These tasks include detecting cultural artifacts and exotic materials. We evaluate if the integration of different remote sensing modalities improves classification performance.
NASA Technical Reports Server (NTRS)
Czaja, Wojciech; Le Moigne-Stewart, Jacqueline
2014-01-01
In recent years, sophisticated mathematical techniques have been successfully applied to the field of remote sensing to produce significant advances in applications such as registration, integration and fusion of remotely sensed data. Registration, integration and fusion of multiple source imagery are the most important issues when dealing with Earth Science remote sensing data where information from multiple sensors, exhibiting various resolutions, must be integrated. Issues ranging from different sensor geometries, different spectral responses, differing illumination conditions, different seasons, and various amounts of noise need to be dealt with when designing an image registration, integration or fusion method. This tutorial will first define the problems and challenges associated with these applications and then will review some mathematical techniques that have been successfully utilized to solve them. In particular, we will cover topics on geometric multiscale representations, redundant representations and fusion frames, graph operators, diffusion wavelets, as well as spatial-spectral and operator-based data fusion. All the algorithms will be illustrated using remotely sensed data, with an emphasis on current and operational instruments.
NASA Astrophysics Data System (ADS)
Zhang, X.; Wu, B.; Zhang, M.; Zeng, H.
2017-12-01
Rice is one of the main staple foods in East Asia and Southeast Asia, which has occupied more than half of the world's population with 11% of cultivated land. Study on rice can provide direct or indirect information on food security and water source management. Remote sensing has proven to be the most effective method to monitoring the cropland in large scale by using temporary and spectral information. There are two main kinds of satellite have been used to mapping rice including microwave and optical. Rice, as the main crop of paddy fields, the main feature different from other crops is flooding phenomenon at planning stage (Figure 1). Microwave satellites can penetrate through clouds and efficiency on monitoring flooding phenomenon. Meanwhile, the vegetation index based on optical satellite can well distinguish rice from other vegetation. Google Earth Engine is a cloud-based platform that makes it easy to access high-performance computing resources for processing very large geospatial datasets. Google has collected large number of remote sensing satellite data around the world, which providing researchers with the possibility of doing application by using multi-source remote sensing data in a large area. In this work, we map rice planting area in south China through integration of Landsat-8 OLI, Sentienl-2, and Sentinel-1 Synthetic Aperture Radar (SAR) images. The flowchart is shown in figure 2. First, a threshold method the VH polarized backscatter from SAR sensor and vegetation index including normalized difference vegetation index (NDVI) and enhanced vegetation index (EVI) from optical sensor were used the classify the rice extent map. The forest and water surface extent map provided by earth engine were used to mask forest and water. To overcome the problem of the "salt and pepper effect" by Pixel-based classification when the spatial resolution increased, we segment the optical image and use the pixel- based classification results to merge the object-oriented segmentation data, and finally get the rice extent map. At last, by using the time series analysis, the peak count was obtained for each rice area to ensure the crop intensity. In this work, the rice ground point from a GVG crowdsourcing smartphone and rice area statistical results from National Bureau of Statistics were used to validate and evaluate our result.
NASA Technical Reports Server (NTRS)
Aucoin, P. J.; Stewart, J.; Mckay, M. F. (Principal Investigator)
1980-01-01
This document presents instructions for analysts who use the EOD-LARSYS as programmed on the Purdue University IBM 370/148 (recently replaced by the IBM 3031) computer. It presents sample applications, control cards, and error messages for all processors in the system and gives detailed descriptions of the mathematical procedures and information needed to execute the system and obtain the desired output. EOD-LARSYS is the JSC version of an integrated batch system for analysis of multispectral scanner imagery data. The data included is designed for use with the as built documentation (volume 3) and the program listings (volume 4). The system is operational from remote terminals at Johnson Space Center under the virtual machine/conversational monitor system environment.
Device for inspecting vessel surfaces
DOE Office of Scientific and Technical Information (OSTI.GOV)
Appel, D.K.
1995-12-12
A portable, remotely-controlled inspection crawler is described for use along the walls of tanks, vessels, piping and the like. The crawler can be configured to use a vacuum chamber for supporting itself on the inspected surface by suction or a plurality of magnetic wheels for moving the crawler along the inspected surface. The crawler is adapted to be equipped with an ultrasonic probe for mapping the structural integrity or other characteristics of the surface being inspected. Navigation of the crawler is achieved by triangulation techniques between a signal transmitter on the crawler and a pair of microphones attached to amore » fixed, remote location, such as the crawler`s deployment unit. The necessary communications are established between the crawler and computers external to the inspection environment for position control and storage and/or monitoring of data acquisition. 5 figs.« less
Data distribution service-based interoperability framework for smart grid testbed infrastructure
Youssef, Tarek A.; Elsayed, Ahmed T.; Mohammed, Osama A.
2016-03-02
This study presents the design and implementation of a communication and control infrastructure for smart grid operation. The proposed infrastructure enhances the reliability of the measurements and control network. The advantages of utilizing the data-centric over message-centric communication approach are discussed in the context of smart grid applications. The data distribution service (DDS) is used to implement a data-centric common data bus for the smart grid. This common data bus improves the communication reliability, enabling distributed control and smart load management. These enhancements are achieved by avoiding a single point of failure while enabling peer-to-peer communication and an automatic discoverymore » feature for dynamic participating nodes. The infrastructure and ideas presented in this paper were implemented and tested on the smart grid testbed. A toolbox and application programing interface for the testbed infrastructure are developed in order to facilitate interoperability and remote access to the testbed. This interface allows control, monitoring, and performing of experiments remotely. Furthermore, it could be used to integrate multidisciplinary testbeds to study complex cyber-physical systems (CPS).« less
Patterns of computer usage among medical practitioners in rural and remote Queensland.
White, Col; Sheedy, Vicki; Lawrence, Nicola
2002-06-01
As part of a more detailed needs analysis, patterns of computer usage among medical practitioners in rural and remote Queensland were investigated. Utilising a questionnaire approach, a response rate of 23.82% (n = 131) was obtained. Results suggest that medical practitioners in rural and remote Queensland are relatively sophisticated in their use of computer and information technologies and have embraced computerisation to a substantially higher extent compared with their urban counterparts and previously published estimates. Findings also indicate that a substantial number of rural and remote practitioners are utilising computer and information technologies for clinical purposes such as pathology, patient information sheets, prescribing, education, patient records and patient recalls. Despite barriers such as bandwidth limitations, cost and the sometimes unreliable quality of Internet service providers, a majority of rural and remote respondents rated an Internet site with continuing medical education information and services as being important or very important. Suggestions that "rural doctors are slow to adapt to new technologies" are questioned, with findings indicating that rural and remote medical practitioners in Queensland have adapted to, and utilise, information technology to a far higher extent than has been previously documented.
Remote monitoring of patients with implanted devices: data exchange and integration.
Van der Velde, Enno T; Atsma, Douwe E; Foeken, Hylke; Witteman, Tom A; Hoekstra, Wybo H G J
2013-06-01
Remote follow-up of implanted implantable cardioverter defibrillators (ICDs) may offer a solution to the problem of overcrowded outpatient clinics, and may also be effective in detecting clinical events early. Data obtained from remote follow up systems, as developed by all major device companies, are stored in a central database system, operated and owned by the device company. A problem now arises that the patient's clinical information is partly stored in the local electronic health record (EHR) system in the hospital, and partly in the remote monitoring database, which may potentially result in patient safety issues. To address the requirement of integrating remote monitoring data in the local EHR, the Integrating the Healthcare Enterprise (IHE) Implantable Device Cardiac Observation (IDCO) profile has been developed. This IHE IDCO profile has been adapted by all major device companies. In our hospital, we have implemented the IHE IDCO profile to import data from the remote databases from two device vendors into the departmental Cardiology Information System (EPD-Vision). Data is exchanged via a HL7/XML communication protocol, as defined in the IHE IDCO profile. By implementing the IHE IDCO profile, we have been able to integrate the data from the remote monitoring databases in our local EHRs. It can be expected that remote monitoring systems will develop into dedicated monitoring and therapy platforms. Data retrieved from these systems should form an integral part of the electronic patient record as more and more out-patient clinic care will shift to personalized care provided at a distance, in other words at the patient's home.
2011-12-29
ISS030-E-017776 (29 Dec. 2011) --- Working in chorus with the International Space Station team in Houston?s Mission Control Center, this astronaut and his Expedition 30 crewmates on the station install a set of Enhanced Processor and Integrated Communications (EPIC) computer cards in one of seven primary computers onboard. The upgrade will allow more experiments to operate simultaneously, and prepare for the arrival of commercial cargo ships later this year.
Telepresence in neurosurgery: the integrated remote neurosurgical system.
Kassell, N F; Downs, J H; Graves, B S
1997-01-01
This paper describes the Integrated Remote Neurosurgical System (IRNS), a remotely-operated neurosurgical microscope with high-speed communications and a surgeon-accessible user interface. The IRNS will allow high quality bidirectional mentoring in the neurosurgical suite. The research goals of this effort are twofold: to develop a clinical system allowing a remote neurosurgeon to lend expertise to the OR-based neurosurgical team and to provide an integrated training environment. The IRNS incorporates a generic microscope/transport model, Called SuMIT (Surgical Manipulator Interface Translator). Our system is currently under test using the Zeiss MKM surgical transport. A SuMIT interface is also being constructed for the Robotics Research 1607. The IRNS Remote Planning and Navigation Workstation incorporates surgical planning capabilities, real-time, 30 fps video from the microscope and overhead video camera. The remote workstation includes a force reflecting handcontroller which gives the remote surgeon an intuitive way to position the microscope head. Bidirectional audio, video whiteboarding, and image archiving are also supported by the remote workstation. A simulation mode permits pre-surgical simulation, post-surgical critique, and training for surgeons without access to an actual microscope transport system. The components of the IRNS are integrated using ATM switching to provide low latency data transfer. The research, along with the more sophisticated systems that will follow, will serve as a foundation and test-bed for extending the surgeon's skills without regard to time zone or geographic boundaries.
Enhanced control & sensing for the REMOTEC ANDROS Mk VI robot. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spelt, P.F.; Harvey, H.W.
1997-08-01
This Cooperative Research and Development Agreement (CRADA) between Lockheed Marietta Energy Systems, Inc., and REMOTEC, Inc., explored methods of providing operator feedback for various work actions of the ANDROS Mk VI teleoperated robot. In a hazardous environment, an extremely heavy workload seriously degrades the productivity of teleoperated robot operators. This CRADA involved the addition of computer power to the robot along with a variety of sensors and encoders to provide information about the robot`s performance in and relationship to its environment. Software was developed to integrate the sensor and encoder information and provide control input to the robot. ANDROS Mkmore » VI robots are presently used by numerous electric utilities to perform tasks in reactors where substantial exposure to radiation exists, as well as in a variety of other hazardous environments. Further, this platform has potential for use in a number of environmental restoration tasks, such as site survey and detection of hazardous waste materials. The addition of sensors and encoders serves to make the robot easier to manage and permits tasks to be done more safely and inexpensively (due to time saved in the completion of complex remote tasks). Prior research on the automation of mobile platforms with manipulators at Oak Ridge National Laboratory`s Center for Engineering Systems Advanced Research (CESAR, B&R code KC0401030) Laboratory, a BES-supported facility, indicated that this type of enhancement is effective. This CRADA provided such enhancements to a successful working teleoperated robot for the first time. Performance of this CRADA used the CESAR laboratory facilities and expertise developed under BES funding.« less
A New Computational Framework for Atmospheric and Surface Remote Sensing
NASA Technical Reports Server (NTRS)
Timucin, Dogan A.
2004-01-01
A Bayesian data-analysis framework is described for atmospheric and surface retrievals from remotely-sensed hyper-spectral data. Some computational techniques are high- lighted for improved accuracy in the forward physics model.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
A coactive interdisciplinary research program with NASA
NASA Technical Reports Server (NTRS)
Rouse, J. W., Jr.
1972-01-01
The applications area of the Texas A&M University remote sensing program consists of a series of coactive projects with NASA/MSC personnel. In each case, the Remote Sensing Center has served to complement and enhance the research capability within the Manned Spacecraft Center. In addition to the applications study area, the Texas A&M University program includes coordinated projects in sensors and data analysis. Under the sensors area, an extensive experimental study of microwave radiometry for soil moisture determination established the effect of soil moisture on the measured brightness temperature for several different soil types. The data analysis area included a project which ERTS-A and Skylab data were simulated using aircraft multispectral scanner measurements at two altitudes. This effort resulted in development of a library of computer programs which provides an operational capability in classification analysis of multispectral data.
NASA Astrophysics Data System (ADS)
Chiu, L.; Hao, X.; Kinter, J. L.; Stearn, G.; Aliani, M.
2017-12-01
The launch of GOES-16 series provides an opportunity to advance near real-time applications in natural hazard detection, monitoring and warning. This study demonstrates the capability and values of receiving real-time satellite-based Earth observations over a fast terrestrial networks and processing high-resolution remote sensing data in a university environment. The demonstration system includes 4 components: 1) Near real-time data receiving and processing; 2) data analysis and visualization; 3) event detection and monitoring; and 4) information dissemination. Various tools are developed and integrated to receive and process GRB data in near real-time, produce images and value-added data products, and detect and monitor extreme weather events such as hurricane, fire, flooding, fog, lightning, etc. A web-based application system is developed to disseminate near-real satellite images and data products. The images are generated with GIS-compatible format (GeoTIFF) to enable convenient use and integration in various GIS platforms. This study enhances the capacities for undergraduate and graduate education in Earth system and climate sciences, and related applications to understand the basic principles and technology in real-time applications with remote sensing measurements. It also provides an integrated platform for near real-time monitoring of extreme weather events, which are helpful for various user communities.
Review of integrated digital systems: evolution and adoption
NASA Astrophysics Data System (ADS)
Fritz, Lawrence W.
The factors that are influencing the evolution of photogrammetric and remote sensing technology to transition into fully integrated digital systems are reviewed. These factors include societal pressures for new, more timely digital products from the Spatial Information Sciencesand the adoption of rapid technological advancements in digital processing hardware and software. Current major developments in leading government mapping agencies of the USA, such as the Digital Production System (DPS) modernization programme at the Defense Mapping Agency, and the Automated Nautical Charting System II (ANCS-II) programme and Integrated Digital Photogrammetric Facility (IDPF) at NOAA/National Ocean Service, illustrate the significant benefits to be realized. These programmes are examples of different levels of integrated systems that have been designed to produce digital products. They provide insights to the management complexities to be considered for very large integrated digital systems. In recognition of computer industry trends, a knowledge-based architecture for managing the complexity of the very large spatial information systems of the future is proposed.
ERIC Educational Resources Information Center
Blue, Elfreda; Tirotta, Rose
2011-01-01
Twenty-first century technology has changed the way tools are used to support and enhance learning and instruction. Cloud computing and interactive white boards, make it possible for learners to interact, simulate, collaborate, and document learning experiences and real world problem-solving. This article discusses how various technologies (blogs,…
Enhancing Student Success through the Use of Interactive Videodisc Technology.
ERIC Educational Resources Information Center
Pokrass, Richard J.; And Others
Burlington County College in New Jersey has integrated Interactive Videodisc Technology (IVT) into several of its programs, beginning with the college's nursing program. IVT, at its highest level, is a merging of a laser disc player, a personal computer, computer software, and a qualified instructor, designed to bring to students a new dimension…
Academic Achievement Enhanced by Personal Digital Assistant Use
ERIC Educational Resources Information Center
Bick, Alexander
2005-01-01
Research during the past decade suggests that integrating computing technology in general, and mobile computers in particular, into the educational environment has positive effects. This is the first long-term study of high school Personal Digital Assistant use. It involved three-parts, 146 students during four years. Part one found that PDA use…
NASA Astrophysics Data System (ADS)
Drescher, Anushka C.; Yost, Michael G.; Park, Doo Y.; Levine, Steven P.; Gadgil, Ashok J.; Fischer, Marc L.; Nazaroff, William W.
1995-05-01
Optical remote sensing and iterative computed tomography (CT) can be combined to measure the spatial distribution of gaseous pollutant concentrations in a plane. We have conducted chamber experiments to test this combination of techniques using an Open Path Fourier Transform Infrared Spectrometer (OP-FTIR) and a standard algebraic reconstruction technique (ART). ART was found to converge to solutions that showed excellent agreement with the ray integral concentrations measured by the FTIR but were inconsistent with simultaneously gathered point sample concentration measurements. A new CT method was developed based on (a) the superposition of bivariate Gaussians to model the concentration distribution and (b) a simulated annealing minimization routine to find the parameters of the Gaussians that resulted in the best fit to the ray integral concentration data. This new method, named smooth basis function minimization (SBFM) generated reconstructions that agreed well, both qualitatively and quantitatively, with the concentration profiles generated from point sampling. We present one set of illustrative experimental data to compare the performance of ART and SBFM.
Enhancing the Remote Variable Operations in NPSS/CCDK
NASA Technical Reports Server (NTRS)
Sang, Janche; Follen, Gregory; Kim, Chan; Lopez, Isaac; Townsend, Scott
2001-01-01
Many scientific applications in aerodynamics and solid mechanics are written in Fortran. Refitting these legacy Fortran codes with distributed objects can increase the code reusability. The remote variable scheme provided in NPSS/CCDK helps programmers easily migrate the Fortran codes towards a client-server platform. This scheme gives the client the capability of accessing the variables at the server site. In this paper, we review and enhance the remote variable scheme by using the operator overloading features in C++. The enhancement enables NPSS programmers to use remote variables in much the same way as traditional variables. The remote variable scheme adopts the lazy update approach and the prefetch method. The design strategies and implementation techniques are described in details. Preliminary performance evaluation shows that communication overhead can be greatly reduced.
A rapid local singularity analysis algorithm with applications
NASA Astrophysics Data System (ADS)
Chen, Zhijun; Cheng, Qiuming; Agterberg, Frits
2015-04-01
The local singularity model developed by Cheng is fast gaining popularity in characterizing mineralization and detecting anomalies of geochemical, geophysical and remote sensing data. However in one of the conventional algorithms involving the moving average values with different scales is time-consuming especially while analyzing a large dataset. Summed area table (SAT), also called as integral image, is a fast algorithm used within the Viola-Jones object detection framework in computer vision area. Historically, the principle of SAT is well-known in the study of multi-dimensional probability distribution functions, namely in computing 2D (or ND) probabilities (area under the probability distribution) from the respective cumulative distribution functions. We introduce SAT and it's variation Rotated Summed Area Table in the isotropic, anisotropic or directional local singularity mapping in this study. Once computed using SAT, any one of the rectangular sum can be computed at any scale or location in constant time. The area for any rectangular region in the image can be computed by using only 4 array accesses in constant time independently of the size of the region; effectively reducing the time complexity from O(n) to O(1). New programs using Python, Julia, matlab and C++ are implemented respectively to satisfy different applications, especially to the big data analysis. Several large geochemical and remote sensing datasets are tested. A wide variety of scale changes (linear spacing or log spacing) for non-iterative or iterative approach are adopted to calculate the singularity index values and compare the results. The results indicate that the local singularity analysis with SAT is more robust and superior to traditional approach in identifying anomalies.
KNET - DISTRIBUTED COMPUTING AND/OR DATA TRANSFER PROGRAM
NASA Technical Reports Server (NTRS)
Hui, J.
1994-01-01
KNET facilitates distributed computing between a UNIX compatible local host and a remote host which may or may not be UNIX compatible. It is capable of automatic remote login. That is, it performs on the user's behalf the chore of handling host selection, user name, and password to the designated host. Once the login has been successfully completed, the user may interactively communicate with the remote host. Data output from the remote host may be directed to the local screen, to a local file, and/or to a local process. Conversely, data input from the keyboard, a local file, or a local process may be directed to the remote host. KNET takes advantage of the multitasking and terminal mode control features of the UNIX operating system. A parent process is used as the upper layer for interfacing with the local user. A child process is used for a lower layer for interfacing with the remote host computer, and optionally one or more child processes can be used for the remote data output. Output may be directed to the screen and/or to the local processes under the control of a data pipe switch. In order for KNET to operate, the local and remote hosts must observe a common communications protocol. KNET is written in ANSI standard C-language for computers running UNIX. It has been successfully implemented on several Sun series computers and a DECstation 3100 and used to run programs remotely on VAX VMS and UNIX based computers. It requires 100K of RAM under SunOS and 120K of RAM under DEC RISC ULTRIX. An electronic copy of the documentation is provided on the distribution medium. The standard distribution medium for KNET is a .25 inch streaming magnetic tape cartridge in UNIX tar format. It is also available on a 3.5 inch diskette in UNIX tar format. KNET was developed in 1991 and is a copyrighted work with all copyright vested in NASA. UNIX is a registered trademark of AT&T Bell Laboratories. Sun and SunOS are trademarks of Sun Microsystems, Inc. DECstation, VAX, VMS, and ULTRIX are trademarks of Digital Equipment Corporation.
NASA Astrophysics Data System (ADS)
Zhao, Shaoshuai; Ni, Chen; Cao, Jing; Li, Zhengqiang; Chen, Xingfeng; Ma, Yan; Yang, Leiku; Hou, Weizhen; Qie, Lili; Ge, Bangyu; Liu, Li; Xing, Jin
2018-03-01
The remote sensing image is usually polluted by atmosphere components especially like aerosol particles. For the quantitative remote sensing applications, the radiative transfer model based atmospheric correction is used to get the reflectance with decoupling the atmosphere and surface by consuming a long computational time. The parallel computing is a solution method for the temporal acceleration. The parallel strategy which uses multi-CPU to work simultaneously is designed to do atmospheric correction for a multispectral remote sensing image. The parallel framework's flow and the main parallel body of atmospheric correction are described. Then, the multispectral remote sensing image of the Chinese Gaofen-2 satellite is used to test the acceleration efficiency. When the CPU number is increasing from 1 to 8, the computational speed is also increasing. The biggest acceleration rate is 6.5. Under the 8 CPU working mode, the whole image atmospheric correction costs 4 minutes.
Remote sensing and eLearning 2.0 for school education
NASA Astrophysics Data System (ADS)
Voss, Kerstin; Goetzke, Roland; Hodam, Henryk
2010-10-01
The "Remote Sensing in Schools" project aims at improving the integration of "Satellite remote sensing" into school teaching. Therefore, it is the project's overall objective to teach students in primary and secondary schools the basics and fields of application of remote sensing. Existing results show that many teachers are interested in remote sensing and at same time motivated to integrate it into their teaching. Despite the good intention, in the end, the implementation often fails due to the complexity and poor set-up of the information provided. Therefore, a comprehensive and well-structured learning platform on the topic of remote sensing is developed. The platform shall allow a structured introduction to the topic.
Durisko, Corrine; McCue, Michael; Doyle, Patrick J.; Dickey, Michael Walsh
2016-01-01
Abstract Background: Neuropsychological testing is a central aspect of stroke research because it provides critical information about the cognitive-behavioral status of stroke survivors, as well as the diagnosis and treatment of stroke-related disorders. Standard neuropsychological methods rely upon face-to-face interactions between a patient and researcher, which creates geographic and logistical barriers that impede research progress and treatment advances. Introduction: To overcome these barriers, we created a flexible and integrated system for the remote acquisition of neuropsychological data (RAND). The system we developed has a secure architecture that permits collaborative videoconferencing. The system supports shared audiovisual feeds that can provide continuous virtual interaction between a participant and researcher throughout a testing session. Shared presentation and computing controls can be used to deliver auditory and visual test items adapted from standard face-to-face materials or execute computer-based assessments. Spoken and manual responses can be acquired, and the components of the session can be recorded for offline data analysis. Materials and Methods: To evaluate its feasibility, our RAND system was used to administer a speech-language test battery to 16 stroke survivors with a variety of communication, sensory, and motor impairments. The sessions were initiated virtually without prior face-to-face instruction in the RAND technology or test battery. Results: Neuropsychological data were successfully acquired from all participants, including those with limited technology experience, and those with a communication, sensory, or motor impairment. Furthermore, participants indicated a high level of satisfaction with the RAND system and the remote assessment that it permits. Conclusions: The results indicate the feasibility of using the RAND system for virtual home-based neuropsychological assessment without prior face-to-face contact between a participant and researcher. Because our RAND system architecture uses off-the-shelf technology and software, it can be duplicated without specialized expertise or equipment. In sum, our RAND system offers a readily available and promising alternative to face-to-face neuropsychological assessment in stroke research. PMID:27214198
Multi-scale remote sensing of coral reefs
Andréfouët, Serge; Hochberg, E.J.; Chevillon, Christophe; Muller-Karger, Frank E.; Brock, John C.; Hu, Chuanmin
2005-01-01
In this chapter we present how both direct and indirect remote sensing can be integrated to address two major coral reef applications - coral bleaching and assessment of biodiversity. This approach reflects the current non-linear integration of remote sensing for environmental assessment of coral reefs, resulting from a rapid increase in available sensors, processing methods and interdisciplinary collaborations (Andréfouët and Riegl, 2004). Moreover, this approach has greatly benefited from recent collaborations of once independent investigations (e.g., benthic ecology, remote sensing, and numerical modeling).
ERIC Educational Resources Information Center
Yankelevich, Eleonora
2017-01-01
A variety of computing devices are available in today's classrooms, but they have not guaranteed the effective integration of technology. Nationally, teachers have ample devices, applications, productivity software, and digital audio and video tools. Despite all this, the literature suggests these tools are not employed to enhance student learning…
Impact of remote sensing upon the planning, management, and development of water resources
NASA Technical Reports Server (NTRS)
Castruccio, P. A.; Loats, H. L.; Fowler, T. R.; Frech, S. L.
1975-01-01
Principal water resources users were surveyed to determine the impact of remote data streams on hydrologic computer models. Analysis of responses demonstrated that: most water resources effort suitable to remote sensing inputs is conducted through federal agencies or through federally stimulated research; and, most hydrologic models suitable to remote sensing data are federally developed. Computer usage by major water resources users was analyzed to determine the trends of usage and costs for the principal hydrologic users/models. The laws and empirical relationships governing the growth of the data processing loads were described and applied to project the future data loads. Data loads for ERTS CCT image processing were computed and projected through the 1985 era.
Ezra, Elishai; Maor, Idan; Bavli, Danny; Shalom, Itai; Levy, Gahl; Prill, Sebastian; Jaeger, Magnus S; Nahmias, Yaakov
2015-08-01
Microfluidic applications range from combinatorial synthesis to high throughput screening, with platforms integrating analog perfusion components, digitally controlled micro-valves and a range of sensors that demand a variety of communication protocols. Currently, discrete control units are used to regulate and monitor each component, resulting in scattered control interfaces that limit data integration and synchronization. Here, we present a microprocessor-based control unit, utilizing the MS Gadgeteer open framework that integrates all aspects of microfluidics through a high-current electronic circuit that supports and synchronizes digital and analog signals for perfusion components, pressure elements, and arbitrary sensor communication protocols using a plug-and-play interface. The control unit supports an integrated touch screen and TCP/IP interface that provides local and remote control of flow and data acquisition. To establish the ability of our control unit to integrate and synchronize complex microfluidic circuits we developed an equi-pressure combinatorial mixer. We demonstrate the generation of complex perfusion sequences, allowing the automated sampling, washing, and calibrating of an electrochemical lactate sensor continuously monitoring hepatocyte viability following exposure to the pesticide rotenone. Importantly, integration of an optical sensor allowed us to implement automated optimization protocols that require different computational challenges including: prioritized data structures in a genetic algorithm, distributed computational efforts in multiple-hill climbing searches and real-time realization of probabilistic models in simulated annealing. Our system offers a comprehensive solution for establishing optimization protocols and perfusion sequences in complex microfluidic circuits.
Computer vision and augmented reality in gastrointestinal endoscopy
Mahmud, Nadim; Cohen, Jonah; Tsourides, Kleovoulos; Berzin, Tyler M.
2015-01-01
Augmented reality (AR) is an environment-enhancing technology, widely applied in the computer sciences, which has only recently begun to permeate the medical field. Gastrointestinal endoscopy—which relies on the integration of high-definition video data with pathologic correlates—requires endoscopists to assimilate and process a tremendous amount of data in real time. We believe that AR is well positioned to provide computer-guided assistance with a wide variety of endoscopic applications, beginning with polyp detection. In this article, we review the principles of AR, describe its potential integration into an endoscopy set-up, and envisage a series of novel uses. With close collaboration between physicians and computer scientists, AR promises to contribute significant improvements to the field of endoscopy. PMID:26133175
Optical vs. electronic enhancement of remote sensing imagery
NASA Technical Reports Server (NTRS)
Colwell, R. N.; Katibah, E. F.
1976-01-01
Basic aspects of remote sensing are considered and a description is provided of the methods which are employed in connection with the optical or electronic enhancement of remote sensing imagery. The advantages and limitations of various image enhancement methods and techniques are evaluated. It is pointed out that optical enhancement methods and techniques are currently superior to electronic ones with respect to spatial resolution and equipment cost considerations. Advantages of electronic procedures, on the other hand, are related to a greater flexibility regarding the presentation of the information as an aid for the interpretation by the image analyst.
Evolving technologies for Space Station Freedom computer-based workstations
NASA Technical Reports Server (NTRS)
Jensen, Dean G.; Rudisill, Marianne
1990-01-01
Viewgraphs on evolving technologies for Space Station Freedom computer-based workstations are presented. The human-computer computer software environment modules are described. The following topics are addressed: command and control workstation concept; cupola workstation concept; Japanese experiment module RMS workstation concept; remote devices controlled from workstations; orbital maneuvering vehicle free flyer; remote manipulator system; Japanese experiment module exposed facility; Japanese experiment module small fine arm; flight telerobotic servicer; human-computer interaction; and workstation/robotics related activities.
NASA Technical Reports Server (NTRS)
Vorosmarty, C.; Grace, A.; Moore, B.; Choudhury, B.; Willmott, C. J.
1990-01-01
A strategy is presented for integrating scanning multichannel microwave radiometer data from the Nimbus-7 satellite with meteorological station records and computer simulations of land surface hydrology, terrestrial nutrient cycling, and trace gas emission. Analysis of the observations together with radiative transfer analysis shows that in the tropics the temporal and spatial variations of the polarization difference are determined primarily by the structure and phenology of vegetation and seasonal inundations of major rivers and wetlands. It is concluded that the proposed surface hydrology model, along with climatological records, and, potentially, 37-GHz data for phenology, will provide inputs to a terrestrial ecosystem model that predicts regional net primary production and CO2 gas exchange.
NASA Astrophysics Data System (ADS)
Lasaponara, R.
2012-04-01
The great amount of multispectral VHR satellite images, even available free of charge in Google earth has opened new strategic challenges in the field of remote sensing for archaeological studies. These challenges substantially deal with: (i) the strategic exploitation of satellite data as much as possible, (ii) the setting up of effective and reliable automatic and/or semiautomatic data processing strategies and (iii) the integration with other data sources from documentary resources to the traditional ground survey, historical documentation, geophysical prospection, etc. VHR satellites provide high resolution data which can improve knowledge on past human activities providing precious qualitative and quantitative information developed to such an extent that currently they share many of the physical characteristics of aerial imagery. This makes them ideal for investigations ranging from a local to a regional scale (see. for example, Lasaponara and Masini 2006a,b, 2007a, 2011; Masini and Lasaponara 2006, 2007, Sparavigna, 2010). Moreover, satellite data are still the only data source for research performed in areas where aerial photography is restricted because of military or political reasons. Among the main advantages of using satellite remote sensing compared to traditional field archaeology herein we briefly focalize on the use of wavelet data processing for enhancing google earth satellite data with particular reference to multitemporal datasets. Study areas selected from Southern Italy, Middle East and South America are presented and discussed. Results obtained point out the use of automatic image enhancement can successfully applied as first step of supervised classification and intelligent data analysis for semiautomatic identification of features of archaeological interest. Reference Lasaponara R, Masini N (2006a) On the potential of panchromatic and multispectral Quickbird data for archaeological prospection. Int J Remote Sens 27: 3607-3614. Lasaponara R, Masini N (2006b) Identification of archaeological buried remains based on Normalized Difference Vegetation Index (NDVI) from Quickbird satellite data. IEEE Geosci Remote S 3(3): 325-328. Lasaponara R, Masini N (2007a) Detection of archaeological crop marks by using satellite QuickBird multispectral imagery. J Archaeol Sci 34: 214-21. Lasaponara R, Masini N (2007b) Improving satellite Quickbird - based identification of landscape archaeological features trough tasselled cup transformation and PCA. 21st CIPA Symposium, Atene, 1-6 giugno 2007. Lasaponara R, Masini N (2010) Facing the archaeological looting in Peru by local spatial autocorrelation statistics of Very high resolution satellite imagery. In: Taniar D et al (Eds), Proceedings of ICSSA, The 2010 International Conference on Computational Science and its Application (Fukuoka-Japan, March 23 - 26, 2010), Springer, Berlin, 261-269. Lasaponara R, Masini N (2011) Satellite Remote Sensing in Archaeology : past, present and future. J Archaeol Sc 38: 1995-2002. Lasaponara R, Masini N, Rizzo E, Orefici G (2011) New discoveries in the Piramide Naranjada in Cahuachi (Peru) using satellite, Ground Probing Radar and magnetic investigations. J Archaeol Sci 38: 2031-2039. Lasaponara R, Masini N, Scardozzi G (2008) Satellite based archaeological research in ancient territory of Hierapolis. 1st International EARSeL Workshop. Advances in Remote Sensing for Archaeology and Cultural Heritage Management", CNR, Rome, September 30-October 4, Aracne, Rome, pp.11-16. Lillesand T M, Kiefer R W (2000) Remote Sensing and Image interpretation. John Wiley and Sons, New York. Masini N, Lasaponara R (2006) Satellite-based recognition of landscape archaeological features related to ancient human transformation. J Geophys Eng 3: 230-235, doi:10.1088/1742-2132/3/3/004. Masini N, Lasaponara R (2007) Investigating the spectral capability of QuickBird data to detect archaeological remains buried under vegetated and not vegetated areas. J Cult Heri 8 (1): 53-60. Sparavigna Enhancing the Google imagery using a wavelet filter, A.C. Sparavigna,http://arxiv.org/abs/1009.1590
Combining remotely sensed and other measurements for hydrologic areal averages
NASA Technical Reports Server (NTRS)
Johnson, E. R.; Peck, E. L.; Keefer, T. N.
1982-01-01
A method is described for combining measurements of hydrologic variables of various sampling geometries and measurement accuracies to produce an estimated mean areal value over a watershed and a measure of the accuracy of the mean areal value. The method provides a means to integrate measurements from conventional hydrological networks and remote sensing. The resulting areal averages can be used to enhance a wide variety of hydrological applications including basin modeling. The correlation area method assigns weights to each available measurement (point, line, or areal) based on the area of the basin most accurately represented by the measurement. The statistical characteristics of the accuracy of the various measurement technologies and of the random fields of the hydrologic variables used in the study (water equivalent of the snow cover and soil moisture) required to implement the method are discussed.
Progress in remote sensing (1972-1976)
Fischer, W. A.; Hemphill, W.R.; Kover, Allan
1976-01-01
This report concerns the progress in remote sensing during the period 1972–1976. Remote sensing has been variously defined but is basically the art or science of telling something about an object without touching it. During the past four years, the major research thrusts have been in three areas: (1) computer-assisted enhancement and interpretation systems; (2) earth science applications of Landsat data; (3) and investigations of the usefulness of observations of luminescence, thermal infrared, and microwave energies. Based on the data sales at the EROS Data Center, the largest users of the Landsat data are industrial companies, followed by government agencies (both national and foreign), and academic institutions. Thermal surveys from aircraft have become largely operational, however, significant research is being undertaken in the field of thermal modeling and analysis of high altitude images. Microwave research is increasing rapidly and programs are being developed for satellite observations. Microwave research is concentrating on oil spill detection, soil moisture measurement, and observations of ice distributions. Luminescence investigations offer promise for becoming a quantitative method of assessing vegetation stress and pollutant concentrations.
American Telemedicine Association: Telestroke Guidelines
Berg, Jill; Chong, Brian W.; Gross, Hartmut; Nystrom, Karin; Adeoye, Opeolu; Schwamm, Lee; Wechsler, Lawrence; Whitchurch, Sallie
2017-01-01
Abstract The following telestroke guidelines were developed to assist practitioners in providing assessment, diagnosis, management, and/or remote consultative support to patients exhibiting symptoms and signs consistent with an acute stroke syndrome, using telemedicine communication technologies. Although telestroke practices may include the more broad utilization of telemedicine across the entire continuum of stroke care, with some even consulting on all neurologic emergencies, this document focuses on the acute phase of stroke, including both pre- and in-hospital encounters for cerebrovascular neurological emergencies. These guidelines describe a network of audiovisual communication and computer systems for delivery of telestroke clinical services and include operations, management, administration, and economic recommendations. These interactive encounters link patients with acute ischemic and hemorrhagic stroke syndromes with acute care facilities with remote and on-site healthcare practitioners providing access to expertise, enhancing clinical practice, and improving quality outcomes and metrics. These guidelines apply specifically to telestroke services and they do not prescribe or recommend overall clinical protocols for stroke patient care. Rather, the focus is on the unique aspects of delivering collaborative bedside and remote care through the telestroke model. PMID:28384077
ERIC Educational Resources Information Center
Psycharis, Sarantos; Botsari, Evanthia; Chatzarakis, George
2014-01-01
Learning styles are increasingly being integrated into computational-enhanced earning environments and a great deal of recent research work is taking place in this area. The purpose of this study was to examine the impact of the computational experiment approach, learning styles, epistemic beliefs, and engagement with the inquiry process on the…
California coast nearshore processes study
NASA Technical Reports Server (NTRS)
Pirie, D. M. (Principal Investigator); Steller, D. D.
1973-01-01
The author has identified the following significant results. Remote sensor aircraft flights took place simultaneously with ERTS-1 overpasses at the San Francisco, Monterey Bay, and Santa Barbara test cells. The cameras and scanners used were configured for detecting suspended sediment and for maximum water penetration. The Ektachrome/Wratten 12 photographs which were intentionally overexposed 1-1/2 stops were found to show the most extensive sediment transport detail. Minus blue/K 2 photographs illustrate nearshore underwater bottom detail including the head of the Mugu submarine canyon. The EMSIDE 9 channel scanner was employed to classify and differentiate suspended sediment, oil, kelp, and other materials found in the nearshore area. Processing of bulk ERTS-1 computer compatible tapes was utilized to enhance and analyze nearshore sediments. This technique was most successful in enhancing subtle nearshore features found to be faint or invisible on prints made from the supplied negatives. In addition to this continuing computer process, an effort was initiated to interface density values from the bulk tapes into contouring and mapping software.
Flexible distributed architecture for semiconductor process control and experimentation
NASA Astrophysics Data System (ADS)
Gower, Aaron E.; Boning, Duane S.; McIlrath, Michael B.
1997-01-01
Semiconductor fabrication requires an increasingly expensive and integrated set of tightly controlled processes, driving the need for a fabrication facility with fully computerized, networked processing equipment. We describe an integrated, open system architecture enabling distributed experimentation and process control for plasma etching. The system was developed at MIT's Microsystems Technology Laboratories and employs in-situ CCD interferometry based analysis in the sensor-feedback control of an Applied Materials Precision 5000 Plasma Etcher (AME5000). Our system supports accelerated, advanced research involving feedback control algorithms, and includes a distributed interface that utilizes the internet to make these fabrication capabilities available to remote users. The system architecture is both distributed and modular: specific implementation of any one task does not restrict the implementation of another. The low level architectural components include a host controller that communicates with the AME5000 equipment via SECS-II, and a host controller for the acquisition and analysis of the CCD sensor images. A cell controller (CC) manages communications between these equipment and sensor controllers. The CC is also responsible for process control decisions; algorithmic controllers may be integrated locally or via remote communications. Finally, a system server images connections from internet/intranet (web) based clients and uses a direct link with the CC to access the system. Each component communicates via a predefined set of TCP/IP socket based messages. This flexible architecture makes integration easier and more robust, and enables separate software components to run on the same or different computers independent of hardware or software platform.
NASA Technical Reports Server (NTRS)
Butera, M. K. (Principal Investigator)
1978-01-01
The author has identified the following significant results. Major vegetative classes identified by the remote sensing technique were cypress swamp, pine, wetland grasses, salt grass, mixed mangrove, black mangrove, Brazilian pepper. Australian pine and melaleuca were not satisfactorily classified from LANDSAT. Aircraft scanners provided better resolution resulting in a classification of finer surface detail. An edge effect, created by the integration of diverse spectral responses within boundary elements of digital data, affected the wetlands classification. Accuracy classification for aircraft was 68% and for LANDSAT was 74%.
NASA Technical Reports Server (NTRS)
Ishov, Alexander G.
1994-01-01
An asymptotic approach to solution of the inverse problems of remote sensing is presented. It consists in changing integral operators characteristic of outgoing radiation into their asymptotic analogues. Such approach does not add new principal uncertainties into the problem and significantly reduces computation time that allows to develop the real (or about) time algorithms for interpretation of satellite measurements. The asymptotic approach has been realized for estimating vertical ozone distribution from satellite measurements of backscatter solar UV radiation in the Earth's atmosphere.
Ubiquitous Computing for Remote Cardiac Patient Monitoring: A Survey
Kumar, Sunil; Kambhatla, Kashyap; Hu, Fei; Lifson, Mark; Xiao, Yang
2008-01-01
New wireless technologies, such as wireless LAN and sensor networks, for telecardiology purposes give new possibilities for monitoring vital parameters with wearable biomedical sensors, and give patients the freedom to be mobile and still be under continuous monitoring and thereby better quality of patient care. This paper will detail the architecture and quality-of-service (QoS) characteristics in integrated wireless telecardiology platforms. It will also discuss the current promising hardware/software platforms for wireless cardiac monitoring. The design methodology and challenges are provided for realistic implementation. PMID:18604301
Present status and trends of image fusion
NASA Astrophysics Data System (ADS)
Xiang, Dachao; Fu, Sheng; Cai, Yiheng
2009-10-01
Image fusion information extracted from multiple images which is more accurate and reliable than that from just a single image. Since various images contain different information aspects of the measured parts, and comprehensive information can be obtained by integrating them together. Image fusion is a main branch of the application of data fusion technology. At present, it was widely used in computer vision technology, remote sensing, robot vision, medical image processing and military field. This paper mainly presents image fusion's contents, research methods, and the status quo at home and abroad, and analyzes the development trend.
Ubiquitous computing for remote cardiac patient monitoring: a survey.
Kumar, Sunil; Kambhatla, Kashyap; Hu, Fei; Lifson, Mark; Xiao, Yang
2008-01-01
New wireless technologies, such as wireless LAN and sensor networks, for telecardiology purposes give new possibilities for monitoring vital parameters with wearable biomedical sensors, and give patients the freedom to be mobile and still be under continuous monitoring and thereby better quality of patient care. This paper will detail the architecture and quality-of-service (QoS) characteristics in integrated wireless telecardiology platforms. It will also discuss the current promising hardware/software platforms for wireless cardiac monitoring. The design methodology and challenges are provided for realistic implementation.
Computer aided design and manufacturing: analysis and development of research issues
NASA Astrophysics Data System (ADS)
Taylor, K.; Jadeja, J. C.
2005-11-01
The paper focuses on the current issues in the areas of computer aided manufacturing and design. The importance of integrating CAD and CAM is analyzed. The associated issues with the integration and recent advancements in this field have been documented. The development of methods for enhancing productivity is explored. A research experiment was conducted in the laboratories of West Virginia University with an objective to portray effects of various machining parameters on production. Graphical results and their interpretations are supplied to better realize the main purpose of the experimentation.
Rahman, Md Rejaur; Shi, Z H; Chongfa, Cai
2014-11-01
This study was an attempt to analyse the regional environmental quality with the application of remote sensing, geographical information system, and spatial multiple criteria decision analysis and, to project a quantitative method applicable to identify the status of the regional environment of the study area. Using spatial multi-criteria evaluation (SMCE) approach with expert knowledge in this study, an integrated regional environmental quality index (REQI) was computed and classified into five levels of regional environment quality viz. worse, poor, moderate, good, and very good. During the process, a set of spatial criteria were selected (here, 15 criterions) together with the degree of importance of criteria in sustainability of the regional environment. Integrated remote sensing and GIS technique and models were applied to generate the necessary factors (criterions) maps for the SMCE approach. The ranking, along with expected value method, was used to standardize the factors and on the other hand, an analytical hierarchy process (AHP) was applied for calculating factor weights. The entire process was executed in the integrated land and water information system (ILWIS) software tool that supports SMCE. The analysis showed that the overall regional environmental quality of the area was at moderate level and was partly determined by elevation. Areas under worse and poor quality of environment indicated that the regional environmental status showed decline in these parts of the county. The study also revealed that the human activities, vegetation condition, soil erosion, topography, climate, and soil conditions have serious influence on the regional environment condition of the area. Considering the regional characteristics of environmental quality, priority, and practical needs for environmental restoration, the study area was further regionalized into four priority areas which may serve as base areas of decision making for the recovery, rebuilding, and protection of the environment.
Utilization of Internet Protocol-Based Voice Systems in Remote Payload Operations
NASA Technical Reports Server (NTRS)
Chamberlain, jim; Bradford, Bob; Best, Susan; Nichols, Kelvin
2002-01-01
Due to limited crew availability to support science and the large number of experiments to be operated simultaneously, telescience is key to a successful International Space Station (ISS) science program. Crew, operations personnel at NASA centers, and researchers at universities and companies around the world must work closely together to per orm scientific experiments on-board ISS. The deployment of reliable high-speed Internet Protocol (IP)-based networks promises to greatly enhance telescience capabilities. These networks are now being used to cost-effectively extend the reach of remote mission support systems. They reduce the need for dedicated leased lines and travel while improving distributed workgroup collaboration capabilities. NASA has initiated use of Voice over Internet Protocol (VoIP) to supplement the existing mission voice communications system used by researchers at their remote sites. The Internet Voice Distribution System (IVoDS) connects remote researchers to mission support "loopsll or conferences via NASA networks and Internet 2. Researchers use NODS software on personal computers to talk with operations personnel at NASA centers. IVoDS also has the ;capability, if authorized, to allow researchers to communicate with the ISS crew during experiment operations. NODS was developed by Marshall Space Flight Center with contractors & Technology, First Virtual Communications, Lockheed-Martin, and VoIP Group. NODS is currently undergoing field-testing with full deployment for up to 50 simultaneous users expected in 2002. Research is being performed in parallel with IVoDS deployment for a next-generation system to qualitatively enhance communications among ISS operations personnel. In addition to the current voice capability, video and data/application-sharing capabilities are being investigated. IVoDS technology is also being considered for mission support systems for programs such as Space Launch Initiative and Homeland Defense.
Khashan, S. A.; Alazzam, A.; Furlani, E. P.
2014-01-01
A microfluidic design is proposed for realizing greatly enhanced separation of magnetically-labeled bioparticles using integrated soft-magnetic elements. The elements are fixed and intersect the carrier fluid (flow-invasive) with their length transverse to the flow. They are magnetized using a bias field to produce a particle capture force. Multiple stair-step elements are used to provide efficient capture throughout the entire flow channel. This is in contrast to conventional systems wherein the elements are integrated into the walls of the channel, which restricts efficient capture to limited regions of the channel due to the short range nature of the magnetic force. This severely limits the channel size and hence throughput. Flow-invasive elements overcome this limitation and enable microfluidic bioseparation systems with superior scalability. This enhanced functionality is quantified for the first time using a computational model that accounts for the dominant mechanisms of particle transport including fully-coupled particle-fluid momentum transfer. PMID:24931437
NASA Technical Reports Server (NTRS)
Dorais, Gregory A.; Kurien, James; Rajan, Kanna
1999-01-01
We describe the computer demonstration of the Remote Agent Experiment (RAX). The Remote Agent is a high-level, model-based, autonomous control agent being validated on the NASA Deep Space 1 spacecraft.
The Fabric for Frontier Experiments Project at Fermilab
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirby, Michael
2014-01-01
The FabrIc for Frontier Experiments (FIFE) project is a new, far-reaching initiative within the Fermilab Scientific Computing Division to drive the future of computing services for experiments at FNAL and elsewhere. It is a collaborative effort between computing professionals and experiment scientists to produce an end-to-end, fully integrated set of services for computing on the grid and clouds, managing data, accessing databases, and collaborating within experiments. FIFE includes 1) easy to use job submission services for processing physics tasks on the Open Science Grid and elsewhere, 2) an extensive data management system for managing local and remote caches, cataloging, querying,more » moving, and tracking the use of data, 3) custom and generic database applications for calibrations, beam information, and other purposes, 4) collaboration tools including an electronic log book, speakers bureau database, and experiment membership database. All of these aspects will be discussed in detail. FIFE sets the direction of computing at Fermilab experiments now and in the future, and therefore is a major driver in the design of computing services worldwide.« less
CAD/CAE Integration Enhanced by New CAD Services Standard
NASA Technical Reports Server (NTRS)
Claus, Russell W.
2002-01-01
A Government-industry team led by the NASA Glenn Research Center has developed a computer interface standard for accessing data from computer-aided design (CAD) systems. The Object Management Group, an international computer standards organization, has adopted this CAD services standard. The new standard allows software (e.g., computer-aided engineering (CAE) and computer-aided manufacturing software to access multiple CAD systems through one programming interface. The interface is built on top of a distributed computing system called the Common Object Request Broker Architecture (CORBA). CORBA allows the CAD services software to operate in a distributed, heterogeneous computing environment.
NASA Technical Reports Server (NTRS)
Wharton, S. W.
1980-01-01
An Interactive Cluster Analysis Procedure (ICAP) was developed to derive classifier training statistics from remotely sensed data. The algorithm interfaces the rapid numerical processing capacity of a computer with the human ability to integrate qualitative information. Control of the clustering process alternates between the algorithm, which creates new centroids and forms clusters and the analyst, who evaluate and elect to modify the cluster structure. Clusters can be deleted or lumped pairwise, or new centroids can be added. A summary of the cluster statistics can be requested to facilitate cluster manipulation. The ICAP was implemented in APL (A Programming Language), an interactive computer language. The flexibility of the algorithm was evaluated using data from different LANDSAT scenes to simulate two situations: one in which the analyst is assumed to have no prior knowledge about the data and wishes to have the clusters formed more or less automatically; and the other in which the analyst is assumed to have some knowledge about the data structure and wishes to use that information to closely supervise the clustering process. For comparison, an existing clustering method was also applied to the two data sets.
Gray, B G; Ichise, M; Chung, D G; Kirsh, J C; Franks, W
1992-01-01
The functional imaging modality has potential for demonstrating parenchymal abnormalities not detectable by traditional morphological imaging. Fifty-three patients with a remote history of traumatic brain injury (TBI) were studied with SPECT using 99mTc-hexamethylpropyleneamineoxime (HMPAO) and x-ray computed tomography (CT). Overall, 42 patients (80%) showed regional cerebral blood flow (rCBF) deficits by HMPAO SPECT, whereas 29 patients (55%) showed morphological abnormalities by CT. Out of 20 patients with minor head injury, 12 patients (60%) showed rCBF deficits and 5 patients (25%) showed CT abnormalities. Of 33 patients with major head injury, 30 patients (90%) showed rCBF deficits and 24 patients (72%) showed CT abnormalities. Thus, HMPAO SPECT was more sensitive than CT in detecting abnormalities in patients with a history of TBI, particularly in the minor head injury group. In the major head injury group, three patients showed localized cortical atrophy by CT and normal rCBF by HMPAO SPECT. In the evaluation of TBI patients, HMPAO SPECT is a useful technique to demonstrate regional brain dysfunction in the presence of morphological integrity as assessed by CT.
Remote hardware-reconfigurable robotic camera
NASA Astrophysics Data System (ADS)
Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.
2001-10-01
In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.
Analysis, Mining and Visualization Service at NCSA
NASA Astrophysics Data System (ADS)
Wilhelmson, R.; Cox, D.; Welge, M.
2004-12-01
NCSA's goal is to create a balanced system that fully supports high-end computing as well as: 1) high-end data management and analysis; 2) visualization of massive, highly complex data collections; 3) large databases; 4) geographically distributed Grid computing; and 5) collaboratories, all based on a secure computational environment and driven with workflow-based services. To this end NCSA has defined a new technology path that includes the integration and provision of cyberservices in support of data analysis, mining, and visualization. NCSA has begun to develop and apply a data mining system-NCSA Data-to-Knowledge (D2K)-in conjunction with both the application and research communities. NCSA D2K will enable the formation of model-based application workflows and visual programming interfaces for rapid data analysis. The Java-based D2K framework, which integrates analytical data mining methods with data management, data transformation, and information visualization tools, will be configurable from the cyberservices (web and grid services, tools, ..) viewpoint to solve a wide range of important data mining problems. This effort will use modules, such as a new classification methods for the detection of high-risk geoscience events, and existing D2K data management, machine learning, and information visualization modules. A D2K cyberservices interface will be developed to seamlessly connect client applications with remote back-end D2K servers, providing computational resources for data mining and integration with local or remote data stores. This work is being coordinated with SDSC's data and services efforts. The new NCSA Visualization embedded workflow environment (NVIEW) will be integrated with D2K functionality to tightly couple informatics and scientific visualization with the data analysis and management services. Visualization services will access and filter disparate data sources, simplifying tasks such as fusing related data from distinct sources into a coherent visual representation. This approach enables collaboration among geographically dispersed researchers via portals and front-end clients, and the coupling with data management services enables recording associations among datasets and building annotation systems into visualization tools and portals, giving scientists a persistent, shareable, virtual lab notebook. To facilitate provision of these cyberservices to the national community, NCSA will be providing a computational environment for large-scale data assimilation, analysis, mining, and visualization. This will be initially implemented on the new 512 processor shared memory SGI's recently purchased by NCSA. In addition to standard batch capabilities, NCSA will provide on-demand capabilities for those projects requiring rapid response (e.g., development of severe weather, earthquake events) for decision makers. It will also be used for non-sequential interactive analysis of data sets where it is important have access to large data volumes over space and time.
Multiplexer/Demultiplexer Loading Tool (MDMLT)
NASA Technical Reports Server (NTRS)
Brewer, Lenox Allen; Hale, Elizabeth; Martella, Robert; Gyorfi, Ryan
2012-01-01
The purpose of the MDMLT is to improve the reliability and speed of loading multiplexers/demultiplexers (MDMs) in the Software Development and Integration Laboratory (SDIL) by automating the configuration management (CM) of the loads in the MDMs, automating the loading procedure, and providing the capability to load multiple or all MDMs concurrently. This loading may be accomplished in parallel, or single MDMs (remote). The MDMLT is a Web-based tool that is capable of loading the entire International Space Station (ISS) MDM configuration in parallel. It is able to load Flight Equivalent Units (FEUs), enhanced, standard, and prototype MDMs as well as both EEPROM (Electrically Erasable Programmable Read-Only Memory) and SSMMU (Solid State Mass Memory Unit) (MASS Memory). This software has extensive configuration management to track loading history, and the performance improvement means of loading the entire ISS MDM configuration of 49 MDMs in approximately 30 minutes, as opposed to 36 hours, which is what it took previously utilizing the flight method of S-Band uplink. The laptop version recently added to the MDMLT suite allows remote lab loading with the CM of information entered into a common database when it is reconnected to the network. This allows the program to reconfigure the test rigs quickly between shifts, allowing the lab to support a variety of onboard configurations during a single day, based on upcoming or current missions. The MDMLT Computer Software Configuration Item (CSCI) supports a Web-based command and control interface to the user. An interface to the SDIL File Transfer Protocol (FTP) server is supported to import Integrated Flight Loads (IFLs) and Internal Product Release Notes (IPRNs) into the database. An interface to the Monitor and Control System (MCS) is supported to control the power state, and to enable or disable the debug port of the MDMs to be loaded. Two direct interfaces to the MDM are supported: a serial interface (debug port) to receive MDM memory dump data and the calculated checksum, and the Small Computer System Interface (SCSI) to transfer load files to MDMs with hard disks. File transfer from the MDM Loading Tool to EEPROM within the MDM is performed via the MILSTD- 1553 bus, making use of the Real- Time Input/Output Processors (RTIOP) when using the rig-based MDMLT, and via a bus box when using the laptop MDMLT. The bus box is a cost-effective alternative to PC-1553 cards for the laptop. It is noted that this system can be modified and adapted to any avionic laboratory for spacecraft computer loading, ship avionics, or aircraft avionics where multiple configurations and strong configuration management of software/firmware loads are required.
NASA Technical Reports Server (NTRS)
Edwards, J. W.; Deets, D. A.
1975-01-01
A cost-effective approach to flight testing advanced control concepts with remotely piloted vehicles is described. The approach utilizes a ground based digital computer coupled to the remotely piloted vehicle's motion sensors and control surface actuators through telemetry links to provide high bandwidth feedback control. The system was applied to the control of an unmanned 3/8-scale model of the F-15 airplane. The model was remotely augmented; that is, the F-15 mechanical and control augmentation flight control systems were simulated by the ground-based computer, rather than being in the vehicle itself. The results of flight tests of the model at high angles of attack are discussed.
NASA Astrophysics Data System (ADS)
Guardo, Roberto; De Siena, Luca
2017-04-01
The timely estimation of short- and long-term volcanic hazard relies on the existence of detailed 3D geophysical images of volcanic structures. High-resolution seismic models of the absorbing uppermost conduit systems and highly-heterogeneous shallowest volcanic layers, while particularly challenging to obtain, provide important data to locate feasible eruptive centers and forecast flank collapses and lava ascending paths. Here, we model the volcanic structures of Mt. Etna (Sicily, Italy) and its outskirts using the Horizontal to Vertical Spectral Ratio method, generally applied to industrial and engineering settings. The integration of this technique with Web-based Geographic Information System improves precision during the acquisition phase. It also integrates geological and geophysical visualization of 3D surface and subsurface structures in a queryable environment representing their exact three-dimensional geographic position, enhancing interpretation. The results show high-resolution 3D images of the shallowest volcanic and feeding systems, which complement (1) deeper seismic tomography imaging and (2) the results of recent remote sensing imaging. The main novelty with respect to previous model is the presence of a vertical structure that divides the pre-existing volcanic complexes of Ellittico and Cuvigghiuni. This could be interpreted as a transitional phase between the two systems. A comparison with recent remote sensing and geological results, however, shows clear connections between the anomaly and dynamic active during the last 15 years. We infer that seismic noise measurements from miniaturized instruments, when combined with remote sensing techniques, represent an important resource when monitoring volcanic media and eruptions, reducing the risk of loss of human lives and instrumentation.
NASA Astrophysics Data System (ADS)
Sava, E.; Cervone, G.; Kalyanapu, A. J.; Sampson, K. M.
2017-12-01
The increasing trend in flooding events, paired with rapid urbanization and an aging infrastructure is projected to enhance the risk of catastrophic losses and increase the frequency of both flash and large area floods. During such events, it is critical for decision makers and emergency responders to have access to timely actionable knowledge regarding preparedness, emergency response, and recovery before, during and after a disaster. Large volumes of data sets derived from sophisticated sensors, mobile phones, and social media feeds are increasingly being used to improve citizen services and provide clues to the best way to respond to emergencies through the use of visualization and GIS mapping. Such data, coupled with recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed decision makers to more efficiently extract precise and relevant knowledge and better understand how damage caused by disasters have real time effects on urban population. This research assesses the feasibility of integrating multiple sources of contributed data into hydrodynamic models for flood inundation simulation and estimating damage assessment. It integrates multiple sources of high-resolution physiographic data such as satellite remote sensing imagery coupled with non-authoritative data such as Civil Air Patrol (CAP) and `during-event' social media observations of flood inundation in order to improve the identification of flood mapping. The goal is to augment remote sensing imagery with new open-source datasets to generate flood extend maps at higher temporal and spatial resolution. The proposed methodology is applied on two test cases, relative to the 2013 Boulder Colorado flood and the 2015 floods in Texas.
Pediatric digital chest imaging.
Tarver, R D; Cohen, M; Broderick, N J; Conces, D J
1990-01-01
The Philips Computed Radiography system performs well with pediatric portable chest radiographs, handling the throughout of a busy intensive care service 24 hours a day. Images are excellent and routinely provide a conventional (unenhanced) image and an edge-enhanced image. Radiation dose is decreased by the lowered frequency of repeat examinations and the ability of the plates to respond to a much lower dose and still provide an adequate image. The high quality and uniform density of serial PCR portable radiographs greatly enhances diagnostic content of the films. Decreased resolution has not been a problem clinically. Image manipulation and electronic transfer to remote viewing stations appear to be helpful and are currently being evaluated further. The PCR system provides a marked improvement in pediatric portable chest radiology.
Techniques for using diazo materials in remote sensor data analysis
NASA Technical Reports Server (NTRS)
Whitebay, L. E.; Mount, S.
1978-01-01
The use of data derived from LANDSAT is facilitated when special products or computer enhanced images can be analyzed. However, the facilities required to produce and analyze such products prevent many users from taking full advantages of the LANDSAT data. A simple, low-cost method is presented by which users can make their own specially enhanced composite images from the four band black and white LANDSAT images by using the diazo process. The diazo process is described and a detailed procedure for making various color composites, such as color infrared, false natural color, and false color, is provided. The advantages and limitations of the diazo process are discussed. A brief discussion interpretation of diazo composites for land use mapping with some typical examples is included.
NASA Technical Reports Server (NTRS)
Maynard, Nancy G.; Yurchak, Boris S.; Sleptsov, Yuri A.; Turi, Johan Mathis; Mathlesen, Svein D.
2005-01-01
To adapt successfully to the major changes - climate, environment, economic, social and industrial - which have taken place across the Arctic. in recent years, indigenous communities such as reindeer herders must become increasingly empowered with the best available technologies to add to their storehouse of traditional knowledge. Remotely-sensed data and observations are providing increased capabilities for monitoring, risk mapping, and surveillance of parameters critical to the characterization of pasture quality and migratory routes, such as vegetation distribution, snow cover, infrastructure development, and pasture damages due to fires. This paper describes a series of remote sensing capabilities, which are useful to reindeer husbandry, and gives the results of the first year of a project, "Reindeer Mapper", which is a remote sensing and GIs-based system to bring together space technologies with indigenous knowledge for sustainable reindeer husbandry in the Russian Arctic. In this project, reindeer herders and scientists are joining together to utilize technologies to create a system for collecting and sharing space-based and indigenous knowledge in the Russian Arctic. The "Reindeer Mapper" system will help make technologies more readily available to the herder community for observing, data collection and analysis, monitoring, sharing, communications, and dissemination of information - to be integrated with traditional, local knowledge. This paper describes some of the technologies which comprise the system including an intranet system to enable to the team members to work together and share information electronically, remote sensing data for monitoring environmental parameters important to reindeer husbandry (e.g., SAR, Landsat, AVHRR, MODIS), indigenous knowledge about important environmental parameters, acquisition of ground- based measurements, and the integration of all useful data sets for more informed decision-making.
NASA Astrophysics Data System (ADS)
Nakamura, T. K. M.; Nakamura, R.; Varsani, A.; Genestreti, K. J.; Baumjohann, W.; Liu, Y.-H.
2018-05-01
A remote sensing technique to infer the local reconnection electric field based on in situ multipoint spacecraft observation at the reconnection separatrix is proposed. In this technique, the increment of the reconnected magnetic flux is estimated by integrating the in-plane magnetic field during the sequential observation of the separatrix boundary by multipoint measurements. We tested this technique by applying it to virtual observations in a two-dimensional fully kinetic particle-in-cell simulation of magnetic reconnection without a guide field and confirmed that the estimated reconnection electric field indeed agrees well with the exact value computed at the X-line. We then applied this technique to an event observed by the Magnetospheric Multiscale mission when crossing an energetic plasma sheet boundary layer during an intense substorm. The estimated reconnection electric field for this event is nearly 1 order of magnitude higher than a typical value of magnetotail reconnection.
Concept, Simulation, and Instrumentation for Radiometric Inflight Icing Detection
NASA Technical Reports Server (NTRS)
Ryerson, Charles; Koenig, George G.; Reehorst, Andrew L.; Scott, Forrest R.
2009-01-01
The multi-agency Flight in Icing Remote Sensing Team (FIRST), a consortium of the National Aeronautics and Space Administration (NASA), the Federal Aviation Administration (FAA), the National Center for Atmospheric Research (NCAR), the National Oceanographic and Atmospheric Administration (NOAA), and the Army Corps of Engineers (USACE), has developed technologies for remotely detecting hazardous inflight icing conditions. The USACE Cold Regions Research and Engineering Laboratory (CRREL) assessed the potential of onboard passive microwave radiometers for remotely detecting icing conditions ahead of aircraft. The dual wavelength system differences the brightness temperature of Space and clouds, with greater differences potentially indicating closer and higher magnitude cloud liquid water content (LWC). The Air Force RADiative TRANsfer model (RADTRAN) was enhanced to assess the flight track sensing concept, and a 'flying' RADTRAN was developed to simulate a radiometer system flying through simulated clouds. Neural network techniques were developed to invert brightness temperatures and obtain integrated cloud liquid water. In addition, a dual wavelength Direct-Detection Polarimeter Radiometer (DDPR) system was built for detecting hazardous drizzle drops. This paper reviews technology development to date and addresses initial polarimeter performance.
Ware, Matthew J; Curtis, Louis T; Wu, Min; Ho, Jason C; Corr, Stuart J; Curley, Steven A; Godin, Biana; Frieboes, Hermann B
2017-06-13
Although chemotherapy combined with radiofrequency exposure has shown promise in cancer treatment by coupling drug cytotoxicity with thermal ablation or thermally-induced cytotoxicity, limited access of the drug to tumor loci in hypo-vascularized lesions has hampered clinical application. We recently showed that high-intensity short-wave capacitively coupled radiofrequency (RF) electric-fields may reach inaccessible targets in vivo. This non-invasive RF combined with gemcitabine (Gem) chemotherapy enhanced drug uptake and effect in pancreatic adenocarcinoma (PDAC), notorious for having poor response and limited therapeutic options, but without inducing thermal injury. We hypothesize that the enhanced cytotoxicity derives from RF-facilitated drug transport in the tumor microenvironment. We propose an integrated experimental/computational approach to evaluate chemotherapeutic response combined with RF-induced phenotypic changes in tissue with impaired transport. Results show that RF facilitates diffusive transport in 3D cell cultures representing hypo-vascularized lesions, enhancing drug uptake and effect. Computational modeling evaluates drug vascular extravasation and diffusive transport as key RF-modulated parameters, with transport being dominant. Assessment of hypothetical schedules following current clinical protocol for Stage-IV PDAC suggests that unresponsive lesions may be growth-restrained when exposed to Gem plus RF. Comparison of these projections to experiments in vivo indicates that synergy may result from RF-induced cell phenotypic changes enhancing drug transport and cytotoxicity, thus providing a potential baseline for clinically-focused evaluation.
Application of narrow-band television to industrial and commercial communications
NASA Technical Reports Server (NTRS)
Embrey, B. C., Jr.; Southworth, G. R.
1974-01-01
The development of narrow-band systems for use in space systems is presented. Applications of the technology to future spacecraft requirements are discussed along with narrow-band television's influence in stimulating development within the industry. The transferral of the technology into industrial and commercial communications is described. Major areas included are: (1) medicine; (2) education; (3) remote sensing for traffic control; and (5) weather observation. Applications in data processing, image enhancement, and information retrieval are provided by the combination of the TV camera and the computer.
Identifying environmental features for land management decisions
NASA Technical Reports Server (NTRS)
1981-01-01
The benefits of changes in management organization and facilities for the Center for Remote Sensing and Cartography in Utah are reported as well as interactions with and outreach to state and local agencies. Completed projects are described which studied (1) Unita Basin wetland/land use; (2) Davis County foothill development; (3) Farmington Bay shoreline fluctuation; (4) irrigation detection; and (5) satellite investigation of snow cover/mule deer relationships. Techniques developed for composite computer mapping, contrast enhancement, U-2 CIR/LANDSAT digital interface; factor analysis, and multivariate statistical analysis are described.
NASA Astrophysics Data System (ADS)
Papers are presented on ISDN, mobile radio systems and techniques for digital connectivity, centralized and distributed algorithms in computer networks, communications networks, quality assurance and impact on cost, adaptive filters in communications, the spread spectrum, signal processing, video communication techniques, and digital satellite services. Topics discussed include performance evaluation issues for integrated protocols, packet network operations, the computer network theory and multiple-access, microwave single sideband systems, switching architectures, fiber optic systems, wireless local communications, modulation, coding, and synchronization, remote switching, software quality, transmission, and expert systems in network operations. Consideration is given to wide area networks, image and speech processing, office communications application protocols, multimedia systems, customer-controlled network operations, digital radio systems, channel modeling and signal processing in digital communications, earth station/on-board modems, computer communications system performance evaluation, source encoding, compression, and quantization, and adaptive communications systems.
Ponce, Brent A; Menendez, Mariano E; Oladeji, Lasun O; Fryberger, Charles T; Dantuluri, Phani K
2014-11-01
The authors describe the first surgical case adopting the combination of real-time augmented reality and wearable computing devices such as Google Glass (Google Inc, Mountain View, California). A 66-year-old man presented to their institution for a total shoulder replacement after 5 years of progressive right shoulder pain and decreased range of motion. Throughout the surgical procedure, Google Glass was integrated with the Virtual Interactive Presence and Augmented Reality system (University of Alabama at Birmingham, Birmingham, Alabama), enabling the local surgeon to interact with the remote surgeon within the local surgical field. Surgery was well tolerated by the patient and early surgical results were encouraging, with an improvement of shoulder pain and greater range of motion. The combination of real-time augmented reality and wearable computing devices such as Google Glass holds much promise in the field of surgery. Copyright 2014, SLACK Incorporated.
Ishikawa, Rie; Fukushima, Hotaka; Frankland, Paul W; Kida, Satoshi
2016-01-01
Forgetting of recent fear memory is promoted by treatment with memantine (MEM), which increases hippocampal neurogenesis. The approaches for treatment of post-traumatic stress disorder (PTSD) using rodent models have focused on the extinction and reconsolidation of recent, but not remote, memories. Here we show that, following prolonged re-exposure to the conditioning context, enhancers of hippocampal neurogenesis, including MEM, promote forgetting of remote contextual fear memory. However, these interventions are ineffective following shorter re-exposures. Importantly, we find that long, but not short re-exposures activate gene expression in the hippocampus and induce hippocampus-dependent reconsolidation of remote contextual fear memory. Furthermore, remote memory retrieval becomes hippocampus-dependent after the long-time recall, suggesting that remote fear memory returns to a hippocampus dependent state after the long-time recall, thereby allowing enhanced forgetting by increased hippocampal neurogenesis. Forgetting of traumatic memory may contribute to the development of PTSD treatment. DOI: http://dx.doi.org/10.7554/eLife.17464.001 PMID:27669409
A high throughput geocomputing system for remote sensing quantitative retrieval and a case study
NASA Astrophysics Data System (ADS)
Xue, Yong; Chen, Ziqiang; Xu, Hui; Ai, Jianwen; Jiang, Shuzheng; Li, Yingjie; Wang, Ying; Guang, Jie; Mei, Linlu; Jiao, Xijuan; He, Xingwei; Hou, Tingting
2011-12-01
The quality and accuracy of remote sensing instruments have been improved significantly, however, rapid processing of large-scale remote sensing data becomes the bottleneck for remote sensing quantitative retrieval applications. The remote sensing quantitative retrieval is a data-intensive computation application, which is one of the research issues of high throughput computation. The remote sensing quantitative retrieval Grid workflow is a high-level core component of remote sensing Grid, which is used to support the modeling, reconstruction and implementation of large-scale complex applications of remote sensing science. In this paper, we intend to study middleware components of the remote sensing Grid - the dynamic Grid workflow based on the remote sensing quantitative retrieval application on Grid platform. We designed a novel architecture for the remote sensing Grid workflow. According to this architecture, we constructed the Remote Sensing Information Service Grid Node (RSSN) with Condor. We developed a graphic user interface (GUI) tools to compose remote sensing processing Grid workflows, and took the aerosol optical depth (AOD) retrieval as an example. The case study showed that significant improvement in the system performance could be achieved with this implementation. The results also give a perspective on the potential of applying Grid workflow practices to remote sensing quantitative retrieval problems using commodity class PCs.
Monitoring glacier change: advances in cross-disciplinary research and data sharing methods
NASA Astrophysics Data System (ADS)
Arendt, A. A.; O'Neel, S.; Cogley, G.; Hill, D. F.; Hood, E. W.
2016-12-01
Recent studies have emphasized the importance of understanding interactions between glacier change and downstream ecosystems, ocean dynamics and human infrastructure. Despite the need for integrated assessments, few in-situ and remote sensing glacier monitoring studies also collect concurrent data on surrounding systems affected by glacier change. In addition, the sharing of glacier datasets across disciplines has often been hampered by limitations in data sharing technologies and a lack of data standardization. Here we provide an overview of recent efforts to facilitate distribution of glacier inventory/change datasets under the framework provided by the Global Terrestrial Network for Glaciers (GTN-G). New, web accessible data products include glacier thickness data and updated glacier extents from the Randolph Glacier Inventory. We also highlight a 2016 data collection effort led by the US Geological Survey on the Wolverine Glacier watershed, Alaska, USA. A large international team collected glaciological, water quality, snow cover, firn composition, vegetation and freshwater ecology data, using remote sensing/in-situ data and model simulations. We summarize preliminary results and outline our use of cloud-computing technologies to coordinate the integration of complex data types across multiple research teams.
Optimization of spectral bands for hyperspectral remote sensing of forest vegetation
NASA Astrophysics Data System (ADS)
Dmitriev, Egor V.; Kozoderov, Vladimir V.
2013-10-01
Optimization principles of accounting for the most informative spectral channels in hyperspectral remote sensing data processing serve to enhance the efficiency of the employed high-productive computers. The problem of pattern recognition of the remotely sensed land surface objects with the accent on the forests is outlined from the point of view of the spectral channels optimization on the processed hyperspectral images. The relevant computational procedures are tested using the images obtained by the produced in Russia hyperspectral camera that was installed on a gyro-stabilized platform to conduct the airborne flight campaigns. The Bayesian classifier is used for the pattern recognition of the forests with different tree species and age. The probabilistically optimal algorithm constructed on the basis of the maximum likelihood principle is described to minimize the probability of misclassification given by this classifier. The classification error is the major category to estimate the accuracy of the applied algorithm by the known holdout cross-validation method. Details of the related techniques are presented. Results are shown of selecting the spectral channels of the camera while processing the images having in mind radiometric distortions that diminish the classification accuracy. The spectral channels are selected of the obtained subclasses extracted from the proposed validation techniques and the confusion matrices are constructed that characterize the age composition of the classified pine species as well as the broad age-class recognition for the pine and birch species with the fully illuminated parts of their crowns.
An Investigation of Multipath Effects on the GPS System During Auto-Rendezvous and Capture
NASA Technical Reports Server (NTRS)
Richie, James E.; Forest, Francis W.
1995-01-01
The proposed use of a Cargo Transport Vehicle (CTV) to carry hardware to the Space Station Freedom (SSF) during the construction phase of the SSF project requires remote maneuvering of the CTV. The CTV is not a manned vehicle. Obtaining the relative positions of the CTV and SSF for remote auto-rendezvous and capture (AR&C) scenarios will rely heavily on the Global Positioning System (GPS). The GPS system is expected to guide the CTV up to a distance of 100 to 300 meters from the SSF. At some point within this range, an optical docking system will take over the remote guidance for capture. During any remote guidance by GPS it is possible that significant multipath signals may be caused by large objects in the vicinity of the module being remotely guided. This could alter the position obtained by the GPS system from the actual position. Due to the nature of the GPS signals, it has been estimated that if the difference in distance between the Line of Sight (LOS) path and the multipath is greater than 300 meters, the GPS system is capable of discriminating between the direct signal and the reflected (or multipath) signal. However, if the path difference is less than 300 meters, one must be concerned. This report details the work accomplished by the Electromagnetic Simulations Laboratory at Marquette University over the period December 1993 to May 1995. This work is an investigation of the strength and phase of a multipath signal arriving at the CTV relative to the direct or line of sight (LOS) signal. The signal originates at a GPS satellite in half geo-stationary orbit and takes two paths to the CTV: (1) the direct or LOS path from the GPS satellite to the CTV; and (2) a scattered path from the GPS satellite to the SSF module and then to the CTV. The scattering from a cylinder has been computed using the physical optics approximation for the current. No other approximations or assumptions have been made including no assumptions regarding the far field or Fresnel field approximations. The integrations required to obtain the scattered field have been computed numerically using an N dimensional Romberg integration. The total scattered electric field is then projected onto the RCP component in the direction of propagation only. The direct or line of sight signal is then used to compute the relative strength and phase of the scattered field. The trajectory of the CTV has been parameterized into 4,214 points that are calculated for each of the geometries investigated. The motion of the CTV between points is small enough for the magnitude data (dB down from direct signal) to appear very smooth; however, because of the distances and wavelengths involved, the phase of the scattered field relative to the direct signal varies very rapidly.
Multimedia as a Means to Enhance Teaching Technical Vocabulary to Physics Undergraduates in Rwanda
ERIC Educational Resources Information Center
Rusanganwa, Joseph
2013-01-01
This study investigates whether the integration of ICT in education can facilitate teaching and learning. An example of such integration is computer assisted language learning (CALL) of English technical vocabulary by undergraduate physics students in Rwanda. The study draws on theories of cognitive load and multimedia learning to explore learning…
ERIC Educational Resources Information Center
Yaacob, Yuzita; Wester, Michael; Steinberg, Stanly
2010-01-01
This paper presents a prototype of a computer learning assistant ILMEV (Interactive Learning-Mathematica Enhanced Vector calculus) package with the purpose of helping students to understand the theory and applications of integration in vector calculus. The main problem for students using Mathematica is to convert a textbook description of a…
Providing Assistive Technology Applications as a Service Through Cloud Computing.
Mulfari, Davide; Celesti, Antonio; Villari, Massimo; Puliafito, Antonio
2015-01-01
Users with disabilities interact with Personal Computers (PCs) using Assistive Technology (AT) software solutions. Such applications run on a PC that a person with a disability commonly uses. However the configuration of AT applications is not trivial at all, especially whenever the user needs to work on a PC that does not allow him/her to rely on his / her AT tools (e.g., at work, at university, in an Internet point). In this paper, we discuss how cloud computing provides a valid technological solution to enhance such a scenario.With the emergence of cloud computing, many applications are executed on top of virtual machines (VMs). Virtualization allows us to achieve a software implementation of a real computer able to execute a standard operating system and any kind of application. In this paper we propose to build personalized VMs running AT programs and settings. By using the remote desktop technology, our solution enables users to control their customized virtual desktop environment by means of an HTML5-based web interface running on any computer equipped with a browser, whenever they are.
Flood Management Enhancement Using Remotely Sensed Data
NASA Technical Reports Server (NTRS)
Romanowski, Gregory J.
1997-01-01
SENTAR, Inc., entered into a cooperative agreement with NASA Goddard Space Flight Center (GSFC) in December 1994. The intent of the NASA Cooperative Agreement was to stimulate broad public use, via the Internet, of the very large remote sensing databases maintained by NASA and other agencies, thus stimulating U.S. economic growth, improving the quality of life, and contributing to the implementation of a National Information Infrastructure. SENTAR headed a team of collaborating organizations in meeting the goals of this project. SENTAR's teammates were the NASA Marshall Space Flight Center (MSFC) Global Hydrology and Climate Center (GHCC), the U.S. Army Space and Strategic Defense Command (USASSDC), and the Alabama Emergency Management Agency (EMA). For this cooperative agreement, SENTAR and its teammates accessed remotely sensed data in the Distributed Active Archive Centers, and other available sources, for use in enhancing the present capabilities for flood disaster management by the Alabama EMA. The project developed a prototype software system for addressing prediction, warning, and damage assessment for floods, though it currently focuses on assessment. The objectives of the prototype system were to demonstrate the added value of remote sensing data for emergency management operations during floods and the ability of the Internet to provide the primary communications medium for the system. To help achieve these objectives, SENTAR developed an integrated interface for the emergency operations staff to simplify acquiring and manipulating source data and data products for use in generating new data products. The prototype system establishes a systems infrastructure designed to expand to include future flood-related data and models or to include other disasters with their associated remote sensing data requirements and distributed data sources. This report covers the specific work performed during the seventh, and final, milestone period of the project, which began on 1 October 1996 and ended on 31 January 1997. In addition, it provides a summary of the entire project.
Scattering and radiative properties of complex soot and soot-containing particles
NASA Astrophysics Data System (ADS)
Liu, L.; Mishchenko, M. I.; Mackowski, D. W.; Dlugach, J.
2012-12-01
Tropospheric soot and soot containing aerosols often exhibit nonspherical overall shapes and complex morphologies. They can externally, semi-externally, and internally mix with other aerosol species. This poses a tremendous challenge in particle characterization, remote sensing, and global climate modeling studies. To address these challenges, we used the new numerically exact public-domain Fortran-90 code based on the superposition T-matrix method (STMM) and other theoretical models to analyze the potential effects of aggregation and heterogeneity on light scattering and absorption by morphologically complex soot containing particles. The parameters we computed include the whole scattering matrix elements, linear depolarization ratios, optical cross-sections, asymmetry parameters, and single scattering albedos. It is shown that the optical characteristics of soot and soot containing aerosols very much depend on particle sizes, compositions, and aerosol overall shapes. The soot particle configurations and heterogeneities can have a substantial effect that can result in a significant enhancement of extinction and absorption relative to those computed from the Lorenz-Mie theory. Meanwhile the model calculated information combined with in-situ and remote sensed data can be used to constrain soot particle shapes and sizes which are much needed in climate models.
The 2009 DOD Cost Research Workshop: Acquisition Reform
2010-02-01
2 ACEIT Enhancement, Help-Desk/Training, Consulting DASA-CE–3 Command, Control, Communications, Computers, Intelligence, Surveillance, and...Management Information System (OSMIS) online interactive relational database DASA-CE–2 Title: ACEIT Enhancement, Help-Desk/Training, Consulting Summary...support and training for the Automated Cost estimator Integrated Tools ( ACEIT ) software suite. ACEIT is the Army standard suite of analytical tools for
Engineering Research and Development and Technology thrust area report FY92
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langland, R.T.; Minichino, C.
1993-03-01
The mission of the Engineering Research, Development, and Technology Program at Lawrence Livermore National Laboratory (LLNL) is to develop the technical staff and the technology needed to support current and future LLNL programs. To accomplish this mission, the Engineering Research, Development, and Technology Program has two important goals: (1) to identify key technologies and (2) to conduct high-quality work to enhance our capabilities in these key technologies. To help focus our efforts, we identify technology thrust areas and select technical leaders for each area. The thrust areas are integrated engineering activities and, rather than being based on individual disciplines, theymore » are staffed by personnel from Electronics Engineering, Mechanical Engineering, and other LLNL organizations, as appropriate. The thrust area leaders are expected to establish strong links to LLNL program leaders and to industry; to use outside and inside experts to review the quality and direction of the work; to use university contacts to supplement and complement their efforts; and to be certain that we are not duplicating the work of others. This annual report, organized by thrust area, describes activities conducted within the Program for the fiscal year 1992. Its intent is to provide timely summaries of objectives, theories, methods, and results. The nine thrust areas for this fiscal year are: Computational Electronics and Electromagnetics; Computational Mechanics; Diagnostics and Microelectronics; Emerging Technologies; Fabrication Technology; Materials Science and Engineering; Microwave and Pulsed Power; Nondestructive Evaluation; and Remote Sensing and Imaging, and Signal Engineering.« less
Remote health monitoring of heart failure with data mining via CART method on HRV features.
Pecchia, Leandro; Melillo, Paolo; Bracale, Marcello
2011-03-01
Disease management programs, which use no advanced information and computer technology, are as effective as telemedicine but more efficient because less costly. We proposed a platform to enhance effectiveness and efficiency of home monitoring using data mining for early detection of any worsening in patient's condition. These worsenings could require more complex and expensive care if not recognized. In this letter, we briefly describe the remote health monitoring platform we designed and realized, which supports heart failure (HF) severity assessment offering functions of data mining based on the classification and regression tree method. The system developed achieved accuracy and a precision of 96.39% and 100.00% in detecting HF and of 79.31% and 82.35% in distinguishing severe versus mild HF, respectively. These preliminary results were achieved on public databases of signals to improve their reproducibility. Clinical trials involving local patients are still running and will require longer experimentation.
Nondestructive remote sensing of hazardous waste sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weil, G.J.; Graf, R.J.
1994-12-31
In the past government and private industry have produced hazardous waste in ever increasing quantities. These untold millions of tons of environmentally dangerous wastes have been disposed of by undocumented burial, simple carelessness and purposeful abandonment. Society has recently dictated that before new construction may be initiated, these wastes must be found and cleaned up. The first step is to locate these undocumented waste depositories. The non-contact, nondestructive, remote sensing techniques, of Computer Enhanced Infrared Thermography and Ground Penetrating Radar, may be used to detect buried waste sites, buried tanks/pits, and tank/pit leak plumes. These technologies may be used frommore » mobile vehicles, helicopters or man-portable systems and are able to cover tens of acres per day depending upon the system fusion method used. This relatively new combination of technologies, win be described in theory, by procedure and the use of case studies based upon successful projects.« less
Revealing livestock effects on bunchgrass vegetation with Landsat ETM+ data across a grazing season
NASA Astrophysics Data System (ADS)
Jansen, Vincent S.
Remote sensing provides monitoring solutions for more informed grazing management. To investigate the ability to detect the effects of cattle grazing on bunchgrass vegetation with Landsat Enhanced Thematic Mapper Plus (ETM+) data, we conducted a study on the Zumwalt Prairie in northeastern Oregon across a gradient of grazing intensities. Biophysical vegetation data was collected on vertical structure, biomass, and cover at three different time periods during the grazing season: June, August, and October 2012. To relate these measures to the remotely sensed Landsat ETM+ data, Pearson's correlations and multiple regression models were computed. Using the best models, predicted vegetation metrics were then mapped across the study area. Results indicated that models using common vegetation indices had the ability to discern different levels of grazing across the study area. Results can be distributed to land managers to help guide grassland conservation by improving monitoring of bunchgrass vegetation for sustainable livestock management.
NASA Astrophysics Data System (ADS)
Chirayath, V.
2014-12-01
Fluid Lensing is a theoretical model and algorithm I present for fluid-optical interactions in turbulent flows as well as two-fluid surface boundaries that, when coupled with an unique computer vision and image-processing pipeline, may be used to significantly enhance the angular resolution of a remote sensing optical system with applicability to high-resolution 3D imaging of subaqueous regions and through turbulent fluid flows. This novel remote sensing technology has recently been implemented on a quadcopter-based UAS for imaging shallow benthic systems to create the first dataset of a biosphere with unprecedented sub-cm-level imagery in 3D over areas as large as 15 square kilometers. Perturbed two-fluid boundaries with different refractive indices, such as the surface between the ocean and air, may be exploited for use as lensing elements for imaging targets on either side of the interface with enhanced angular resolution. I present theoretical developments behind Fluid Lensing and experimental results from its recent implementation for the Reactive Reefs project to image shallow reef ecosystems at cm scales. Preliminary results from petabyte-scale aerial survey efforts using Fluid Lensing to image at-risk coral reefs in American Samoa (August, 2013) show broad applicability to large-scale automated species identification, morphology studies and reef ecosystem characterization for shallow marine environments and terrestrial biospheres, of crucial importance to understanding climate change's impact on coastal zones, global oxygen production and carbon sequestration.
Xiaohui Zhang; George Ball; Eve Halper
2000-01-01
This paper presents an integrated system to support urban natural resource management. With the application of remote sensing (RS) and geographic information systems (GIS), the paper emphasizes the methodology of integrating information technology and a scientific basis to support ecosystem-based management. First, a systematic integration framework is developed and...
Local Data Integration in East Central Florida
NASA Technical Reports Server (NTRS)
Case, Jonathan L.; Manobianco, John T.
1999-01-01
The Applied Meteorology Unit has configured a Local Data Integration System (LDIS) for east central Florida which assimilates in-situ and remotely-sensed observational data into a series of high-resolution gridded analyses. The ultimate goal for running LDIS is to generate products that may enhance weather nowcasts and short-range (less than 6 h) forecasts issued in support of the 45th Weather Squadron (45 WS), Spaceflight Meteorology Group (SMG), and the Melbourne National Weather Service (NWS MLB) operational requirements. LDIS has the potential to provide added value for nowcasts and short-ten-n forecasts for two reasons. First, it incorporates all data operationally available in east central Florida. Second, it is run at finer spatial and temporal resolutions than current national-scale operational models such as the Rapid Update Cycle and Eta models. LDIS combines all available data to produce grid analyses of primary variables (wind, temperature, etc.) at specified temporal and spatial resolutions. These analyses of primary variables can be used to compute diagnostic quantities such as vorticity and divergence. This paper demonstrates the utility of LDIS over east central Florida for a warm season case study. The evolution of a significant thunderstorm outflow boundary is depicted through horizontal and vertical cross section plots of wind speed, divergence, and circulation. In combination with a suitable visualization too], LDIS may provide users with a more complete and comprehensive understanding of evolving mesoscale weather than could be developed by individually examining the disparate data sets over the same area and time.
A Browser-Server-Based Tele-audiology System That Supports Multiple Hearing Test Modalities
Yao, Daoyuan; Givens, Gregg
2015-01-01
Abstract Introduction: Millions of global citizens suffering from hearing disorders have limited or no access to much needed hearing healthcare. Although tele-audiology presents a solution to alleviate this problem, existing remote hearing diagnosis systems support only pure-tone tests, leaving speech and other test procedures unsolved, due to the lack of software and hardware to enable communication required between audiologists and their remote patients. This article presents a comprehensive remote hearing test system that integrates the two most needed hearing test procedures: a pure-tone audiogram and a speech test. Materials and Methods: This enhanced system is composed of a Web application server, an embedded smart Internet-Bluetooth® (Bluetooth SIG, Kirkland, WA) gateway (or console device), and a Bluetooth-enabled audiometer. Several graphical user interfaces and a relational database are hosted on the application server. The console device has been designed to support the tests and auxiliary communication between the local site and the remote site. Results: The study was conducted at an audiology laboratory. Pure-tone audiogram and speech test results from volunteers tested with this tele-audiology system are comparable with results from the traditional face-to-face approach. Conclusions: This browser-server–based comprehensive tele-audiology offers a flexible platform to expand hearing services to traditionally underserved groups. PMID:25919376
A Browser-Server-Based Tele-audiology System That Supports Multiple Hearing Test Modalities.
Yao, Jianchu Jason; Yao, Daoyuan; Givens, Gregg
2015-09-01
Millions of global citizens suffering from hearing disorders have limited or no access to much needed hearing healthcare. Although tele-audiology presents a solution to alleviate this problem, existing remote hearing diagnosis systems support only pure-tone tests, leaving speech and other test procedures unsolved, due to the lack of software and hardware to enable communication required between audiologists and their remote patients. This article presents a comprehensive remote hearing test system that integrates the two most needed hearing test procedures: a pure-tone audiogram and a speech test. This enhanced system is composed of a Web application server, an embedded smart Internet-Bluetooth(®) (Bluetooth SIG, Kirkland, WA) gateway (or console device), and a Bluetooth-enabled audiometer. Several graphical user interfaces and a relational database are hosted on the application server. The console device has been designed to support the tests and auxiliary communication between the local site and the remote site. The study was conducted at an audiology laboratory. Pure-tone audiogram and speech test results from volunteers tested with this tele-audiology system are comparable with results from the traditional face-to-face approach. This browser-server-based comprehensive tele-audiology offers a flexible platform to expand hearing services to traditionally underserved groups.
NASA Technical Reports Server (NTRS)
1995-01-01
Through the Earth Observation Commercial Applications Program (EOCAP) at Stennis Space Center, Applied Analysis, Inc. developed a new tool for analyzing remotely sensed data. The Applied Analysis Spectral Analytical Process (AASAP) detects or classifies objects smaller than a pixel and removes the background. This significantly enhances the discrimination among surface features in imagery. ERDAS, Inc. offers the system as a modular addition to its ERDAS IMAGINE software package for remote sensing applications. EOCAP is a government/industry cooperative program designed to encourage commercial applications of remote sensing. Projects can run three years or more and funding is shared by NASA and the private sector participant. Through the Earth Observation Commercial Applications Program (EOCAP), Ocean and Coastal Environmental Sensing (OCENS) developed SeaStation for marine users. SeaStation is a low-cost, portable, shipboard satellite groundstation integrated with vessel catch and product monitoring software. Linked to the Global Positioning System, SeaStation provides real time relationships between vessel position and data such as sea surface temperature, weather conditions and ice edge location. This allows the user to increase fishing productivity and improve vessel safety. EOCAP is a government/industry cooperative program designed to encourage commercial applications of remote sensing. Projects can run three years or more and funding is shared by NASA and the private sector participant.
Evaluation of a Remote Training Approach for Teaching Seniors to Use a Telehealth System
Lai, Albert M.; Kaufman, David R.; Starren, Justin; Shea, Steven
2009-01-01
Objective There has been a growth of home health care technology in rural areas. However, a significant limitation has been the need for costly and repetitive training in order for patients to efficiently use their home telemedicine unit (HTU). This research describes the evaluation of an architecture for remote training of patients in a telemedicine environment. This work examines the viability of a remote training architecture called REmote Patient Education in a Telemedicine Environment (REPETE). REPETE was implemented and evaluated in the context of the IDEATel project, a large-scale telemedicine project, focusing on Medicare beneficiaries with diabetes in New York State. Methods A number of qualitative and quantitative evaluation tools were developed and used to study the effectiveness of the remote training sessions evaluating: a) task complexity, b) changes in patient performance and c) the communication between trainer and patient. Specifically, the effectiveness of the training was evaluated using a measure of web skills competency, a user satisfaction survey, a cognitive task analysis and an interaction analysis. Results Patients not only reported that the training was beneficial, but also showed significant improvements in their ability to effectively perform tasks. Our qualitative evaluations scrutinizing the interaction between the trainer and patient showed that while there was a learning curve for both the patient and trainer when negotiating the shared workspace, the mutually visible pointer used in REPETE enhanced the computer-mediated instruction. Conclusions REPETE is an effective remote training tool for older adults in the telemedicine environment. Patients demonstrated significant improvements in their ability to perform tasks on their home telemedicine unit. PMID:19620023
Evaluation of a remote training approach for teaching seniors to use a telehealth system.
Lai, Albert M; Kaufman, David R; Starren, Justin; Shea, Steven
2009-11-01
There has been a growth of home healthcare technology in rural areas. However, a significant limitation has been the need for costly and repetitive training in order for patients to efficiently use their home telemedicine unit (HTU). This research describes the evaluation of an architecture for remote training of patients in a telemedicine environment. This work examines the viability of a remote training architecture called REmote Patient Education in a Telemedicine Environment (REPETE). REPETE was implemented and evaluated in the context of the IDEATel project, a large-scale telemedicine project, focusing on Medicare beneficiaries with diabetes in New York State. A number of qualitative and quantitative evaluation tools were developed and used to study the effectiveness of the remote training sessions evaluating: (a) task complexity, (b) changes in patient performance and (c) the communication between trainer and patient. Specifically, the effectiveness of the training was evaluated using a measure of web skills competency, a user satisfaction survey, a cognitive task analysis and an interaction analysis. Patients not only reported that the training was beneficial, but also showed significant improvements in their ability to effectively perform tasks. Our qualitative evaluations scrutinizing the interaction between the trainer and patient showed that while there was a learning curve for both the patient and trainer when negotiating the shared workspace, the mutually visible pointer used in REPETE enhanced the computer-mediated instruction. REPETE is an effective remote training tool for older adults in the telemedicine environment. Patients demonstrated significant improvements in their ability to perform tasks on their home telemedicine unit.
Using NetMeeting for remote configuration of the Otto Bock C-Leg: technical considerations.
Lemaire, E D; Fawcett, J A
2002-08-01
Telehealth has the potential to be a valuable tool for technical and clinical support of computer controlled prosthetic devices. This pilot study examined the use of Internet-based, desktop video conferencing for remote configuration of the Otto Bock C-Leg. Laboratory tests involved connecting two computers running Microsoft NetMeeting over a local area network (IP protocol). Over 56 Kbs(-1), DSL/Cable, and 10 Mbs(-1) LAN speeds, a prosthetist remotely configured a user's C-Leg by using Application Sharing, Live Video, and Live Audio. A similar test between sites in Ottawa and Toronto, Canada was limited by the notebook computer's 28 Kbs(-1) modem. At the 28 Kbs(-1) Internet-connection speed, NetMeeting's application sharing feature was not able to update the remote Sliders window fast enough to display peak toe loads and peak knee angles. These results support the use of NetMeeting as an accessible and cost-effective tool for remote C-Leg configuration, provided that sufficient Internet data transfer speed is available.
Lab4CE: A Remote Laboratory for Computer Education
ERIC Educational Resources Information Center
Broisin, Julien; Venant, Rémi; Vidal, Philippe
2017-01-01
Remote practical activities have been demonstrated to be efficient when learners come to acquire inquiry skills. In computer science education, virtualization technologies are gaining popularity as this technological advance enables instructors to implement realistic practical learning activities, and learners to engage in authentic and…
Tunable quantum interference in a 3D integrated circuit.
Chaboyer, Zachary; Meany, Thomas; Helt, L G; Withford, Michael J; Steel, M J
2015-04-27
Integrated photonics promises solutions to questions of stability, complexity, and size in quantum optics. Advances in tunable and non-planar integrated platforms, such as laser-inscribed photonics, continue to bring the realisation of quantum advantages in computation and metrology ever closer, perhaps most easily seen in multi-path interferometry. Here we demonstrate control of two-photon interference in a chip-scale 3D multi-path interferometer, showing a reduced periodicity and enhanced visibility compared to single photon measurements. Observed non-classical visibilities are widely tunable, and explained well by theoretical predictions based on classical measurements. With these predictions we extract Fisher information approaching a theoretical maximum. Our results open a path to quantum enhanced phase measurements.
Legacy model integration for enhancing hydrologic interdisciplinary research
NASA Astrophysics Data System (ADS)
Dozier, A.; Arabi, M.; David, O.
2013-12-01
Many challenges are introduced to interdisciplinary research in and around the hydrologic science community due to advances in computing technology and modeling capabilities in different programming languages, across different platforms and frameworks by researchers in a variety of fields with a variety of experience in computer programming. Many new hydrologic models as well as optimization, parameter estimation, and uncertainty characterization techniques are developed in scripting languages such as Matlab, R, Python, or in newer languages such as Java and the .Net languages, whereas many legacy models have been written in FORTRAN and C, which complicates inter-model communication for two-way feedbacks. However, most hydrologic researchers and industry personnel have little knowledge of the computing technologies that are available to address the model integration process. Therefore, the goal of this study is to address these new challenges by utilizing a novel approach based on a publish-subscribe-type system to enhance modeling capabilities of legacy socio-economic, hydrologic, and ecologic software. Enhancements include massive parallelization of executions and access to legacy model variables at any point during the simulation process by another program without having to compile all the models together into an inseparable 'super-model'. Thus, this study provides two-way feedback mechanisms between multiple different process models that can be written in various programming languages and can run on different machines and operating systems. Additionally, a level of abstraction is given to the model integration process that allows researchers and other technical personnel to perform more detailed and interactive modeling, visualization, optimization, calibration, and uncertainty analysis without requiring deep understanding of inter-process communication. To be compatible, a program must be written in a programming language with bindings to a common implementation of the message passing interface (MPI), which includes FORTRAN, C, Java, the .NET languages, Python, R, Matlab, and many others. The system is tested on a longstanding legacy hydrologic model, the Soil and Water Assessment Tool (SWAT), to observe and enhance speed-up capabilities for various optimization, parameter estimation, and model uncertainty characterization techniques, which is particularly important for computationally intensive hydrologic simulations. Initial results indicate that the legacy extension system significantly decreases developer time, computation time, and the cost of purchasing commercial parallel processing licenses, while enhancing interdisciplinary research by providing detailed two-way feedback mechanisms between various process models with minimal changes to legacy code.
Ionospsheric observation of enhanced convection-initiated gravity waves during tornadic storms
NASA Technical Reports Server (NTRS)
Hung, R. J.
1981-01-01
Atmospheric gravity waves associated with tornadoes, with locally severe storms occuring with tornadoes, and with hurricanes were studied through the coupling between the ionosphere and the troposphere. Reverse group ray tracing computations of gravity waves observed by an ionospheric Doppler sounder array were analyzed. The results of ray tracing computations and comparisons between the computed location of the wave sources and with conventional meteorological data indicate that the computed sources of the waves were near the touchdown of the tornadoes, near the eye of the hurricanes, and directly on the squall line of the severe thunderstorms. The signals excited occurred one hour in advance of the tornadoes and three hours in advance of the hurricanes. Satellite photographs show convective overshooting turrets occurring at the same locations and times the gravity waves were being excited. It is suggested that gravity wave observations, conventional meteorological data, and satellite photographs be combined to develop a remote sensing technique for detecting severe storms.
An introduction to quantitative remote sensing. [data processing
NASA Technical Reports Server (NTRS)
Lindenlaub, J. C.; Russell, J.
1974-01-01
The quantitative approach to remote sensing is discussed along with the analysis of remote sensing data. Emphasis is placed on the application of pattern recognition in numerically oriented remote sensing systems. A common background and orientation for users of the LARS computer software system is provided.
Ardö, Jonas
2015-12-01
Africa is an important part of the global carbon cycle. It is also a continent facing potential problems due to increasing resource demand in combination with climate change-induced changes in resource supply. Quantifying the pools and fluxes constituting the terrestrial African carbon cycle is a challenge, because of uncertainties in meteorological driver data, lack of validation data, and potentially uncertain representation of important processes in major ecosystems. In this paper, terrestrial primary production estimates derived from remote sensing and a dynamic vegetation model are compared and quantified for major African land cover types. Continental gross primary production estimates derived from remote sensing were higher than corresponding estimates derived from a dynamic vegetation model. However, estimates of continental net primary production from remote sensing were lower than corresponding estimates from the dynamic vegetation model. Variation was found among land cover classes, and the largest differences in gross primary production were found in the evergreen broadleaf forest. Average carbon use efficiency (NPP/GPP) was 0.58 for the vegetation model and 0.46 for the remote sensing method. Validation versus in situ data of aboveground net primary production revealed significant positive relationships for both methods. A combination of the remote sensing method with the dynamic vegetation model did not strongly affect this relationship. Observed significant differences in estimated vegetation productivity may have several causes, including model design and temperature sensitivity. Differences in carbon use efficiency reflect underlying model assumptions. Integrating the realistic process representation of dynamic vegetation models with the high resolution observational strength of remote sensing may support realistic estimation of components of the carbon cycle and enhance resource monitoring, providing suitable validation data is available.
NASA Technical Reports Server (NTRS)
Zwick, H.; Ward, V.; Beaudette, L.
1973-01-01
A critical evaluation of existing optical remote sensors for HCl vapor detection in solid propellant rocket plumes is presented. The P branch of the fundamental vibration-rotation band was selected as the most promising spectral feature to sense. A computation of transmittance for HCl vapor, an estimation of interferent spectra, the application of these spectra to computer modelled remote sensors, and a trade-off study for instrument recommendation are also included.
Design of Remote GPRS-based Gas Data Monitoring System
NASA Astrophysics Data System (ADS)
Yan, Xiyue; Yang, Jianhua; Lu, Wei
2018-01-01
In order to solve the problem of remote data transmission of gas flowmeter, and realize unattended operation on the spot, an unattended remote monitoring system based on GPRS for gas data is designed in this paper. The slave computer of this system adopts embedded microprocessor to read data of gas flowmeter through rs-232 bus and transfers it to the host computer through DTU. In the host computer, the VB program dynamically binds the Winsock control to receive and parse data. By using dynamic data exchange, the Kingview configuration software realizes history trend curve, real time trend curve, alarm, print, web browsing and other functions.
General-Purpose Serial Interface For Remote Control
NASA Technical Reports Server (NTRS)
Busquets, Anthony M.; Gupton, Lawrence E.
1990-01-01
Computer controls remote television camera. General-purpose controller developed to serve as interface between host computer and pan/tilt/zoom/focus functions on series of automated video cameras. Interface port based on 8251 programmable communications-interface circuit configured for tristated outputs, and connects controller system to any host computer with RS-232 input/output (I/O) port. Accepts byte-coded data from host, compares them with prestored codes in read-only memory (ROM), and closes or opens appropriate switches. Six output ports control opening and closing of as many as 48 switches. Operator controls remote television camera by speaking commands, in system including general-purpose controller.
Estimating costs and performance of systems for machine processing of remotely sensed data
NASA Technical Reports Server (NTRS)
Ballard, R. J.; Eastwood, L. F., Jr.
1977-01-01
This paper outlines a method for estimating computer processing times and costs incurred in producing information products from digital remotely sensed data. The method accounts for both computation and overhead, and may be applied to any serial computer. The method is applied to estimate the cost and computer time involved in producing Level II Land Use and Vegetative Cover Maps for a five-state midwestern region. The results show that the amount of data to be processed overloads some example computer systems, but that the processing is feasible on others.
Earth Science Informatics - Overview
NASA Technical Reports Server (NTRS)
Ramapriyan, H. K.
2015-01-01
Over the last 10-15 years, significant advances have been made in information management, there are an increasing number of individuals entering the field of information management as it applies to Geoscience and Remote Sensing data, and the field of informatics has come to its own. Informatics is the science and technology of applying computers and computational methods to the systematic analysis, management, interchange, and representation of science data, information, and knowledge. Informatics also includes the use of computers and computational methods to support decision making and applications. Earth Science Informatics (ESI, a.k.a. geoinformatics) is the application of informatics in the Earth science domain. ESI is a rapidly developing discipline integrating computer science, information science, and Earth science. Major national and international research and infrastructure projects in ESI have been carried out or are on-going. Notable among these are: the Global Earth Observation System of Systems (GEOSS), the European Commissions INSPIRE, the U.S. NSDI and Geospatial One-Stop, the NASA EOSDIS, and the NSF DataONE, EarthCube and Cyberinfrastructure for Geoinformatics. More than 18 departments and agencies in the U.S. federal government have been active in Earth science informatics. All major space agencies in the world, have been involved in ESI research and application activities. In the United States, the Federation of Earth Science Information Partners (ESIP), whose membership includes nearly 150 organizations (government, academic and commercial) dedicated to managing, delivering and applying Earth science data, has been working on many ESI topics since 1998. The Committee on Earth Observation Satellites (CEOS)s Working Group on Information Systems and Services (WGISS) has been actively coordinating the ESI activities among the space agencies. Remote Sensing; Earth Science Informatics, Data Systems; Data Services; Metadata
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gao, Wen-Yang; Cai, Rong; Pham, Tony
Copper paddlewheel based molecular building blocks (MBBs) are ubiquitous and have been widely employed for the construction of highly porous metal–organic frameworks (MOFs). However, most copper paddlewheel based MOFs fail to retain their structural integrity in the presence of water. This instability is directly correlated to the plausible displacement of coordinating carboxylates in the copper paddlewheel MBB, [Cu₂(O₂C-)₄], by the strongly coordinating water molecules. In this comprehensive study, we illustrate the chemical stability control in the rht-MOF platform via strengthening the coordinating bonds within the triangular inorganic MBB, [Cu₃O(N 4–x(CH) xC-)₃] (x = 0, 1, or 2). Remotely, the chemicalmore » stabilization propagated into the paddlewheel MBB to afford isoreticular rht-MOFs with remarkably enhanced water/chemical stabilities compared to the prototypal rht-MOF-1.« less
NASA Technical Reports Server (NTRS)
Raymond, William H.; Olson, William S.
1990-01-01
Delay in the spin-up of precipitation early in numerical atmospheric forecasts is a deficiency correctable by diabatic initialization combined with diabatic forcing. For either to be effective requires some knowledge of the magnitude and vertical placement of the latent heating fields. Until recently the best source of cloud and rain water data was the remotely sensed vertical integrated precipitation rate or liquid water content. Vertical placement of the condensation remains unknown. Some information about the vertical distribution of the heating rates and precipitating liquid water and ice can be obtained from retrieval techniques that use a physical model of precipitating clouds to refine and improve the interpretation of the remotely sensed data. A description of this procedure and an examination of its 3-D liquid water products, along with improved modeling methods that enhance or speed-up storm development is discussed.
Evaluation of the Next-Gen Exercise Software Interface in the NEEMO Analog
NASA Technical Reports Server (NTRS)
Hanson, Andrea; Kalogera, Kent; Sandor, Aniko; Hardy, Marc; Frank, Andrew; English, Kirk; Williams, Thomas; Perera, Jeevan; Amonette, William
2017-01-01
NSBRI (National Space Biomedical Research Institute) funded research grant to develop the 'NextGen' exercise software for the NEEMO (NASA Extreme Environment Mission Operations) analog. Develop a software architecture to integrate instructional, motivational and socialization techniques into a common portal to enhance exercise countermeasures in remote environments. Increase user efficiency and satisfaction, and institute commonality across multiple exercise systems. Utilized GUI (Graphical User Interface) design principals focused on intuitive ease of use to minimize training time and realize early user efficiency. Project requirement to test the software in an analog environment. Top Level Project Aims: 1) Improve the usability of crew interface software to exercise CMS (Crew Management System) through common app-like interfaces. 2) Introduce virtual instructional motion training. 3) Use virtual environment to provide remote socialization with family and friends, improve exercise technique, adherence, motivation and ultimately performance outcomes.
Commodity Cluster Computing for Remote Sensing Applications using Red Hat LINUX
NASA Technical Reports Server (NTRS)
Dorband, John
2003-01-01
Since 1994, we have been doing research at Goddard Space Flight Center on implementing a wide variety of applications on commodity based computing clusters. This talk is about these clusters and haw they are used on these applications including ones for remote sensing.
RADIAL COMPUTED TOMOGRAPHY OF AIR CONTAMINANTS USING OPTICAL REMOTE SENSING
The paper describes the application of an optical remote-sensing (ORS) system to map air contaminants and locate fugitive emissions. Many ORD systems may utilize radial non-overlapping beam geometry and a computed tomography (CT) algorithm to map the concentrations in a plane. In...
ERIC Educational Resources Information Center
Ray, Darrell L.
2013-01-01
Students often enter biology programs deficient in the math and computational skills that would enhance their attainment of a deeper understanding of the discipline. To address some of these concerns, I developed a series of spreadsheet simulation exercises that focus on some of the mathematical foundations of scientific inquiry and the benefits…
Technology Needs for Teachers Web Development and Curriculum Adaptations
NASA Technical Reports Server (NTRS)
Carroll, Christy J.
1999-01-01
Computer-based mathematics and science curricula focusing on NASA inventions and technologies will enhance current teacher knowledge and skills. Materials and interactive software developed by educators will allow students to integrate their various courses, to work cooperatively, and to collaborate with both NASA scientists and students at other locations by using computer networks, email and the World Wide Web.
LVFS: A Big Data File Storage Bridge for the HPC Community
NASA Astrophysics Data System (ADS)
Golpayegani, N.; Halem, M.; Mauoka, E.; Fonseca, L. F.
2015-12-01
Merging Big Data capabilities into High Performance Computing architecture starts at the file storage level. Heterogeneous storage systems are emerging which offer enhanced features for dealing with Big Data such as the IBM GPFS storage system's integration into Hadoop Map-Reduce. Taking advantage of these capabilities requires file storage systems to be adaptive and accommodate these new storage technologies. We present the extension of the Lightweight Virtual File System (LVFS) currently running as the production system for the MODIS Level 1 and Atmosphere Archive and Distribution System (LAADS) to incorporate a flexible plugin architecture which allows easy integration of new HPC hardware and/or software storage technologies without disrupting workflows, system architectures and only minimal impact on existing tools. We consider two essential aspects provided by the LVFS plugin architecture needed for the future HPC community. First, it allows for the seamless integration of new and emerging hardware technologies which are significantly different than existing technologies such as Segate's Kinetic disks and Intel's 3DXPoint non-volatile storage. Second is the transparent and instantaneous conversion between new software technologies and various file formats. With most current storage system a switch in file format would require costly reprocessing and nearly doubling of storage requirements. We will install LVFS on UMBC's IBM iDataPlex cluster with a heterogeneous storage architecture utilizing local, remote, and Seagate Kinetic storage as a case study. LVFS merges different kinds of storage architectures to show users a uniform layout and, therefore, prevent any disruption in workflows, architecture design, or tool usage. We will show how LVFS will convert HDF data produced by applying machine learning algorithms to Xco2 Level 2 data from the OCO-2 satellite to produce CO2 surface fluxes into GeoTIFF for visualization.
NASA Astrophysics Data System (ADS)
Sushkevich, T. A.
2017-11-01
60 years ago, on 4 October 1957, the USSR successfully launched into space the FIRST SPUTNIK (artificial Earth satellite). From this date begins the countdown of the space age. Information and mathematical software is an integral component of any space project. Discusses the history and future of space exploration and the role of mathematics and computers. For illustration, presents a large list of publications. It is important to pay attention to the role of mathematics and computer science in space projects and research, remote sensing problems, the evolution of the Earth's environment and climate, where the theory of radiation transfer plays a key role, and the achievements of Russian scientists at the dawn of the space age.
Optimization of knowledge-based systems and expert system building tools
NASA Technical Reports Server (NTRS)
Yasuda, Phyllis; Mckellar, Donald
1993-01-01
The objectives of the NASA-AMES Cooperative Agreement were to investigate, develop, and evaluate, via test cases, the system parameters and processing algorithms that constrain the overall performance of the Information Sciences Division's Artificial Intelligence Research Facility. Written reports covering various aspects of the grant were submitted to the co-investigators for the grant. Research studies concentrated on the field of artificial intelligence knowledge-based systems technology. Activities included the following areas: (1) AI training classes; (2) merging optical and digital processing; (3) science experiment remote coaching; (4) SSF data management system tests; (5) computer integrated documentation project; (6) conservation of design knowledge project; (7) project management calendar and reporting system; (8) automation and robotics technology assessment; (9) advanced computer architectures and operating systems; and (10) honors program.
Physics through the 1990s: Scientific interfaces and technological applications
NASA Technical Reports Server (NTRS)
1986-01-01
The volume examines the scientific interfaces and technological applications of physics. Twelve areas are dealt with: biological physics-biophysics, the brain, and theoretical biology; the physics-chemistry interface-instrumentation, surfaces, neutron and synchrotron radiation, polymers, organic electronic materials; materials science; geophysics-tectonics, the atmosphere and oceans, planets, drilling and seismic exploration, and remote sensing; computational physics-complex systems and applications in basic research; mathematics-field theory and chaos; microelectronics-integrated circuits, miniaturization, future trends; optical information technologies-fiber optics and photonics; instrumentation; physics applications to energy needs and the environment; national security-devices, weapons, and arms control; medical physics-radiology, ultrasonics, MNR, and photonics. An executive summary and many chapters contain recommendations regarding funding, education, industry participation, small-group university research and large facility programs, government agency programs, and computer database needs.
NASA/RAE collaboration on nonlinear control using the F-8C digital fly-by-wire aircraft
NASA Technical Reports Server (NTRS)
Butler, G. F.; Corbin, M. J.; Mepham, S.; Stewart, J. F.; Larson, R. R.
1983-01-01
Design procedures are reviewed for variable integral control to optimize response (VICTOR) algorithms and results of preliminary flight tests are presented. The F-8C aircraft is operated in the remotely augmented vehicle (RAV) mode, with the control laws implemented as FORTRAN programs on a ground-based computer. Pilot commands and sensor information are telemetered to the ground, where the data are processed to form surface commands which are then telemetered back to the aircraft. The RAV mode represents a singlestring (simplex) system and is therefore vulnerable to a hardover since comparison monitoring is not possible. Hence, extensive error checking is conducted on both the ground and airborne computers to prevent the development of potentially hazardous situations. Experience with the RAV monitoring and validation procedures is described.
NASA Technical Reports Server (NTRS)
1981-01-01
The Space Transportation System (STS) is discussed, including the launch processing system, the thermal protection subsystem, meteorological research, sound supression water system, rotating service structure, improved hypergol or removal systems, fiber optics research, precision positioning, remote controlled solid rocket booster nozzle plugs, ground operations for Centaur orbital transfer vehicle, parachute drying, STS hazardous waste disposal and recycle, toxic waste technology and control concepts, fast analytical densitometry study, shuttle inventory management system, operational intercommunications system improvement, and protective garment ensemble. Terrestrial applications are also covered, including LANDSAT applications to water resources, satellite freeze forecast system, application of ground penetrating radar to soil survey, turtle tracking, evaluating computer drawn ground cover maps, sparkless load pulsar, and coupling a microcomputer and computing integrator with a gas chromatograph.
Operating a wide-area remote observing system for the W. M. Keck Observatory
NASA Astrophysics Data System (ADS)
Wirth, Gregory D.; Kibrick, Robert I.; Goodrich, Robert W.; Lyke, James E.
2008-07-01
For over a decade, the W. M. Keck Observatory's two 10-meter telescopes have been operated remotely from its Waimea headquarters. Over the last 6 years, WMKO remote observing has expanded to allow teams at dedicated sites in California to observe either in collaboration with colleagues in Waimea or entirely from the U.S. mainland. Once an experimental effort, the Observatory's mainland observing capability is now fully operational, supported on all science instruments (except the interferometer) and regularly used by astronomers at eight mainland sites. Establishing a convenient and secure observing capability from those sites required careful planning to ensure that they are properly equipped and configured. It also entailed a significant investment in hardware and software, including both custom scripts to simplify launching the instrument interface at remote sites and automated routers employing ISDN backup lines to ensure continuation of observing during Internet outages. Observers often wait until shortly before their runs to request use of the mainland facilities. Scheduling these requests and ensuring proper system operation prior to observing requires close coordination between personnel at WMKO and the mainland sites. An established protocol for approving requests and carrying out pre-run checkout has proven useful in ensuring success. The Observatory anticipates enhancing and expanding its remote observing system. Future plans include deploying dedicated summit computers for running VNC server software, implementing a web-based tracking system for mainland-based observing requests, expanding the system to additional mainland sites, and converting to full-time VNC operation for all instruments.
NASA Astrophysics Data System (ADS)
Johnson, Erika; Cowen, Edwin
2013-11-01
The effect of increased bed roughness on the free surface turbulence signature of an open channel flow is investigated with the goal of incorporating the findings into a methodology to remotely monitor volumetric flow rates. Half of a wide (B = 2 m) open channel bed is covered with a 3 cm thick layer of loose gravel (D50 = 0.6 cm). Surface PIV (particle image velocimetry) experiments are conducted for a range of flow depths (B/H = 10-30) and Reynolds numbers (ReH = 10,000-60,000). It is well established that bed roughness in wall-bounded flows enhances the vertical velocity fluctuations (e.g. Krogstad et al. 1992). When the vertical velocity fluctuations approach the free surface they are redistributed (e.g. Cowen et al. 1995) to the surface parallel component directions. It is anticipated and confirmed that the interaction of these two phenomena result in enhanced turbulence at the free surface. The effect of the rough bed on the integral length scales and the second order velocity structure functions calculated at the free surface are investigated. These findings have important implications for developing new technologies in stream gaging.
DESIGN AND CONSTRUCTION OF A FORCE-REFLECTING TELEOPERATION SYSTEM
DOE Office of Scientific and Technical Information (OSTI.GOV)
M.A. Ebadian, Ph.D.
1999-01-01
For certain applications, such as space servicing, undersea operations, and hazardous material handling tasks in nuclear reactors, the environments can be uncertain, complex, and hazardous. Lives may be in danger if humans were to work under these conditions. As a result, a man-machine system--a teleoperator system--has been developed to work in these types of environments. In a typical teleoperator system, the actual system operates at a remote site; the operator located away from this system usually receives visual information from a video image and/or graphical animation on the computer screen. Additional feedback, such as aural and force information, can significantlymore » enhance performance of the system. Force reflection is a type of feedback in which forces experienced by the remote manipulator are fed back to the manual controller. Various control methods have been proposed for implementation on a teleoperator system. In order to examine different control schemes, a one Degree-Of-Freedom (DOF) Force-Reflecting Manual Controller (FRMC) is constructed and integrated into a PC. The system parameters are identified and constructed as a mathematical model. The Proportional-Integral-Derivative (PID) and fuzzy logic controllers are developed and tested experimentally. Numerical simulation results obtained from the mathematical model are compared with those of experimental data for both types of controllers. In addition, the concept of a telesensation system is introduced. A telesensation system is an advanced teleoperator system that attempts to provide the operator with sensory feedback. In this context, a telesensation system integrates the use of a Virtual Reality (VR) unit, FRMC, and Graphical User Interface (GUI). The VR unit is used to provide the operator with a 3-D visual effect. Various commercial VR units are reviewed and features compared for use in a telesensation system. As for the FRMC, the conceptual design of a 3-DOF FRMC is developed in an effort to make the system portable, compact, and lightweight. A variety of design alternatives are presented and evaluated. Finally, a GUI software package is developed to interface with several teleoperation unit components. These components include an industrial robot, electric motor, encoder, force/torque sensor, and CCD camera. The software includes features such as position scaling, force scaling, and rereferencing and is intended to provide a sound basis for the development of a multi-DOF FRMC system in the future.« less
Mills, Jane; Francis, Karen; McLeod, Margaret; Al-Motlaq, Mohammad
2015-01-01
Nurses and midwives collectively, represent the largest workforce category in rural and remote areas of Australia. Maintaining currency of practice and attaining annual licensure with the Australian Health Practitioners Regulatory Authority (AHPRA) present challenges for individual nurses and midwives and for their health service managers. Engagement with information and communication technologies, in order for geographically isolated clinicians to access ongoing education and training, is considered a useful strategy to address such challenges. This paper presents a pre- and post-test study design. It examines the impact of an online continuing professional development (CPD) program on Australian rural nurses and midwives. The aims of the program were to increase basic skill acquisition in the utilisation of common computer software, the use of the Internet and the enhancement of email communication. Findings from the study demonstrate that participants who complete a relevant CPD program gain confidence in the use of information and communication technologies. Further, increased confidence leads to increased access to contemporary, reliable and important health care information on the Internet, in addition to clinicians adopting email as a regular method of communication. Health care employers commonly assume employees are skilled users of information and communication technologies. However, findings from this study contradict such assumptions. It is argued in the recommendations that health care employees should be given regular access to CPD programs designed to introduce them to information and communication technologies. Developing knowledge and skills in this area has the potential to improve staff productivity, raise health care standards and improve patient outcomes.
DOT National Transportation Integrated Search
2015-03-01
This report presents a research examining the feasibility of creating an integrated structural health : monitoring and impact/collision detection system for bridges in remote cold regions, where in-person : inspection becomes formidable. The research...