DOE Office of Scientific and Technical Information (OSTI.GOV)
Mohamed Abdelrahman; roger Haggard; Wagdy Mahmoud
The final goal of this project was the development of a system that is capable of controlling an industrial process effectively through the integration of information obtained through intelligent sensor fusion and intelligent control technologies. The industry of interest in this project was the metal casting industry as represented by cupola iron-melting furnaces. However, the developed technology is of generic type and hence applicable to several other industries. The system was divided into the following four major interacting components: 1. An object oriented generic architecture to integrate the developed software and hardware components @. Generic algorithms for intelligent signal analysismore » and sensor and model fusion 3. Development of supervisory structure for integration of intelligent sensor fusion data into the controller 4. Hardware implementation of intelligent signal analysis and fusion algorithms« less
Health-Enabled Smart Sensor Fusion Technology
NASA Technical Reports Server (NTRS)
Wang, Ray
2012-01-01
A process was designed to fuse data from multiple sensors in order to make a more accurate estimation of the environment and overall health in an intelligent rocket test facility (IRTF), to provide reliable, high-confidence measurements for a variety of propulsion test articles. The object of the technology is to provide sensor fusion based on a distributed architecture. Specifically, the fusion technology is intended to succeed in providing health condition monitoring capability at the intelligent transceiver, such as RF signal strength, battery reading, computing resource monitoring, and sensor data reading. The technology also provides analytic and diagnostic intelligence at the intelligent transceiver, enhancing the IEEE 1451.x-based standard for sensor data management and distributions, as well as providing appropriate communications protocols to enable complex interactions to support timely and high-quality flow of information among the system elements.
Airborne net-centric multi-INT sensor control, display, fusion, and exploitation systems
NASA Astrophysics Data System (ADS)
Linne von Berg, Dale C.; Lee, John N.; Kruer, Melvin R.; Duncan, Michael D.; Olchowski, Fred M.; Allman, Eric; Howard, Grant
2004-08-01
The NRL Optical Sciences Division has initiated a multi-year effort to develop and demonstrate an airborne net-centric suite of multi-intelligence (multi-INT) sensors and exploitation systems for real-time target detection and targeting product dissemination. The goal of this Net-centric Multi-Intelligence Fusion Targeting Initiative (NCMIFTI) is to develop an airborne real-time intelligence gathering and targeting system that can be used to detect concealed, camouflaged, and mobile targets. The multi-INT sensor suite will include high-resolution visible/infrared (EO/IR) dual-band cameras, hyperspectral imaging (HSI) sensors in the visible-to-near infrared, short-wave and long-wave infrared (VNIR/SWIR/LWIR) bands, Synthetic Aperture Radar (SAR), electronics intelligence sensors (ELINT), and off-board networked sensors. Other sensors are also being considered for inclusion in the suite to address unique target detection needs. Integrating a suite of multi-INT sensors on a single platform should optimize real-time fusion of the on-board sensor streams, thereby improving the detection probability and reducing the false alarms that occur in reconnaissance systems that use single-sensor types on separate platforms, or that use independent target detection algorithms on multiple sensors. In addition to the integration and fusion of the multi-INT sensors, the effort is establishing an open-systems net-centric architecture that will provide a modular "plug and play" capability for additional sensors and system components and provide distributed connectivity to multiple sites for remote system control and exploitation.
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2009-04-01
The rapidly advancing hardware technology, smart sensors and sensor networks are advancing environment sensing. One major potential of this technology is Large-Scale Surveillance Systems (LS3) especially for, homeland security, battlefield intelligence, facility guarding and other civilian applications. The efficient and effective deployment of LS3 requires addressing number of aspects impacting the scalability of such systems. The scalability factors are related to: computation and memory utilization efficiency, communication bandwidth utilization, network topology (e.g., centralized, ad-hoc, hierarchical or hybrid), network communication protocol and data routing schemes; and local and global data/information fusion scheme for situational awareness. Although, many models have been proposed to address one aspect or another of these issues but, few have addressed the need for a multi-modality multi-agent data/information fusion that has characteristics satisfying the requirements of current and future intelligent sensors and sensor networks. In this paper, we have presented a novel scalable fusion engine for multi-modality multi-agent information fusion for LS3. The new fusion engine is based on a concept we call: Energy Logic. Experimental results of this work as compared to a Fuzzy logic model strongly supported the validity of the new model and inspired future directions for different levels of fusion and different applications.
Extended Logic Intelligent Processing System for a Sensor Fusion Processor Hardware
NASA Technical Reports Server (NTRS)
Stoica, Adrian; Thomas, Tyson; Li, Wei-Te; Daud, Taher; Fabunmi, James
2000-01-01
The paper presents the hardware implementation and initial tests from a low-power, highspeed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) is described, which combines rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor signals in compact low power VLSI. The development of the ELIPS concept is being done to demonstrate the interceptor functionality which particularly underlines the high speed and low power requirements. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Processing speeds of microseconds have been demonstrated using our test hardware.
Facility Monitoring: A Qualitative Theory for Sensor Fusion
NASA Technical Reports Server (NTRS)
Figueroa, Fernando
2001-01-01
Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.
Hernandez, Wilmar
2007-01-01
In this paper a survey on recent applications of optimal signal processing techniques to improve the performance of mechanical sensors is made. Here, a comparison between classical filters and optimal filters for automotive sensors is made, and the current state of the art of the application of robust and optimal control and signal processing techniques to the design of the intelligent (or smart) sensors that today's cars need is presented through several experimental results that show that the fusion of intelligent sensors and optimal signal processing techniques is the clear way to go. However, the switch between the traditional methods of designing automotive sensors and the new ones cannot be done overnight because there are some open research issues that have to be solved. This paper draws attention to one of the open research issues and tries to arouse researcher's interest in the fusion of intelligent sensors and optimal signal processing techniques.
Trust metrics in information fusion
NASA Astrophysics Data System (ADS)
Blasch, Erik
2014-05-01
Trust is an important concept for machine intelligence and is not consistent across many applications. In this paper, we seek to understand trust from a variety of factors: humans, sensors, communications, intelligence processing algorithms and human-machine displays of information. In modeling the various aspects of trust, we provide an example from machine intelligence that supports the various attributes of measuring trust such as sensor accuracy, communication timeliness, machine processing confidence, and display throughput to convey the various attributes that support user acceptance of machine intelligence results. The example used is fusing video and text whereby an analyst needs trust information in the identified imagery track. We use the proportional conflict redistribution rule as an information fusion technique that handles conflicting data from trusted and mistrusted sources. The discussion of the many forms of trust explored in the paper seeks to provide a systems-level design perspective for information fusion trust quantification.
Real-time sensor validation and fusion for distributed autonomous sensors
NASA Astrophysics Data System (ADS)
Yuan, Xiaojing; Li, Xiangshang; Buckles, Bill P.
2004-04-01
Multi-sensor data fusion has found widespread applications in industrial and research sectors. The purpose of real time multi-sensor data fusion is to dynamically estimate an improved system model from a set of different data sources, i.e., sensors. This paper presented a systematic and unified real time sensor validation and fusion framework (RTSVFF) based on distributed autonomous sensors. The RTSVFF is an open architecture which consists of four layers - the transaction layer, the process fusion layer, the control layer, and the planning layer. This paradigm facilitates distribution of intelligence to the sensor level and sharing of information among sensors, controllers, and other devices in the system. The openness of the architecture also provides a platform to test different sensor validation and fusion algorithms and thus facilitates the selection of near optimal algorithms for specific sensor fusion application. In the version of the model presented in this paper, confidence weighted averaging is employed to address the dynamic system state issue noted above. The state is computed using an adaptive estimator and dynamic validation curve for numeric data fusion and a robust diagnostic map for decision level qualitative fusion. The framework is then applied to automatic monitoring of a gas-turbine engine, including a performance comparison of the proposed real-time sensor fusion algorithms and a traditional numerical weighted average.
Multi-Source Sensor Fusion for Small Unmanned Aircraft Systems Using Fuzzy Logic
NASA Technical Reports Server (NTRS)
Cook, Brandon; Cohen, Kelly
2017-01-01
As the applications for using small Unmanned Aircraft Systems (sUAS) beyond visual line of sight (BVLOS) continue to grow in the coming years, it is imperative that intelligent sensor fusion techniques be explored. In BVLOS scenarios the vehicle position must accurately be tracked over time to ensure no two vehicles collide with one another, no vehicle crashes into surrounding structures, and to identify off-nominal scenarios. Therefore, in this study an intelligent systems approach is used to estimate the position of sUAS given a variety of sensor platforms, including, GPS, radar, and on-board detection hardware. Common research challenges include, asynchronous sensor rates and sensor reliability. In an effort to realize these challenges, techniques such as a Maximum a Posteriori estimation and a Fuzzy Logic based sensor confidence determination are used.
The life and death of ATR/sensor fusion and the hope for resurrection
NASA Astrophysics Data System (ADS)
Rogers, Steven K.; Sadowski, Charles; Bauer, Kenneth W.; Oxley, Mark E.; Kabrisky, Matthew; Rogers, Adam; Mott, Stephen D.
2008-04-01
For over half a century, scientists and engineers have worked diligently to advance computational intelligence. One application of interest is how computational intelligence can bring value to our war fighters. Automatic Target Recognition (ATR) and sensor fusion efforts have fallen far short of the desired capabilities. In this article we review the capabilities requested by war fighters. When compared to our current capabilities, it is easy to conclude current Combat Identification (CID) as a Family of Systems (FoS) does a lousy job. The war fighter needed capable, operationalized ATR and sensor fusion systems ten years ago but it did not happen. The article reviews the war fighter needs and the current state of the art. The article then concludes by looking forward to where we are headed to provide the capabilities required.
Identifying and Tracking Pedestrians Based on Sensor Fusion and Motion Stability Predictions
Musleh, Basam; García, Fernando; Otamendi, Javier; Armingol, José Mª; de la Escalera, Arturo
2010-01-01
The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS) applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem) and dense disparity maps and u-v disparity (vision subsystem). Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle. PMID:22163639
Identifying and tracking pedestrians based on sensor fusion and motion stability predictions.
Musleh, Basam; García, Fernando; Otamendi, Javier; Armingol, José Maria; de la Escalera, Arturo
2010-01-01
The lack of trustworthy sensors makes development of Advanced Driver Assistance System (ADAS) applications a tough task. It is necessary to develop intelligent systems by combining reliable sensors and real-time algorithms to send the proper, accurate messages to the drivers. In this article, an application to detect and predict the movement of pedestrians in order to prevent an imminent collision has been developed and tested under real conditions. The proposed application, first, accurately measures the position of obstacles using a two-sensor hybrid fusion approach: a stereo camera vision system and a laser scanner. Second, it correctly identifies pedestrians using intelligent algorithms based on polylines and pattern recognition related to leg positions (laser subsystem) and dense disparity maps and u-v disparity (vision subsystem). Third, it uses statistical validation gates and confidence regions to track the pedestrian within the detection zones of the sensors and predict their position in the upcoming frames. The intelligent sensor application has been experimentally tested with success while tracking pedestrians that cross and move in zigzag fashion in front of a vehicle.
Sensor and information fusion for improved hostile fire situational awareness
NASA Astrophysics Data System (ADS)
Scanlon, Michael V.; Ludwig, William D.
2010-04-01
A research-oriented Army Technology Objective (ATO) named Sensor and Information Fusion for Improved Hostile Fire Situational Awareness uniquely focuses on the underpinning technologies to detect and defeat any hostile threat; before, during, and after its occurrence. This is a joint effort led by the Army Research Laboratory, with the Armaments and the Communications and Electronics Research, Development, and Engineering Centers (CERDEC and ARDEC) partners. It addresses distributed sensor fusion and collaborative situational awareness enhancements, focusing on the underpinning technologies to detect/identify potential hostile shooters prior to firing a shot and to detect/classify/locate the firing point of hostile small arms, mortars, rockets, RPGs, and missiles after the first shot. A field experiment conducted addressed not only diverse modality sensor performance and sensor fusion benefits, but gathered useful data to develop and demonstrate the ad hoc networking and dissemination of relevant data and actionable intelligence. Represented at this field experiment were various sensor platforms such as UGS, soldier-worn, manned ground vehicles, UGVs, UAVs, and helicopters. This ATO continues to evaluate applicable technologies to include retro-reflection, UV, IR, visible, glint, LADAR, radar, acoustic, seismic, E-field, narrow-band emission and image processing techniques to detect the threats with very high confidence. Networked fusion of multi-modal data will reduce false alarms and improve actionable intelligence by distributing grid coordinates, detection report features, and imagery of threats.
Multi Sensor Fusion Using Fitness Adaptive Differential Evolution
NASA Astrophysics Data System (ADS)
Giri, Ritwik; Ghosh, Arnob; Chowdhury, Aritra; Das, Swagatam
The rising popularity of multi-source, multi-sensor networks supports real-life applications calls for an efficient and intelligent approach to information fusion. Traditional optimization techniques often fail to meet the demands. The evolutionary approach provides a valuable alternative due to its inherent parallel nature and its ability to deal with difficult problems. We present a new evolutionary approach based on a modified version of Differential Evolution (DE), called Fitness Adaptive Differential Evolution (FiADE). FiADE treats sensors in the network as distributed intelligent agents with various degrees of autonomy. Existing approaches based on intelligent agents cannot completely answer the question of how their agents could coordinate their decisions in a complex environment. The proposed approach is formulated to produce good result for the problems that are high-dimensional, highly nonlinear, and random. The proposed approach gives better result in case of optimal allocation of sensors. The performance of the proposed approach is compared with an evolutionary algorithm coordination generalized particle model (C-GPM).
Evolution of an Intelligent Information Fusion System
NASA Technical Reports Server (NTRS)
Campbell, William J.; Cromp, Robert F.
1990-01-01
Consideration is given to the hardware and software needed to manage the enormous amount and complexity of data that the next generation of space-borne sensors will provide. An anthology is presented illustrating the evolution of artificial intelligence, science data processing, and management from the 1960s to the near future. Problems and limitations of technologies, data structures, data standards, and conceptual thinking are addressed. The development of an end-to-end Intelligent Information Fusion System that embodies knowledge of the user's domain-specific goals is proposed.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, J.R.; Netrologic, Inc., San Diego, CA)
1988-01-01
Topics presented include integrating neural networks and expert systems, neural networks and signal processing, machine learning, cognition and avionics applications, artificial intelligence and man-machine interface issues, real time expert systems, artificial intelligence, and engineering applications. Also considered are advanced problem solving techniques, combinational optimization for scheduling and resource control, data fusion/sensor fusion, back propagation with momentum, shared weights and recurrency, automatic target recognition, cybernetics, optical neural networks.
Adaptive neural network/expert system that learns fault diagnosis for different structures
NASA Astrophysics Data System (ADS)
Simon, Solomon H.
1992-08-01
Corporations need better real-time monitoring and control systems to improve productivity by watching quality and increasing production flexibility. The innovative technology to achieve this goal is evolving in the form artificial intelligence and neural networks applied to sensor processing, fusion, and interpretation. By using these advanced Al techniques, we can leverage existing systems and add value to conventional techniques. Neural networks and knowledge-based expert systems can be combined into intelligent sensor systems which provide real-time monitoring, control, evaluation, and fault diagnosis for production systems. Neural network-based intelligent sensor systems are more reliable because they can provide continuous, non-destructive monitoring and inspection. Use of neural networks can result in sensor fusion and the ability to model highly, non-linear systems. Improved models can provide a foundation for more accurate performance parameters and predictions. We discuss a research software/hardware prototype which integrates neural networks, expert systems, and sensor technologies and which can adapt across a variety of structures to perform fault diagnosis. The flexibility and adaptability of the prototype in learning two structures is presented. Potential applications are discussed.
NASA Astrophysics Data System (ADS)
Newman, Andrew J.; Richardson, Casey L.; Kain, Sean M.; Stankiewicz, Paul G.; Guseman, Paul R.; Schreurs, Blake A.; Dunne, Jeffrey A.
2016-05-01
This paper introduces the game of reconnaissance blind multi-chess (RBMC) as a paradigm and test bed for understanding and experimenting with autonomous decision making under uncertainty and in particular managing a network of heterogeneous Intelligence, Surveillance and Reconnaissance (ISR) sensors to maintain situational awareness informing tactical and strategic decision making. The intent is for RBMC to serve as a common reference or challenge problem in fusion and resource management of heterogeneous sensor ensembles across diverse mission areas. We have defined a basic rule set and a framework for creating more complex versions, developed a web-based software realization to serve as an experimentation platform, and developed some initial machine intelligence approaches to playing it.
Feasibility study on sensor data fusion for the CP-140 aircraft: fusion architecture analyses
NASA Astrophysics Data System (ADS)
Shahbazian, Elisa
1995-09-01
Loral Canada completed (May 1995) a Department of National Defense (DND) Chief of Research and Development (CRAD) contract, to study the feasibility of implementing a multi- sensor data fusion (MSDF) system onboard the CP-140 Aurora aircraft. This system is expected to fuse data from: (a) attributed measurement oriented sensors (ESM, IFF, etc.); (b) imaging sensors (FLIR, SAR, etc.); (c) tracking sensors (radar, acoustics, etc.); (d) data from remote platforms (data links); and (e) non-sensor data (intelligence reports, environmental data, visual sightings, encyclopedic data, etc.). Based on purely theoretical considerations a central-level fusion architecture will lead to a higher performance fusion system. However, there are a number of systems and fusion architecture issues involving fusion of such dissimilar data: (1) the currently existing sensors are not designed to provide the type of data required by a fusion system; (2) the different types (attribute, imaging, tracking, etc.) of data may require different degree of processing, before they can be used within a fusion system efficiently; (3) the data quality from different sensors, and more importantly from remote platforms via the data links must be taken into account before fusing; and (4) the non-sensor data may impose specific requirements on the fusion architecture (e.g. variable weight/priority for the data from different sensors). This paper presents the analyses performed for the selection of the fusion architecture for the enhanced sensor suite planned for the CP-140 aircraft in the context of the mission requirements and environmental conditions.
Integrating Sensor-Collected Intelligence
2008-11-01
collecting, processing, data storage and fusion, and the dissemination of information collected by Intelligence, Surveillance, and Reconnaissance (ISR...Grid – Bandwidth Expansion (GIG-BE) program) to provide the capability to transfer data from sensors to accessible storage and satellite and airborne...based ISR is much more fragile. There was a purposeful drawdown of these systems following the Cold War and modernization programs were planned to
An Approach to Automated Fusion System Design and Adaptation
Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker
2017-01-01
Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum. PMID:28300762
An Approach to Automated Fusion System Design and Adaptation.
Fritze, Alexander; Mönks, Uwe; Holst, Christoph-Alexander; Lohweg, Volker
2017-03-16
Industrial applications are in transition towards modular and flexible architectures that are capable of self-configuration and -optimisation. This is due to the demand of mass customisation and the increasing complexity of industrial systems. The conversion to modular systems is related to challenges in all disciplines. Consequently, diverse tasks such as information processing, extensive networking, or system monitoring using sensor and information fusion systems need to be reconsidered. The focus of this contribution is on distributed sensor and information fusion systems for system monitoring, which must reflect the increasing flexibility of fusion systems. This contribution thus proposes an approach, which relies on a network of self-descriptive intelligent sensor nodes, for the automatic design and update of sensor and information fusion systems. This article encompasses the fusion system configuration and adaptation as well as communication aspects. Manual interaction with the flexibly changing system is reduced to a minimum.
Context-Aided Sensor Fusion for Enhanced Urban Navigation
Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María
2012-01-01
The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments. PMID:23223080
Context-aided sensor fusion for enhanced urban navigation.
Martí, Enrique David; Martín, David; García, Jesús; de la Escalera, Arturo; Molina, José Manuel; Armingol, José María
2012-12-06
The deployment of Intelligent Vehicles in urban environments requires reliable estimation of positioning for urban navigation. The inherent complexity of this kind of environments fosters the development of novel systems which should provide reliable and precise solutions to the vehicle. This article details an advanced GNSS/IMU fusion system based on a context-aided Unscented Kalman filter for navigation in urban conditions. The constrained non-linear filter is here conditioned by a contextual knowledge module which reasons about sensor quality and driving context in order to adapt it to the situation, while at the same time it carries out a continuous estimation and correction of INS drift errors. An exhaustive analysis has been carried out with available data in order to characterize the behavior of available sensors and take it into account in the developed solution. The performance is then analyzed with an extensive dataset containing representative situations. The proposed solution suits the use of fusion algorithms for deploying Intelligent Transport Systems in urban environments.
Intelligence Fusion Modeling. A Proposed Approach.
1983-09-16
based techniques developed by artificial intelligence researchers. This paper describes the application of these techniques in the modeling of an... intelligence requirements, although the methods presented are applicable . We treat PIR/IR as given. -7- -- -W V"W v* 1.- . :71.,v It k*~ ~-- Movement...items from the PIR/IR/HVT decomposition are received from the CMDS. Formatted tactical intelligence reports are received from sensors of like types
ELIPS: Toward a Sensor Fusion Processor on a Chip
NASA Technical Reports Server (NTRS)
Daud, Taher; Stoica, Adrian; Tyson, Thomas; Li, Wei-te; Fabunmi, James
1998-01-01
The paper presents the concept and initial tests from the hardware implementation of a low-power, high-speed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) processor is developed to seamlessly combine rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor in compact low power VLSI. The first demonstration of the ELIPS concept targets interceptor functionality; other applications, mainly in robotics and autonomous systems are considered for the future. The main assumption behind ELIPS is that fuzzy, rule-based and neural forms of computation can serve as the main primitives of an "intelligent" processor. Thus, in the same way classic processors are designed to optimize the hardware implementation of a set of fundamental operations, ELIPS is developed as an efficient implementation of computational intelligence primitives, and relies on a set of fuzzy set, fuzzy inference and neural modules, built in programmable analog hardware. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Following software demonstrations on several interceptor data, three important ELIPS building blocks (a fuzzy set preprocessor, a rule-based fuzzy system and a neural network) have been fabricated in analog VLSI hardware and demonstrated microsecond-processing times.
NASA Astrophysics Data System (ADS)
Thomas, Paul A.; Marshall, Gillian; Faulkner, David; Kent, Philip; Page, Scott; Islip, Simon; Oldfield, James; Breckon, Toby P.; Kundegorski, Mikolaj E.; Clark, David J.; Styles, Tim
2016-05-01
Currently, most land Intelligence, Surveillance and Reconnaissance (ISR) assets (e.g. EO/IR cameras) are simply data collectors. Understanding, decision making and sensor control are performed by the human operators, involving high cognitive load. Any automation in the system has traditionally involved bespoke design of centralised systems that are highly specific for the assets/targets/environment under consideration, resulting in complex, non-flexible systems that exhibit poor interoperability. We address a concept of Autonomous Sensor Modules (ASMs) for land ISR, where these modules have the ability to make low-level decisions on their own in order to fulfil a higher-level objective, and plug in, with the minimum of preconfiguration, to a High Level Decision Making Module (HLDMM) through a middleware integration layer. The dual requisites of autonomy and interoperability create challenges around information fusion and asset management in an autonomous hierarchical system, which are addressed in this work. This paper presents the results of a demonstration system, known as Sensing for Asset Protection with Integrated Electronic Networked Technology (SAPIENT), which was shown in realistic base protection scenarios with live sensors and targets. The SAPIENT system performed sensor cueing, intelligent fusion, sensor tasking, target hand-off and compensation for compromised sensors, without human control, and enabled rapid integration of ISR assets at the time of system deployment, rather than at design-time. Potential benefits include rapid interoperability for coalition operations, situation understanding with low operator cognitive burden and autonomous sensor management in heterogenous sensor systems.
Distributed video data fusion and mining
NASA Astrophysics Data System (ADS)
Chang, Edward Y.; Wang, Yuan-Fang; Rodoplu, Volkan
2004-09-01
This paper presents an event sensing paradigm for intelligent event-analysis in a wireless, ad hoc, multi-camera, video surveillance system. In particilar, we present statistical methods that we have developed to support three aspects of event sensing: 1) energy-efficient, resource-conserving, and robust sensor data fusion and analysis, 2) intelligent event modeling and recognition, and 3) rapid deployment, dynamic configuration, and continuous operation of the camera networks. We outline our preliminary results, and discuss future directions that research might take.
Fusion or confusion: knowledge or nonsense?
NASA Astrophysics Data System (ADS)
Rothman, Peter L.; Denton, Richard V.
1991-08-01
The terms 'data fusion,' 'sensor fusion,' multi-sensor integration,' and 'multi-source integration' have been used widely in the technical literature to refer to a variety of techniques, technologies, systems, and applications which employ and/or combine data derived from multiple information sources. Applications of data fusion range from real-time fusion of sensor information for the navigation of mobile robots to the off-line fusion of both human and technical strategic intelligence data. The Department of Defense Critical Technologies Plan lists data fusion in the highest priority group of critical technologies, but just what is data fusion? The DoD Critical Technologies Plan states that data fusion involves 'the acquisition, integration, filtering, correlation, and synthesis of useful data from diverse sources for the purposes of situation/environment assessment, planning, detecting, verifying, diagnosing problems, aiding tactical and strategic decisions, and improving system performance and utility.' More simply states, sensor fusion refers to the combination of data from multiple sources to provide enhanced information quality and availability over that which is available from any individual source alone. This paper presents a survey of the state-of-the- art in data fusion technologies, system components, and applications. A set of characteristics which can be utilized to classify data fusion systems is presented. Additionally, a unifying mathematical and conceptual framework within which to understand and organize fusion technologies is described. A discussion of often overlooked issues in the development of sensor fusion systems is also presented.
A practical approach for active camera coordination based on a fusion-driven multi-agent system
NASA Astrophysics Data System (ADS)
Bustamante, Alvaro Luis; Molina, José M.; Patricio, Miguel A.
2014-04-01
In this paper, we propose a multi-agent system architecture to manage spatially distributed active (or pan-tilt-zoom) cameras. Traditional video surveillance algorithms are of no use for active cameras, and we have to look at different approaches. Such multi-sensor surveillance systems have to be designed to solve two related problems: data fusion and coordinated sensor-task management. Generally, architectures proposed for the coordinated operation of multiple cameras are based on the centralisation of management decisions at the fusion centre. However, the existence of intelligent sensors capable of decision making brings with it the possibility of conceiving alternative decentralised architectures. This problem is approached by means of a MAS, integrating data fusion as an integral part of the architecture for distributed coordination purposes. This paper presents the MAS architecture and system agents.
Visualization of multi-INT fusion data using Java Viewer (JVIEW)
NASA Astrophysics Data System (ADS)
Blasch, Erik; Aved, Alex; Nagy, James; Scott, Stephen
2014-05-01
Visualization is important for multi-intelligence fusion and we demonstrate issues for presenting physics-derived (i.e., hard) and human-derived (i.e., soft) fusion results. Physics-derived solutions (e.g., imagery) typically involve sensor measurements that are objective, while human-derived (e.g., text) typically involve language processing. Both results can be geographically displayed for user-machine fusion. Attributes of an effective and efficient display are not well understood, so we demonstrate issues and results for filtering, correlation, and association of data for users - be they operators or analysts. Operators require near-real time solutions while analysts have the opportunities of non-real time solutions for forensic analysis. In a use case, we demonstrate examples using the JVIEW concept that has been applied to piloting, space situation awareness, and cyber analysis. Using the open-source JVIEW software, we showcase a big data solution for multi-intelligence fusion application for context-enhanced information fusion.
Villarubia, Gabriel; De Paz, Juan F.; Bajo, Javier
2017-01-01
The use of electric bikes (e-bikes) has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route. PMID:29088087
De La Iglesia, Daniel H; Villarrubia, Gabriel; De Paz, Juan F; Bajo, Javier
2017-10-31
The use of electric bikes (e-bikes) has grown in popularity, especially in large cities where overcrowding and traffic congestion are common. This paper proposes an intelligent engine management system for e-bikes which uses the information collected from sensors to optimize battery energy and time. The intelligent engine management system consists of a built-in network of sensors in the e-bike, which is used for multi-sensor data fusion; the collected data is analysed and fused and on the basis of this information the system can provide the user with optimal and personalized assistance. The user is given recommendations related to battery consumption, sensors, and other parameters associated with the route travelled, such as duration, speed, or variation in altitude. To provide a user with these recommendations, artificial neural networks are used to estimate speed and consumption for each of the segments of a route. These estimates are incorporated into evolutionary algorithms in order to make the optimizations. A comparative analysis of the results obtained has been conducted for when routes were travelled with and without the optimization system. From the experiments, it is evident that the use of an engine management system results in significant energy and time savings. Moreover, user satisfaction increases as the level of assistance adapts to user behavior and the characteristics of the route.
Intelligent lead: a novel HRI sensor for guide robots.
Cho, Keum-Bae; Lee, Beom-Hee
2012-01-01
This paper addresses the introduction of a new Human Robot Interaction (HRI) sensor for guide robots. Guide robots for geriatric patients or the visually impaired should follow user's control command, keeping a certain desired distance allowing the user to work freely. Therefore, it is necessary to acquire control commands and a user's position on a real-time basis. We suggest a new sensor fusion system to achieve this objective and we will call this sensor the "intelligent lead". The objective of the intelligent lead is to acquire a stable distance from the user to the robot, speed-control volume and turn-control volume, even when the robot platform with the intelligent lead is shaken on uneven ground. In this paper we explain a precise Extended Kalman Filter (EKF) procedure for this. The intelligent lead physically consists of a Kinect sensor, the serial linkage attached with eight rotary encoders, and an IMU (Inertial Measurement Unit) and their measurements are fused by the EKF. A mobile robot was designed to test the performance of the proposed sensor system. After installing the intelligent lead in the mobile robot, several tests are conducted to verify that the mobile robot with the intelligent lead is capable of achieving its goal points while maintaining the appropriate distance between the robot and the user. The results show that we can use the intelligent lead proposed in this paper as a new HRI sensor joined a joystick and a distance measure in the mobile environments such as the robot and the user are moving at the same time.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Qishi; Zhu, Mengxia; Rao, Nageswara S
We propose an intelligent decision support system based on sensor and computer networks that incorporates various component techniques for sensor deployment, data routing, distributed computing, and information fusion. The integrated system is deployed in a distributed environment composed of both wireless sensor networks for data collection and wired computer networks for data processing in support of homeland security defense. We present the system framework and formulate the analytical problems and develop approximate or exact solutions for the subtasks: (i) sensor deployment strategy based on a two-dimensional genetic algorithm to achieve maximum coverage with cost constraints; (ii) data routing scheme tomore » achieve maximum signal strength with minimum path loss, high energy efficiency, and effective fault tolerance; (iii) network mapping method to assign computing modules to network nodes for high-performance distributed data processing; and (iv) binary decision fusion rule that derive threshold bounds to improve system hit rate and false alarm rate. These component solutions are implemented and evaluated through either experiments or simulations in various application scenarios. The extensive results demonstrate that these component solutions imbue the integrated system with the desirable and useful quality of intelligence in decision making.« less
Fusion of imaging and nonimaging data for surveillance aircraft
NASA Astrophysics Data System (ADS)
Shahbazian, Elisa; Gagnon, Langis; Duquet, Jean Remi; Macieszczak, Maciej; Valin, Pierre
1997-06-01
This paper describes a phased incremental integration approach for application of image analysis and data fusion technologies to provide automated intelligent target tracking and identification for airborne surveillance on board an Aurora Maritime Patrol Aircraft. The sensor suite of the Aurora consists of a radar, an identification friend or foe (IFF) system, an electronic support measures (ESM) system, a spotlight synthetic aperture radar (SSAR), a forward looking infra-red (FLIR) sensor and a link-11 tactical datalink system. Lockheed Martin Canada (LMCan) is developing a testbed, which will be used to analyze and evaluate approaches for combining the data provided by the existing sensors, which were initially not designed to feed a fusion system. Three concurrent research proof-of-concept activities provide techniques, algorithms and methodology into three sequential phases of integration of this testbed. These activities are: (1) analysis of the fusion architecture (track/contact/hybrid) most appropriate for the type of data available, (2) extraction and fusion of simple features from the imaging data into the fusion system performing automatic target identification, and (3) development of a unique software architecture which will permit integration and independent evolution, enhancement and optimization of various decision aid capabilities, such as multi-sensor data fusion (MSDF), situation and threat assessment (STA) and resource management (RM).
Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng
2014-09-02
Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers. Copyright © 2014 Elsevier B.V. All rights reserved.
Conflict management based on belief function entropy in sensor fusion.
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Wireless sensor network plays an important role in intelligent navigation. It incorporates a group of sensors to overcome the limitation of single detection system. Dempster-Shafer evidence theory can combine the sensor data of the wireless sensor network by data fusion, which contributes to the improvement of accuracy and reliability of the detection system. However, due to different sources of sensors, there may be conflict among the sensor data under uncertain environment. Thus, this paper proposes a new method combining Deng entropy and evidence distance to address the issue. First, Deng entropy is adopted to measure the uncertain information. Then, evidence distance is applied to measure the conflict degree. The new method can cope with conflict effectually and improve the accuracy and reliability of the detection system. An example is illustrated to show the efficiency of the new method and the result is compared with that of the existing methods.
Jing, Luyang; Wang, Taiyong; Zhao, Ming; Wang, Peng
2017-01-01
A fault diagnosis approach based on multi-sensor data fusion is a promising tool to deal with complicated damage detection problems of mechanical systems. Nevertheless, this approach suffers from two challenges, which are (1) the feature extraction from various types of sensory data and (2) the selection of a suitable fusion level. It is usually difficult to choose an optimal feature or fusion level for a specific fault diagnosis task, and extensive domain expertise and human labor are also highly required during these selections. To address these two challenges, we propose an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis. The proposed method can learn features from raw data and optimize a combination of different fusion levels adaptively to satisfy the requirements of any fault diagnosis task. The proposed method is tested through a planetary gearbox test rig. Handcraft features, manual-selected fusion levels, single sensory data, and two traditional intelligent models, back-propagation neural networks (BPNN) and a support vector machine (SVM), are used as comparisons in the experiment. The results demonstrate that the proposed method is able to detect the conditions of the planetary gearbox effectively with the best diagnosis accuracy among all comparative methods in the experiment. PMID:28230767
Towards a Unified Approach to Information Integration - A review paper on data/information fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Paul D.; Posse, Christian; Lei, Xingye C.
2005-10-14
Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, informationmore » is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.« less
Multi-sensor information fusion method for vibration fault diagnosis of rolling bearing
NASA Astrophysics Data System (ADS)
Jiao, Jing; Yue, Jianhai; Pei, Di
2017-10-01
Bearing is a key element in high-speed electric multiple unit (EMU) and any defect of it can cause huge malfunctioning of EMU under high operation speed. This paper presents a new method for bearing fault diagnosis based on least square support vector machine (LS-SVM) in feature-level fusion and Dempster-Shafer (D-S) evidence theory in decision-level fusion which were used to solve the problems about low detection accuracy, difficulty in extracting sensitive characteristics and unstable diagnosis system of single-sensor in rolling bearing fault diagnosis. Wavelet de-nosing technique was used for removing the signal noises. LS-SVM was used to make pattern recognition of the bearing vibration signal, and then fusion process was made according to the D-S evidence theory, so as to realize recognition of bearing fault. The results indicated that the data fusion method improved the performance of the intelligent approach in rolling bearing fault detection significantly. Moreover, the results showed that this method can efficiently improve the accuracy of fault diagnosis.
Sensor fusion: lane marking detection and autonomous intelligent cruise control system
NASA Astrophysics Data System (ADS)
Baret, Marc; Baillarin, S.; Calesse, C.; Martin, Lionel
1995-12-01
In the past few years MATRA and RENAULT have developed an Autonomous Intelligent Cruise Control (AICC) system based on a LIDAR sensor. This sensor incorporating a charge coupled device was designed to acquire pulsed laser diode emission reflected by standard car reflectors. The absence of moving mechanical parts, the large field of view, the high measurement rate and the very good accuracy for distance range and angular position of targets make this sensor very interesting. It provides the equipped car with the distance and the relative speed of other vehicles enabling the safety distance to be controlled by acting on the throttle and the automatic gear box. Experiments in various real traffic situations have shown the limitations of this kind of system especially on bends. All AICC sensors are unable to distinguish between a bend and a change of lane. This is easily understood if we consider a road without lane markings. This fact has led MATRA to improve its AICC system by providing the lane marking information. Also in the scope of the EUREKA PROMETHEUS project, MATRA and RENAULT have developed a lane keeping system in order to warn of the drivers lack of vigilance. Thus, MATRA have spread this system to far field lane marking detection and have coupled it with the AICC system. Experiments will be carried out on roads to estimate the gain in performance and comfort due to this fusion.
Composable Analytic Systems for next-generation intelligence analysis
NASA Astrophysics Data System (ADS)
DiBona, Phil; Llinas, James; Barry, Kevin
2015-05-01
Lockheed Martin Advanced Technology Laboratories (LM ATL) is collaborating with Professor James Llinas, Ph.D., of the Center for Multisource Information Fusion at the University at Buffalo (State of NY), researching concepts for a mixed-initiative associate system for intelligence analysts to facilitate reduced analysis and decision times while proactively discovering and presenting relevant information based on the analyst's needs, current tasks and cognitive state. Today's exploitation and analysis systems have largely been designed for a specific sensor, data type, and operational context, leading to difficulty in directly supporting the analyst's evolving tasking and work product development preferences across complex Operational Environments. Our interactions with analysts illuminate the need to impact the information fusion, exploitation, and analysis capabilities in a variety of ways, including understanding data options, algorithm composition, hypothesis validation, and work product development. Composable Analytic Systems, an analyst-driven system that increases flexibility and capability to effectively utilize Multi-INT fusion and analytics tailored to the analyst's mission needs, holds promise to addresses the current and future intelligence analysis needs, as US forces engage threats in contested and denied environments.
Depth and thermal sensor fusion to enhance 3D thermographic reconstruction.
Cao, Yanpeng; Xu, Baobei; Ye, Zhangyu; Yang, Jiangxin; Cao, Yanlong; Tisse, Christel-Loic; Li, Xin
2018-04-02
Three-dimensional geometrical models with incorporated surface temperature data provide important information for various applications such as medical imaging, energy auditing, and intelligent robots. In this paper we present a robust method for mobile and real-time 3D thermographic reconstruction through depth and thermal sensor fusion. A multimodal imaging device consisting of a thermal camera and a RGB-D sensor is calibrated geometrically and used for data capturing. Based on the underlying principle that temperature information remains robust against illumination and viewpoint changes, we present a Thermal-guided Iterative Closest Point (T-ICP) methodology to facilitate reliable 3D thermal scanning applications. The pose of sensing device is initially estimated using correspondences found through maximizing the thermal consistency between consecutive infrared images. The coarse pose estimate is further refined by finding the motion parameters that minimize a combined geometric and thermographic loss function. Experimental results demonstrate that complimentary information captured by multimodal sensors can be utilized to improve performance of 3D thermographic reconstruction. Through effective fusion of thermal and depth data, the proposed approach generates more accurate 3D thermal models using significantly less scanning data.
NASA Astrophysics Data System (ADS)
Shahini Shamsabadi, Salar
A web-based PAVEment MONitoring system, PAVEMON, is a GIS oriented platform for accommodating, representing, and leveraging data from a multi-modal mobile sensor system. Stated sensor system consists of acoustic, optical, electromagnetic, and GPS sensors and is capable of producing as much as 1 Terabyte of data per day. Multi-channel raw sensor data (microphone, accelerometer, tire pressure sensor, video) and processed results (road profile, crack density, international roughness index, micro texture depth, etc.) are outputs of this sensor system. By correlating the sensor measurements and positioning data collected in tight time synchronization, PAVEMON attaches a spatial component to all the datasets. These spatially indexed outputs are placed into an Oracle database which integrates seamlessly with PAVEMON's web-based system. The web-based system of PAVEMON consists of two major modules: 1) a GIS module for visualizing and spatial analysis of pavement condition information layers, and 2) a decision-support module for managing maintenance and repair (Mℝ) activities and predicting future budget needs. PAVEMON weaves together sensor data with third-party climate and traffic information from the National Oceanic and Atmospheric Administration (NOAA) and Long Term Pavement Performance (LTPP) databases for an organized data driven approach to conduct pavement management activities. PAVEMON deals with heterogeneous and redundant observations by fusing them for jointly-derived higher-confidence results. A prominent example of the fusion algorithms developed within PAVEMON is a data fusion algorithm used for estimating the overall pavement conditions in terms of ASTM's Pavement Condition Index (PCI). PAVEMON predicts PCI by undertaking a statistical fusion approach and selecting a subset of all the sensor measurements. Other fusion algorithms include noise-removal algorithms to remove false negatives in the sensor data in addition to fusion algorithms developed for identifying features on the road. PAVEMON offers an ideal research and monitoring platform for rapid, intelligent and comprehensive evaluation of tomorrow's transportation infrastructure based on up-to-date data from heterogeneous sensor systems.
Track classification within wireless sensor network
NASA Astrophysics Data System (ADS)
Doumerc, Robin; Pannetier, Benjamin; Moras, Julien; Dezert, Jean; Canevet, Loic
2017-05-01
In this paper, we present our study on track classification by taking into account environmental information and target estimated states. The tracker uses several motion model adapted to different target dynamics (pedestrian, ground vehicle and SUAV, i.e. small unmanned aerial vehicle) and works in centralized architecture. The main idea is to explore both: classification given by heterogeneous sensors and classification obtained with our fusion module. The fusion module, presented in his paper, provides a class on each track according to track location, velocity and associated uncertainty. To model the likelihood on each class, a fuzzy approach is used considering constraints on target capability to move in the environment. Then the evidential reasoning approach based on Dempster-Shafer Theory (DST) is used to perform a time integration of this classifier output. The fusion rules are tested and compared on real data obtained with our wireless sensor network.In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of this system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
A survey of body sensor networks.
Lai, Xiaochen; Liu, Quanli; Wei, Xin; Wang, Wei; Zhou, Guoqiao; Han, Guangyi
2013-04-24
The technology of sensor, pervasive computing, and intelligent information processing is widely used in Body Sensor Networks (BSNs), which are a branch of wireless sensor networks (WSNs). BSNs are playing an increasingly important role in the fields of medical treatment, social welfare and sports, and are changing the way humans use computers. Existing surveys have placed emphasis on the concept and architecture of BSNs, signal acquisition, context-aware sensing, and system technology, while this paper will focus on sensor, data fusion, and network communication. And we will introduce the research status of BSNs, the analysis of hotspots, and future development trends, the discussion of major challenges and technical problems facing currently. The typical research projects and practical application of BSNs are introduced as well. BSNs are progressing along the direction of multi-technology integration and intelligence. Although there are still many problems, the future of BSNs is fundamentally promising, profoundly changing the human-machine relationships and improving the quality of people's lives.
Hernandez, Wilmar
2005-01-01
In the present paper, in order to estimate the response of both a wheel speed sensor and an accelerometer placed in a car under performance tests, robust and optimal multivariable estimation techniques are used. In this case, the disturbances and noises corrupting the relevant information coming from the sensors' outputs are so dangerous that their negative influence on the electrical systems impoverish the general performance of the car. In short, the solution to this problem is a safety related problem that deserves our full attention. Therefore, in order to diminish the negative effects of the disturbances and noises on the car's electrical and electromechanical systems, an optimum observer is used. The experimental results show a satisfactory improvement in the signal-to-noise ratio of the relevant signals and demonstrate the importance of the fusion of several intelligent sensor design techniques when designing the intelligent sensors that today's cars need.
A Survey of Body Sensor Networks
Lai, Xiaochen; Liu, Quanli; Wei, Xin; Wang, Wei; Zhou, Guoqiao; Han, Guangyi
2013-01-01
The technology of sensor, pervasive computing, and intelligent information processing is widely used in Body Sensor Networks (BSNs), which are a branch of wireless sensor networks (WSNs). BSNs are playing an increasingly important role in the fields of medical treatment, social welfare and sports, and are changing the way humans use computers. Existing surveys have placed emphasis on the concept and architecture of BSNs, signal acquisition, context-aware sensing, and system technology, while this paper will focus on sensor, data fusion, and network communication. And we will introduce the research status of BSNs, the analysis of hotspots, and future development trends, the discussion of major challenges and technical problems facing currently. The typical research projects and practical application of BSNs are introduced as well. BSNs are progressing along the direction of multi-technology integration and intelligence. Although there are still many problems, the future of BSNs is fundamentally promising, profoundly changing the human-machine relationships and improving the quality of people's lives. PMID:23615581
Distributed data fusion across multiple hard and soft mobile sensor platforms
NASA Astrophysics Data System (ADS)
Sinsley, Gregory
One of the biggest challenges currently facing the robotics field is sensor data fusion. Unmanned robots carry many sophisticated sensors including visual and infrared cameras, radar, laser range finders, chemical sensors, accelerometers, gyros, and global positioning systems. By effectively fusing the data from these sensors, a robot would be able to form a coherent view of its world that could then be used to facilitate both autonomous and intelligent operation. Another distinct fusion problem is that of fusing data from teammates with data from onboard sensors. If an entire team of vehicles has the same worldview they will be able to cooperate much more effectively. Sharing worldviews is made even more difficult if the teammates have different sensor types. The final fusion challenge the robotics field faces is that of fusing data gathered by robots with data gathered by human teammates (soft sensors). Humans sense the world completely differently from robots, which makes this problem particularly difficult. The advantage of fusing data from humans is that it makes more information available to the entire team, thus helping each agent to make the best possible decisions. This thesis presents a system for fusing data from multiple unmanned aerial vehicles, unmanned ground vehicles, and human observers. The first issue this thesis addresses is that of centralized data fusion. This is a foundational data fusion issue, which has been very well studied. Important issues in centralized fusion include data association, classification, tracking, and robotics problems. Because these problems are so well studied, this thesis does not make any major contributions in this area, but does review it for completeness. The chapter on centralized fusion concludes with an example unmanned aerial vehicle surveillance problem that demonstrates many of the traditional fusion methods. The second problem this thesis addresses is that of distributed data fusion. Distributed data fusion is a younger field than centralized fusion. The main issues in distributed fusion that are addressed are distributed classification and distributed tracking. There are several well established methods for performing distributed fusion that are first reviewed. The chapter on distributed fusion concludes with a multiple unmanned vehicle collaborative test involving an unmanned aerial vehicle and an unmanned ground vehicle. The third issue this thesis addresses is that of soft sensor only data fusion. Soft-only fusion is a newer field than centralized or distributed hard sensor fusion. Because of the novelty of the field, the chapter on soft only fusion contains less background information and instead focuses on some new results in soft sensor data fusion. Specifically, it discusses a novel fuzzy logic based soft sensor data fusion method. This new method is tested using both simulations and field measurements. The biggest issue addressed in this thesis is that of combined hard and soft fusion. Fusion of hard and soft data is the newest area for research in the data fusion community; therefore, some of the largest theoretical contributions in this thesis are in the chapter on combined hard and soft fusion. This chapter presents a novel combined hard and soft data fusion method based on random set theory, which processes random set data using a particle filter. Furthermore, the particle filter is designed to be distributed across multiple robots and portable computers (used by human observers) so that there is no centralized failure point in the system. After laying out a theoretical groundwork for hard and soft sensor data fusion the thesis presents practical applications for hard and soft sensor data fusion in simulation. Through a series of three progressively more difficult simulations, some important hard and soft sensor data fusion capabilities are demonstrated. The first simulation demonstrates fusing data from a single soft sensor and a single hard sensor in order to track a car that could be driving normally or erratically. The second simulation adds the extra complication of classifying the type of target to the simulation. The third simulation uses multiple hard and soft sensors, with a limited field of view, to track a moving target and classify it as a friend, foe, or neutral. The final chapter builds on the work done in previous chapters by performing a field test of the algorithms for hard and soft sensor data fusion. The test utilizes an unmanned aerial vehicle, an unmanned ground vehicle, and a human observer with a laptop. The test is designed to mimic a collaborative human and robot search and rescue problem. This test makes some of the most important practical contributions of the thesis by showing that the algorithms that have been developed for hard and soft sensor data fusion are capable of running in real time on relatively simple hardware.
NASA Astrophysics Data System (ADS)
Paramanandham, Nirmala; Rajendiran, Kishore
2018-01-01
A novel image fusion technique is presented for integrating infrared and visible images. Integration of images from the same or various sensing modalities can deliver the required information that cannot be delivered by viewing the sensor outputs individually and consecutively. In this paper, a swarm intelligence based image fusion technique using discrete cosine transform (DCT) domain is proposed for surveillance application which integrates the infrared image with the visible image for generating a single informative fused image. Particle swarm optimization (PSO) is used in the fusion process for obtaining the optimized weighting factor. These optimized weighting factors are used for fusing the DCT coefficients of visible and infrared images. Inverse DCT is applied for obtaining the initial fused image. An enhanced fused image is obtained through adaptive histogram equalization for a better visual understanding and target detection. The proposed framework is evaluated using quantitative metrics such as standard deviation, spatial frequency, entropy and mean gradient. The experimental results demonstrate the outperformance of the proposed algorithm over many other state- of- the- art techniques reported in literature.
Data fusion for target tracking and classification with wireless sensor network
NASA Astrophysics Data System (ADS)
Pannetier, Benjamin; Doumerc, Robin; Moras, Julien; Dezert, Jean; Canevet, Loic
2016-10-01
In this paper, we address the problem of multiple ground target tracking and classification with information obtained from a unattended wireless sensor network. A multiple target tracking (MTT) algorithm, taking into account road and vegetation information, is proposed based on a centralized architecture. One of the key issue is how to adapt classical MTT approach to satisfy embedded processing. Based on track statistics, the classification algorithm uses estimated location, velocity and acceleration to help to classify targets. The algorithms enables tracking human and vehicles driving both on and off road. We integrate road or trail width and vegetation cover, as constraints in target motion models to improve performance of tracking under constraint with classification fusion. Our algorithm also presents different dynamic models, to palliate the maneuvers of targets. The tracking and classification algorithms are integrated into an operational platform (the fusion node). In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
High Level Information Fusion (HLIF) with nested fusion loops
NASA Astrophysics Data System (ADS)
Woodley, Robert; Gosnell, Michael; Fischer, Amber
2013-05-01
Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.
Markov logic network based complex event detection under uncertainty
NASA Astrophysics Data System (ADS)
Lu, Jingyang; Jia, Bin; Chen, Genshe; Chen, Hua-mei; Sullivan, Nichole; Pham, Khanh; Blasch, Erik
2018-05-01
In a cognitive reasoning system, the four-stage Observe-Orient-Decision-Act (OODA) reasoning loop is of interest. The OODA loop is essential for the situational awareness especially in heterogeneous data fusion. Cognitive reasoning for making decisions can take advantage of different formats of information such as symbolic observations, various real-world sensor readings, or the relationship between intelligent modalities. Markov Logic Network (MLN) provides mathematically sound technique in presenting and fusing data at multiple levels of abstraction, and across multiple intelligent sensors to conduct complex decision-making tasks. In this paper, a scenario about vehicle interaction is investigated, in which uncertainty is taken into consideration as no systematic approaches can perfectly characterize the complex event scenario. MLNs are applied to the terrestrial domain where the dynamic features and relationships among vehicles are captured through multiple sensors and information sources regarding the data uncertainty.
Intelligent imaging systems for automotive applications
NASA Astrophysics Data System (ADS)
Thompson, Chris; Huang, Yingping; Fu, Shan
2004-03-01
In common with many other application areas, visual signals are becoming an increasingly important information source for many automotive applications. For several years CCD cameras have been used as research tools for a range of automotive applications. Infrared cameras, RADAR and LIDAR are other types of imaging sensors that have also been widely investigated for use in cars. This paper will describe work in this field performed in C2VIP over the last decade - starting with Night Vision Systems and looking at various other Advanced Driver Assistance Systems. Emerging from this experience, we make the following observations which are crucial for "intelligent" imaging systems: 1. Careful arrangement of sensor array. 2. Dynamic-Self-Calibration. 3. Networking and processing. 4. Fusion with other imaging sensors, both at the image level and the feature level, provides much more flexibility and reliability in complex situations. We will discuss how these problems can be addressed and what are the outstanding issues.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sen, Satyabrata; Rao, Nageswara S; Wu, Qishi
There have been increasingly large deployments of radiation detection networks that require computationally fast algorithms to produce prompt results over ad-hoc sub-networks of mobile devices, such as smart-phones. These algorithms are in sharp contrast to complex network algorithms that necessitate all measurements to be sent to powerful central servers. In this work, at individual sensors, we employ Wald-statistic based detection algorithms which are computationally very fast, and are implemented as one of three Z-tests and four chi-square tests. At fusion center, we apply the K-out-of-N fusion to combine the sensors hard decisions. We characterize the performance of detection methods bymore » deriving analytical expressions for the distributions of underlying test statistics, and by analyzing the fusion performances in terms of K, N, and the false-alarm rates of individual detectors. We experimentally validate our methods using measurements from indoor and outdoor characterization tests of the Intelligence Radiation Sensors Systems (IRSS) program. In particular, utilizing the outdoor measurements, we construct two important real-life scenarios, boundary surveillance and portal monitoring, and present the results of our algorithms.« less
Scalable sensor management for automated fusion and tactical reconnaissance
NASA Astrophysics Data System (ADS)
Walls, Thomas J.; Wilson, Michael L.; Partridge, Darin C.; Haws, Jonathan R.; Jensen, Mark D.; Johnson, Troy R.; Petersen, Brad D.; Sullivan, Stephanie W.
2013-05-01
The capabilities of tactical intelligence, surveillance, and reconnaissance (ISR) payloads are expanding from single sensor imagers to integrated systems-of-systems architectures. Increasingly, these systems-of-systems include multiple sensing modalities that can act as force multipliers for the intelligence analyst. Currently, the separate sensing modalities operate largely independent of one another, providing a selection of operating modes but not an integrated intelligence product. We describe here a Sensor Management System (SMS) designed to provide a small, compact processing unit capable of managing multiple collaborative sensor systems on-board an aircraft. Its purpose is to increase sensor cooperation and collaboration to achieve intelligent data collection and exploitation. The SMS architecture is designed to be largely sensor and data agnostic and provide flexible networked access for both data providers and data consumers. It supports pre-planned and ad-hoc missions, with provisions for on-demand tasking and updates from users connected via data links. Management of sensors and user agents takes place over standard network protocols such that any number and combination of sensors and user agents, either on the local network or connected via data link, can register with the SMS at any time during the mission. The SMS provides control over sensor data collection to handle logging and routing of data products to subscribing user agents. It also supports the addition of algorithmic data processing agents for feature/target extraction and provides for subsequent cueing from one sensor to another. The SMS architecture was designed to scale from a small UAV carrying a limited number of payloads to an aircraft carrying a large number of payloads. The SMS system is STANAG 4575 compliant as a removable memory module (RMM) and can act as a vehicle specific module (VSM) to provide STANAG 4586 compliance (level-3 interoperability) to a non-compliant sensor system. The SMS architecture will be described and results from several flight tests and simulations will be shown.
NASA Astrophysics Data System (ADS)
McMullen, Sonya A. H.; Henderson, Troy; Ison, David
2017-05-01
The miniaturization of unmanned systems and spacecraft, as well as computing and sensor technologies, has opened new opportunities in the areas of remote sensing and multi-sensor data fusion for a variety of applications. Remote sensing and data fusion historically have been the purview of large government organizations, such as the Department of Defense (DoD), National Aeronautics and Space Administration (NASA), and National Geospatial-Intelligence Agency (NGA) due to the high cost and complexity of developing, fielding, and operating such systems. However, miniaturized computers with high capacity processing capabilities, small and affordable sensors, and emerging, commercially available platforms such as UAS and CubeSats to carry such sensors, have allowed for a vast range of novel applications. In order to leverage these developments, Embry-Riddle Aeronautical University (ERAU) has developed an advanced sensor and data fusion laboratory to research component capabilities and their employment on a wide-range of autonomous, robotic, and transportation systems. This lab is unique in several ways, for example, it provides a traditional campus laboratory for students and faculty to model and test sensors in a range of scenarios, process multi-sensor data sets (both simulated and experimental), and analyze results. Moreover, such allows for "virtual" modeling, testing, and teaching capability reaching beyond the physical confines of the facility for use among ERAU Worldwide students and faculty located around the globe. Although other institutions such as Georgia Institute of Technology, Lockheed Martin, University of Dayton, and University of Central Florida have optical sensor laboratories, the ERAU virtual concept is the first such lab to expand to multispectral sensors and data fusion, while focusing on the data collection and data products and not on the manufacturing aspect. Further, the initiative is a unique effort among Embry-Riddle faculty to develop multi-disciplinary, cross-campus research to facilitate faculty- and student-driven research. Specifically, the ERAU Worldwide Campus, with locations across the globe and delivering curricula online, will be leveraged to provide novel approaches to remote sensor experimentation and simulation. The purpose of this paper and presentation is to present this new laboratory, research, education, and collaboration process.
Design of an auto change mechanism and intelligent gripper for the space station
NASA Technical Reports Server (NTRS)
Dehoff, Paul H.; Naik, Dipak P.
1989-01-01
Robot gripping of objects in space is inherently demanding and dangerous and nowhere is this more clearly reflected than in the design of the robot gripper. An object which escapes the gripper in a micro g environment is launched not dropped. To prevent this, the gripper must have sensors and signal processing to determine that the object is properly grasped, e.g., grip points and gripping forces and, if not, to provide information to the robot to enable closed loop corrections to be made. The sensors and sensor strategies employed in the NASA/GSFC Split-Rail Parallel Gripper are described. Objectives and requirements are given followed by the design of the sensor suite, sensor fusion techniques and supporting algorithms.
A Hybrid Positioning Strategy for Vehicles in a Tunnel Based on RFID and In-Vehicle Sensors
Song, Xiang; Li, Xu; Tang, Wencheng; Zhang, Weigong; Li, Bin
2014-01-01
Many intelligent transportation system applications require accurate, reliable, and continuous vehicle positioning. How to achieve such positioning performance in extended GPS-denied environments such as tunnels is the main challenge for land vehicles. This paper proposes a hybrid multi-sensor fusion strategy for vehicle positioning in tunnels. First, the preliminary positioning algorithm is developed. The Radio Frequency Identification (RFID) technology is introduced to achieve preliminary positioning in the tunnel. The received signal strength (RSS) is used as an indicator to calculate the distances between the RFID tags and reader, and then a Least Mean Square (LMS) federated filter is designed to provide the preliminary position information for subsequent global fusion. Further, to improve the positioning performance in the tunnel, an interactive multiple model (IMM)-based global fusion algorithm is developed to fuse the data from preliminary positioning results and low-cost in-vehicle sensors, such as electronic compasses and wheel speed sensors. In the actual implementation of IMM, the strong tracking extended Kalman filter (STEKF) algorithm is designed to replace the conventional extended Kalman filter (EKF) to achieve model individual filtering. Finally, the proposed strategy is evaluated through experiments. The results validate the feasibility and effectiveness of the proposed strategy. PMID:25490581
A hybrid positioning strategy for vehicles in a tunnel based on RFID and in-vehicle sensors.
Song, Xiang; Li, Xu; Tang, Wencheng; Zhang, Weigong; Li, Bin
2014-12-05
Many intelligent transportation system applications require accurate, reliable, and continuous vehicle positioning. How to achieve such positioning performance in extended GPS-denied environments such as tunnels is the main challenge for land vehicles. This paper proposes a hybrid multi-sensor fusion strategy for vehicle positioning in tunnels. First, the preliminary positioning algorithm is developed. The Radio Frequency Identification (RFID) technology is introduced to achieve preliminary positioning in the tunnel. The received signal strength (RSS) is used as an indicator to calculate the distances between the RFID tags and reader, and then a Least Mean Square (LMS) federated filter is designed to provide the preliminary position information for subsequent global fusion. Further, to improve the positioning performance in the tunnel, an interactive multiple model (IMM)-based global fusion algorithm is developed to fuse the data from preliminary positioning results and low-cost in-vehicle sensors, such as electronic compasses and wheel speed sensors. In the actual implementation of IMM, the strong tracking extended Kalman filter (STEKF) algorithm is designed to replace the conventional extended Kalman filter (EKF) to achieve model individual filtering. Finally, the proposed strategy is evaluated through experiments. The results validate the feasibility and effectiveness of the proposed strategy.
Dynamic multisensor fusion for mobile robot navigation in an indoor environment
NASA Astrophysics Data System (ADS)
Jin, Taeseok; Lee, Jang-Myung; Luk, Bing L.; Tso, Shiu K.
2001-10-01
In this study, as the preliminary step for developing a multi-purpose Autonomous robust carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as sonar, CCD camera dn IR sensor for map-building mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. Smart sensory systems are crucial for successful autonomous systems. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, since there exist several recent thorough books and review paper on this paper. Instead we will focus on the main results with relevance to the intelligent service robot project at the Centre of Intelligent Design, Automation & Manufacturing (CIDAM). We will conclude by discussing some possible future extensions of the project. It is first dealt with the general principle of the navigation and guidance architecture, then the detailed functions recognizing environments updated, obstacle detection and motion assessment, with the first results form the simulations run.
Shamwell, E Jared; Nothwang, William D; Perlis, Donald
2018-05-04
Aimed at improving size, weight, and power (SWaP)-constrained robotic vision-aided state estimation, we describe our unsupervised, deep convolutional-deconvolutional sensor fusion network, Multi-Hypothesis DeepEfference (MHDE). MHDE learns to intelligently combine noisy heterogeneous sensor data to predict several probable hypotheses for the dense, pixel-level correspondence between a source image and an unseen target image. We show how our multi-hypothesis formulation provides increased robustness against dynamic, heteroscedastic sensor and motion noise by computing hypothesis image mappings and predictions at 76⁻357 Hz depending on the number of hypotheses being generated. MHDE fuses noisy, heterogeneous sensory inputs using two parallel, inter-connected architectural pathways and n (1⁻20 in this work) multi-hypothesis generating sub-pathways to produce n global correspondence estimates between a source and a target image. We evaluated MHDE on the KITTI Odometry dataset and benchmarked it against the vision-only DeepMatching and Deformable Spatial Pyramids algorithms and were able to demonstrate a significant runtime decrease and a performance increase compared to the next-best performing method.
Intelligent data processing of an ultrasonic sensor system for pattern recognition improvements
NASA Astrophysics Data System (ADS)
Na, Seung You; Park, Min-Sang; Hwang, Won-Gul; Kee, Chang-Doo
1999-05-01
Though conventional time-of-flight ultrasonic sensor systems are popular due to the advantages of low cost and simplicity, the usage of the sensors is rather narrowly restricted within object detection and distance readings. There is a strong need to enlarge the amount of environmental information for mobile applications to provide intelligent autonomy. Wide sectors of such neighboring object recognition problems can be satisfactorily handled with coarse vision data such as sonar maps instead of accurate laser or optic measurements. For the usage of object pattern recognition, ultrasonic senors have inherent shortcomings of poor directionality and specularity which result in low spatial resolution and indistinctiveness of object patterns. To resolve these problems an array of increased number of sensor elements has been used for large objects. In this paper we propose a method of sensor array system with improved recognition capability using electronic circuits accompanying the sensor array and neuro-fuzzy processing of data fusion. The circuit changes transmitter output voltages of array elements in several steps. Relying upon the known sensor characteristics, a set of different return signals from neighboring senors is manipulated to provide an enhanced pattern recognition in the aspects of inclination angle, size and shift as well as distance of objects. The results show improved resolution of the measurements for smaller targets.
Improving CAR Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
Improving Car Navigation with a Vision-Based System
NASA Astrophysics Data System (ADS)
Kim, H.; Choi, K.; Lee, I.
2015-08-01
The real-time acquisition of the accurate positions is very important for the proper operations of driver assistance systems or autonomous vehicles. Since the current systems mostly depend on a GPS and map-matching technique, they show poor and unreliable performance in blockage and weak areas of GPS signals. In this study, we propose a vision oriented car navigation method based on sensor fusion with a GPS and in-vehicle sensors. We employed a single photo resection process to derive the position and attitude of the camera and thus those of the car. This image georeferencing results are combined with other sensory data under the sensor fusion framework for more accurate estimation of the positions using an extended Kalman filter. The proposed system estimated the positions with an accuracy of 15 m although GPS signals are not available at all during the entire test drive of 15 minutes. The proposed vision based system can be effectively utilized for the low-cost but high-accurate and reliable navigation systems required for intelligent or autonomous vehicles.
Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology.
Hsu, Yu-Liang; Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen
2017-07-15
This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents' wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident's feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment.
Design and Implementation of a Smart Home System Using Multisensor Data Fusion Technology
Chou, Po-Huan; Chang, Hsing-Cheng; Lin, Shyan-Lung; Yang, Shih-Chin; Su, Heng-Yi; Chang, Chih-Chien; Cheng, Yuan-Sheng; Kuo, Yu-Chen
2017-01-01
This paper aims to develop a multisensor data fusion technology-based smart home system by integrating wearable intelligent technology, artificial intelligence, and sensor fusion technology. We have developed the following three systems to create an intelligent smart home environment: (1) a wearable motion sensing device to be placed on residents’ wrists and its corresponding 3D gesture recognition algorithm to implement a convenient automated household appliance control system; (2) a wearable motion sensing device mounted on a resident’s feet and its indoor positioning algorithm to realize an effective indoor pedestrian navigation system for smart energy management; (3) a multisensor circuit module and an intelligent fire detection and alarm algorithm to realize a home safety and fire detection system. In addition, an intelligent monitoring interface is developed to provide in real-time information about the smart home system, such as environmental temperatures, CO concentrations, communicative environmental alarms, household appliance status, human motion signals, and the results of gesture recognition and indoor positioning. Furthermore, an experimental testbed for validating the effectiveness and feasibility of the smart home system was built and verified experimentally. The results showed that the 3D gesture recognition algorithm could achieve recognition rates for automated household appliance control of 92.0%, 94.8%, 95.3%, and 87.7% by the 2-fold cross-validation, 5-fold cross-validation, 10-fold cross-validation, and leave-one-subject-out cross-validation strategies. For indoor positioning and smart energy management, the distance accuracy and positioning accuracy were around 0.22% and 3.36% of the total traveled distance in the indoor environment. For home safety and fire detection, the classification rate achieved 98.81% accuracy for determining the conditions of the indoor living environment. PMID:28714884
NASA Astrophysics Data System (ADS)
Yue, Haosong; Chen, Weihai; Wu, Xingming; Wang, Jianhua
2016-03-01
Three-dimensional (3-D) simultaneous localization and mapping (SLAM) is a crucial technique for intelligent robots to navigate autonomously and execute complex tasks. It can also be applied to shape measurement, reverse engineering, and many other scientific or engineering fields. A widespread SLAM algorithm, named KinectFusion, performs well in environments with complex shapes. However, it cannot handle translation uncertainties well in highly structured scenes. This paper improves the KinectFusion algorithm and makes it competent in both structured and unstructured environments. 3-D line features are first extracted according to both color and depth data captured by Kinect sensor. Then the lines in the current data frame are matched with the lines extracted from the entire constructed world model. Finally, we fuse the distance errors of these line-pairs into the standard KinectFusion framework and estimate sensor poses using an iterative closest point-based algorithm. Comparative experiments with the KinectFusion algorithm and one state-of-the-art method in a corridor scene have been done. The experimental results demonstrate that after our improvement, the KinectFusion algorithm can also be applied to structured environments and has higher accuracy. Experiments on two open access datasets further validated our improvements.
NASA Astrophysics Data System (ADS)
Sabeur, Zoheir; Middleton, Stuart; Veres, Galina; Zlatev, Zlatko; Salvo, Nicola
2010-05-01
The advancement of smart sensor technology in the last few years has led to an increase in the deployment of affordable sensors for monitoring the environment around Europe. This is generating large amounts of sensor observation information and inevitably leading to problems about how to manage large volumes of data as well as making sense out the data for decision-making. In addition, the various European Directives (Water Framework Diectives, Bathing Water Directives, Habitat Directives, etc.. ) which regulate human activities in the environment and the INSPIRE Directive on spatial information management regulations have implicitely led the designated European Member States environment agencies and authorities to put in place new sensor monitoring infrastructure and share information about environmental regions under their statutory responsibilities. They will need to work cross border and collectively reach environmental quality standards. They will also need to regularly report to the EC on the quality of the environments of which they are responsible and make such information accessible to the members of the public. In recent years, early pioneering work on the design of service oriented architecture using sensor networks has been achieved. Information web-services infrastructure using existing data catalogues and web-GIS map services can now be enriched with the deployment of new sensor observation and data fusion and modelling services using OGC standards. The deployment of the new services which describe sensor observations and intelligent data-processing using data fusion techniques can now be implemented and provide added value information with spatial-temporal uncertainties to the next generation of decision support service systems. The new decision support service systems have become key to implement across Europe in order to comply with EU environmental regulations and INSPIRE. In this paper, data fusion services using OGC standards with sensor observation data streams are described in context of a geo-distributed service infrastructure specialising in multiple environmental risk management and decision-support. The sensor data fusion services are deployed and validated in two use cases. These are respectively concerned with: 1) Microbial risks forecast in bathing waters; and 2) Geohazards in urban zones during underground tunneling activities. This research was initiated in the SANY Integrated Project(www.sany-ip.org) and funded by the European Commission under the 6th Framework Programme.
INL Control System Situational Awareness Technology Annual Report 2012
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon Rueff; Bryce Wheeler; Todd Vollmer
The overall goal of this project is to develop an interoperable set of tools to provide a comprehensive, consistent implementation of cyber security and overall situational awareness of control and sensor network implementations. The operation and interoperability of these tools will fill voids in current technological offerings and address issues that remain an impediment to the security of control systems. This report provides an FY 2012 update on the Sophia, Mesh Mapper, Intelligent Cyber Sensor, and Data Fusion projects with respect to the year-two tasks and annual reporting requirements of the INL Control System Situational Awareness Technology report (July 2010).
Image Registration of High-Resolution Uav Data: the New Hypare Algorithm
NASA Astrophysics Data System (ADS)
Bahr, T.; Jin, X.; Lasica, R.; Giessel, D.
2013-08-01
Unmanned aerial vehicles play an important role in the present-day civilian and military intelligence. Equipped with a variety of sensors, such as SAR imaging modes, E/O- and IR sensor technology, they are due to their agility suitable for many applications. Hence, the necessity arises to use fusion technologies and to develop them continuously. Here an exact image-to-image registration is essential. It serves as the basis for important image processing operations such as georeferencing, change detection, and data fusion. Therefore we developed the Hybrid Powered Auto-Registration Engine (HyPARE). HyPARE combines all available spatial reference information with a number of image registration approaches to improve the accuracy, performance, and automation of tie point generation and image registration. We demonstrate this approach by the registration of 39 still images from a high-resolution image stream, acquired with a Aeryon Photo3S™ camera on an Aeryon Scout micro-UAV™.
Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang; Hu, Jianjun
2017-07-28
Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster-Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions.
Li, Shaobo; Liu, Guokai; Tang, Xianghong; Lu, Jianguang
2017-01-01
Intelligent machine health monitoring and fault diagnosis are becoming increasingly important for modern manufacturing industries. Current fault diagnosis approaches mostly depend on expert-designed features for building prediction models. In this paper, we proposed IDSCNN, a novel bearing fault diagnosis algorithm based on ensemble deep convolutional neural networks and an improved Dempster–Shafer theory based evidence fusion. The convolutional neural networks take the root mean square (RMS) maps from the FFT (Fast Fourier Transformation) features of the vibration signals from two sensors as inputs. The improved D-S evidence theory is implemented via distance matrix from evidences and modified Gini Index. Extensive evaluations of the IDSCNN on the Case Western Reserve Dataset showed that our IDSCNN algorithm can achieve better fault diagnosis performance than existing machine learning methods by fusing complementary or conflicting evidences from different models and sensors and adapting to different load conditions. PMID:28788099
Intelligent Melting Probes - How to Make the Most out of our Data
NASA Astrophysics Data System (ADS)
Kowalski, J.; Clemens, J.; Chen, S.; Schüller, K.
2016-12-01
Direct exploration of glaciers, ice sheets, or subglacial environments poses a big challenge. Different technological solutions have been proposed and deployed in the last decades, examples being hot-water drills or different melting probe designs. Most of the recent engineering concepts integrate a variety of different on-board sensors, e.g. temperature sensors, pressure sensors, or an inertial measurement unit. Not only do individual sensors provide valuable insight into the current state of the probe, yet often they also contain a wealth of additional information when analyzed collectively. This quite naturally raises the question: How can we make most out of our data? We find that it is necessary to implement intelligent data integration and sensor fusion strategies to retrieve a maximum amount of information from the observations. In this contribution, we are inspired by the engineering design of the IceMole, a minimally invasive, steerable melting probe. We will talk about two sensor integration strategies relevant to IceMole melting scenarios. At first, we will present a multi-sensor fusion approach to accurately retrieve subsurface position and attitude information. It uses an extended Kalman filter to integrate data from an on-board IMU, a differential magnetometer system, the screw feed, as well as the travel time of acoustic signals originating from emitters at the ice surface. Furthermore, an evidential mapping algorithm estimates a map of the environment from data of ultrasound phased arrays in the probe's head. Various results from tests in a swimming pool and in glacier ice will be shown during the presentation. A second block considers the fluid-dynamical state in the melting channel, as well as the ambient cryo-environment. It is devoted to retrieving information from on-board temperature and pressure sensors. Here, we will report on preliminary results from re-analysing past field test data. Knowledge from integrated sensor data likewise provides valuable input for the parameter identification and verification of data based models. Due to the concept of not focusing on the physical laws, this approach can still be used, if modifications are done. It is highly transferable and hasn't been exploited rigorously so far. This could be a potential future direction.
NASA Technical Reports Server (NTRS)
Panangadan, Anand; Monacos, Steve; Burleigh, Scott; Joswig, Joseph; James, Mark; Chow, Edward
2012-01-01
In this paper, we describe the architecture of both the PATS and SAP systems and how these two systems interoperate with each other forming a unified capability for deploying intelligence in hostile environments with the objective of providing actionable situational awareness of individuals. The SAP system works in concert with the UICDS information sharing middleware to provide data fusion from multiple sources. UICDS can then publish the sensor data using the OGC's Web Mapping Service, Web Feature Service, and Sensor Observation Service standards. The system described in the paper is able to integrate a spatially distributed sensor system, operating without the benefit of the Web infrastructure, with a remote monitoring and control system that is equipped to take advantage of SWE.
Semiotic foundation for multisensor-multilook fusion
NASA Astrophysics Data System (ADS)
Myler, Harley R.
1998-07-01
This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.
NASA Astrophysics Data System (ADS)
Câmara, F.; Oliveira, J.; Hormigo, T.; Araújo, J.; Ribeiro, R.; Falcão, A.; Gomes, M.; Dubois-Matra, O.; Vijendran, S.
2015-06-01
This paper discusses the design and evaluation of data fusion strategies to perform tiered fusion of several heterogeneous sensors and a priori data. The aim is to increase robustness and performance of hazard detection and avoidance systems, while enabling safe planetary and small body landings anytime, anywhere. The focus is on Mars and asteroid landing mission scenarios and three distinct data fusion algorithms are introduced and compared. The first algorithm consists of a hybrid camera-LIDAR hazard detection and avoidance system, the H2DAS, in which data fusion is performed at both sensor-level data (reconstruction of the point cloud obtained with a scanning LIDAR using the navigation motion states and correcting the image for motion compensation using IMU data), feature-level data (concatenation of multiple digital elevation maps, obtained from consecutive LIDAR images, to achieve higher accuracy and resolution maps while enabling relative positioning) as well as decision-level data (fusing hazard maps from multiple sensors onto a single image space, with a single grid orientation and spacing). The second method presented is a hybrid reasoning fusion, the HRF, in which innovative algorithms replace the decision-level functions of the previous method, by combining three different reasoning engines—a fuzzy reasoning engine, a probabilistic reasoning engine and an evidential reasoning engine—to produce safety maps. Finally, the third method presented is called Intelligent Planetary Site Selection, the IPSIS, an innovative multi-criteria, dynamic decision-level data fusion algorithm that takes into account historical information for the selection of landing sites and a piloting function with a non-exhaustive landing site search capability, i.e., capable of finding local optima by searching a reduced set of global maps. All the discussed data fusion strategies and algorithms have been integrated, verified and validated in a closed-loop simulation environment. Monte Carlo simulation campaigns were performed for the algorithms performance assessment and benchmarking. The simulations results comprise the landing phases of Mars and Phobos landing mission scenarios.
Object Detection and Classification by Decision-Level Fusion for Intelligent Vehicle Systems.
Oh, Sang-Il; Kang, Hang-Bong
2017-01-22
To understand driving environments effectively, it is important to achieve accurate detection and classification of objects detected by sensor-based intelligent vehicle systems, which are significantly important tasks. Object detection is performed for the localization of objects, whereas object classification recognizes object classes from detected object regions. For accurate object detection and classification, fusing multiple sensor information into a key component of the representation and perception processes is necessary. In this paper, we propose a new object-detection and classification method using decision-level fusion. We fuse the classification outputs from independent unary classifiers, such as 3D point clouds and image data using a convolutional neural network (CNN). The unary classifiers for the two sensors are the CNN with five layers, which use more than two pre-trained convolutional layers to consider local to global features as data representation. To represent data using convolutional layers, we apply region of interest (ROI) pooling to the outputs of each layer on the object candidate regions generated using object proposal generation to realize color flattening and semantic grouping for charge-coupled device and Light Detection And Ranging (LiDAR) sensors. We evaluate our proposed method on a KITTI benchmark dataset to detect and classify three object classes: cars, pedestrians and cyclists. The evaluation results show that the proposed method achieves better performance than the previous methods. Our proposed method extracted approximately 500 proposals on a 1226 × 370 image, whereas the original selective search method extracted approximately 10 6 × n proposals. We obtained classification performance with 77.72% mean average precision over the entirety of the classes in the moderate detection level of the KITTI benchmark dataset.
Object Detection and Classification by Decision-Level Fusion for Intelligent Vehicle Systems
Oh, Sang-Il; Kang, Hang-Bong
2017-01-01
To understand driving environments effectively, it is important to achieve accurate detection and classification of objects detected by sensor-based intelligent vehicle systems, which are significantly important tasks. Object detection is performed for the localization of objects, whereas object classification recognizes object classes from detected object regions. For accurate object detection and classification, fusing multiple sensor information into a key component of the representation and perception processes is necessary. In this paper, we propose a new object-detection and classification method using decision-level fusion. We fuse the classification outputs from independent unary classifiers, such as 3D point clouds and image data using a convolutional neural network (CNN). The unary classifiers for the two sensors are the CNN with five layers, which use more than two pre-trained convolutional layers to consider local to global features as data representation. To represent data using convolutional layers, we apply region of interest (ROI) pooling to the outputs of each layer on the object candidate regions generated using object proposal generation to realize color flattening and semantic grouping for charge-coupled device and Light Detection And Ranging (LiDAR) sensors. We evaluate our proposed method on a KITTI benchmark dataset to detect and classify three object classes: cars, pedestrians and cyclists. The evaluation results show that the proposed method achieves better performance than the previous methods. Our proposed method extracted approximately 500 proposals on a 1226×370 image, whereas the original selective search method extracted approximately 106×n proposals. We obtained classification performance with 77.72% mean average precision over the entirety of the classes in the moderate detection level of the KITTI benchmark dataset. PMID:28117742
Appearance-based multimodal human tracking and identification for healthcare in the digital home.
Yang, Mau-Tsuen; Huang, Shen-Yen
2014-08-05
There is an urgent need for intelligent home surveillance systems to provide home security, monitor health conditions, and detect emergencies of family members. One of the fundamental problems to realize the power of these intelligent services is how to detect, track, and identify people at home. Compared to RFID tags that need to be worn all the time, vision-based sensors provide a natural and nonintrusive solution. Observing that body appearance and body build, as well as face, provide valuable cues for human identification, we model and record multi-view faces, full-body colors and shapes of family members in an appearance database by using two Kinects located at a home's entrance. Then the Kinects and another set of color cameras installed in other parts of the house are used to detect, track, and identify people by matching the captured color images with the registered templates in the appearance database. People are detected and tracked by multisensor fusion (Kinects and color cameras) using a Kalman filter that can handle duplicate or partial measurements. People are identified by multimodal fusion (face, body appearance, and silhouette) using a track-based majority voting. Moreover, the appearance-based human detection, tracking, and identification modules can cooperate seamlessly and benefit from each other. Experimental results show the effectiveness of the human tracking across multiple sensors and human identification considering the information of multi-view faces, full-body clothes, and silhouettes. The proposed home surveillance system can be applied to domestic applications in digital home security and intelligent healthcare.
Appearance-Based Multimodal Human Tracking and Identification for Healthcare in the Digital Home
Yang, Mau-Tsuen; Huang, Shen-Yen
2014-01-01
There is an urgent need for intelligent home surveillance systems to provide home security, monitor health conditions, and detect emergencies of family members. One of the fundamental problems to realize the power of these intelligent services is how to detect, track, and identify people at home. Compared to RFID tags that need to be worn all the time, vision-based sensors provide a natural and nonintrusive solution. Observing that body appearance and body build, as well as face, provide valuable cues for human identification, we model and record multi-view faces, full-body colors and shapes of family members in an appearance database by using two Kinects located at a home's entrance. Then the Kinects and another set of color cameras installed in other parts of the house are used to detect, track, and identify people by matching the captured color images with the registered templates in the appearance database. People are detected and tracked by multisensor fusion (Kinects and color cameras) using a Kalman filter that can handle duplicate or partial measurements. People are identified by multimodal fusion (face, body appearance, and silhouette) using a track-based majority voting. Moreover, the appearance-based human detection, tracking, and identification modules can cooperate seamlessly and benefit from each other. Experimental results show the effectiveness of the human tracking across multiple sensors and human identification considering the information of multi-view faces, full-body clothes, and silhouettes. The proposed home surveillance system can be applied to domestic applications in digital home security and intelligent healthcare. PMID:25098207
NASA Astrophysics Data System (ADS)
Fink, Wolfgang; George, Thomas; Tarbell, Mark A.
2007-04-01
Robotic reconnaissance operations are called for in extreme environments, not only those such as space, including planetary atmospheres, surfaces, and subsurfaces, but also in potentially hazardous or inaccessible operational areas on Earth, such as mine fields, battlefield environments, enemy occupied territories, terrorist infiltrated environments, or areas that have been exposed to biochemical agents or radiation. Real time reconnaissance enables the identification and characterization of transient events. A fundamentally new mission concept for tier-scalable reconnaissance of operational areas, originated by Fink et al., is aimed at replacing the engineering and safety constrained mission designs of the past. The tier-scalable paradigm integrates multi-tier (orbit atmosphere surface/subsurface) and multi-agent (satellite UAV/blimp surface/subsurface sensing platforms) hierarchical mission architectures, introducing not only mission redundancy and safety, but also enabling and optimizing intelligent, less constrained, and distributed reconnaissance in real time. Given the mass, size, and power constraints faced by such a multi-platform approach, this is an ideal application scenario for a diverse set of MEMS sensors. To support such mission architectures, a high degree of operational autonomy is required. Essential elements of such operational autonomy are: (1) automatic mapping of an operational area from different vantage points (including vehicle health monitoring); (2) automatic feature extraction and target/region-of-interest identification within the mapped operational area; and (3) automatic target prioritization for close-up examination. These requirements imply the optimal deployment of MEMS sensors and sensor platforms, sensor fusion, and sensor interoperability.
NASA Astrophysics Data System (ADS)
Sabeur, Z. A.; Wächter, J.; Middleton, S. E.; Zlatev, Z.; Häner, R.; Hammitzsch, M.; Loewe, P.
2012-04-01
The intelligent management of large volumes of environmental monitoring data for early tsunami warning requires the deployment of robust and scalable service oriented infrastructure that is supported by an agile knowledge-base for critical decision-support In the TRIDEC project (TRIDEC 2010-2013), a sensor observation service bus of the TRIDEC system is being developed for the advancement of complex tsunami event processing and management. Further, a dedicated TRIDEC system knowledge-base is being implemented to enable on-demand access to semantically rich OGC SWE compliant hydrodynamic observations and operationally oriented meta-information to multiple subscribers. TRIDEC decision support requires a scalable and agile real-time processing architecture which enables fast response to evolving subscribers requirements as the tsunami crisis develops. This is also achieved with the support of intelligent processing services which specialise in multi-level fusion methods with relevance feedback and deep learning. The TRIDEC knowledge base development work coupled with that of the generic sensor bus platform shall be presented to demonstrate advanced decision-support with situation awareness in context of tsunami early warning and crisis management.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandrasekhar Potluri,; Madhavi Anugolu; Marco P. Schoen
2013-08-01
In this work, an array of three surface Electrography (sEMG) sensors are used to acquired muscle extension and contraction signals for 18 healthy test subjects. The skeletal muscle force is estimated using the acquired sEMG signals and a Non-linear Wiener Hammerstein model, relating the two signals in a dynamic fashion. The model is obtained from using System Identification (SI) algorithm. The obtained force models for each sensor are fused using a proposed fuzzy logic concept with the intent to improve the force estimation accuracy and resilience to sensor failure or misalignment. For the fuzzy logic inference system, the sEMG entropy,more » the relative error, and the correlation of the force signals are considered for defining the membership functions. The proposed fusion algorithm yields an average of 92.49% correlation between the actual force and the overall estimated force output. In addition, the proposed fusionbased approach is implemented on a test platform. Experiments indicate an improvement in finger/hand force estimation.« less
NASA Astrophysics Data System (ADS)
Noah, Paul V.; Noah, Meg A.; Schroeder, John W.; Chernick, Julian A.
1990-09-01
The U.S. Army has a requirement to develop systems for the detection and identification of ground targets in a clutter environment. Autonomous Homing Munitions (AHM) using infrared, visible, millimeter wave and other sensors are being investigated for this application. Advanced signal processing and computational approaches using pattern recognition and artificial intelligence techniques combined with multisensor data fusion have the potential to meet the Army's requirements for next generation ARM.
Integration of language and sensor information
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.; Weijers, Bertus
2003-04-01
The talk describes the development of basic technologies of intelligent systems fusing data from multiple domains and leading to automated computational techniques for understanding data contents. Understanding involves inferring appropriate decisions and recommending proper actions, which in turn requires fusion of data and knowledge about objects, situations, and actions. Data might include sensory data, verbal reports, intelligence intercepts, or public records, whereas knowledge ought to encompass the whole range of objects, situations, people and their behavior, and knowledge of languages. In the past, a fundamental difficulty in combining knowledge with data was the combinatorial complexity of computations, too many combinations of data and knowledge pieces had to be evaluated. Recent progress in understanding of natural intelligent systems, including the human mind, leads to the development of neurophysiologically motivated architectures for solving these challenging problems, in particular the role of emotional neural signals in overcoming combinatorial complexity of old logic-based approaches. Whereas past approaches based on logic tended to identify logic with language and thinking, recent studies in cognitive linguistics have led to appreciation of more complicated nature of linguistic models. Little is known about the details of the brain mechanisms integrating language and thinking. Understanding and fusion of linguistic information with sensory data represent a novel challenging aspect of the development of integrated fusion systems. The presentation will describe a non-combinatorial approach to this problem and outline techniques that can be used for fusing diverse and uncertain knowledge with sensory and linguistic data.
Application of online measures to monitor and evaluate multiplatform fusion performance
NASA Astrophysics Data System (ADS)
Stubberud, Stephen C.; Kowalski, Charlene; Klamer, Dale M.
1999-07-01
A primary concern of multiplatform data fusion is assessing the quality and utility of data shared among platforms. Constraints such as platform and sensor capability and task load necessitate development of an on-line system that computes a metric to determine which other platform can provide the best data for processing. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. To determine data quality, we are implementing an approach based on entropy coupled with intelligent agents. Entropy measures quality of processed information such as localization, classification, and ambiguity in measurement-to-track association. Lower entropy scores imply less uncertainty about a particular target. When new information is provided, we compuete the level of improvement a particular track obtains from one measurement to another. The measure permits us to evaluate the utility of the new information. We couple entropy with intelligent agents that provide two main data gathering functions: estimation of another platform's performance and evaluation of the new measurement data's quality. Both functions result from the entropy metric. The intelligent agent on a platform makes an estimate of another platform's measurement and provides it to its own fusion system, which can then incorporate it, for a particular target. A resulting entropy measure is then calculated and returned to its own agent. From this metric, the agent determines a perceived value of the offboard platform's measurement. If the value is satisfactory, the agent requests the measurement from the other platform, usually by interacting with the other platform's agent. Once the actual measurement is received, again entropy is computed and the agent assesses its estimation process and refines it accordingly.
Multi-intelligence critical rating assessment of fusion techniques (MiCRAFT)
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-06-01
Assessment of multi-intelligence fusion techniques includes credibility of algorithm performance, quality of results against mission needs, and usability in a work-domain context. Situation awareness (SAW) brings together low-level information fusion (tracking and identification), high-level information fusion (threat and scenario-based assessment), and information fusion level 5 user refinement (physical, cognitive, and information tasks). To measure SAW, we discuss the SAGAT (Situational Awareness Global Assessment Technique) technique for a multi-intelligence fusion (MIF) system assessment that focuses on the advantages of MIF against single intelligence sources. Building on the NASA TLX (Task Load Index), SAGAT probes, SART (Situational Awareness Rating Technique) questionnaires, and CDM (Critical Decision Method) decision points; we highlight these tools for use in a Multi-Intelligence Critical Rating Assessment of Fusion Techniques (MiCRAFT). The focus is to measure user refinement of a situation over the information fusion quality of service (QoS) metrics: timeliness, accuracy, confidence, workload (cost), and attention (throughput). A key component of any user analysis includes correlation, association, and summarization of data; so we also seek measures of product quality and QuEST of information. Building a notion of product quality from multi-intelligence tools is typically subjective which needs to be aligned with objective machine metrics.
2006-06-01
scenarios. The demonstration planned for May 2006, in Chiang Mai , Thailand, will have a first-responder, law enforcement, and counter-terrorism and counter...to local ( Chiang Mai ), theater (Bangkok), and global (Alameda, California) command and control centers. This fusion of information validates using...network performance to be tested during moderate environmental conditions. The third and fourth scenarios were conducted in Chiang Mai , Thailand
An RFID-based intelligent vehicle speed controller using active traffic signals.
Pérez, Joshué; Seco, Fernando; Milanés, Vicente; Jiménez, Antonio; Díaz, Julio C; de Pedro, Teresa
2010-01-01
These days, mass-produced vehicles benefit from research on Intelligent Transportation System (ITS). One prime example of ITS is vehicle Cruise Control (CC), which allows it to maintain a pre-defined reference speed, to economize on fuel or energy consumption, to avoid speeding fines, or to focus all of the driver's attention on the steering of the vehicle. However, achieving efficient Cruise Control is not easy in roads or urban streets where sudden changes of the speed limit can happen, due to the presence of unexpected obstacles or maintenance work, causing, in inattentive drivers, traffic accidents. In this communication we present a new Infrastructure to Vehicles (I2V) communication and control system for intelligent speed control, which is based upon Radio Frequency Identification (RFID) technology for identification of traffic signals on the road, and high accuracy vehicle speed measurement with a Hall effect-based sensor. A fuzzy logic controller, based on sensor fusion of the information provided by the I2V infrastructure, allows the efficient adaptation of the speed of the vehicle to the circumstances of the road. The performance of the system is checked empirically, with promising results.
An RFID-Based Intelligent Vehicle Speed Controller Using Active Traffic Signals
Pérez, Joshué; Seco, Fernando; Milanés, Vicente; Jiménez, Antonio; Díaz, Julio C.; de Pedro, Teresa
2010-01-01
These days, mass-produced vehicles benefit from research on Intelligent Transportation System (ITS). One prime example of ITS is vehicle Cruise Control (CC), which allows it to maintain a pre-defined reference speed, to economize on fuel or energy consumption, to avoid speeding fines, or to focus all of the driver’s attention on the steering of the vehicle. However, achieving efficient Cruise Control is not easy in roads or urban streets where sudden changes of the speed limit can happen, due to the presence of unexpected obstacles or maintenance work, causing, in inattentive drivers, traffic accidents. In this communication we present a new Infrastructure to Vehicles (I2V) communication and control system for intelligent speed control, which is based upon Radio Frequency Identification (RFID) technology for identification of traffic signals on the road, and high accuracy vehicle speed measurement with a Hall effect-based sensor. A fuzzy logic controller, based on sensor fusion of the information provided by the I2V infrastructure, allows the efficient adaptation of the speed of the vehicle to the circumstances of the road. The performance of the system is checked empirically, with promising results. PMID:22219692
Cognitive foundations for model-based sensor fusion
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.; Weijers, Bertus; Mutz, Chris W.
2003-08-01
Target detection, tracking, and sensor fusion are complicated problems, which usually are performed sequentially. First detecting targets, then tracking, then fusing multiple sensors reduces computations. This procedure however is inapplicable to difficult targets which cannot be reliably detected using individual sensors, on individual scans or frames. In such more complicated cases one has to perform functions of fusing, tracking, and detecting concurrently. This often has led to prohibitive combinatorial complexity and, as a consequence, to sub-optimal performance as compared to the information-theoretic content of all the available data. It is well appreciated that in this task the human mind is by far superior qualitatively to existing mathematical methods of sensor fusion, however, the human mind is limited in the amount of information and speed of computation it can cope with. Therefore, research efforts have been devoted toward incorporating "biological lessons" into smart algorithms, yet success has been limited. Why is this so, and how to overcome existing limitations? The fundamental reasons for current limitations are analyzed and a potentially breakthrough research and development effort is outlined. We utilize the way our mind combines emotions and concepts in the thinking process and present the mathematical approach to accomplishing this in the current technology computers. The presentation will summarize the difficulties encountered by intelligent systems over the last 50 years related to combinatorial complexity, analyze the fundamental limitations of existing algorithms and neural networks, and relate it to the type of logic underlying the computational structure: formal, multivalued, and fuzzy logic. A new concept of dynamic logic will be introduced along with algorithms capable of pulling together all the available information from multiple sources. This new mathematical technique, like our brain, combines conceptual understanding with emotional evaluation and overcomes the combinatorial complexity of concurrent fusion, tracking, and detection. The presentation will discuss examples of performance, where computational speedups of many orders of magnitude were attained leading to performance improvements of up to 10 dB (and better).
Si, Lei; Wang, Zhongbin; Liu, Xinhua; Tan, Chao; Xu, Jing; Zheng, Kehong
2015-11-13
In order to efficiently and accurately identify the cutting condition of a shearer, this paper proposed an intelligent multi-sensor data fusion identification method using the parallel quasi-Newton neural network (PQN-NN) and the Dempster-Shafer (DS) theory. The vibration acceleration signals and current signal of six cutting conditions were collected from a self-designed experimental system and some special state features were extracted from the intrinsic mode functions (IMFs) based on the ensemble empirical mode decomposition (EEMD). In the experiment, three classifiers were trained and tested by the selected features of the measured data, and the DS theory was used to combine the identification results of three single classifiers. Furthermore, some comparisons with other methods were carried out. The experimental results indicate that the proposed method performs with higher detection accuracy and credibility than the competing algorithms. Finally, an industrial application example in the fully mechanized coal mining face was demonstrated to specify the effect of the proposed system.
The Modular Design and Production of an Intelligent Robot Based on a Closed-Loop Control Strategy.
Zhang, Libo; Zhu, Junjie; Ren, Hao; Liu, Dongdong; Meng, Dan; Wu, Yanjun; Luo, Tiejian
2017-10-14
Intelligent robots are part of a new generation of robots that are able to sense the surrounding environment, plan their own actions and eventually reach their targets. In recent years, reliance upon robots in both daily life and industry has increased. The protocol proposed in this paper describes the design and production of a handling robot with an intelligent search algorithm and an autonomous identification function. First, the various working modules are mechanically assembled to complete the construction of the work platform and the installation of the robotic manipulator. Then, we design a closed-loop control system and a four-quadrant motor control strategy, with the aid of debugging software, as well as set steering gear identity (ID), baud rate and other working parameters to ensure that the robot achieves the desired dynamic performance and low energy consumption. Next, we debug the sensor to achieve multi-sensor fusion to accurately acquire environmental information. Finally, we implement the relevant algorithm, which can recognize the success of the robot's function for a given application. The advantage of this approach is its reliability and flexibility, as the users can develop a variety of hardware construction programs and utilize the comprehensive debugger to implement an intelligent control strategy. This allows users to set personalized requirements based on their needs with high efficiency and robustness.
Smart Networked Elements in Support of ISHM
NASA Technical Reports Server (NTRS)
Oostdyk, Rebecca; Mata, Carlos; Perotti, Jose M.
2008-01-01
At the core of ISHM is the ability to extract information and knowledge from raw data. Conventional data acquisition systems sample and convert physical measurements to engineering units, which higher-level systems use to derive health and information about processes and systems. Although health management is essential at the top level, there are considerable advantages to implementing health-related functions at the sensor level. The distribution of processing to lower levels reduces bandwidth requirements, enhances data fusion, and improves the resolution for detection and isolation of failures in a system, subsystem, component, or process. The Smart Networked Element (SNE) has been developed to implement intelligent functions and algorithms at the sensor level in support of ISHM.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1990-01-01
Various papers on human and machine strategies in sensor fusion are presented. The general topics addressed include: active vision, measurement and analysis of visual motion, decision models for sensor fusion, implementation of sensor fusion algorithms, applying sensor fusion to image analysis, perceptual modules and their fusion, perceptual organization and object recognition, planning and the integration of high-level knowledge with perception, using prior knowledge and context in sensor fusion.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1991-01-01
The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.
Multisource information fusion applied to ship identification for the recognized maritime picture
NASA Astrophysics Data System (ADS)
Simard, Marc-Alain; Lefebvre, Eric; Helleur, Christopher
2000-04-01
The Recognized Maritime Picture (RMP) is defined as a composite picture of activity over a maritime area of interest. In simplistic terms, building an RAMP comes down to finding if an object of interest, a ship in our case, is there or not, determining what it is, determining what it is doing and determining if some type of follow-on action is required. The Canadian Department of National Defence currently has access to or may, in the near future, have access to a number of civilians, military and allied information or sensor systems to accomplish these purposes. These systems include automatic self-reporting positional systems, air patrol surveillance systems, high frequency surface radars, electronic intelligence systems, radar space systems and high frequency direction finding sensors. The ability to make full use of these systems is limited by the existing capability to fuse data from all sources in a timely, accurate and complete manner. This paper presents an information fusion systems under development that correlates and fuses these information and sensor data sources. This fusion system, named Adaptive Fuzzy Logic Correlator, correlates the information in batch but fuses and constructs ship tracks sequentially. It applies standard Kalman filter techniques and fuzzy logic correlation techniques. We propose a set of recommendations that should improve the ship identification process. Particularly it is proposed to utilize as many non-redundant sources of information as possible that address specific vessel attributes. Another important recommendation states that the information fusion and data association techniques should be capable of dealing with incomplete and imprecise information. Some fuzzy logic techniques capable of tolerating imprecise and dissimilar data are proposed.
Data Fusion for Enhanced Aircraft Engine Prognostics and Health Management
NASA Technical Reports Server (NTRS)
Volponi, Al
2005-01-01
Aircraft gas-turbine engine data is available from a variety of sources, including on-board sensor measurements, maintenance histories, and component models. An ultimate goal of Propulsion Health Management (PHM) is to maximize the amount of meaningful information that can be extracted from disparate data sources to obtain comprehensive diagnostic and prognostic knowledge regarding the health of the engine. Data fusion is the integration of data or information from multiple sources for the achievement of improved accuracy and more specific inferences than can be obtained from the use of a single sensor alone. The basic tenet underlying the data/ information fusion concept is to leverage all available information to enhance diagnostic visibility, increase diagnostic reliability and reduce the number of diagnostic false alarms. This report describes a basic PHM data fusion architecture being developed in alignment with the NASA C-17 PHM Flight Test program. The challenge of how to maximize the meaningful information extracted from disparate data sources to obtain enhanced diagnostic and prognostic information regarding the health and condition of the engine is the primary goal of this endeavor. To address this challenge, NASA Glenn Research Center, NASA Dryden Flight Research Center, and Pratt & Whitney have formed a team with several small innovative technology companies to plan and conduct a research project in the area of data fusion, as it applies to PHM. Methodologies being developed and evaluated have been drawn from a wide range of areas including artificial intelligence, pattern recognition, statistical estimation, and fuzzy logic. This report will provide a chronology and summary of the work accomplished under this research contract.
Intelligent Sensors: Strategies for an Integrated Systems Approach
NASA Technical Reports Server (NTRS)
Chitikeshi, Sanjeevi; Mahajan, Ajay; Bandhil, Pavan; Utterbach, Lucas; Figueroa, Fernando
2005-01-01
This paper proposes the development of intelligent sensors as an integrated systems approach, i.e. one treats the sensors as a complete system with its own sensing hardware (the traditional sensor), A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements. These smart elements can be sensors, actuators or other devices. The immediate application is the monitoring of the rocket test stands, but the technology should be generally applicable to the Intelligent Systems Health Monitoring (ISHM) vision. This paper outlines progress made in the development of intelligent sensors by describing the work done till date on Physical Intelligent Sensors (PIS) and Virtual Intelligent Sensors (VIS).
Evaluation of the Jonker-Volgenant-Castanon (JVC) assignment algorithm for track association
NASA Astrophysics Data System (ADS)
Malkoff, Donald B.
1997-07-01
The Jonker-Volgenant-Castanon (JVC) assignment algorithm was used by Lockheed Martin Advanced Technology Laboratories (ATL) for track association in the Rotorcraft Pilot's Associate (RPA) program. RPA is Army Aviation's largest science and technology program, involving an integrated hardware/software system approach for a next generation helicopter containing advanced sensor equipments and applying artificial intelligence `associate' technologies. ATL is responsible for the multisensor, multitarget, onboard/offboard track fusion. McDonnell Douglas Helicopter Systems is the prime contractor and Lockheed Martin Federal Systems is responsible for developing much of the cognitive decision aiding and controls-and-displays subsystems. RPA is scheduled for flight testing beginning in 1997. RPA is unique in requiring real-time tracking and fusion for large numbers of highly-maneuverable ground (and air) targets in a target-dense environment. It uses diverse sensors and is concerned with a large area of interest. Target class and identification data is tightly integrated with spatial and kinematic data throughout the processing. Because of platform constraints, processing hardware for track fusion was quite limited. No previous experience using JVC in this type environment had been reported. ATL performed extensive testing of the JVC, concentrating on error rates and run- times under a variety of conditions. These included wide ranging numbers and types of targets, sensor uncertainties, target attributes, differing degrees of target maneuverability, and diverse combinations of sensors. Testing utilized Monte Carlo approaches, as well as many kinds of challenging scenarios. Comparisons were made with a nearest-neighbor algorithm and a new, proprietary algorithm (the `Competition' algorithm). The JVC proved to be an excellent choice for the RPA environment, providing a good balance between speed of operation and accuracy of results.
Si, Lei; Wang, Zhongbin; Liu, Xinhua; Tan, Chao; Xu, Jing; Zheng, Kehong
2015-01-01
In order to efficiently and accurately identify the cutting condition of a shearer, this paper proposed an intelligent multi-sensor data fusion identification method using the parallel quasi-Newton neural network (PQN-NN) and the Dempster-Shafer (DS) theory. The vibration acceleration signals and current signal of six cutting conditions were collected from a self-designed experimental system and some special state features were extracted from the intrinsic mode functions (IMFs) based on the ensemble empirical mode decomposition (EEMD). In the experiment, three classifiers were trained and tested by the selected features of the measured data, and the DS theory was used to combine the identification results of three single classifiers. Furthermore, some comparisons with other methods were carried out. The experimental results indicate that the proposed method performs with higher detection accuracy and credibility than the competing algorithms. Finally, an industrial application example in the fully mechanized coal mining face was demonstrated to specify the effect of the proposed system. PMID:26580620
A novel multisensor traffic state assessment system based on incomplete data.
Zeng, Yiliang; Lan, Jinhui; Ran, Bin; Jiang, Yaoliang
2014-01-01
A novel multisensor system with incomplete data is presented for traffic state assessment. The system comprises probe vehicle detection sensors, fixed detection sensors, and traffic state assessment algorithm. First of all, the validity checking of the traffic flow data is taken as preprocessing of this method. And then a new method based on the history data information is proposed to fuse and recover the incomplete data. According to the characteristics of space complementary of data based on the probe vehicle detector and fixed detector, a fusion model of space matching is presented to estimate the mean travel speed of the road. Finally, the traffic flow data include flow, speed and, occupancy rate, which are detected between Beijing Deshengmen bridge and Drum Tower bridge, are fused to assess the traffic state of the road by using the fusion decision model of rough sets and cloud. The accuracy of experiment result can reach more than 98%, and the result is in accordance with the actual road traffic state. This system is effective to assess traffic state, and it is suitable for the urban intelligent transportation system.
A Novel Multisensor Traffic State Assessment System Based on Incomplete Data
Zeng, Yiliang; Lan, Jinhui; Ran, Bin; Jiang, Yaoliang
2014-01-01
A novel multisensor system with incomplete data is presented for traffic state assessment. The system comprises probe vehicle detection sensors, fixed detection sensors, and traffic state assessment algorithm. First of all, the validity checking of the traffic flow data is taken as preprocessing of this method. And then a new method based on the history data information is proposed to fuse and recover the incomplete data. According to the characteristics of space complementary of data based on the probe vehicle detector and fixed detector, a fusion model of space matching is presented to estimate the mean travel speed of the road. Finally, the traffic flow data include flow, speed and, occupancy rate, which are detected between Beijing Deshengmen bridge and Drum Tower bridge, are fused to assess the traffic state of the road by using the fusion decision model of rough sets and cloud. The accuracy of experiment result can reach more than 98%, and the result is in accordance with the actual road traffic state. This system is effective to assess traffic state, and it is suitable for the urban intelligent transportation system. PMID:25162055
Interactive Scene Analysis Module - A sensor-database fusion system for telerobotic environments
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; Vazquez, Sixto L.; Goode, Plesent W.
1992-01-01
Accomplishing a task with telerobotics typically involves a combination of operator control/supervision and a 'script' of preprogrammed commands. These commands usually assume that the location of various objects in the task space conform to some internal representation (database) of that task space. The ability to quickly and accurately verify the task environment against the internal database would improve the robustness of these preprogrammed commands. In addition, the on-line initialization and maintenance of a task space database is difficult for operators using Cartesian coordinates alone. This paper describes the Interactive Scene' Analysis Module (ISAM) developed to provide taskspace database initialization and verification utilizing 3-D graphic overlay modelling, video imaging, and laser radar based range imaging. Through the fusion of taskspace database information and image sensor data, a verifiable taskspace model is generated providing location and orientation data for objects in a task space. This paper also describes applications of the ISAM in the Intelligent Systems Research Laboratory (ISRL) at NASA Langley Research Center, and discusses its performance relative to representation accuracy and operator interface efficiency.
NASA Astrophysics Data System (ADS)
Low, Kerwin; Elhadidi, Basman; Glauser, Mark
2009-11-01
Understanding the different noise production mechanisms caused by the free shear flows in a turbulent jet flow provides insight to improve ``intelligent'' feedback mechanisms to control the noise. Towards this effort, a control scheme is based on feedback of azimuthal pressure measurements in the near field of the jet at two streamwise locations. Previous studies suggested that noise reduction can be achieved by azimuthal actuators perturbing the shear layer at the jet lip. The closed-loop actuation will be based on a low-dimensional Fourier representation of the hydrodynamic pressure measurements. Preliminary results show that control authority and reduction in the overall sound pressure level was possible. These results provide motivation to move forward with the overall vision of developing innovative multi-mode sensing methods to improve state estimation and derive dynamical systems. It is envisioned that estimating velocity-field and dynamic pressure information from various locations both local and in the far-field regions, sensor fusion techniques can be utilized to ascertain greater overall control authority.
2016-09-01
other associated grants. 15. SUBJECT TERMS SUNY Poly, STEM, Artificial Intelligence , Command and Control 16. SECURITY CLASSIFICATION OF: 17...neuromorphic system has the potential to be widely used in a high-efficiency artificial intelligence system. Simulation results have indicated that the...novel multiresolution fusion and advanced fusion performance evaluation tool for an Artificial Intelligence based natural language annotation engine for
Intelligent Sensors: An Integrated Systems Approach
NASA Technical Reports Server (NTRS)
Mahajan, Ajay; Chitikeshi, Sanjeevi; Bandhil, Pavan; Utterbach, Lucas; Figueroa, Fernando
2005-01-01
The need for intelligent sensors as a critical component for Integrated System Health Management (ISHM) is fairly well recognized by now. Even the definition of what constitutes an intelligent sensor (or smart sensor) is well documented and stems from an intuitive desire to get the best quality measurement data that forms the basis of any complex health monitoring and/or management system. If the sensors, i.e. the elements closest to the measurand, are unreliable then the whole system works with a tremendous handicap. Hence, there has always been a desire to distribute intelligence down to the sensor level, and give it the ability to assess its own health thereby improving the confidence in the quality of the data at all times. This paper proposes the development of intelligent sensors as an integrated systems approach, i.e. one treats the sensors as a complete system with its own sensing hardware (the traditional sensor), A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the NASA Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements. These smart elements can be sensors, actuators or other devices. The immediate application is the monitoring of the rocket test stands, but the technology should be generally applicable to the Intelligent Systems Health Monitoring (ISHM) vision. This paper outlines some fundamental issues in the development of intelligent sensors under the following two categories: Physical Intelligent Sensors (PIS) and Virtual Intelligent Sensors (VIS).
NASA Astrophysics Data System (ADS)
Mahajan, Ajay; Chitikeshi, Sanjeevi; Utterbach, Lucas; Bandhil, Pavan; Figueroa, Fernando
2006-05-01
This paper describes the application of intelligent sensors in the Integrated Systems Health Monitoring (ISHM) as applied to a rocket test stand. The development of intelligent sensors is attempted as an integrated system approach, i.e. one treats the sensors as a complete system with its own physical transducer, A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the NASA Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements associated with the rocket tests stands. These smart elements can be sensors, actuators or other devices. Though the immediate application is the monitoring of the rocket test stands, the technology should be generally applicable to the ISHM vision. This paper outlines progress made in the development of intelligent sensors by describing the work done till date on Physical Intelligent sensors (PIS) and Virtual Intelligent Sensors (VIS).
NASA Astrophysics Data System (ADS)
Saadeddin, Kamal; Abdel-Hafez, Mamoun F.; Jaradat, Mohammad A.; Jarrah, Mohammad Amin
2013-12-01
In this paper, a low-cost navigation system that fuses the measurements of the inertial navigation system (INS) and the global positioning system (GPS) receiver is developed. First, the system's dynamics are obtained based on a vehicle's kinematic model. Second, the INS and GPS measurements are fused using an extended Kalman filter (EKF) approach. Subsequently, an artificial intelligence based approach for the fusion of INS/GPS measurements is developed based on an Input-Delayed Adaptive Neuro-Fuzzy Inference System (IDANFIS). Experimental tests are conducted to demonstrate the performance of the two sensor fusion approaches. It is found that the use of the proposed IDANFIS approach achieves a reduction in the integration development time and an improvement in the estimation accuracy of the vehicle's position and velocity compared to the EKF based approach.
Development of an Information Fusion System for Engine Diagnostics and Health Management
NASA Technical Reports Server (NTRS)
Volponi, Allan J.; Brotherton, Tom; Luppold, Robert; Simon, Donald L.
2004-01-01
Aircraft gas-turbine engine data are available from a variety of sources including on-board sensor measurements, maintenance histories, and component models. An ultimate goal of Propulsion Health Management (PHM) is to maximize the amount of meaningful information that can be extracted from disparate data sources to obtain comprehensive diagnostic and prognostic knowledge regarding the health of the engine. Data Fusion is the integration of data or information from multiple sources, to achieve improved accuracy and more specific inferences than can be obtained from the use of a single sensor alone. The basic tenet underlying the data/information fusion concept is to leverage all available information to enhance diagnostic visibility, increase diagnostic reliability and reduce the number of diagnostic false alarms. This paper describes a basic PHM Data Fusion architecture being developed in alignment with the NASA C17 Propulsion Health Management (PHM) Flight Test program. The challenge of how to maximize the meaningful information extracted from disparate data sources to obtain enhanced diagnostic and prognostic information regarding the health and condition of the engine is the primary goal of this endeavor. To address this challenge, NASA Glenn Research Center (GRC), NASA Dryden Flight Research Center (DFRC) and Pratt & Whitney (P&W) have formed a team with several small innovative technology companies to plan and conduct a research project in the area of data fusion as applied to PHM. Methodologies being developed and evaluated have been drawn from a wide range of areas including artificial intelligence, pattern recognition, statistical estimation, and fuzzy logic. This paper will provide a broad overview of this work, discuss some of the methodologies employed and give some illustrative examples.
Advances in data representation for hard/soft information fusion
NASA Astrophysics Data System (ADS)
Rimland, Jeffrey C.; Coughlin, Dan; Hall, David L.; Graham, Jacob L.
2012-06-01
Information fusion is becoming increasingly human-centric. While past systems typically relegated humans to the role of analyzing a finished fusion product, current systems are exploring the role of humans as integral elements in a modular and extensible distributed framework where many tasks can be accomplished by either human or machine performers. For example, "participatory sensing" campaigns give humans the role of "soft sensors" by uploading their direct observations or as "soft sensor platforms" by using mobile devices to record human-annotated, GPS-encoded high quality photographs, video, or audio. Additionally, the role of "human-in-the-loop", in which individuals or teams using advanced human computer interface (HCI) tools such as stereoscopic 3D visualization, haptic interfaces, or aural "sonification" interfaces can help to effectively engage the innate human capability to perform pattern matching, anomaly identification, and semantic-based contextual reasoning to interpret an evolving situation. The Pennsylvania State University is participating in a Multi-disciplinary University Research Initiative (MURI) program funded by the U.S. Army Research Office to investigate fusion of hard and soft data in counterinsurgency (COIN) situations. In addition to the importance of this research for Intelligence Preparation of the Battlefield (IPB), many of the same challenges and techniques apply to health and medical informatics, crisis management, crowd-sourced "citizen science", and monitoring environmental concerns. One of the key challenges that we have encountered is the development of data formats, protocols, and methodologies to establish an information architecture and framework for the effective capture, representation, transmission, and storage of the vastly heterogeneous data and accompanying metadata -- including capabilities and characteristics of human observers, uncertainty of human observations, "soft" contextual data, and information pedigree. This paper describes our findings and offers insights into the role of data representation in hard/soft fusion.
Sensor fusion for antipersonnel landmine detection: a case study
NASA Astrophysics Data System (ADS)
den Breejen, Eric; Schutte, Klamer; Cremer, Frank
1999-08-01
In this paper the multi sensor fusion results obtained within the European research project GEODE are presented. The layout of the test lane and the individual sensors used are described. The implementation of the SCOOP algorithm improves the ROC curves, as the false alarm surface and the number of false alarms both are taken into account. The confidence grids, as produced by the sensor manufacturers, of the sensors are used as input for the different sensor fusion methods implemented. The multisensor fusion methods implemented are Bayes, Dempster-Shafer, fuzzy probabilities and rules. The mapping of the confidence grids to the input parameters for fusion methods is an important step. Due to limited amount of the available data the entire test lane is used for training and evaluation. All four sensor fusion methods provide better detection results than the individual sensors.
Hybrid Arrays for Chemical Sensing
NASA Astrophysics Data System (ADS)
Kramer, Kirsten E.; Rose-Pehrsson, Susan L.; Johnson, Kevin J.; Minor, Christian P.
In recent years, multisensory approaches to environment monitoring for chemical detection as well as other forms of situational awareness have become increasingly popular. A hybrid sensor is a multimodal system that incorporates several sensing elements and thus produces data that are multivariate in nature and may be significantly increased in complexity compared to data provided by single-sensor systems. Though a hybrid sensor is itself an array, hybrid sensors are often organized into more complex sensing systems through an assortment of network topologies. Part of the reason for the shift to hybrid sensors is due to advancements in sensor technology and computational power available for processing larger amounts of data. There is also ample evidence to support the claim that a multivariate analytical approach is generally superior to univariate measurements because it provides additional redundant and complementary information (Hall, D. L.; Linas, J., Eds., Handbook of Multisensor Data Fusion, CRC, Boca Raton, FL, 2001). However, the benefits of a multisensory approach are not automatically achieved. Interpretation of data from hybrid arrays of sensors requires the analyst to develop an application-specific methodology to optimally fuse the disparate sources of data generated by the hybrid array into useful information characterizing the sample or environment being observed. Consequently, multivariate data analysis techniques such as those employed in the field of chemometrics have become more important in analyzing sensor array data. Depending on the nature of the acquired data, a number of chemometric algorithms may prove useful in the analysis and interpretation of data from hybrid sensor arrays. It is important to note, however, that the challenges posed by the analysis of hybrid sensor array data are not unique to the field of chemical sensing. Applications in electrical and process engineering, remote sensing, medicine, and of course, artificial intelligence and robotics, all share the same essential data fusion challenges. The design of a hybrid sensor array should draw on this extended body of knowledge. In this chapter, various techniques for data preprocessing, feature extraction, feature selection, and modeling of sensor data will be introduced and illustrated with data fusion approaches that have been implemented in applications involving data from hybrid arrays. The example systems discussed in this chapter involve the development of prototype sensor networks for damage control event detection aboard US Navy vessels and the development of analysis algorithms to combine multiple sensing techniques for enhanced remote detection of unexploded ordnance (UXO) in both ground surveys and wide area assessments.
A data fusion framework for meta-evaluation of intelligent transportation system effectiveness
DOT National Transportation Integrated Search
This study presents a framework for the meta-evaluation of Intelligent Transportation System effectiveness. The framework is based on data fusion approaches that adjust for data biases and violations of other standard statistical assumptions. Operati...
1984-12-01
system. The reconstruction process is Simply data fusion after allA data are in. After reconstruction, artifcial intelligence (Al) techniques may be...14. CATE OF fhPM~TVW MWtvt Ogv It PAWE COMN Interim __100 -_ TO December 1984 24 MILD ON" s-o Artificial intelligence Command control Data fusion...RD-Ai5O 867 RESEARCH NEEDS FOR ARTIFICIAL INTELLIGENCE APPLICATIONS i/i IN SUPPORT OF C3 (..(U) NAVAL OCEAN SVSTEIIS CENTER SAN DIEGO CA R R DILLARD
NASA Technical Reports Server (NTRS)
Foyle, David C.
1993-01-01
Based on existing integration models in the psychological literature, an evaluation framework is developed to assess sensor fusion displays as might be implemented in an enhanced/synthetic vision system. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The pilot's performance with the sensor fusion image is compared to models' predictions based on the pilot's performance when viewing the original component sensor images prior to fusion. This allows for the determination as to when a sensor fusion system leads to: poorer performance than one of the original sensor displays, clearly an undesirable system in which the fused sensor system causes some distortion or interference; better performance than with either single sensor system alone, but at a sub-optimal level compared to model predictions; optimal performance compared to model predictions; or, super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays.
Optimal Sensor Fusion for Structural Health Monitoring of Aircraft Composite Components
2011-09-01
sensor networks combine or fuse different types of sensors. Fiber Bragg Grating ( FBG ) sensors can be inserted in layers of composite structures to...consideration. This paper describes an example of optimal sensor fusion, which combines FBG sensors and PZT sensors. Optimal sensor fusion tries to find...Fiber Bragg Grating ( FBG ) sensors can be inserted in layers of composite structures to provide local damage detection, while surface mounted
NASA Astrophysics Data System (ADS)
Preece, Alun; Gwilliams, Chris; Parizas, Christos; Pizzocaro, Diego; Bakdash, Jonathan Z.; Braines, Dave
2014-05-01
Recent developments in sensing technologies, mobile devices and context-aware user interfaces have made it pos- sible to represent information fusion and situational awareness for Intelligence, Surveillance and Reconnaissance (ISR) activities as a conversational process among actors at or near the tactical edges of a network. Motivated by use cases in the domain of Company Intelligence Support Team (CoIST) tasks, this paper presents an approach to information collection, fusion and sense-making based on the use of natural language (NL) and controlled nat- ural language (CNL) to support richer forms of human-machine interaction. The approach uses a conversational protocol to facilitate a ow of collaborative messages from NL to CNL and back again in support of interactions such as: turning eyewitness reports from human observers into actionable information (from both soldier and civilian sources); fusing information from humans and physical sensors (with associated quality metadata); and assisting human analysts to make the best use of available sensing assets in an area of interest (governed by man- agement and security policies). CNL is used as a common formal knowledge representation for both machine and human agents to support reasoning, semantic information fusion and generation of rationale for inferences, in ways that remain transparent to human users. Examples are provided of various alternative styles for user feedback, including NL, CNL and graphical feedback. A pilot experiment with human subjects shows that a prototype conversational agent is able to gather usable CNL information from untrained human subjects.
Study on the multi-sensors monitoring and information fusion technology of dangerous cargo container
NASA Astrophysics Data System (ADS)
Xu, Shibo; Zhang, Shuhui; Cao, Wensheng
2017-10-01
In this paper, monitoring system of dangerous cargo container based on multi-sensors is presented. In order to improve monitoring accuracy, multi-sensors will be applied inside of dangerous cargo container. Multi-sensors information fusion solution of monitoring dangerous cargo container is put forward, and information pre-processing, the fusion algorithm of homogenous sensors and information fusion based on BP neural network are illustrated, applying multi-sensors in the field of container monitoring has some novelty.
Intelligent Sensors and Components for On-Board ISHM
NASA Technical Reports Server (NTRS)
Figueroa, Jorge; Morris, Jon; Nickles, Donald; Schmalzel, Jorge; Rauth, David; Mahajan, Ajay; Utterbach, L.; Oesch, C.
2006-01-01
A viewgraph presentation on the development of intelligent sensors and components for on-board Integrated Systems Health Health Management (ISHM) is shown. The topics include: 1) Motivation; 2) Integrated Systems Health Management (ISHM); 3) Intelligent Components; 4) IEEE 1451; 5)Intelligent Sensors; 6) Application; and 7) Future Directions
An Innovative Thinking-Based Intelligent Information Fusion Algorithm
Hu, Liang; Liu, Gang; Zhou, Jin
2013-01-01
This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information. PMID:23956699
An innovative thinking-based intelligent information fusion algorithm.
Lu, Huimin; Hu, Liang; Liu, Gang; Zhou, Jin
2013-01-01
This study proposes an intelligent algorithm that can realize information fusion in reference to the relative research achievements in brain cognitive theory and innovative computation. This algorithm treats knowledge as core and information fusion as a knowledge-based innovative thinking process. Furthermore, the five key parts of this algorithm including information sense and perception, memory storage, divergent thinking, convergent thinking, and evaluation system are simulated and modeled. This algorithm fully develops innovative thinking skills of knowledge in information fusion and is a try to converse the abstract conception of brain cognitive science to specific and operable research routes and strategies. Furthermore, the influences of each parameter of this algorithm on algorithm performance are analyzed and compared with those of classical intelligent algorithms trough test. Test results suggest that the algorithm proposed in this study can obtain the optimum problem solution by less target evaluation times, improve optimization effectiveness, and achieve the effective fusion of information.
Wireless structural monitoring for homeland security applications
NASA Astrophysics Data System (ADS)
Kiremidjian, Garo K.; Kiremidjian, Anne S.; Lynch, Jerome P.
2004-07-01
This paper addresses the development of a robust, low-cost, low power, and high performance autonomous wireless monitoring system for civil assets such as large facilities, new construction, bridges, dams, commercial buildings, etc. The role of the system is to identify the onset, development, location and severity of structural vulnerability and damage. The proposed system represents an enabling infrastructure for addressing structural vulnerabilities specifically associated with homeland security. The system concept is based on dense networks of "intelligent" wireless sensing units. The fundamental properties of a wireless sensing unit include: (a) interfaces to multiple sensors for measuring structural and environmental data (such as acceleration, displacements, pressure, strain, material degradation, temperature, gas agents, biological agents, humidity, corrosion, etc.); (b) processing of sensor data with embedded algorithms for assessing damage and environmental conditions; (c) peer-to-peer wireless communications for information exchange among units(thus enabling joint "intelligent" processing coordination) and storage of data and processed information in servers for information fusion; (d) ultra low power operation; (e) cost-effectiveness and compact size through the use of low-cost small-size off-the-shelf components. An integral component of the overall system concept is a decision support environment for interpretation and dissemination of information to various decision makers.
A small, cheap, and portable reconnaissance robot
NASA Astrophysics Data System (ADS)
Kenyon, Samuel H.; Creary, D.; Thi, Dan; Maynard, Jeffrey
2005-05-01
While there is much interest in human-carriable mobile robots for defense/security applications, existing examples are still too large/heavy, and there are not many successful small human-deployable mobile ground robots, especially ones that can survive being thrown/dropped. We have developed a prototype small short-range teleoperated indoor reconnaissance/surveillance robot that is semi-autonomous. It is self-powered, self-propelled, spherical, and meant to be carried and thrown by humans into indoor, yet relatively unstructured, dynamic environments. The robot uses multiple channels for wireless control and feedback, with the potential for inter-robot communication, swarm behavior, or distributed sensor network capabilities. The primary reconnaissance sensor for this prototype is visible-spectrum video. This paper focuses more on the software issues, both the onboard intelligent real time control system and the remote user interface. The communications, sensor fusion, intelligent real time controller, etc. are implemented with onboard microcontrollers. We based the autonomous and teleoperation controls on a simple finite state machine scripting layer. Minimal localization and autonomous routines were designed to best assist the operator, execute whatever mission the robot may have, and promote its own survival. We also discuss the advantages and pitfalls of an inexpensive, rapidly-developed semi-autonomous robotic system, especially one that is spherical, and the importance of human-robot interaction as considered for the human-deployment and remote user interface.
Begum, Shahina; Barua, Shaibal; Ahmed, Mobyen Uddin
2014-07-03
Today, clinicians often do diagnosis and classification of diseases based on information collected from several physiological sensor signals. However, sensor signal could easily be vulnerable to uncertain noises or interferences and due to large individual variations sensitivity to different physiological sensors could also vary. Therefore, multiple sensor signal fusion is valuable to provide more robust and reliable decision. This paper demonstrates a physiological sensor signal classification approach using sensor signal fusion and case-based reasoning. The proposed approach has been evaluated to classify Stressed or Relaxed individuals using sensor data fusion. Physiological sensor signals i.e., Heart Rate (HR), Finger Temperature (FT), Respiration Rate (RR), Carbon dioxide (CO2) and Oxygen Saturation (SpO2) are collected during the data collection phase. Here, sensor fusion has been done in two different ways: (i) decision-level fusion using features extracted through traditional approaches; and (ii) data-level fusion using features extracted by means of Multivariate Multiscale Entropy (MMSE). Case-Based Reasoning (CBR) is applied for the classification of the signals. The experimental result shows that the proposed system could classify Stressed or Relaxed individual 87.5% accurately compare to an expert in the domain. So, it shows promising result in the psychophysiological domain and could be possible to adapt this approach to other relevant healthcare systems.
Zhang, Wenyu; Zhang, Zhenjiang
2015-01-01
Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any type of classifier because the basic belief assignments (BBAs) of each sensor are constructed on the basis of the classifier’s training output confusion matrix and real-time observations. We also derive explicit global BBA in the fusion center under Dempster’s combinational rule, making the decision making operation in the fusion center greatly simplified. Also, sending the whole BBA structure to the fusion center is avoided. Experimental results demonstrate that the proposed fusion rule has better performance in fusion accuracy compared with the naïve Bayes rule and weighted majority voting rule. PMID:26295399
Spatiotemporal Local-Remote Senor Fusion (ST-LRSF) for Cooperative Vehicle Positioning.
Jeong, Han-You; Nguyen, Hoa-Hung; Bhawiyuga, Adhitya
2018-04-04
Vehicle positioning plays an important role in the design of protocols, algorithms, and applications in the intelligent transport systems. In this paper, we present a new framework of spatiotemporal local-remote sensor fusion (ST-LRSF) that cooperatively improves the accuracy of absolute vehicle positioning based on two state estimates of a vehicle in the vicinity: a local sensing estimate, measured by the on-board exteroceptive sensors, and a remote sensing estimate, received from neighbor vehicles via vehicle-to-everything communications. Given both estimates of vehicle state, the ST-LRSF scheme identifies the set of vehicles in the vicinity, determines the reference vehicle state, proposes a spatiotemporal dissimilarity metric between two reference vehicle states, and presents a greedy algorithm to compute a minimal weighted matching (MWM) between them. Given the outcome of MWM, the theoretical position uncertainty of the proposed refinement algorithm is proven to be inversely proportional to the square root of matching size. To further reduce the positioning uncertainty, we also develop an extended Kalman filter model with the refined position of ST-LRSF as one of the measurement inputs. The numerical results demonstrate that the proposed ST-LRSF framework can achieve high positioning accuracy for many different scenarios of cooperative vehicle positioning.
Intelligent Sensors for Integrated Systems Health Management (ISHM)
NASA Technical Reports Server (NTRS)
Schmalzel, John L.
2008-01-01
IEEE 1451 Smart Sensors contribute to a number of ISHM goals including cost reduction achieved through: a) Improved configuration management (TEDS); and b) Plug-and-play re-configuration. Intelligent Sensors are adaptation of Smart Sensors to include ISHM algorithms; this offers further benefits: a) Sensor validation. b) Confidence assessment of measurement, and c) Distributed ISHM processing. Space-qualified intelligent sensors are possible a) Size, mass, power constraints. b) Bus structure/protocol.
Color regeneration from reflective color sensor using an artificial intelligent technique.
Saracoglu, Ömer Galip; Altural, Hayriye
2010-01-01
A low-cost optical sensor based on reflective color sensing is presented. Artificial neural network models are used to improve the color regeneration from the sensor signals. Analog voltages of the sensor are successfully converted to RGB colors. The artificial intelligent models presented in this work enable color regeneration from analog outputs of the color sensor. Besides, inverse modeling supported by an intelligent technique enables the sensor probe for use of a colorimetric sensor that relates color changes to analog voltages.
Selected examples of intelligent (micro) sensor systems: state-of-the-art and tendencies
NASA Astrophysics Data System (ADS)
Hauptmann, Peter R.
2006-03-01
The capability of intelligent sensors to have more intelligence built into them continues to drive their application in areas including automotive, aerospace and defense, industrial, intelligent house and wear, medical and homeland security. In principle it is difficult to overestimate the importance of intelligent (micro) sensors or sensor systems within advanced societies but one characteristic feature is the global market for sensors, which is now about 20 billion annually. Therefore sensors or sensor systems play a dominant role in many fields from the macro sensor in manufacturing industry down to the miniaturized sensor for medical applications. The diversity of sensors precludes a complete description of the state-of-the-art; selected examples will illustrate the current situation. MEMS (microelectromechanical systems) devices are of special interest in the context of micro sensor systems. In past the main requirements of a sensor were in terms of metrological performance. The electrical (or optical) signal produced by the sensor needed to match the measure relatively accurately. Such basic functionality is no longer sufficient. Data processing near the sensor, the extraction of more information than just the direct sensor information by signal analysis, system aspects and multi-sensor information are the new demands. A shifting can be observed away from aiming to design perfect single-function transducers and towards the utilization of system-based sensors as system components. In the ideal case such systems contain sensors, actuators and electronics. They can be realized in monolithic, hybrid or discrete form—which kind is used depends on the application. In this article the state-of-the-art of intelligent sensors or sensor systems is reviewed using selected examples. Future trends are deduced.
Proposed evaluation framework for assessing operator performance with multisensor displays
NASA Technical Reports Server (NTRS)
Foyle, David C.
1992-01-01
Despite aggressive work on the development of sensor fusion algorithms and techniques, no formal evaluation procedures have been proposed. Based on existing integration models in the literature, an evaluation framework is developed to assess an operator's ability to use multisensor, or sensor fusion, displays. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The operator's performance with the sensor fusion display can be compared to the models' predictions based on the operator's performance when viewing the original sensor displays prior to fusion. This allows for the determination as to when a sensor fusion system leads to: 1) poorer performance than one of the original sensor displays (clearly an undesirable system in which the fused sensor system causes some distortion or interference); 2) better performance than with either single sensor system alone, but at a sub-optimal (compared to the model predictions) level; 3) optimal performance (compared to model predictions); or, 4) super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays. An experiment demonstrating the usefulness of the proposed evaluation framework is discussed.
A force vector and surface orientation sensor for intelligent grasping
NASA Technical Reports Server (NTRS)
Mcglasson, W. D.; Lorenz, R. D.; Duffie, N. A.; Gale, K. L.
1991-01-01
The paper discusses a force vector and surface orientation sensor suitable for intelligent grasping. The use of a novel four degree-of-freedom force vector robotic fingertip sensor allows efficient, real time intelligent grasping operations. The basis of sensing for intelligent grasping operations is presented and experimental results demonstrate the accuracy and ease of implementation of this approach.
Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi
2013-01-01
Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study. PMID:24351636
Liao, Yi-Hung; Chou, Jung-Chuan; Lin, Chin-Yi
2013-12-13
Fault diagnosis (FD) and data fusion (DF) technologies implemented in the LabVIEW program were used for a ruthenium dioxide pH sensor array. The purpose of the fault diagnosis and data fusion technologies is to increase the reliability of measured data. Data fusion is a very useful statistical method used for sensor arrays in many fields. Fault diagnosis is used to avoid sensor faults and to measure errors in the electrochemical measurement system, therefore, in this study, we use fault diagnosis to remove any faulty sensors in advance, and then proceed with data fusion in the sensor array. The average, self-adaptive and coefficient of variance data fusion methods are used in this study. The pH electrode is fabricated with ruthenium dioxide (RuO2) sensing membrane using a sputtering system to deposit it onto a silicon substrate, and eight RuO2 pH electrodes are fabricated to form a sensor array for this study.
Advances in multi-sensor data fusion: algorithms and applications.
Dong, Jiang; Zhuang, Dafang; Huang, Yaohuan; Fu, Jingying
2009-01-01
With the development of satellite and remote sensing techniques, more and more image data from airborne/satellite sensors have become available. Multi-sensor image fusion seeks to combine information from different images to obtain more inferences than can be derived from a single sensor. In image-based application fields, image fusion has emerged as a promising research area since the end of the last century. The paper presents an overview of recent advances in multi-sensor satellite image fusion. Firstly, the most popular existing fusion algorithms are introduced, with emphasis on their recent improvements. Advances in main applications fields in remote sensing, including object identification, classification, change detection and maneuvering targets tracking, are described. Both advantages and limitations of those applications are then discussed. Recommendations are addressed, including: (1) Improvements of fusion algorithms; (2) Development of "algorithm fusion" methods; (3) Establishment of an automatic quality assessment scheme.
2008-03-01
amount of arriving data, extract actionable information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center...information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center climate and it becomes clear that analysts, police... fusion centers, including specifics about how these problems manifest at the Illinois State Police (ISP) Statewide Terrorism and Intelligence Center
Enhanced chemical weapon warning via sensor fusion
NASA Astrophysics Data System (ADS)
Flaherty, Michael; Pritchett, Daniel; Cothren, Brian; Schwaiger, James
2011-05-01
Torch Technologies Inc., is actively involved in chemical sensor networking and data fusion via multi-year efforts with Dugway Proving Ground (DPG) and the Defense Threat Reduction Agency (DTRA). The objective of these efforts is to develop innovative concepts and advanced algorithms that enhance our national Chemical Warfare (CW) test and warning capabilities via the fusion of traditional and non-traditional CW sensor data. Under Phase I, II, and III Small Business Innovative Research (SBIR) contracts with DPG, Torch developed the Advanced Chemical Release Evaluation System (ACRES) software to support non real-time CW sensor data fusion. Under Phase I and II SBIRs with DTRA in conjunction with the Edgewood Chemical Biological Center (ECBC), Torch is using the DPG ACRES CW sensor data fuser as a framework from which to develop the Cloud state Estimation in a Networked Sensor Environment (CENSE) data fusion system. Torch is currently developing CENSE to implement and test innovative real-time sensor network based data fusion concepts using CW and non-CW ancillary sensor data to improve CW warning and detection in tactical scenarios.
Multi-sources data fusion framework for remote triage prioritization in telehealth.
Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K
2014-09-01
The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.
Integrative Multi-Spectral Sensor Device for Far-Infrared and Visible Light Fusion
NASA Astrophysics Data System (ADS)
Qiao, Tiezhu; Chen, Lulu; Pang, Yusong; Yan, Gaowei
2018-06-01
Infrared and visible light image fusion technology is a hot spot in the research of multi-sensor fusion technology in recent years. Existing infrared and visible light fusion technologies need to register before fusion because of using two cameras. However, the application effect of the registration technology has yet to be improved. Hence, a novel integrative multi-spectral sensor device is proposed for infrared and visible light fusion, and by using the beam splitter prism, the coaxial light incident from the same lens is projected to the infrared charge coupled device (CCD) and visible light CCD, respectively. In this paper, the imaging mechanism of the proposed sensor device is studied with the process of the signals acquisition and fusion. The simulation experiment, which involves the entire process of the optic system, signal acquisition, and signal fusion, is constructed based on imaging effect model. Additionally, the quality evaluation index is adopted to analyze the simulation result. The experimental results demonstrate that the proposed sensor device is effective and feasible.
2007-04-20
assistance, particularly Dr. Herb Nelson and Dr. Dan Steinhurst. 1 Executive Summary Background. The remediation of sites contaminated with...and applications,” Proc. of IEEE Intl. Conf. on Multisensor Fusion and Integration for Intelligent Systems, Taipei, Taiwan , R.O.C., Aug., 1999. 4
Hand-writing motion tracking with vision-inertial sensor fusion: calibration and error correction.
Zhou, Shengli; Fei, Fei; Zhang, Guanglie; Liu, Yunhui; Li, Wen J
2014-08-25
The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model.
Communications for unattended sensor networks
NASA Astrophysics Data System (ADS)
Nemeroff, Jay L.; Angelini, Paul; Orpilla, Mont; Garcia, Luis; DiPierro, Stefano
2004-07-01
The future model of the US Army's Future Combat Systems (FCS) and the Future Force reflects a combat force that utilizes lighter armor protection than the current standard. Survival on the future battlefield will be increased by the use of advanced situational awareness provided by unattended tactical and urban sensors that detect, identify, and track enemy targets and threats. Successful implementation of these critical sensor fields requires the development of advanced sensors, sensor and data-fusion processors, and a specialized communications network. To ensure warfighter and asset survivability, the communications must be capable of near real-time dissemination of the sensor data using robust, secure, stealthy, and jam resistant links so that the proper and decisive action can be taken. Communications will be provided to a wide-array of mission-specific sensors that are capable of processing data from acoustic, magnetic, seismic, and/or Chemical, Biological, Radiological, and Nuclear (CBRN) sensors. Other, more powerful, sensor node configurations will be capable of fusing sensor data and intelligently collect and process data images from infrared or visual imaging cameras. The radio waveform and networking protocols being developed under the Soldier Level Integrated Communications Environment (SLICE) Soldier Radio Waveform (SRW) and the Networked Sensors for the Future Force Advanced Technology Demonstration are part of an effort to develop a common waveform family which will operate across multiple tactical domains including dismounted soldiers, ground sensor, munitions, missiles and robotics. These waveform technologies will ultimately be transitioned to the JTRS library, specifically the Cluster 5 requirement.
Sensor fusion V; Proceedings of the Meeting, Boston, MA, Nov. 15-17, 1992
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Topics addressed include 3D object perception, human-machine interface in multisensor systems, sensor fusion architecture, fusion of multiple and distributed sensors, interface and decision models for sensor fusion, computational networks, simple sensing for complex action, multisensor-based control, and metrology and calibration of multisensor systems. Particular attention is given to controlling 3D objects by sketching 2D views, the graphical simulation and animation environment for flexible structure robots, designing robotic systems from sensorimotor modules, cylindrical object reconstruction from a sequence of images, an accurate estimation of surface properties by integrating information using Bayesian networks, an adaptive fusion model for a distributed detection system, multiple concurrent object descriptions in support of autonomous navigation, robot control with multiple sensors and heuristic knowledge, and optical array detectors for image sensors calibration. (No individual items are abstracted in this volume)
REVIEW ARTICLE: Sensor communication technology towards ambient intelligence
NASA Astrophysics Data System (ADS)
Delsing, J.; Lindgren, P.
2005-04-01
This paper is a review of the fascinating development of sensors and the communication of sensor data. A brief historical introduction is given, followed by a discussion on architectures for sensor networks. Further, realistic specifications on sensor devices suitable for ambient intelligence and ubiquitous computing are given. Based on these specifications, the status and current frontline development are discussed. In total, it is shown that future technology for ambient intelligence based on sensor and actuator devices using standardized Internet communication is within the range of possibilities within five years.
NASA Technical Reports Server (NTRS)
Mckee, James W.
1988-01-01
This final report describes the accomplishments of the General Purpose Intelligent Sensor Interface task of the Applications of Artificial Intelligence to Space Station grant for the period from October 1, 1987 through September 30, 1988. Portions of the First Biannual Report not revised will not be included but only referenced. The goal is to develop an intelligent sensor system that will simplify the design and development of expert systems using sensors of the physical phenomena as a source of data. This research will concentrate on the integration of image processing sensors and voice processing sensors with a computer designed for expert system development. The result of this research will be the design and documentation of a system in which the user will not need to be an expert in such areas as image processing algorithms, local area networks, image processor hardware selection or interfacing, television camera selection, voice recognition hardware selection, or analog signal processing. The user will be able to access data from video or voice sensors through standard LISP statements without any need to know about the sensor hardware or software.
NASA Technical Reports Server (NTRS)
Schmalzel, John L.; Morris, Jon; Turowski, Mark; Figueroa, Fernando; Oostdyk, Rebecca
2008-01-01
There are a number of architecture models for implementing Integrated Systems Health Management (ISHM) capabilities. For example, approaches based on the OSA-CBM and OSA-EAI models, or specific architectures developed in response to local needs. NASA s John C. Stennis Space Center (SSC) has developed one such version of an extensible architecture in support of rocket engine testing that integrates a palette of functions in order to achieve an ISHM capability. Among the functional capabilities that are supported by the framework are: prognostic models, anomaly detection, a data base of supporting health information, root cause analysis, intelligent elements, and integrated awareness. This paper focuses on the role that intelligent elements can play in ISHM architectures. We define an intelligent element as a smart element with sufficient computing capacity to support anomaly detection or other algorithms in support of ISHM functions. A smart element has the capabilities of supporting networked implementations of IEEE 1451.x smart sensor and actuator protocols. The ISHM group at SSC has been actively developing intelligent elements in conjunction with several partners at other Centers, universities, and companies as part of our ISHM approach for better supporting rocket engine testing. We have developed several implementations. Among the key features for these intelligent sensors is support for IEEE 1451.1 and incorporation of a suite of algorithms for determination of sensor health. Regardless of the potential advantages that can be achieved using intelligent sensors, existing large-scale systems are still based on conventional sensors and data acquisition systems. In order to bring the benefits of intelligent sensors to these environments, we have also developed virtual implementations of intelligent sensors.
Bialas, Andrzej
2010-01-01
The paper discusses the security issues of intelligent sensors that are able to measure and process data and communicate with other information technology (IT) devices or systems. Such sensors are often used in high risk applications. To improve their robustness, the sensor systems should be developed in a restricted way to provide them with assurance. One of assurance creation methodologies is Common Criteria (ISO/IEC 15408), used for IT products and systems. The contribution of the paper is a Common Criteria compliant and pattern-based method for the intelligent sensors security development. The paper concisely presents this method and its evaluation for the sensor detecting methane in a mine, focusing on the security problem of the intelligent sensor definition and solution. The aim of the validation is to evaluate and improve the introduced method.
Spatiotemporal Local-Remote Senor Fusion (ST-LRSF) for Cooperative Vehicle Positioning
Bhawiyuga, Adhitya
2018-01-01
Vehicle positioning plays an important role in the design of protocols, algorithms, and applications in the intelligent transport systems. In this paper, we present a new framework of spatiotemporal local-remote sensor fusion (ST-LRSF) that cooperatively improves the accuracy of absolute vehicle positioning based on two state estimates of a vehicle in the vicinity: a local sensing estimate, measured by the on-board exteroceptive sensors, and a remote sensing estimate, received from neighbor vehicles via vehicle-to-everything communications. Given both estimates of vehicle state, the ST-LRSF scheme identifies the set of vehicles in the vicinity, determines the reference vehicle state, proposes a spatiotemporal dissimilarity metric between two reference vehicle states, and presents a greedy algorithm to compute a minimal weighted matching (MWM) between them. Given the outcome of MWM, the theoretical position uncertainty of the proposed refinement algorithm is proven to be inversely proportional to the square root of matching size. To further reduce the positioning uncertainty, we also develop an extended Kalman filter model with the refined position of ST-LRSF as one of the measurement inputs. The numerical results demonstrate that the proposed ST-LRSF framework can achieve high positioning accuracy for many different scenarios of cooperative vehicle positioning. PMID:29617341
Advances in Multi-Sensor Information Fusion: Theory and Applications 2017.
Jin, Xue-Bo; Sun, Shuli; Wei, Hong; Yang, Feng-Bao
2018-04-11
The information fusion technique can integrate a large amount of data and knowledge representing the same real-world object and obtain a consistent, accurate, and useful representation of that object. The data may be independent or redundant, and can be obtained by different sensors at the same time or at different times. A suitable combination of investigative methods can substantially increase the profit of information in comparison with that from a single sensor. Multi-sensor information fusion has been a key issue in sensor research since the 1970s, and it has been applied in many fields. For example, manufacturing and process control industries can generate a lot of data, which have real, actionable business value. The fusion of these data can greatly improve productivity through digitization. The goal of this special issue is to report innovative ideas and solutions for multi-sensor information fusion in the emerging applications era, focusing on development, adoption, and applications.
Pires, Ivan Miguel; Garcia, Nuno M.; Pombo, Nuno; Flórez-Revuelta, Francisco
2016-01-01
This paper focuses on the research on the state of the art for sensor fusion techniques, applied to the sensors embedded in mobile devices, as a means to help identify the mobile device user’s daily activities. Sensor data fusion techniques are used to consolidate the data collected from several sensors, increasing the reliability of the algorithms for the identification of the different activities. However, mobile devices have several constraints, e.g., low memory, low battery life and low processing power, and some data fusion techniques are not suited to this scenario. The main purpose of this paper is to present an overview of the state of the art to identify examples of sensor data fusion techniques that can be applied to the sensors available in mobile devices aiming to identify activities of daily living (ADLs). PMID:26848664
Hand-Writing Motion Tracking with Vision-Inertial Sensor Fusion: Calibration and Error Correction
Zhou, Shengli; Fei, Fei; Zhang, Guanglie; Liu, Yunhui; Li, Wen J.
2014-01-01
The purpose of this study was to improve the accuracy of real-time ego-motion tracking through inertial sensor and vision sensor fusion. Due to low sampling rates supported by web-based vision sensor and accumulation of errors in inertial sensors, ego-motion tracking with vision sensors is commonly afflicted by slow updating rates, while motion tracking with inertial sensor suffers from rapid deterioration in accuracy with time. This paper starts with a discussion of developed algorithms for calibrating two relative rotations of the system using only one reference image. Next, stochastic noises associated with the inertial sensor are identified using Allan Variance analysis, and modeled according to their characteristics. Finally, the proposed models are incorporated into an extended Kalman filter for inertial sensor and vision sensor fusion. Compared with results from conventional sensor fusion models, we have shown that ego-motion tracking can be greatly enhanced using the proposed error correction model. PMID:25157546
NASA Technical Reports Server (NTRS)
Oneil, William F.
1993-01-01
The fusion of radar and electro-optic (E-O) sensor images presents unique challenges. The two sensors measure different properties of the real three-dimensional (3-D) world. Forming the sensor outputs into a common format does not mask these differences. In this paper, the conditions under which fusion of the two sensor signals is possible are explored. The program currently planned to investigate this problem is briefly discussed.
Multiple estimation channel decoupling and optimization method based on inverse system
NASA Astrophysics Data System (ADS)
Wu, Peng; Mu, Rongjun; Zhang, Xin; Deng, Yanpeng
2018-03-01
This paper addressed the intelligent autonomous navigation request of intelligent deformation missile, based on the intelligent deformation missile dynamics and kinematics modeling, navigation subsystem solution method and error modeling, and then focuses on the corresponding data fusion and decision fusion technology, decouples the sensitive channel of the filter input through the inverse system of design dynamics to reduce the influence of sudden change of the measurement information on the filter input. Then carrying out a series of simulation experiments, which verified the feasibility of the inverse system decoupling algorithm effectiveness.
Average Throughput Performance of Myopic Policy in Energy Harvesting Wireless Sensor Networks.
Gul, Omer Melih; Demirekler, Mubeccel
2017-09-26
This paper considers a single-hop wireless sensor network where a fusion center collects data from M energy harvesting wireless sensors. The harvested energy is stored losslessly in an infinite-capacity battery at each sensor. In each time slot, the fusion center schedules K sensors for data transmission over K orthogonal channels. The fusion center does not have direct knowledge on the battery states of sensors, or the statistics of their energy harvesting processes. The fusion center only has information of the outcomes of previous transmission attempts. It is assumed that the sensors are data backlogged, there is no battery leakage and the communication is error-free. An energy harvesting sensor can transmit data to the fusion center whenever being scheduled only if it has enough energy for data transmission. We investigate average throughput of Round-Robin type myopic policy both analytically and numerically under an average reward (throughput) criterion. We show that Round-Robin type myopic policy achieves optimality for some class of energy harvesting processes although it is suboptimal for a broad class of energy harvesting processes.
Average Throughput Performance of Myopic Policy in Energy Harvesting Wireless Sensor Networks
Demirekler, Mubeccel
2017-01-01
This paper considers a single-hop wireless sensor network where a fusion center collects data from M energy harvesting wireless sensors. The harvested energy is stored losslessly in an infinite-capacity battery at each sensor. In each time slot, the fusion center schedules K sensors for data transmission over K orthogonal channels. The fusion center does not have direct knowledge on the battery states of sensors, or the statistics of their energy harvesting processes. The fusion center only has information of the outcomes of previous transmission attempts. It is assumed that the sensors are data backlogged, there is no battery leakage and the communication is error-free. An energy harvesting sensor can transmit data to the fusion center whenever being scheduled only if it has enough energy for data transmission. We investigate average throughput of Round-Robin type myopic policy both analytically and numerically under an average reward (throughput) criterion. We show that Round-Robin type myopic policy achieves optimality for some class of energy harvesting processes although it is suboptimal for a broad class of energy harvesting processes. PMID:28954420
Banos, Oresti; Damas, Miguel; Pomares, Hector; Rojas, Ignacio
2012-01-01
The main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise) imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered. PMID:22969386
Banos, Oresti; Damas, Miguel; Pomares, Hector; Rojas, Ignacio
2012-01-01
The main objective of fusion mechanisms is to increase the individual reliability of the systems through the use of the collectivity knowledge. Moreover, fusion models are also intended to guarantee a certain level of robustness. This is particularly required for problems such as human activity recognition where runtime changes in the sensor setup seriously disturb the reliability of the initial deployed systems. For commonly used recognition systems based on inertial sensors, these changes are primarily characterized as sensor rotations, displacements or faults related to the batteries or calibration. In this work we show the robustness capabilities of a sensor-weighted fusion model when dealing with such disturbances under different circumstances. Using the proposed method, up to 60% outperformance is obtained when a minority of the sensors are artificially rotated or degraded, independent of the level of disturbance (noise) imposed. These robustness capabilities also apply for any number of sensors affected by a low to moderate noise level. The presented fusion mechanism compensates the poor performance that otherwise would be obtained when just a single sensor is considered.
A review of physical security robotics at Sandia National Laboratories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Roerig, S.C.
1990-01-01
As an outgrowth of research into physical security technologies, Sandia is investigating the role of robotics in security systems. Robotics may allow more effective utilization of guard forces, especially in scenarios where personnel would be exposed to harmful environments. Robots can provide intrusion detection and assessment functions for failed sensors or transient assets, can test existing fixed site sensors, and can gather additional intelligence and dispense delaying elements. The Robotic Security Vehicle (RSV) program for DOE/OSS is developing a fieldable prototype for an exterior physical security robot based upon a commercial four wheel drive vehicle. The RSV will be capablemore » of driving itself, being driven remotely, or being driven by an onboard operator around a site and will utilize its sensors to alert an operator to unusual conditions. The Remote Security Station (RSS) program for the Defense Nuclear Agency is developing a proof-of-principle robotic system which will be used to evaluate the role, and associated cost, of robotic technologies in exterior security systems. The RSS consists of an independent sensor pod, a mobile sensor platform and a control and display console. Sensor data fusion is used to optimize the system's intrusion detection performance. These programs are complementary, the RSV concentrates on developing autonomous mobility, while the RSS thrust is on mobile sensor employment. 3 figs.« less
Assessing Intelligence Operation/Fusion/Coordination Centers for Efficiency Opportunities
2013-02-28
intelligence ], HUMINT [human intelligence ], GEOINT [geospatial intelligence ], or even open source information into the NIC-C. There is no...centers have and continue to be stood up to improve the collaboration across intelligence organizatons addressing national security threats. Open ... source review of journals and books describing changes in the intelligence community organizational structure since September 2001, were reviewed to
Bialas, Andrzej
2010-01-01
The paper discusses the security issues of intelligent sensors that are able to measure and process data and communicate with other information technology (IT) devices or systems. Such sensors are often used in high risk applications. To improve their robustness, the sensor systems should be developed in a restricted way to provide them with assurance. One of assurance creation methodologies is Common Criteria (ISO/IEC 15408), used for IT products and systems. The contribution of the paper is a Common Criteria compliant and pattern-based method for the intelligent sensors security development. The paper concisely presents this method and its evaluation for the sensor detecting methane in a mine, focusing on the security problem of the intelligent sensor definition and solution. The aim of the validation is to evaluate and improve the introduced method. PMID:22399888
Wu, Chunxue; Wu, Wenliang; Wan, Caihua
2017-01-01
Sensors are increasingly used in mobile environments with wireless network connections. Multiple sensor types measure distinct aspects of the same event. Their measurements are then combined to produce integrated, reliable results. As the number of sensors in networks increases, low energy requirements and changing network connections complicate event detection and measurement. We present a data fusion scheme for use in mobile wireless sensor networks with high energy efficiency and low network delays, that still produces reliable results. In the first phase, we used a network simulation where mobile agents dynamically select the next hop migration node based on the stability parameter of the link, and perform the data fusion at the migration node. Agents use the fusion results to decide if it should return the fusion results to the processing center or continue to collect more data. In the second phase. The feasibility of data fusion at the node level is confirmed by an experimental design where fused data from color sensors show near-identical results to actual physical temperatures. These results are potentially important for new large-scale sensor network applications. PMID:29099793
Driving in traffic: short-range sensing for urban collision avoidance
NASA Astrophysics Data System (ADS)
Thorpe, Chuck E.; Duggins, David F.; Gowdy, Jay W.; MacLaughlin, Rob; Mertz, Christoph; Siegel, Mel; Suppe, Arne; Wang, Chieh-Chih; Yata, Teruko
2002-07-01
Intelligent vehicles are beginning to appear on the market, but so far their sensing and warning functions only work on the open road. Functions such as runoff-road warning or adaptive cruise control are designed for the uncluttered environments of open highways. We are working on the much more difficult problem of sensing and driver interfaces for driving in urban areas. We need to sense cars and pedestrians and curbs and fire plugs and bicycles and lamp posts; we need to predict the paths of our own vehicle and of other moving objects; and we need to decide when to issue alerts or warnings to both the driver of our own vehicle and (potentially) to nearby pedestrians. No single sensor is currently able to detect and track all relevant objects. We are working with radar, ladar, stereo vision, and a novel light-stripe range sensor. We have installed a subset of these sensors on a city bus, driving through the streets of Pittsburgh on its normal runs. We are using different kinds of data fusion for different subsets of sensors, plus a coordinating framework for mapping objects at an abstract level.
Overview of the Smart Network Element Architecture and Recent Innovations
NASA Technical Reports Server (NTRS)
Perotti, Jose M.; Mata, Carlos T.; Oostdyk, Rebecca L.
2008-01-01
In industrial environments, system operators rely on the availability and accuracy of sensors to monitor processes and detect failures of components and/or processes. The sensors must be networked in such a way that their data is reported to a central human interface, where operators are tasked with making real-time decisions based on the state of the sensors and the components that are being monitored. Incorporating health management functions at this central location aids the operator by automating the decision-making process to suggest, and sometimes perform, the action required by current operating conditions. Integrated Systems Health Management (ISHM) aims to incorporate data from many sources, including real-time and historical data and user input, and extract information and knowledge from that data to diagnose failures and predict future failures of the system. By distributing health management processing to lower levels of the architecture, there is less bandwidth required for ISHM, enhanced data fusion, make systems and processes more robust, and improved resolution for the detection and isolation of failures in a system, subsystem, component, or process. The Smart Network Element (SNE) has been developed at NASA Kennedy Space Center to perform intelligent functions at sensors and actuators' level in support of ISHM.
NASA Astrophysics Data System (ADS)
Nagel, David J.
2000-11-01
The coordinated exploitation of modern communication, micro- sensor and computer technologies makes it possible to give global reach to our senses. Web-cameras for vision, web- microphones for hearing and web-'noses' for smelling, plus the abilities to sense many factors we cannot ordinarily perceive, are either available or will be soon. Applications include (1) determination of weather and environmental conditions on dense grids or over large areas, (2) monitoring of energy usage in buildings, (3) sensing the condition of hardware in electrical power distribution and information systems, (4) improving process control and other manufacturing, (5) development of intelligent terrestrial, marine, aeronautical and space transportation systems, (6) managing the continuum of routine security monitoring, diverse crises and military actions, and (7) medicine, notably the monitoring of the physiology and living conditions of individuals. Some of the emerging capabilities, such as the ability to measure remotely the conditions inside of people in real time, raise interesting social concerns centered on privacy issues. Methods for sensor data fusion and designs for human-computer interfaces are both crucial for the full realization of the potential of pervasive sensing. Computer-generated virtual reality, augmented with real-time sensor data, should be an effective means for presenting information from distributed sensors.
Bergamini, Elena; Ligorio, Gabriele; Summa, Aurora; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2014-10-09
Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter) and complementary (Non-linear observer) filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles) and heading (yaw angle) errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.
Distributed multimodal data fusion for large scale wireless sensor networks
NASA Astrophysics Data System (ADS)
Ertin, Emre
2006-05-01
Sensor network technology has enabled new surveillance systems where sensor nodes equipped with processing and communication capabilities can collaboratively detect, classify and track targets of interest over a large surveillance area. In this paper we study distributed fusion of multimodal sensor data for extracting target information from a large scale sensor network. Optimal tracking, classification, and reporting of threat events require joint consideration of multiple sensor modalities. Multiple sensor modalities improve tracking by reducing the uncertainty in the track estimates as well as resolving track-sensor data association problems. Our approach to solving the fusion problem with large number of multimodal sensors is construction of likelihood maps. The likelihood maps provide a summary data for the solution of the detection, tracking and classification problem. The likelihood map presents the sensory information in an easy format for the decision makers to interpret and is suitable with fusion of spatial prior information such as maps, imaging data from stand-off imaging sensors. We follow a statistical approach to combine sensor data at different levels of uncertainty and resolution. The likelihood map transforms each sensor data stream to a spatio-temporal likelihood map ideally suitable for fusion with imaging sensor outputs and prior geographic information about the scene. We also discuss distributed computation of the likelihood map using a gossip based algorithm and present simulation results.
A Distributed Artificial Intelligence Approach To Object Identification And Classification
NASA Astrophysics Data System (ADS)
Sikka, Digvijay I.; Varshney, Pramod K.; Vannicola, Vincent C.
1989-09-01
This paper presents an application of Distributed Artificial Intelligence (DAI) tools to the data fusion and classification problem. Our approach is to use a blackboard for information management and hypothe-ses formulation. The blackboard is used by the knowledge sources (KSs) for sharing information and posting their hypotheses on, just as experts sitting around a round table would do. The present simulation performs classification of an Aircraft(AC), after identifying it by its features, into disjoint sets (object classes) comprising of the five commercial ACs; Boeing 747, Boeing 707, DC10, Concord and Boeing 727. A situation data base is characterized by experimental data available from the three levels of expert reasoning. Ohio State University ElectroScience Laboratory provided this experimental data. To validate the architecture presented, we employ two KSs for modeling the sensors, aspect angle polarization feature and the ellipticity data. The system has been implemented on Symbolics 3645, under Genera 7.1, in Common LISP.
2011-07-01
intellectual ability. It is fashioned after the Wechsler Adult Intelligence Scale (Ref 11), which is the most widely used, individually administered test...Multidimensional Aptitude Battery-II Manual, Sigma Assessment Systems Inc., London, 2003. 11. Wechsler D, Wechsler Adult Intelligence Scale® – Third...AFRL-SA-WP-TR-2011-0006 MULTIPLE APTITUDE NORMATIVE INTELLIGENCE TESTING THAT DISTINGUISHES U.S. AIR FORCE MQ-1 PREDATOR SENSOR
An Architecture for Intelligent Systems Based on Smart Sensors
NASA Technical Reports Server (NTRS)
Schmalzel, John; Figueroa, Fernando; Morris, Jon; Mandayam, Shreekanth; Polikar, Robi
2004-01-01
Based on requirements for a next-generation rocket test facility, elements of a prototype Intelligent Rocket Test Facility (IRTF) have been implemented. A key component is distributed smart sensor elements integrated using a knowledgeware environment. One of the specific goals is to imbue sensors with the intelligence needed to perform self diagnosis of health and to participate in a hierarchy of health determination at sensor, process, and system levels. The preliminary results provide the basis for future advanced development and validation using rocket test stand facilities at Stennis Space Center (SSC). We have identified issues important to further development of health-enabled networks, which should be of interest to others working with smart sensors and intelligent health management systems.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.
A visual analytic framework for data fusion in investigative intelligence
NASA Astrophysics Data System (ADS)
Cai, Guoray; Gross, Geoff; Llinas, James; Hall, David
2014-05-01
Intelligence analysis depends on data fusion systems to provide capabilities of detecting and tracking important objects, events, and their relationships in connection to an analytical situation. However, automated data fusion technologies are not mature enough to offer reliable and trustworthy information for situation awareness. Given the trend of increasing sophistication of data fusion algorithms and loss of transparency in data fusion process, analysts are left out of the data fusion process cycle with little to no control and confidence on the data fusion outcome. Following the recent rethinking of data fusion as human-centered process, this paper proposes a conceptual framework towards developing alternative data fusion architecture. This idea is inspired by the recent advances in our understanding of human cognitive systems, the science of visual analytics, and the latest thinking about human-centered data fusion. Our conceptual framework is supported by an analysis of the limitation of existing fully automated data fusion systems where the effectiveness of important algorithmic decisions depend on the availability of expert knowledge or the knowledge of the analyst's mental state in an investigation. The success of this effort will result in next generation data fusion systems that can be better trusted while maintaining high throughput.
Chen, Te; Chen, Long; Xu, Xing; Cai, Yingfeng; Jiang, Haobin; Sun, Xiaoqiang
2018-04-20
Exact estimation of longitudinal force and sideslip angle is important for lateral stability and path-following control of four-wheel independent driven electric vehicle. This paper presents an effective method for longitudinal force and sideslip angle estimation by observer iteration and information fusion for four-wheel independent drive electric vehicles. The electric driving wheel model is introduced into the vehicle modeling process and used for longitudinal force estimation, the longitudinal force reconstruction equation is obtained via model decoupling, the a Luenberger observer and high-order sliding mode observer are united for longitudinal force observer design, and the Kalman filter is applied to restrain the influence of noise. Via the estimated longitudinal force, an estimation strategy is then proposed based on observer iteration and information fusion, in which the Luenberger observer is applied to achieve the transcendental estimation utilizing less sensor measurements, the extended Kalman filter is used for a posteriori estimation with higher accuracy, and a fuzzy weight controller is used to enhance the adaptive ability of observer system. Simulations and experiments are carried out, and the effectiveness of proposed estimation method is verified.
Chen, Long; Xu, Xing; Cai, Yingfeng; Jiang, Haobin; Sun, Xiaoqiang
2018-01-01
Exact estimation of longitudinal force and sideslip angle is important for lateral stability and path-following control of four-wheel independent driven electric vehicle. This paper presents an effective method for longitudinal force and sideslip angle estimation by observer iteration and information fusion for four-wheel independent drive electric vehicles. The electric driving wheel model is introduced into the vehicle modeling process and used for longitudinal force estimation, the longitudinal force reconstruction equation is obtained via model decoupling, the a Luenberger observer and high-order sliding mode observer are united for longitudinal force observer design, and the Kalman filter is applied to restrain the influence of noise. Via the estimated longitudinal force, an estimation strategy is then proposed based on observer iteration and information fusion, in which the Luenberger observer is applied to achieve the transcendental estimation utilizing less sensor measurements, the extended Kalman filter is used for a posteriori estimation with higher accuracy, and a fuzzy weight controller is used to enhance the adaptive ability of observer system. Simulations and experiments are carried out, and the effectiveness of proposed estimation method is verified. PMID:29677124
Cooperative angle-only orbit initialization via fusion of admissible areas
NASA Astrophysics Data System (ADS)
Jia, Bin; Pham, Khanh; Blasch, Erik; Chen, Genshe; Shen, Dan; Wang, Zhonghai
2017-05-01
For the short-arc angle only orbit initialization problem, the admissible area is often used. However, the accuracy using a single sensor is often limited. For high value space objects, it is desired to achieve more accurate results. Fortunately, multiple sensors, which are dedicated to space situational awareness, are available. The work in this paper uses multiple sensors' information to cooperatively initialize the orbit based on the fusion of multiple admissible areas. Both the centralized fusion and decentralized fusion are discussed. Simulation results verify the expectation that the orbit initialization accuracy is improved by using information from multiple sensors.
INL Control System Situational Awareness Technology Final Report 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gordon Rueff; Bryce Wheeler; Todd Vollmer
The Situational Awareness project is a comprehensive undertaking of Idaho National Laboratory (INL) in an effort to produce technologies capable of defending the country’s energy sector infrastructure from cyber attack. INL has addressed this challenge through research and development of an interoperable suite of tools that safeguard critical energy sector infrastructure. The technologies in this project include the Sophia Tool, Mesh Mapper (MM) Tool, Intelligent Cyber Sensor (ICS) Tool, and Data Fusion Tool (DFT). Each is designed to function effectively on its own, or they can be integrated in a variety of customized configurations based on the end user’s riskmore » profile and security needs.« less
Discrete shaped strain sensors for intelligent structures
NASA Technical Reports Server (NTRS)
Andersson, Mark S.; Crawley, Edward F.
1992-01-01
Design of discrete, highly distributed sensor systems for intelligent structures has been studied. Data obtained indicate that discrete strain-averaging sensors satisfy the functional requirements for distributed sensing of intelligent structures. Bartlett and Gauss-Hanning sensors, in particular, provide good wavenumber characteristics while meeting the functional requirements. They are characterized by good rolloff rates and positive Fourier transforms for all wavenumbers. For the numerical integration schemes, Simpson's rule is considered to be very simple to implement and consistently provides accurate results for five sensors or more. It is shown that a sensor system that satisfies the functional requirements can be applied to a structure that supports mode shapes with purely sinusoidal curvature.
Wireless Monitoring of Automobile Tires for Intelligent Tires
Matsuzaki, Ryosuke; Todoroki, Akira
2008-01-01
This review discusses key technologies of intelligent tires focusing on sensors and wireless data transmission. Intelligent automobile tires, which monitor their pressure, deformation, wheel loading, friction, or tread wear, are expected to improve the reliability of tires and tire control systems. However, in installing sensors in a tire, many problems have to be considered, such as compatibility of the sensors with tire rubber, wireless transmission, and battery installments. As regards sensing, this review discusses indirect methods using existing sensors, such as that for wheel speed, and direct methods, such as surface acoustic wave sensors and piezoelectric sensors. For wireless transmission, passive wireless methods and energy harvesting are also discussed. PMID:27873979
Interagency and Multinational Information Sharing Architecture and Solutions (IMISAS) Project
2012-02-01
Defense (DOD) Enterprise Unclassified Information Sharing Service, August 10, 2010 12 Lindenmayer, Martin J. Civil Information and Intelligence Fusion...Organizations (1:2), 44-65. Lindenmayer, Martin J. Civil Information and Intelligence Fusion: Making “Non-Traditional” into “New Traditional” for...perceived as a good start which needs more development. References [Badke-Schaub et al. 2008] Badke-Schaub, Petra ; Hofinger, Gesine; Lauche
Secure, Autonomous, Intelligent Controller for Integrating Distributed Sensor Webs
NASA Technical Reports Server (NTRS)
Ivancic, William D.
2007-01-01
This paper describes the infrastructure and protocols necessary to enable near-real-time commanding, access to space-based assets, and the secure interoperation between sensor webs owned and controlled by various entities. Select terrestrial and aeronautics-base sensor webs will be used to demonstrate time-critical interoperability between integrated, intelligent sensor webs both terrestrial and between terrestrial and space-based assets. For this work, a Secure, Autonomous, Intelligent Controller and knowledge generation unit is implemented using Virtual Mission Operation Center technology.
Recent progress in distributed optical fiber Raman photon sensors at China Jiliang University
NASA Astrophysics Data System (ADS)
Zhang, Zaixuan; Wang, Jianfeng; Li, Yi; Gong, Huaping; Yu, Xiangdong; Liu, Honglin; Jin, Yongxing; Kang, Juan; Li, Chenxia; Zhang, Wensheng; Zhang, Wenping; Niu, Xiaohui; Sun, Zhongzhou; Zhao, Chunliu; Dong, Xinyong; Jin, Shangzhong
2012-06-01
A brief review of recent progress in researches, productions and applications of full distributed fiber Raman photon sensors at China Jiliang University (CJLU) is presented. In order to improve the measurement distance, the accuracy, the space resolution, the ability of multi-parameter measurements, and the intelligence of full distributed fiber sensor systems, a new generation fiber sensor technology based on the optical fiber nonlinear scattering fusion principle is proposed. A series of new generation full distributed fiber sensors are investigated and designed, which consist of new generation ultra-long distance full distributed fiber Raman and Rayleigh scattering photon sensors integrated with a fiber Raman amplifier, auto-correction full distributed fiber Raman photon temperature sensors based on Raman correlation dual sources, full distributed fiber Raman photon temperature sensors based on a pulse coding source, full distributed fiber Raman photon temperature sensors using a fiber Raman wavelength shifter, a new type of Brillouin optical time domain analyzers (BOTDAs) integrated with a fiber Raman amplifier for replacing a fiber Brillouin amplifier, full distributed fiber Raman and Brillouin photon sensors integrated with a fiber Raman amplifier, and full distributed fiber Brillouin photon sensors integrated with a fiber Brillouin frequency shifter. The Internet of things is believed as one of candidates of the next technological revolution, which has driven hundreds of millions of class markets. Sensor networks are important components of the Internet of things. The full distributed optical fiber sensor network (Rayleigh, Raman, and Brillouin scattering) is a 3S (smart materials, smart structure, and smart skill) system, which is easy to construct smart fiber sensor networks. The distributed optical fiber sensor can be embedded in the power grids, railways, bridges, tunnels, roads, constructions, water supply systems, dams, oil and gas pipelines and other facilities, and can be integrated with wireless networks.
Multisensor data fusion for IED threat detection
NASA Astrophysics Data System (ADS)
Mees, Wim; Heremans, Roel
2012-10-01
In this paper we present the multi-sensor registration and fusion algorithms that were developed for a force protection research project in order to detect threats against military patrol vehicles. The fusion is performed at object level, using a hierarchical evidence aggregation approach. It first uses expert domain knowledge about the features used to characterize the detected threats, that is implemented in the form of a fuzzy expert system. The next level consists in fusing intra-sensor and inter-sensor information. Here an ordered weighted averaging operator is used. The object level fusion between candidate threats that are detected asynchronously on a moving vehicle by sensors with different imaging geometries, requires an accurate sensor to world coordinate transformation. This image registration will also be discussed in this paper.
Miniature fiber Fabry-Perot sensors based on fusion splicing
NASA Astrophysics Data System (ADS)
Zhu, Jia-li; Wang, Ming; Yang, Chun-di; Wang, Ting-ting
2013-03-01
Fiber-optic Fabry-Perot (F-P) sensors are widely investigated because they have several advantages over conventional sensors, such as immunity to electromagnetic interference, ability to operate under bad environments, high sensitivity and the potential for multiplexing. A new method to fabricate micro-cavity Fabry-Perot interferometer is introduced, which is fusion splicing a section of conventional single-mode fiber (SMF) and a section of hollow core or solid core photonic crystal fiber (PCF) together to form a micro-cavity at the splice joint. The technology of fusion splicing is discussed, and two miniature optical fiber sensors based on Fabry-Perot interference using fusion splicing are presented. The two sensors are completely made of fused silica, and have good high-temperature capability.
A Fusion Architecture for Tracking a Group of People Using a Distributed Sensor Network
2013-07-01
Determining the composition of the group is done using several classifiers. The fusion is done at the UGS level to fuse information from all the modalities to...to classification and counting of the targets. Section III also presents the algorithms for fusion of distributed sensor data at the UGS level and...ultrasonic sensors. Determining the composition of the group is done using several classifiers. The fusion is done at the UGS level to fuse
Deep data fusion method for missile-borne inertial/celestial system
NASA Astrophysics Data System (ADS)
Zhang, Chunxi; Chen, Xiaofei; Lu, Jiazhen; Zhang, Hao
2018-05-01
Strap-down inertial-celestial integrated navigation system has the advantages of autonomy and high precision and is very useful for ballistic missiles. The star sensor installation error and inertial measurement error have a great influence for the system performance. Based on deep data fusion, this paper establishes measurement equations including star sensor installation error and proposes the deep fusion filter method. Simulations including misalignment error, star sensor installation error, IMU error are analyzed. Simulation results indicate that the deep fusion method can estimate the star sensor installation error and IMU error. Meanwhile, the method can restrain the misalignment errors caused by instrument errors.
NASA Technical Reports Server (NTRS)
Bandhil, Pavan; Chitikeshi, Sanjeevi; Mahajan, Ajay; Figueroa, Fernando
2005-01-01
This paper proposes the development of intelligent sensors as part of an integrated systems approach, i.e. one treats the sensors as a complete system with its own sensing hardware (the traditional sensor), A/D converters, processing and storage capabilities, software drivers, self-assessment algorithms, communication protocols and evolutionary methodologies that allow them to get better with time. Under a project being undertaken at the NASA s Stennis Space Center, an integrated framework is being developed for the intelligent monitoring of smart elements. These smart elements can be sensors, actuators or other devices. The immediate application is the monitoring of the rocket test stands, but the technology should be generally applicable to the Integrated Systems Health Monitoring (ISHM) vision. This paper outlines progress made in the development of intelligent sensors by describing the work done till date on Physical Intelligent Sensors (PIS). The PIS discussed here consists of a thermocouple used to read temperature in an analog form which is then converted into digital values. A microprocessor collects the sensor readings and runs numerous embedded event detection routines on the collected data and if any event is detected, it is reported, stored and sent to a remote system through an Ethernet connection. Hence the output of the PIS is data coupled with confidence factor in the reliability of the data which leads to information on the health of the sensor at all times. All protocols are consistent with IEEE 1451.X standards. This work lays the foundation for the next generation of smart devices that have embedded intelligence for distributed decision making capabilities.
Fan, Bingfei; Li, Qingguo; Liu, Tao
2017-12-28
With the advancements in micro-electromechanical systems (MEMS) technologies, magnetic and inertial sensors are becoming more and more accurate, lightweight, smaller in size as well as low-cost, which in turn boosts their applications in human movement analysis. However, challenges still exist in the field of sensor orientation estimation, where magnetic disturbance represents one of the obstacles limiting their practical application. The objective of this paper is to systematically analyze exactly how magnetic disturbances affects the attitude and heading estimation for a magnetic and inertial sensor. First, we reviewed four major components dealing with magnetic disturbance, namely decoupling attitude estimation from magnetic reading, gyro bias estimation, adaptive strategies of compensating magnetic disturbance and sensor fusion algorithms. We review and analyze the features of existing methods of each component. Second, to understand each component in magnetic disturbance rejection, four representative sensor fusion methods were implemented, including gradient descent algorithms, improved explicit complementary filter, dual-linear Kalman filter and extended Kalman filter. Finally, a new standardized testing procedure has been developed to objectively assess the performance of each method against magnetic disturbance. Based upon the testing results, the strength and weakness of the existing sensor fusion methods were easily examined, and suggestions were presented for selecting a proper sensor fusion algorithm or developing new sensor fusion method.
Staggered scheduling of sensor estimation and fusion for tracking over long-haul links
Liu, Qiang; Rao, Nageswara S. V.; Wang, Xin
2016-08-01
Networked sensing can be found in a multitude of real-world applications. Here, we focus on the communication-and computation-constrained long-haul sensor networks, where sensors are remotely deployed over a vast geographical area to perform certain tasks. Of special interest is a class of such networks where sensors take measurements of one or more dynamic targets and send their state estimates to a remote fusion center via long-haul satellite links. The severe loss and delay over such links can easily reduce the amount of sensor data received by the fusion center, thereby limiting the potential information fusion gain and resulting in suboptimalmore » tracking performance. In this paper, starting with the temporal-domain staggered estimation for an individual sensor, we explore the impact of the so-called intra-state prediction and retrodiction on estimation errors. We then investigate the effect of such estimation scheduling across different sensors on the spatial-domain fusion performance, where the sensing time epochs across sensors are scheduled in an asynchronous and staggered manner. In particular, the impact of communication delay and loss as well as sensor bias on such scheduling is explored by means of numerical and simulation studies that demonstrate the validity of our analysis.« less
NASA Astrophysics Data System (ADS)
Leal-Junior, Arnaldo G.; Vargas-Valencia, Laura; dos Santos, Wilian M.; Schneider, Felipe B. A.; Siqueira, Adriano A. G.; Pontes, Maria José; Frizera, Anselmo
2018-07-01
This paper presents a low cost and highly reliable system for angle measurement based on a sensor fusion between inertial and fiber optic sensors. The system consists of the sensor fusion through Kalman filter of two inertial measurement units (IMUs) and an intensity variation-based polymer optical fiber (POF) curvature sensor. In addition, the IMU was applied as a reference for a compensation technique of POF curvature sensor hysteresis. The proposed system was applied on the knee angle measurement of a lower limb exoskeleton in flexion/extension cycles and in gait analysis. Results show the accuracy of the system, where the Root Mean Square Error (RMSE) between the POF-IMU sensor system and the encoder was below 4° in the worst case and about 1° in the best case. Then, the POF-IMU sensor system was evaluated as a wearable sensor for knee joint angle assessment without the exoskeleton, where its suitability for this purpose was demonstrated. The results obtained in this paper pave the way for future applications of sensor fusion between electronic and fiber optic sensors in movement analysis.
2013-09-01
Office of the Inspector General OSINT Open Source Intelligence PPD Presidential Policy Directive SIGINT Signals Intelligence SLFC State/Local Fusion...Geospatial Intelligence (GEOINT) from Geographic Information Systems (GIS), and Open Source Intelligence ( OSINT ) from Social Media. GIS is widely...and monitor make it a feasible tool to capitalize on for OSINT . A formalized EM intelligence process would help expedite the processing of such
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan
2018-01-01
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan
2018-02-06
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.
NASA Astrophysics Data System (ADS)
Ward, Dennis W.; Bennett, Kelly W.
2017-05-01
The Sensor Information Testbed COllaberative Research Environment (SITCORE) and the Automated Online Data Repository (AODR) are significant enablers of the U.S. Army Research Laboratory (ARL)'s Open Campus Initiative and together create a highly-collaborative research laboratory and testbed environment focused on sensor data and information fusion. SITCORE creates a virtual research development environment allowing collaboration from other locations, including DoD, industry, academia, and collation facilities. SITCORE combined with AODR provides end-toend algorithm development, experimentation, demonstration, and validation. The AODR enterprise allows the U.S. Army Research Laboratory (ARL), as well as other government organizations, industry, and academia to store and disseminate multiple intelligence (Multi-INT) datasets collected at field exercises and demonstrations, and to facilitate research and development (R and D), and advancement of analytical tools and algorithms supporting the Intelligence, Surveillance, and Reconnaissance (ISR) community. The AODR provides a potential central repository for standards compliant datasets to serve as the "go-to" location for lessons-learned and reference products. Many of the AODR datasets have associated ground truth and other metadata which provides a rich and robust data suite for researchers to develop, test, and refine their algorithms. Researchers download the test data to their own environments using a sophisticated web interface. The AODR allows researchers to request copies of stored datasets and for the government to process the requests and approvals in an automated fashion. Access to the AODR requires two-factor authentication in the form of a Common Access Card (CAC) or External Certificate Authority (ECA)
NASA Astrophysics Data System (ADS)
Arnhardt, C.; Fernández-Steeger, T. M.; Azzam, R.
2009-04-01
In most mountainous regions, landslides represent a major threat to human life, properties and infrastructures. Nowadays existing landslide monitoring systems are often characterized by high efforts in terms of purchase, installation, maintenance, manpower and material. In addition (or because of this) only small areas or selective points of the endangered zone can be observed by the system. Therefore the improvement of existing and the development of new monitoring and warning systems are of high relevance. The joint project "Sensor based Landslide Early Warning Systems" (SLEWS) deals with the development of a prototypic Alarm- and Early Warning system (EWS) for different types of landslides using low-cost micro-sensors (MEMS) integrated in a wireless sensor network (WSN). Modern so called Ad-Hoc, Multi-Hop wireless sensor networks (WSN) are characterized by a self organizing and self-healing capacity of the system (autonomous systems). The network consists of numerous individual and own energy-supply operating sensor nodes, that can send data packages from their measuring devices (here: MEMS) over other nodes (Multi-Hop) to a collection point (gateway). The gateway provides the interface to central processing and data retrieval units (PC, Laptop or server) outside the network. In order to detect and monitor the different landslide processes (like fall, topple, spreading or sliding) 3D MEMS capacitive sensors made from single silicon crystals and glass were chosen to measure acceleration, tilting and altitude changes. Based on the so called MEMS (Micro-Electro-Mechanical Systems) technology, the sensors combine very small mechanical and electronic units, sensing elements and transducers on a small microchip. The mass production of such type of sensors allows low cost applications in different areas (like automobile industries, medicine, and automation technology). Apart from the small and so space saving size and the low costs another advantage is the energy efficiency that permits measurements over a long period of time. A special sensor-board that accommodates the measuring sensors and the node of the WSN was developed. The standardized interfaces of the measuring sensors permit an easy interaction with the node and thus enable an uncomplicated data transfer to the gateway. The 3-axis acceleration sensor (measuring range: +/- 2g), the 2-axis inclination sensor (measuring range: +/- 30°) for measuring tilt and the barometric pressure sensor (measuring rang: 30kPa - 120 kPa) for measuring sub-meter height changes (altimeter) are currently integrated into the sensor network and are tested in realistic experiments. In addition sensor nodes with precise potentiometric displacement and linear magnetorestrictive position transducer are used for extension and convergence measurements. According to the accuracy of the first developed test stations, the results of the experiments showed that the selected sensors meet the requirement profile, as the stability is satisfying and the spreading of the data is quite low. Therefore the jet developed sensor boards can be tested in a larger environment of a sensor network. In order to get more information about accuracy in detail, experiments in a new more precise test bed and tests with different sampling rates will follow. Another increasingly important aspect for the future is the fusion of sensor data (i.e. combination and comparison) to identify malfunctions and to reduce false alarm rates, while increasing data quality at the same time. The correlation of different (complementary sensor fusion) but also identical sensor-types (redundant sensor fusion) permits a validation of measuring data. The development of special algorithms allows in a further step to analyze and evaluate the data from all nodes of the network together (sensor node fusion). The sensor fusion contributes to the decision making of alarm and early warning systems and allows a better interpretation of data. The network data are processed outside the network in a service orientated special data infrastructure (SDI) by standardized OGC (open Geospatial Consortium) conformal services and visualized according to the requirements of the end-user. The modular setup of the hardware, combined with standardized interfaces and open services for data processing allows an easy adaption or integration in existing solutions and other networks. The Monitoring system described here is characterized by very flexible structure, cost efficiency and high fail-safe level. The application of WSN in combination with MEMS provides an inexpensive, easy to set up and intelligent monitoring system for spatial data gathering in large areas.
A Comprehensive Fusion Liaison Officer Program: The Arizona Model
2015-03-01
Office of Intelligence and Analysis, Office of Intelligence and Analysis Strategic Plan Fiscal Year 2011–Fiscal Year 2018 (Washington, DC: U.S...needs. The second chapter will provide a historical perspective to the reader on the creation of the post 9/11 city of Phoenix’s Liaison Officer...fusion centers’ benefit to address baseline capabilities and further benefit their home agencies. Chapter VI provides the reader recommendations and
A system for activity recognition using multi-sensor fusion.
Gao, Lei; Bourke, Alan K; Nelson, John
2011-01-01
This paper proposes a system for activity recognition using multi-sensor fusion. In this system, four sensors are attached to the waist, chest, thigh, and side of the body. In the study we present two solutions for factors that affect the activity recognition accuracy: the calibration drift and the sensor orientation changing. The datasets used to evaluate this system were collected from 8 subjects who were asked to perform 8 scripted normal activities of daily living (ADL), three times each. The Naïve Bayes classifier using multi-sensor fusion is adopted and achieves 70.88%-97.66% recognition accuracies for 1-4 sensors.
NASA Astrophysics Data System (ADS)
Maimaitijiang, Maitiniyazi; Ghulam, Abduwasit; Sidike, Paheding; Hartling, Sean; Maimaitiyiming, Matthew; Peterson, Kyle; Shavers, Ethan; Fishman, Jack; Peterson, Jim; Kadam, Suhas; Burken, Joel; Fritschi, Felix
2017-12-01
Estimating crop biophysical and biochemical parameters with high accuracy at low-cost is imperative for high-throughput phenotyping in precision agriculture. Although fusion of data from multiple sensors is a common application in remote sensing, less is known on the contribution of low-cost RGB, multispectral and thermal sensors to rapid crop phenotyping. This is due to the fact that (1) simultaneous collection of multi-sensor data using satellites are rare and (2) multi-sensor data collected during a single flight have not been accessible until recent developments in Unmanned Aerial Systems (UASs) and UAS-friendly sensors that allow efficient information fusion. The objective of this study was to evaluate the power of high spatial resolution RGB, multispectral and thermal data fusion to estimate soybean (Glycine max) biochemical parameters including chlorophyll content and nitrogen concentration, and biophysical parameters including Leaf Area Index (LAI), above ground fresh and dry biomass. Multiple low-cost sensors integrated on UASs were used to collect RGB, multispectral, and thermal images throughout the growing season at a site established near Columbia, Missouri, USA. From these images, vegetation indices were extracted, a Crop Surface Model (CSM) was advanced, and a model to extract the vegetation fraction was developed. Then, spectral indices/features were combined to model and predict crop biophysical and biochemical parameters using Partial Least Squares Regression (PLSR), Support Vector Regression (SVR), and Extreme Learning Machine based Regression (ELR) techniques. Results showed that: (1) For biochemical variable estimation, multispectral and thermal data fusion provided the best estimate for nitrogen concentration and chlorophyll (Chl) a content (RMSE of 9.9% and 17.1%, respectively) and RGB color information based indices and multispectral data fusion exhibited the largest RMSE 22.6%; the highest accuracy for Chl a + b content estimation was obtained by fusion of information from all three sensors with an RMSE of 11.6%. (2) Among the plant biophysical variables, LAI was best predicted by RGB and thermal data fusion while multispectral and thermal data fusion was found to be best for biomass estimation. (3) For estimation of the above mentioned plant traits of soybean from multi-sensor data fusion, ELR yields promising results compared to PLSR and SVR in this study. This research indicates that fusion of low-cost multiple sensor data within a machine learning framework can provide relatively accurate estimation of plant traits and provide valuable insight for high spatial precision in agriculture and plant stress assessment.
Intelligence Fusion for Combined Operations
1994-06-03
Database ISE - Intelligence Support Element JASMIN - Joint Analysis System for Military Intelligence RC - Joint Intelligence Center JDISS - Joint Defense...has made accessable otherwise inaccessible networks such as connectivity to the German Joint Analysis System for Military Intelligence ( JASMIN ) and the...successfully any mission in the Battlespace is the essence of the C41 for the Warrior concept."’ It recognizes that the current C41 systems do not
Biology-inspired Architecture for Situation Management
NASA Technical Reports Server (NTRS)
Jones, Kennie H.; Lodding, Kenneth N.; Olariu, Stephan; Wilson, Larry; Xin, Chunsheng
2006-01-01
Situation Management is a rapidly developing science combining new techniques for data collection with advanced methods of data fusion to facilitate the process leading to correct decisions prescribing action. Current research focuses on reducing increasing amounts of diverse data to knowledge used by decision makers and on reducing time between observations, decisions and actions. No new technology is more promising for increasing the diversity and fidelity of observations than sensor networks. However, current research on sensor networks concentrates on a centralized network architecture. We believe this trend will not realize the full potential of situation management. We propose a new architecture modeled after biological ecosystems where motes are autonomous and intelligent, yet cooperate with local neighborhoods. Providing a layered approach, they sense and act independently when possible, and cooperate with neighborhoods when necessary. The combination of their local actions results in global effects. While situation management research is currently dominated by military applications, advances envisioned for industrial and business applications have similar requirements. NASA has requirements for intelligent and autonomous systems in future missions that can benefit from advances in situation management. We describe requirements for the Integrated Vehicle Health Management program where our biology-inspired architecture provides a layered approach and decisions can be made at the proper level to improve safety, reduce costs, and improve efficiency in making diagnostic and prognostic assessments of the structural integrity, aerodynamic characteristics, and operation of aircraft.
Sensors-network and its application in the intelligent storage security
NASA Astrophysics Data System (ADS)
Zhang, Qingying; Nicolescu, Mihai; Jiang, Xia; Zhang, Ying; Yue, Weihong; Xiao, Weihong
2004-11-01
Intelligent storage systems run on different advanced technologies, such as linear layout, business intelligence and data mining. Security, the basic desire of the storage system, has been focused on with the indraught of multimedia communication technology and sensors" network. Along with the developing of science and the social demands, multifarious alarming system has been designed and improved to be intelligentized, modularized and have network connections. It is of great moment to make the storage, and further more, the logistics system more and more efficient and perfect with modern science and technology. Diversified information on the spot should be caught by different kinds of sensors. Those signals are treated and communicated to the control center to give the further actions. For fire-proofing, broad-spectrum gas sensors, fume sensors, flame sensors and temperature sensors are used to catch the information in their own ways. Once the fire is taken somewhere, the sensors work by the fume, temperature, and flame as well as gas immediately. Meanwhile the intelligent control system starts. It passes the tidings to the center unit. At the same time, it sets those movable walls on to work quickly to obstruct the fire"s spreading. While for guarding the warehouse against theft, cut-off sensors, body sensors, photoelectric sensors, microwave sensors and closed-circuit television as well as electronic clocks are available to monitor the warehouse reasonably. All of those sensors work in a net way. The intelligent control system is made with a digital circuit instead of traditional switch one. This system can work in a better way in many cases. Its reliability is high and the cost is low.
Bialas, Andrzej
2010-01-01
The paper is focused on the security issues of sensors provided with processors and software and used for high-risk applications. Common IT related threats may cause serious consequences for sensor system users. To improve their robustness, sensor systems should be developed in a restricted way that would provide them with assurance. One assurance creation methodology is Common Criteria (ISO/IEC 15408) used for IT products and systems. The paper begins with a primer on the Common Criteria, and then a general security model of the intelligent sensor as an IT product is discussed. The paper presents how the security problem of the intelligent sensor is defined and solved. The contribution of the paper is to provide Common Criteria (CC) related security design patterns and to improve the effectiveness of the sensor development process. PMID:22315571
Further Structural Intelligence for Sensors Cluster Technology in Manufacturing
Mekid, Samir
2006-01-01
With the ever increasing complex sensing and actuating tasks in manufacturing plants, intelligent sensors cluster in hybrid networks becomes a rapidly expanding area. They play a dominant role in many fields from macro and micro scale. Global object control and the ability to self organize into fault-tolerant and scalable systems are expected for high level applications. In this paper, new structural concepts of intelligent sensors and networks with new intelligent agents are presented. Embedding new functionalities to dynamically manage cooperative agents for autonomous machines are interesting key enabling technologies most required in manufacturing for zero defects production.
Data fusion: principles and applications in air defense
NASA Astrophysics Data System (ADS)
Maltese, Dominique; Lucas, Andre
1998-07-01
Within a Surveillance and Reconnaissance System, the Fusion Process is an essential part of the software package since the different sensors measurements are combined by this process; each sensor sends its data to a fusion center whose task is to elaborate the best tactical situation. In this paper, a practical algorithm of data fusion applied to a military application context is presented; the case studied here is a medium-range surveillance situation featuring a dual-sensor platform which combines a surveillance Radar and an IRST; both sensors are collocated. The presented performances were obtained on validation scenarios via simulations performed by SAGEM with the ESSOR ('Environnement de Simulation de Senseurs Optroniques et Radar') multisensor simulation test bench.
Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2016-01-26
Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor's uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process.
Activity recognition using dynamic multiple sensor fusion in body sensor networks.
Gao, Lei; Bourke, Alan K; Nelson, John
2012-01-01
Multiple sensor fusion is a main research direction for activity recognition. However, there are two challenges in those systems: the energy consumption due to the wireless transmission and the classifier design because of the dynamic feature vector. This paper proposes a multi-sensor fusion framework, which consists of the sensor selection module and the hierarchical classifier. The sensor selection module adopts the convex optimization to select the sensor subset in real time. The hierarchical classifier combines the Decision Tree classifier with the Naïve Bayes classifier. The dataset collected from 8 subjects, who performed 8 scenario activities, was used to evaluate the proposed system. The results show that the proposed system can obviously reduce the energy consumption while guaranteeing the recognition accuracy.
NASA Astrophysics Data System (ADS)
Olivares, A.; Górriz, J. M.; Ramírez, J.; Olivares, G.
2011-02-01
Inertial sensors are widely used in human body motion monitoring systems since they permit us to determine the position of the subject's limbs. Limb angle measurement is carried out through the integration of the angular velocity measured by a rate sensor and the decomposition of the components of static gravity acceleration measured by an accelerometer. Different factors derived from the sensors' nature, such as the angle random walk and dynamic bias, lead to erroneous measurements. Dynamic bias effects can be reduced through the use of adaptive filtering based on sensor fusion concepts. Most existing published works use a Kalman filtering sensor fusion approach. Our aim is to perform a comparative study among different adaptive filters. Several least mean squares (LMS), recursive least squares (RLS) and Kalman filtering variations are tested for the purpose of finding the best method leading to a more accurate and robust limb angle measurement. A new angle wander compensation sensor fusion approach based on LMS and RLS filters has been developed.
A Cyber-ITS Framework for Massive Traffic Data Analysis Using Cyber Infrastructure
Fontaine, Michael D.
2013-01-01
Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing. PMID:23766690
A Cyber-ITS framework for massive traffic data analysis using cyber infrastructure.
Xia, Yingjie; Hu, Jia; Fontaine, Michael D
2013-01-01
Traffic data is commonly collected from widely deployed sensors in urban areas. This brings up a new research topic, data-driven intelligent transportation systems (ITSs), which means to integrate heterogeneous traffic data from different kinds of sensors and apply it for ITS applications. This research, taking into consideration the significant increase in the amount of traffic data and the complexity of data analysis, focuses mainly on the challenge of solving data-intensive and computation-intensive problems. As a solution to the problems, this paper proposes a Cyber-ITS framework to perform data analysis on Cyber Infrastructure (CI), by nature parallel-computing hardware and software systems, in the context of ITS. The techniques of the framework include data representation, domain decomposition, resource allocation, and parallel processing. All these techniques are based on data-driven and application-oriented models and are organized as a component-and-workflow-based model in order to achieve technical interoperability and data reusability. A case study of the Cyber-ITS framework is presented later based on a traffic state estimation application that uses the fusion of massive Sydney Coordinated Adaptive Traffic System (SCATS) data and GPS data. The results prove that the Cyber-ITS-based implementation can achieve a high accuracy rate of traffic state estimation and provide a significant computational speedup for the data fusion by parallel computing.
Tannous, Halim; Istrate, Dan; Benlarbi-Delai, Aziz; Sarrazin, Julien; Gamet, Didier; Ho Ba Tho, Marie Christine; Dao, Tien Tuan
2016-11-15
Exergames have been proposed as a potential tool to improve the current practice of musculoskeletal rehabilitation. Inertial or optical motion capture sensors are commonly used to track the subject's movements. However, the use of these motion capture tools suffers from the lack of accuracy in estimating joint angles, which could lead to wrong data interpretation. In this study, we proposed a real time quaternion-based fusion scheme, based on the extended Kalman filter, between inertial and visual motion capture sensors, to improve the estimation accuracy of joint angles. The fusion outcome was compared to angles measured using a goniometer. The fusion output shows a better estimation, when compared to inertial measurement units and Kinect outputs. We noted a smaller error (3.96°) compared to the one obtained using inertial sensors (5.04°). The proposed multi-sensor fusion system is therefore accurate enough to be applied, in future works, to our serious game for musculoskeletal rehabilitation.
NASA Astrophysics Data System (ADS)
Mitilineos, Stelios A.; Argyreas, Nick D.; Thomopoulos, Stelios C. A.
2009-05-01
A fusion-based localization technique for location-based services in indoor environments is introduced herein, based on ultrasound time-of-arrival measurements from multiple off-the-shelf range estimating sensors which are used in a market-available localization system. In-situ field measurements results indicated that the respective off-the-shelf system was unable to estimate position in most of the cases, while the underlying sensors are of low-quality and yield highly inaccurate range and position estimates. An extensive analysis is performed and a model of the sensor-performance characteristics is established. A low-complexity but accurate sensor fusion and localization technique is then developed, which consists inof evaluating multiple sensor measurements and selecting the one that is considered most-accurate based on the underlying sensor model. Optimality, in the sense of a genie selecting the optimum sensor, is subsequently evaluated and compared to the proposed technique. The experimental results indicate that the proposed fusion method exhibits near-optimal performance and, albeit being theoretically suboptimal, it largely overcomes most flaws of the underlying single-sensor system resulting in a localization system of increased accuracy, robustness and availability.
Li, Qingguo
2017-01-01
With the advancements in micro-electromechanical systems (MEMS) technologies, magnetic and inertial sensors are becoming more and more accurate, lightweight, smaller in size as well as low-cost, which in turn boosts their applications in human movement analysis. However, challenges still exist in the field of sensor orientation estimation, where magnetic disturbance represents one of the obstacles limiting their practical application. The objective of this paper is to systematically analyze exactly how magnetic disturbances affects the attitude and heading estimation for a magnetic and inertial sensor. First, we reviewed four major components dealing with magnetic disturbance, namely decoupling attitude estimation from magnetic reading, gyro bias estimation, adaptive strategies of compensating magnetic disturbance and sensor fusion algorithms. We review and analyze the features of existing methods of each component. Second, to understand each component in magnetic disturbance rejection, four representative sensor fusion methods were implemented, including gradient descent algorithms, improved explicit complementary filter, dual-linear Kalman filter and extended Kalman filter. Finally, a new standardized testing procedure has been developed to objectively assess the performance of each method against magnetic disturbance. Based upon the testing results, the strength and weakness of the existing sensor fusion methods were easily examined, and suggestions were presented for selecting a proper sensor fusion algorithm or developing new sensor fusion method. PMID:29283432
NASA Astrophysics Data System (ADS)
Zan, Tao; Wang, Min; Hu, Jianzhong
2010-12-01
Machining status monitoring technique by multi-sensors can acquire and analyze the machining process information to implement abnormity diagnosis and fault warning. Statistical quality control technique is normally used to distinguish abnormal fluctuations from normal fluctuations through statistical method. In this paper by comparing the advantages and disadvantages of the two methods, the necessity and feasibility of integration and fusion is introduced. Then an approach that integrates multi-sensors status monitoring and statistical process control based on artificial intelligent technique, internet technique and database technique is brought forward. Based on virtual instrument technique the author developed the machining quality assurance system - MoniSysOnline, which has been used to monitoring the grinding machining process. By analyzing the quality data and AE signal information of wheel dressing process the reason of machining quality fluctuation has been obtained. The experiment result indicates that the approach is suitable for the status monitoring and analyzing of machining process.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S; Sen, Satyabrata; Berry, M. L..
Domestic Nuclear Detection Office s (DNDO) Intelligence Radiation Sensors Systems (IRSS) program supported the development of networks of commercial-off-the-shelf (COTS) radiation counters for detecting, localizing, and identifying low-level radiation sources. Under this program, a series of indoor and outdoor tests were conducted with multiple source strengths and types, different background profiles, and various types of source and detector movements. Following the tests, network algorithms were replayed in various re-constructed scenarios using sub-networks. These measurements and algorithm traces together provide a rich collection of highly valuable datasets for testing the current and next generation radiation network algorithms, including the ones (tomore » be) developed by broader R&D communities such as distributed detection, information fusion, and sensor networks. From this multiple TeraByte IRSS database, we distilled out and packaged the first batch of canonical datasets for public release. They include measurements from ten indoor and two outdoor tests which represent increasingly challenging baseline scenarios for robustly testing radiation network algorithms.« less
The Fusion Model of Intelligent Transportation Systems Based on the Urban Traffic Ontology
NASA Astrophysics Data System (ADS)
Yang, Wang-Dong; Wang, Tao
On these issues unified representation of urban transport information using urban transport ontology, it defines the statute and the algebraic operations of semantic fusion in ontology level in order to achieve the fusion of urban traffic information in the semantic completeness and consistency. Thus this paper takes advantage of the semantic completeness of the ontology to build urban traffic ontology model with which we resolve the problems as ontology mergence and equivalence verification in semantic fusion of traffic information integration. Information integration in urban transport can increase the function of semantic fusion, and reduce the amount of data integration of urban traffic information as well enhance the efficiency and integrity of traffic information query for the help, through the practical application of intelligent traffic information integration platform of Changde city, the paper has practically proved that the semantic fusion based on ontology increases the effect and efficiency of the urban traffic information integration, reduces the storage quantity, and improve query efficiency and information completeness.
Sensor Fusion Techniques for Phased-Array Eddy Current and Phased-Array Ultrasound Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arrowood, Lloyd F.
Sensor (or Data) fusion is the process of integrating multiple data sources to produce more consistent, accurate and comprehensive information than is provided by a single data source. Sensor fusion may also be used to combine multiple signals from a single modality to improve the performance of a particular inspection technique. Industrial nondestructive testing may utilize multiple sensors to acquire inspection data depending upon the object under inspection and the anticipated types of defects that can be identified. Sensor fusion can be performed at various levels of signal abstraction with each having its strengths and weaknesses. A multimodal data fusionmore » strategy first proposed by Heideklang and Shokouhi that combines spatially scattered detection locations to improve detection performance of surface-breaking and near-surface cracks in ferromagnetic metals is shown using a surface inspection example and is then extended for volumetric inspections. Utilizing data acquired from an Olympus Omniscan MX2 from both phased array eddy current and ultrasound probes on test phantoms, single and multilevel fusion techniques are employed to integrate signals from the two modalities. Preliminary results demonstrate how confidence in defect identification and interpretation benefit from sensor fusion techniques. Lastly, techniques for integrating data into radiographic and volumetric imagery from computed tomography are described and results are presented.« less
Ali, Abdulbaset; Hu, Bing; Ramahi, Omar
2015-05-15
This work presents a real life experiment of implementing an artificial intelligence model for detecting sub-millimeter cracks in metallic surfaces on a dataset obtained from a waveguide sensor loaded with metamaterial elements. Crack detection using microwave sensors is typically based on human observation of change in the sensor's signal (pattern) depicted on a high-resolution screen of the test equipment. However, as demonstrated in this work, implementing artificial intelligence to classify cracked from non-cracked surfaces has appreciable impact in terms of sensing sensitivity, cost, and automation. Furthermore, applying artificial intelligence for post-processing data collected from microwave sensors is a cornerstone for handheld test equipment that can outperform rack equipment with large screens and sophisticated plotting features. The proposed method was tested on a metallic plate with different cracks and the obtained experimental results showed good crack classification accuracy rates.
Ali, Abdulbaset; Hu, Bing; Ramahi, Omar M.
2015-01-01
This work presents a real-life experiment implementing an artificial intelligence model for detecting sub-millimeter cracks in metallic surfaces on a dataset obtained from a waveguide sensor loaded with metamaterial elements. Crack detection using microwave sensors is typically based on human observation of change in the sensor's signal (pattern) depicted on a high-resolution screen of the test equipment. However, as demonstrated in this work, implementing artificial intelligence to classify cracked from non-cracked surfaces has appreciable impacts in terms of sensing sensitivity, cost, and automation. Furthermore, applying artificial intelligence for post-processing the data collected from microwave sensors is a cornerstone for handheld test equipment that can outperform rack equipment with large screens and sophisticated plotting features. The proposed method was tested on a metallic plate with different cracks, and the experimental results showed good crack classification accuracy rates. PMID:25988871
Double Cluster Heads Model for Secure and Accurate Data Fusion in Wireless Sensor Networks
Fu, Jun-Song; Liu, Yun
2015-01-01
Secure and accurate data fusion is an important issue in wireless sensor networks (WSNs) and has been extensively researched in the literature. In this paper, by combining clustering techniques, reputation and trust systems, and data fusion algorithms, we propose a novel cluster-based data fusion model called Double Cluster Heads Model (DCHM) for secure and accurate data fusion in WSNs. Different from traditional clustering models in WSNs, two cluster heads are selected after clustering for each cluster based on the reputation and trust system and they perform data fusion independently of each other. Then, the results are sent to the base station where the dissimilarity coefficient is computed. If the dissimilarity coefficient of the two data fusion results exceeds the threshold preset by the users, the cluster heads will be added to blacklist, and the cluster heads must be reelected by the sensor nodes in a cluster. Meanwhile, feedback is sent from the base station to the reputation and trust system, which can help us to identify and delete the compromised sensor nodes in time. Through a series of extensive simulations, we found that the DCHM performed very well in data fusion security and accuracy. PMID:25608211
An Improved Multi-Sensor Fusion Navigation Algorithm Based on the Factor Graph
Zeng, Qinghua; Chen, Weina; Liu, Jianye; Wang, Huizhe
2017-01-01
An integrated navigation system coupled with additional sensors can be used in the Micro Unmanned Aerial Vehicle (MUAV) applications because the multi-sensor information is redundant and complementary, which can markedly improve the system accuracy. How to deal with the information gathered from different sensors efficiently is an important problem. The fact that different sensors provide measurements asynchronously may complicate the processing of these measurements. In addition, the output signals of some sensors appear to have a non-linear character. In order to incorporate these measurements and calculate a navigation solution in real time, the multi-sensor fusion algorithm based on factor graph is proposed. The global optimum solution is factorized according to the chain structure of the factor graph, which allows for a more general form of the conditional probability density. It can convert the fusion matter into connecting factors defined by these measurements to the graph without considering the relationship between the sensor update frequency and the fusion period. An experimental MUAV system has been built and some experiments have been performed to prove the effectiveness of the proposed method. PMID:28335570
An Improved Multi-Sensor Fusion Navigation Algorithm Based on the Factor Graph.
Zeng, Qinghua; Chen, Weina; Liu, Jianye; Wang, Huizhe
2017-03-21
An integrated navigation system coupled with additional sensors can be used in the Micro Unmanned Aerial Vehicle (MUAV) applications because the multi-sensor information is redundant and complementary, which can markedly improve the system accuracy. How to deal with the information gathered from different sensors efficiently is an important problem. The fact that different sensors provide measurements asynchronously may complicate the processing of these measurements. In addition, the output signals of some sensors appear to have a non-linear character. In order to incorporate these measurements and calculate a navigation solution in real time, the multi-sensor fusion algorithm based on factor graph is proposed. The global optimum solution is factorized according to the chain structure of the factor graph, which allows for a more general form of the conditional probability density. It can convert the fusion matter into connecting factors defined by these measurements to the graph without considering the relationship between the sensor update frequency and the fusion period. An experimental MUAV system has been built and some experiments have been performed to prove the effectiveness of the proposed method.
Practical considerations in Bayesian fusion of point sensors
NASA Astrophysics Data System (ADS)
Johnson, Kevin; Minor, Christian
2012-06-01
Sensor data fusion is and has been a topic of considerable research, but rigorous and quantitative understanding of the benefits of fusing specific types of sensor data remains elusive. Often, sensor fusion is performed on an ad hoc basis with the assumption that overall detection capabilities will improve, only to discover later, after expensive and time consuming laboratory and/or field testing that little advantage was gained. The work presented here will discuss these issues with theoretical and practical considerations in the context of fusing chemical sensors with binary outputs. Results are given for the potential performance gains one could expect with such systems, as well as the practical difficulties involved in implementing an optimal Bayesian fusion strategy with realistic scenarios. Finally, a discussion of the biases that inaccurate statistical estimates introduce into the results and their consequences is presented.
A Brief Overview of NASA Glenn Research Center Sensor and Electronics Activities
NASA Technical Reports Server (NTRS)
Hunter, Gary W.
2012-01-01
Aerospace applications require a range of sensing technologies. There is a range of sensor and sensor system technologies being developed using microfabrication and micromachining technology to form smart sensor systems and intelligent microsystems. Drive system intelligence to the local (sensor) level -- distributed smart sensor systems. Sensor and sensor system development examples: (1) Thin-film physical sensors (2) High temperature electronics and wireless (3) "lick and stick" technology. NASA GRC is a world leader in aerospace sensor technology with a broad range of development and application experience. Core microsystems technology applicable to a range of application environmentS.
A Comparison of Vibration and Oil Debris Gear Damage Detection Methods Applied to Pitting Damage
NASA Technical Reports Server (NTRS)
Dempsey, Paula J.
2000-01-01
Helicopter Health Usage Monitoring Systems (HUMS) must provide reliable, real-time performance monitoring of helicopter operating parameters to prevent damage of flight critical components. Helicopter transmission diagnostics are an important part of a helicopter HUMS. In order to improve the reliability of transmission diagnostics, many researchers propose combining two technologies, vibration and oil monitoring, using data fusion and intelligent systems. Some benefits of combining multiple sensors to make decisions include improved detection capabilities and increased probability the event is detected. However, if the sensors are inaccurate, or the features extracted from the sensors are poor predictors of transmission health, integration of these sensors will decrease the accuracy of damage prediction. For this reason, one must verify the individual integrity of vibration and oil analysis methods prior to integrating the two technologies. This research focuses on comparing the capability of two vibration algorithms, FM4 and NA4, and a commercially available on-line oil debris monitor to detect pitting damage on spur gears in the NASA Glenn Research Center Spur Gear Fatigue Test Rig. Results from this research indicate that the rate of change of debris mass measured by the oil debris monitor is comparable to the vibration algorithms in detecting gear pitting damage.
Network-Capable Application Process and Wireless Intelligent Sensors for ISHM
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Morris, Jon; Turowski, Mark; Wang, Ray
2011-01-01
Intelligent sensor technology and systems are increasingly becoming attractive means to serve as frameworks for intelligent rocket test facilities with embedded intelligent sensor elements, distributed data acquisition elements, and onboard data acquisition elements. Networked intelligent processors enable users and systems integrators to automatically configure their measurement automation systems for analog sensors. NASA and leading sensor vendors are working together to apply the IEEE 1451 standard for adding plug-and-play capabilities for wireless analog transducers through the use of a Transducer Electronic Data Sheet (TEDS) in order to simplify sensor setup, use, and maintenance, to automatically obtain calibration data, and to eliminate manual data entry and error. A TEDS contains the critical information needed by an instrument or measurement system to identify, characterize, interface, and properly use the signal from an analog sensor. A TEDS is deployed for a sensor in one of two ways. First, the TEDS can reside in embedded, nonvolatile memory (typically flash memory) within the intelligent processor. Second, a virtual TEDS can exist as a separate file, downloadable from the Internet. This concept of virtual TEDS extends the benefits of the standardized TEDS to legacy sensors and applications where the embedded memory is not available. An HTML-based user interface provides a visual tool to interface with those distributed sensors that a TEDS is associated with, to automate the sensor management process. Implementing and deploying the IEEE 1451.1-based Network-Capable Application Process (NCAP) can achieve support for intelligent process in Integrated Systems Health Management (ISHM) for the purpose of monitoring, detection of anomalies, diagnosis of causes of anomalies, prediction of future anomalies, mitigation to maintain operability, and integrated awareness of system health by the operator. It can also support local data collection and storage. This invention enables wide-area sensing and employs numerous globally distributed sensing devices that observe the physical world through the existing sensor network. This innovation enables distributed storage, distributed processing, distributed intelligence, and the availability of DiaK (Data, Information, and Knowledge) to any element as needed. It also enables the simultaneous execution of multiple processes, and represents models that contribute to the determination of the condition and health of each element in the system. The NCAP (intelligent process) can configure data-collection and filtering processes in reaction to sensed data, allowing it to decide when and how to adapt collection and processing with regard to sophisticated analysis of data derived from multiple sensors. The user will be able to view the sensing device network as a single unit that supports a high-level query language. Each query would be able to operate over data collected from across the global sensor network just as a search query encompasses millions of Web pages. The sensor web can preserve ubiquitous information access between the querier and the queried data. Pervasive monitoring of the physical world raises significant data and privacy concerns. This innovation enables different authorities to control portions of the sensing infrastructure, and sensor service authors may wish to compose services across authority boundaries.
Towards an autonomous sensor architecture for persistent area protection
NASA Astrophysics Data System (ADS)
Thomas, Paul A.; Marshall, Gillian F.; Stubbins, Daniel J.; Faulkner, David A.
2016-10-01
The majority of sensor installations for area protection (e.g. critical national infrastructure, military forward operating bases, etc.) make use of banks of screens each containing one or more sensor feeds, such that the burden of combining data from the various sources, understanding the situation, and controlling the sensors all lies with the human operator. Any automation in the system is generally heavily bespoke for the particular installation, leading to an inflexible system which is difficult to change or upgrade. We have developed a modular system architecture consisting of intelligent autonomous sensor modules, a high level decision making module, a middleware integration layer and an end-user GUI. The modules are all effectively "plug and play", and we have demonstrated that it is relatively simple to incorporate legacy sensors into the architecture. We have extended our previously-reported SAPIENT demonstration system to operate with a larger number and variety of sensor modules, over an extended area, detecting and classifying a wider variety of "threat activities", both vehicular and pedestrian. We report the results of a demonstration of the SAPIENT system containing multiple autonomous sensor modules with a range of modalities including laser scanners, radar, TI, EO, acoustic and seismic sensors. They operate from a combination of mains, generator and battery power, and communicate with the central "hub" over Ethernet, point-to-point wireless links and Wi-Fi. The system has been configured to protect an extended area in a complex semi-urban environment. We discuss the operation of the SAPIENT system in a realistic demonstration environment (which included significant activity not under trial control), showing sensor cueing, multi-modal sensor fusion, threat prioritisation and target hand-off.
A robust vision-based sensor fusion approach for real-time pose estimation.
Assa, Akbar; Janabi-Sharifi, Farrokh
2014-02-01
Object pose estimation is of great importance to many applications, such as augmented reality, localization and mapping, motion capture, and visual servoing. Although many approaches based on a monocular camera have been proposed, only a few works have concentrated on applying multicamera sensor fusion techniques to pose estimation. Higher accuracy and enhanced robustness toward sensor defects or failures are some of the advantages of these schemes. This paper presents a new Kalman-based sensor fusion approach for pose estimation that offers higher accuracy and precision, and is robust to camera motion and image occlusion, compared to its predecessors. Extensive experiments are conducted to validate the superiority of this fusion method over currently employed vision-based pose estimation algorithms.
An Expert System For Multispectral Threat Assessment And Response
NASA Astrophysics Data System (ADS)
Steinberg, Alan N.
1987-05-01
A concept has been defined for an automatic system to manage the self-defense of a combat aircraft. Distinctive new features of this concept include: a. the flexible prioritization of tasks and coordinated use of sensor, countermeasures, flight systems and weapons assets by means of an automated planning function; b. the integration of state-of-the-art data fusion algorithms with event prediction processing; c. the use of advanced Artificial Intelligence tools to emulate the decision processes of tactical EW experts. Threat Assessment functions (a) estimate threat identity, lethality and intent on the basis of multi-spectral sensor data, and (b) predict the time to critical events in threat engagements (e.g., target acquisition, tracking, weapon launch, impact). Response Management functions (a) select candidate responses to reported threat situations; (b) estimate the effects of candidate actions on survival; and (c) coordinate the assignment of sensors, weapons and countermeasures with the flight plan. The system employs Finite State Models to represent current engagements and to predict subsequent events. Each state in a model is associated with a set of observable features, allowing interpretation of sensor data and adaptive use of sensor assets. Defined conditions on state transitions allow prediction of times to critical future states and are used in planning self-defensive responses, which are designed either to impede a particular state transition or to force a transition to a lower threat state.
2016-12-21
PLANNING TO COUNTER THREAT NETWORKS Joint Intelligence Preparation of the Operational Environment and Threat Networks...Army Expeditionary Forensic Facility in Afghanistan ........ E-9 E-4 Exploitation Support to Intelligence Fusion and Decision Making ......... E-10...Approach The groundwork for successful countering threat networks activities starts with information and intelligence to develop an understanding
2011-02-15
already robust lineup of 57 National Guard Combat Support Teams (CSTs) and 17 CBRNE Enhanced Response Force Packages (CERFPs) to increase the existing...analysis of disparate data sources, identification of intelligence gaps, and proactive collection of intelligence against those gaps which could
2012-12-01
flows, diversity, emergence, networks, fusion, strategic planning, information sharing, ecosystem, hierarchy, NJ Regional Operations Intelligence ...Related Information...........................................................................79 viii 3. Production of Disaster Intelligence for... Intelligence for Field Personnel .................80 5. Focused Collection Efforts to Support FEMA and NJ OEM Operations
Enhanced image capture through fusion
NASA Technical Reports Server (NTRS)
Burt, Peter J.; Hanna, Keith; Kolczynski, Raymond J.
1993-01-01
Image fusion may be used to combine images from different sensors, such as IR and visible cameras, to obtain a single composite with extended information content. Fusion may also be used to combine multiple images from a given sensor to form a composite image in which information of interest is enhanced. We present a general method for performing image fusion and show that this method is effective for diverse fusion applications. We suggest that fusion may provide a powerful tool for enhanced image capture with broad utility in image processing and computer vision.
Science of Land Target Spectral Signatures
2013-04-03
F. Meriaudeau, T. Downey , A. Wig , A. Passian, M. Buncick, T.L. Ferrell, Fiber optic sensor based on gold island plasmon resonance , Sensors and...processing, detection algorithms, sensor fusion, spectral signature modeling Dr. J. Michael Cathcart Georgia Tech Research Corporation Office of...target detection and sensor fusion. The phenomenology research continued to focus on spectroscopic soil measurements, optical property analyses, field
Ligorio, Gabriele; Bergamini, Elena; Pasciuto, Ilaria; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2016-01-01
Information from complementary and redundant sensors are often combined within sensor fusion algorithms to obtain a single accurate observation of the system at hand. However, measurements from each sensor are characterized by uncertainties. When multiple data are fused, it is often unclear how all these uncertainties interact and influence the overall performance of the sensor fusion algorithm. To address this issue, a benchmarking procedure is presented, where simulated and real data are combined in different scenarios in order to quantify how each sensor’s uncertainties influence the accuracy of the final result. The proposed procedure was applied to the estimation of the pelvis orientation using a waist-worn magnetic-inertial measurement unit. Ground-truth data were obtained from a stereophotogrammetric system and used to obtain simulated data. Two Kalman-based sensor fusion algorithms were submitted to the proposed benchmarking procedure. For the considered application, gyroscope uncertainties proved to be the main error source in orientation estimation accuracy for both tested algorithms. Moreover, although different performances were obtained using simulated data, these differences became negligible when real data were considered. The outcome of this evaluation may be useful both to improve the design of new sensor fusion methods and to drive the algorithm tuning process. PMID:26821027
NASA Astrophysics Data System (ADS)
Gruber, Thomas; Grim, Larry; Fauth, Ryan; Tercha, Brian; Powell, Chris; Steinhardt, Kristin
2011-05-01
Large networks of disparate chemical/biological (C/B) sensors, MET sensors, and intelligence, surveillance, and reconnaissance (ISR) sensors reporting to various command/display locations can lead to conflicting threat information, questions of alarm confidence, and a confused situational awareness. Sensor netting algorithms (SNA) are being developed to resolve these conflicts and to report high confidence consensus threat map data products on a common operating picture (COP) display. A data fusion algorithm design was completed in a Phase I SBIR effort and development continues in the Phase II SBIR effort. The initial implementation and testing of the algorithm has produced some performance results. The algorithm accepts point and/or standoff sensor data, and event detection data (e.g., the location of an explosion) from various ISR sensors (e.g., acoustic, infrared cameras, etc.). These input data are preprocessed to assign estimated uncertainty to each incoming piece of data. The data are then sent to a weighted tomography process to obtain a consensus threat map, including estimated threat concentration level uncertainty. The threat map is then tested for consistency and the overall confidence for the map result is estimated. The map and confidence results are displayed on a COP. The benefits of a modular implementation of the algorithm and comparisons of fused / un-fused data results will be presented. The metrics for judging the sensor-netting algorithm performance are warning time, threat map accuracy (as compared to ground truth), false alarm rate, and false alarm rate v. reported threat confidence level.
Estimation and Fusion for Tracking Over Long-Haul Links Using Artificial Neural Networks
Liu, Qiang; Brigham, Katharine; Rao, Nageswara S. V.
2017-02-01
In a long-haul sensor network, sensors are remotely deployed over a large geographical area to perform certain tasks, such as tracking and/or monitoring of one or more dynamic targets. A remote fusion center fuses the information provided by these sensors so that a final estimate of certain target characteristics – such as the position – is expected to possess much improved quality. In this paper, we pursue learning-based approaches for estimation and fusion of target states in longhaul sensor networks. In particular, we consider learning based on various implementations of artificial neural networks (ANNs). Finally, the joint effect of (i)more » imperfect communication condition, namely, link-level loss and delay, and (ii) computation constraints, in the form of low-quality sensor estimates, on ANN-based estimation and fusion, is investigated by means of analytical and simulation studies.« less
Estimation and Fusion for Tracking Over Long-Haul Links Using Artificial Neural Networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Qiang; Brigham, Katharine; Rao, Nageswara S. V.
In a long-haul sensor network, sensors are remotely deployed over a large geographical area to perform certain tasks, such as tracking and/or monitoring of one or more dynamic targets. A remote fusion center fuses the information provided by these sensors so that a final estimate of certain target characteristics – such as the position – is expected to possess much improved quality. In this paper, we pursue learning-based approaches for estimation and fusion of target states in longhaul sensor networks. In particular, we consider learning based on various implementations of artificial neural networks (ANNs). Finally, the joint effect of (i)more » imperfect communication condition, namely, link-level loss and delay, and (ii) computation constraints, in the form of low-quality sensor estimates, on ANN-based estimation and fusion, is investigated by means of analytical and simulation studies.« less
Data Fusion for a Vision-Radiological System for Source Tracking and Discovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enqvist, Andreas; Koppal, Sanjeev
2015-07-01
A multidisciplinary approach to allow the tracking of the movement of radioactive sources by fusing data from multiple radiological and visual sensors is under development. The goal is to improve the ability to detect, locate, track and identify nuclear/radiological threats. The key concept is that such widely available visual and depth sensors can impact radiological detection, since the intensity fall-off in the count rate can be correlated to movement in three dimensions. To enable this, we pose an important question; what is the right combination of sensing modalities and vision algorithms that can best compliment a radiological sensor, for themore » purpose of detection and tracking of radioactive material? Similarly what is the best radiation detection methods and unfolding algorithms suited for data fusion with tracking data? Data fusion of multi-sensor data for radiation detection have seen some interesting developments lately. Significant examples include intelligent radiation sensor systems (IRSS), which are based on larger numbers of distributed similar or identical radiation sensors coupled with position data for network capable to detect and locate radiation source. Other developments are gamma-ray imaging systems based on Compton scatter in segmented detector arrays. Similar developments using coded apertures or scatter cameras for neutrons have recently occurred. The main limitation of such systems is not so much in their capability but rather in their complexity and cost which is prohibitive for large scale deployment. Presented here is a fusion system based on simple, low-cost computer vision and radiological sensors for tracking of multiple objects and identifying potential radiological materials being transported or shipped. The main focus of this work is the development on two separate calibration algorithms for characterizing the fused sensor system. The deviation from a simple inverse square-root fall-off of radiation intensity is explored and accounted for. In particular, the computer vision system enables a map of distance-dependence of the sources being tracked. Infrared, laser or stereoscopic vision sensors are all options for computer-vision implementation depending on interior vs exterior deployment, resolution desired and other factors. Similarly the radiation sensors will be focused on gamma-ray or neutron detection due to the long travel length and ability to penetrate even moderate shielding. There is a significant difference between the vision sensors and radiation sensors in the way the 'source' or signals are generated. A vision sensor needs an external light-source to illuminate the object and then detects the re-emitted illumination (or lack thereof). However, for a radiation detector, the radioactive material is the source itself. The only exception to this is the field of active interrogations where radiation is beamed into a material to entice new/additional radiation emission beyond what the material would emit spontaneously. The aspect of the nuclear material being the source itself means that all other objects in the environment are 'illuminated' or irradiated by the source. Most radiation will readily penetrate regular material, scatter in new directions or be absorbed. Thus if a radiation source is located near a larger object that object will in turn scatter some radiation that was initially emitted in a direction other than the direction of the radiation detector, this can add to the count rate that is observed. The effect of these scatter is a deviation from the traditional distance dependence of the radiation signal and is a key challenge that needs a combined system calibration solution and algorithms. Thus both an algebraic approach as well as a statistical approach have been developed and independently evaluated to investigate the sensitivity to this deviation from the simplified radiation fall-off as a function of distance. The resulting calibrated system algorithms are used and demonstrated in various laboratory scenarios, and later in realistic tracking scenarios. The selection and testing of radiological and computer-vision sensors for the additional specific scenarios will be the subject of ongoing and future work. (authors)« less
Multisensor fusion for 3D target tracking using track-before-detect particle filter
NASA Astrophysics Data System (ADS)
Moshtagh, Nima; Romberg, Paul M.; Chan, Moses W.
2015-05-01
This work presents a novel fusion mechanism for estimating the three-dimensional trajectory of a moving target using images collected by multiple imaging sensors. The proposed projective particle filter avoids the explicit target detection prior to fusion. In projective particle filter, particles that represent the posterior density (of target state in a high-dimensional space) are projected onto the lower-dimensional observation space. Measurements are generated directly in the observation space (image plane) and a marginal (sensor) likelihood is computed. The particles states and their weights are updated using the joint likelihood computed from all the sensors. The 3D state estimate of target (system track) is then generated from the states of the particles. This approach is similar to track-before-detect particle filters that are known to perform well in tracking dim and stealthy targets in image collections. Our approach extends the track-before-detect approach to 3D tracking using the projective particle filter. The performance of this measurement-level fusion method is compared with that of a track-level fusion algorithm using the projective particle filter. In the track-level fusion algorithm, the 2D sensor tracks are generated separately and transmitted to a fusion center, where they are treated as measurements to the state estimator. The 2D sensor tracks are then fused to reconstruct the system track. A realistic synthetic scenario with a boosting target was generated, and used to study the performance of the fusion mechanisms.
Motion Artifact Quantification and Sensor Fusion for Unobtrusive Health Monitoring.
Hoog Antink, Christoph; Schulz, Florian; Leonhardt, Steffen; Walter, Marian
2017-12-25
Sensors integrated into objects of everyday life potentially allow unobtrusive health monitoring at home. However, since the coupling of sensors and subject is not as well-defined as compared to a clinical setting, the signal quality is much more variable and can be disturbed significantly by motion artifacts. One way of tackling this challenge is the combined evaluation of multiple channels via sensor fusion. For robust and accurate sensor fusion, analyzing the influence of motion on different modalities is crucial. In this work, a multimodal sensor setup integrated into an armchair is presented that combines capacitively coupled electrocardiography, reflective photoplethysmography, two high-frequency impedance sensors and two types of ballistocardiography sensors. To quantify motion artifacts, a motion protocol performed by healthy volunteers is recorded with a motion capture system, and reference sensors perform cardiorespiratory monitoring. The shape-based signal-to-noise ratio SNR S is introduced and used to quantify the effect on motion on different sensing modalities. Based on this analysis, an optimal combination of sensors and fusion methodology is developed and evaluated. Using the proposed approach, beat-to-beat heart-rate is estimated with a coverage of 99.5% and a mean absolute error of 7.9 ms on 425 min of data from seven volunteers in a proof-of-concept measurement scenario.
Expanding the Role of Emergency Medical Services in Homeland Security
2013-03-01
1 A. BACKGROUND AND OVERVIEW .............................................................2 B ... B . DATA ANALYSIS .........................................................................................20 III. ANALYSIS AND EVALUATION—EMS AS...INTELLIGENCE SENSORS ......21 A. ACTING AS INTELLIGENCE SENSORS ................................................21 B . PREVENTION MODELS
Vision Guided Intelligent Robot Design And Experiments
NASA Astrophysics Data System (ADS)
Slutzky, G. D.; Hall, E. L.
1988-02-01
The concept of an intelligent robot is an important topic combining sensors, manipulators, and artificial intelligence to design a useful machine. Vision systems, tactile sensors, proximity switches and other sensors provide the elements necessary for simple game playing as well as industrial applications. These sensors permit adaption to a changing environment. The AI techniques permit advanced forms of decision making, adaptive responses, and learning while the manipulator provides the ability to perform various tasks. Computer languages such as LISP and OPS5, have been utilized to achieve expert systems approaches in solving real world problems. The purpose of this paper is to describe several examples of visually guided intelligent robots including both stationary and mobile robots. Demonstrations will be presented of a system for constructing and solving a popular peg game, a robot lawn mower, and a box stacking robot. The experience gained from these and other systems provide insight into what may be realistically expected from the next generation of intelligent machines.
Multisensor Parallel Largest Ellipsoid Distributed Data Fusion with Unknown Cross-Covariances
Liu, Baoyu; Zhan, Xingqun; Zhu, Zheng H.
2017-01-01
As the largest ellipsoid (LE) data fusion algorithm can only be applied to two-sensor system, in this contribution, parallel fusion structure is proposed to introduce the LE algorithm into a multisensor system with unknown cross-covariances, and three parallel fusion structures based on different estimate pairing methods are presented and analyzed. In order to assess the influence of fusion structure on fusion performance, two fusion performance assessment parameters are defined as Fusion Distance and Fusion Index. Moreover, the formula for calculating the upper bounds of actual fused error covariances of the presented multisensor LE fusers is also provided. Demonstrated with simulation examples, the Fusion Index indicates fuser’s actual fused accuracy and its sensitivity to the sensor orders, as well as its robustness to the accuracy of newly added sensors. Compared to the LE fuser with sequential structure, the LE fusers with proposed parallel structures not only significantly improve their properties in these aspects, but also embrace better performances in consistency and computation efficiency. The presented multisensor LE fusers generally have better accuracies than covariance intersection (CI) fusion algorithm and are consistent when the local estimates are weakly correlated. PMID:28661442
Low-Cost Ultrasonic Distance Sensor Arrays with Networked Error Correction
Dai, Hongjun; Zhao, Shulin; Jia, Zhiping; Chen, Tianzhou
2013-01-01
Distance has been one of the basic factors in manufacturing and control fields, and ultrasonic distance sensors have been widely used as a low-cost measuring tool. However, the propagation of ultrasonic waves is greatly affected by environmental factors such as temperature, humidity and atmospheric pressure. In order to solve the problem of inaccurate measurement, which is significant within industry, this paper presents a novel ultrasonic distance sensor model using networked error correction (NEC) trained on experimental data. This is more accurate than other existing approaches because it uses information from indirect association with neighboring sensors, which has not been considered before. The NEC technique, focusing on optimization of the relationship of the topological structure of sensor arrays, is implemented for the compensation of erroneous measurements caused by the environment. We apply the maximum likelihood method to determine the optimal fusion data set and use a neighbor discovery algorithm to identify neighbor nodes at the top speed. Furthermore, we adopt the NEC optimization algorithm, which takes full advantage of the correlation coefficients for neighbor sensors. The experimental results demonstrate that the ranging errors of the NEC system are within 2.20%; furthermore, the mean absolute percentage error is reduced to 0.01% after three iterations of this method, which means that the proposed method performs extremely well. The optimized method of distance measurement we propose, with the capability of NEC, would bring a significant advantage for intelligent industrial automation. PMID:24013491
SVM-based multi-sensor fusion for free-living physical activity assessment.
Liu, Shaopeng; Gao, Robert X; John, Dinesh; Staudenmayer, John; Freedson, Patty S
2011-01-01
This paper presents a sensor fusion method for assessing physical activity (PA) of human subjects, based on the support vector machines (SVMs). Specifically, acceleration and ventilation measured by a wearable multi-sensor device on 50 test subjects performing 13 types of activities of varying intensities are analyzed, from which the activity types and related energy expenditures are derived. The result shows that the method correctly recognized the 13 activity types 84.7% of the time, which is 26% higher than using a hip accelerometer alone. Also, the method predicted the associated energy expenditure with a root mean square error of 0.43 METs, 43% lower than using a hip accelerometer alone. Furthermore, the fusion method was effective in reducing the subject-to-subject variability (standard deviation of recognition accuracies across subjects) in activity recognition, especially when data from the ventilation sensor was added to the fusion model. These results demonstrate that the multi-sensor fusion technique presented is more effective in assessing activities of varying intensities than the traditional accelerometer-alone based methods.
Protecting Networks Via Automated Defense of Cyber Systems
2016-09-01
autonomics, and artificial intelligence . Our conclusion is that automation is the future of cyber defense, and that advances are being made in each of...SUBJECT TERMS Internet of Things, autonomics, sensors, artificial intelligence , cyber defense, active cyber defense, automated indicator sharing...called Automated Defense of Cyber Systems, built upon three core technological components: sensors, autonomics, and artificial intelligence . Our
Smart Distributed Sensor Fields: Algorithms for Tactical Sensors
2013-12-23
ranging from detecting, identifying, localizing/tracking interesting events, discarding irrelevant data, to providing actionable intelligence currently...tracking interesting events, discarding irrelevant data, to providing actionable intelligence currently requires significant human super- vision. Human...view of the overall system. The main idea is to reduce the problem to the relevant data, and then reason intelligently over that data. This process
Multisensor fusion with non-optimal decision rules: the challenges of open world sensing
NASA Astrophysics Data System (ADS)
Minor, Christian; Johnson, Kevin
2014-05-01
In this work, simple, generic models of chemical sensing are used to simulate sensor array data and to illustrate the impact on overall system performance that specific design choices impart. The ability of multisensor systems to perform multianalyte detection (i.e., distinguish multiple targets) is explored by examining the distinction between fundamental design-related limitations stemming from mismatching of mixture composition to fused sensor measurement spaces, and limitations that arise from measurement uncertainty. Insight on the limits and potential of sensor fusion to robustly address detection tasks in realistic field conditions can be gained through an examination of a) the underlying geometry of both the composition space of sources one hopes to elucidate and the measurement space a fused sensor system is capable of generating, and b) the informational impact of uncertainty on both of these spaces. For instance, what is the potential impact on sensor fusion in an open world scenario where unknown interferants may contaminate target signals? Under complex and dynamic backgrounds, decision rules may implicitly become non-optimal and adding sensors may increase the amount of conflicting information observed. This suggests that the manner in which a decision rule handles sensor conflict can be critical in leveraging sensor fusion for effective open world sensing, and becomes exponentially more important as more sensors are added. Results and design considerations for handling conflicting evidence in Bayes and Dempster-Shafer fusion frameworks are presented. Bayesian decision theory is used to provide an upper limit on detector performance of simulated sensor systems.
Reconfigurable intelligent sensors for health monitoring: a case study of pulse oximeter sensor.
Jovanov, E; Milenkovic, A; Basham, S; Clark, D; Kelley, D
2004-01-01
Design of low-cost, miniature, lightweight, ultra low-power, intelligent sensors capable of customization and seamless integration into a body area network for health monitoring applications presents one of the most challenging tasks for system designers. To answer this challenge we propose a reconfigurable intelligent sensor platform featuring a low-power microcontroller, a low-power programmable logic device, a communication interface, and a signal conditioning circuit. The proposed solution promises a cost-effective, flexible platform that allows easy customization, run-time reconfiguration, and energy-efficient computation and communication. The development of a common platform for multiple physical sensors and a repository of both software procedures and soft intellectual property cores for hardware acceleration will increase reuse and alleviate costs of transition to a new generation of sensors. As a case study, we present an implementation of a reconfigurable pulse oximeter sensor.
Sensor fusion for synthetic vision
NASA Technical Reports Server (NTRS)
Pavel, M.; Larimer, J.; Ahumada, A.
1991-01-01
Display methodologies are explored for fusing images gathered by millimeter wave sensors with images rendered from an on-board terrain data base to facilitate visually guided flight and ground operations in low visibility conditions. An approach to fusion based on multiresolution image representation and processing is described which facilitates fusion of images differing in resolution within and between images. To investigate possible fusion methods, a workstation-based simulation environment is being developed.
NASA Astrophysics Data System (ADS)
Hanson, Jeffrey A.; McLaughlin, Keith L.; Sereno, Thomas J.
2011-06-01
We have developed a flexible, target-driven, multi-modal, physics-based fusion architecture that efficiently searches sensor detections for targets and rejects clutter while controlling the combinatoric problems that commonly arise in datadriven fusion systems. The informational constraints imposed by long lifetime requirements make systems vulnerable to false alarms. We demonstrate that our data fusion system significantly reduces false alarms while maintaining high sensitivity to threats. In addition, mission goals can vary substantially in terms of targets-of-interest, required characterization, acceptable latency, and false alarm rates. Our fusion architecture provides the flexibility to match these trade-offs with mission requirements unlike many conventional systems that require significant modifications for each new mission. We illustrate our data fusion performance with case studies that span many of the potential mission scenarios including border surveillance, base security, and infrastructure protection. In these studies, we deployed multi-modal sensor nodes - including geophones, magnetometers, accelerometers and PIR sensors - with low-power processing algorithms and low-bandwidth wireless mesh networking to create networks capable of multi-year operation. The results show our data fusion architecture maintains high sensitivities while suppressing most false alarms for a variety of environments and targets.
2018-01-30
algorithms. Due to this, Fusion was built with the goal of extensibility throughout the architecture. The Fusion infrastructure enables software...DISTRIBUTION STATEMENT A: Approved for public release. Cleared, 88PA, Case# 2018-0820. b. Trigger a Highly Mobile ...modes were developed in IMPACT (i.e., normal full coverage patrol (NFCP) and highly mobile (HM)). In both NFCP and HM, all UxVs patrol their assigned
Intelligent Data Fusion for Wide-Area Assessment of UXO Contamination
2008-02-29
Development Program (SERDP). The authors thank the SERDP staff and team members for their assistance, particularly Dr. Herb Nelson and Dr. Dan Steinhurst...Fusion and Integration for Intelligent Systems, Taipei, Taiwan , R.O.C., Aug., 1999. 4. B.J. Johnson, T.G. Moore, B.J. Blejer, C.F. Lee, T.P. Opar, S...gene-expression data using Dempster-Shafer Theory of evidence to predict breast cancer tumors,” Bioinformation 1(5), 170-5, (2006) 21. Dr. Herb H. Nelson, personal communication (2007)
Neural network fusion capabilities for efficient implementation of tracking algorithms
NASA Astrophysics Data System (ADS)
Sundareshan, Malur K.; Amoozegar, Farid
1997-03-01
The ability to efficiently fuse information of different forms to facilitate intelligent decision making is one of the major capabilities of trained multilayer neural networks that is now being recognized. While development of innovative adaptive control algorithms for nonlinear dynamical plants that attempt to exploit these capabilities seems to be more popular, a corresponding development of nonlinear estimation algorithms using these approaches, particularly for application in target surveillance and guidance operations, has not received similar attention. We describe the capabilities and functionality of neural network algorithms for data fusion and implementation of tracking filters. To discuss details and to serve as a vehicle for quantitative performance evaluations, the illustrative case of estimating the position and velocity of surveillance targets is considered. Efficient target- tracking algorithms that can utilize data from a host of sensing modalities and are capable of reliably tracking even uncooperative targets executing fast and complex maneuvers are of interest in a number of applications. The primary motivation for employing neural networks in these applications comes from the efficiency with which more features extracted from different sensor measurements can be utilized as inputs for estimating target maneuvers. A system architecture that efficiently integrates the fusion capabilities of a trained multilayer neural net with the tracking performance of a Kalman filter is described. The innovation lies in the way the fusion of multisensor data is accomplished to facilitate improved estimation without increasing the computational complexity of the dynamical state estimator itself.
Review on the Traction System Sensor Technology of a Rail Transit Train.
Feng, Jianghua; Xu, Junfeng; Liao, Wu; Liu, Yong
2017-06-11
The development of high-speed intelligent rail transit has increased the number of sensors applied on trains. These play an important role in train state control and monitoring. These sensors generally work in a severe environment, so the key problem for sensor data acquisition is to ensure data accuracy and reliability. In this paper, we follow the sequence of sensor signal flow, present sensor signal sensing technology, sensor data acquisition, and processing technology, as well as sensor fault diagnosis technology based on the voltage, current, speed, and temperature sensors which are commonly used in train traction systems. Finally, intelligent sensors and future research directions of rail transit train sensors are discussed.
Review on the Traction System Sensor Technology of a Rail Transit Train
Feng, Jianghua; Xu, Junfeng; Liao, Wu; Liu, Yong
2017-01-01
The development of high-speed intelligent rail transit has increased the number of sensors applied on trains. These play an important role in train state control and monitoring. These sensors generally work in a severe environment, so the key problem for sensor data acquisition is to ensure data accuracy and reliability. In this paper, we follow the sequence of sensor signal flow, present sensor signal sensing technology, sensor data acquisition, and processing technology, as well as sensor fault diagnosis technology based on the voltage, current, speed, and temperature sensors which are commonly used in train traction systems. Finally, intelligent sensors and future research directions of rail transit train sensors are discussed. PMID:28604615
Proceedings of the Augmented VIsual Display (AVID) Research Workshop
NASA Technical Reports Server (NTRS)
Kaiser, Mary K. (Editor); Sweet, Barbara T. (Editor)
1993-01-01
The papers, abstracts, and presentations were presented at a three day workshop focused on sensor modeling and simulation, and image enhancement, processing, and fusion. The technical sessions emphasized how sensor technology can be used to create visual imagery adequate for aircraft control and operations. Participants from industry, government, and academic laboratories contributed to panels on Sensor Systems, Sensor Modeling, Sensor Fusion, Image Processing (Computer and Human Vision), and Image Evaluation and Metrics.
Application of sensor networks to intelligent transportation systems.
DOT National Transportation Integrated Search
2009-12-01
The objective of the research performed is the application of wireless sensor networks to intelligent transportation infrastructures, with the aim of increasing their dependability and improving the efficacy of data collection and utilization. Exampl...
Intelligent modular star and target tracker: a new generation of attitude sensors
NASA Astrophysics Data System (ADS)
Schmidt, Uwe; Strobel, Rainer; Wunder, Dietmar; Graf, Eberhart
2018-04-01
This paper, "Intelligent modular star and target tracker: a new generation of attitude sensors," was presented as part of International Conference on Space Optics—ICSO 1997, held in Toulouse, France.
Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis
Jiang, Wen; Xie, Chunhe; Zhuang, Miaoyan; Shou, Yehang; Tang, Yongchuan
2016-01-01
Sensor data fusion technology is widely employed in fault diagnosis. The information in a sensor data fusion system is characterized by not only fuzziness, but also partial reliability. Uncertain information of sensors, including randomness, fuzziness, etc., has been extensively studied recently. However, the reliability of a sensor is often overlooked or cannot be analyzed adequately. A Z-number, Z = (A, B), can represent the fuzziness and the reliability of information simultaneously, where the first component A represents a fuzzy restriction on the values of uncertain variables and the second component B is a measure of the reliability of A. In order to model and process the uncertainties in a sensor data fusion system reasonably, in this paper, a novel method combining the Z-number and Dempster–Shafer (D-S) evidence theory is proposed, where the Z-number is used to model the fuzziness and reliability of the sensor data and the D-S evidence theory is used to fuse the uncertain information of Z-numbers. The main advantages of the proposed method are that it provides a more robust measure of reliability to the sensor data, and the complementary information of multi-sensors reduces the uncertainty of the fault recognition, thus enhancing the reliability of fault detection. PMID:27649193
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-01-01
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation.
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-12-26
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.
Computational Intelligence for Medical Imaging Simulations.
Chang, Victor
2017-11-25
This paper describes how to simulate medical imaging by computational intelligence to explore areas that cannot be easily achieved by traditional ways, including genes and proteins simulations related to cancer development and immunity. This paper has presented simulations and virtual inspections of BIRC3, BIRC6, CCL4, KLKB1 and CYP2A6 with their outputs and explanations, as well as brain segment intensity due to dancing. Our proposed MapReduce framework with the fusion algorithm can simulate medical imaging. The concept is very similar to the digital surface theories to simulate how biological units can get together to form bigger units, until the formation of the entire unit of biological subject. The M-Fusion and M-Update function by the fusion algorithm can achieve a good performance evaluation which can process and visualize up to 40 GB of data within 600 s. We conclude that computational intelligence can provide effective and efficient healthcare research offered by simulations and visualization.
Zhang, Xinzheng; Rad, Ahmad B; Wong, Yiu-Kwong
2012-01-01
This paper presents a sensor fusion strategy applied for Simultaneous Localization and Mapping (SLAM) in dynamic environments. The designed approach consists of two features: (i) the first one is a fusion module which synthesizes line segments obtained from laser rangefinder and line features extracted from monocular camera. This policy eliminates any pseudo segments that appear from any momentary pause of dynamic objects in laser data. (ii) The second characteristic is a modified multi-sensor point estimation fusion SLAM (MPEF-SLAM) that incorporates two individual Extended Kalman Filter (EKF) based SLAM algorithms: monocular and laser SLAM. The error of the localization in fused SLAM is reduced compared with those of individual SLAM. Additionally, a new data association technique based on the homography transformation matrix is developed for monocular SLAM. This data association method relaxes the pleonastic computation. The experimental results validate the performance of the proposed sensor fusion and data association method.
Kim, Min Young; Lee, Hyunkee; Cho, Hyungsuck
2008-04-10
One major research issue associated with 3D perception by robotic systems is the creation of efficient sensor systems that can generate dense range maps reliably. A visual sensor system for robotic applications is developed that is inherently equipped with two types of sensor, an active trinocular vision and a passive stereo vision. Unlike in conventional active vision systems that use a large number of images with variations of projected patterns for dense range map acquisition or from conventional passive vision systems that work well on specific environments with sufficient feature information, a cooperative bidirectional sensor fusion method for this visual sensor system enables us to acquire a reliable dense range map using active and passive information simultaneously. The fusion algorithms are composed of two parts, one in which the passive stereo vision helps active vision and the other in which the active trinocular vision helps the passive one. The first part matches the laser patterns in stereo laser images with the help of intensity images; the second part utilizes an information fusion technique using the dynamic programming method in which image regions between laser patterns are matched pixel-by-pixel with help of the fusion results obtained in the first part. To determine how the proposed sensor system and fusion algorithms can work in real applications, the sensor system is implemented on a robotic system, and the proposed algorithms are applied. A series of experimental tests is performed for a variety of configurations of robot and environments. The performance of the sensor system is discussed in detail.
Kalman filter-based EM-optical sensor fusion for needle deflection estimation.
Jiang, Baichuan; Gao, Wenpeng; Kacher, Daniel; Nevo, Erez; Fetics, Barry; Lee, Thomas C; Jayender, Jagadeesan
2018-04-01
In many clinical procedures such as cryoablation that involves needle insertion, accurate placement of the needle's tip at the desired target is the major issue for optimizing the treatment and minimizing damage to the neighboring anatomy. However, due to the interaction force between the needle and tissue, considerable error in intraoperative tracking of the needle tip can be observed as needle deflects. In this paper, measurements data from an optical sensor at the needle base and a magnetic resonance (MR) gradient field-driven electromagnetic (EM) sensor placed 10 cm from the needle tip are used within a model-integrated Kalman filter-based sensor fusion scheme. Bending model-based estimations and EM-based direct estimation are used as the measurement vectors in the Kalman filter, thus establishing an online estimation approach. Static tip bending experiments show that the fusion method can reduce the mean error of the tip position estimation from 29.23 mm of the optical sensor-based approach to 3.15 mm of the fusion-based approach and from 39.96 to 6.90 mm, at the MRI isocenter and the MRI entrance, respectively. This work established a novel sensor fusion scheme that incorporates model information, which enables real-time tracking of needle deflection with MRI compatibility, in a free-hand operating setup.
Evaluation of taste solutions by sensor fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kojima, Yohichiro; Sato, Eriko; Atobe, Masahiko
In our previous studies, properties of taste solutions were discriminated based on sound velocity and amplitude of ultrasonic waves propagating through the solutions. However, to make this method applicable to beverages which contain many taste substances, further studies are required. In this study, the waveform of an ultrasonic wave with frequency of approximately 5 MHz propagating through a solution was measured and subjected to frequency analysis. Further, taste sensors require various techniques of sensor fusion to effectively obtain chemical and physical parameter of taste solutions. A sensor fusion method of ultrasonic wave sensor and various sensors, such as the surfacemore » plasmon resonance (SPR) sensor, to estimate tastes were proposed and examined in this report. As a result, differences among pure water and two basic taste solutions were clearly observed as differences in their properties. Furthermore, a self-organizing neural network was applied to obtained data which were used to clarify the differences among solutions.« less
Desensitized Optimal Filtering and Sensor Fusion Toolkit
NASA Technical Reports Server (NTRS)
Karlgaard, Christopher D.
2015-01-01
Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.
Higher-level fusion for military operations based on abductive inference: proof of principle
NASA Astrophysics Data System (ADS)
Pantaleev, Aleksandar V.; Josephson, John
2006-04-01
The ability of contemporary military commanders to estimate and understand complicated situations already suffers from information overload, and the situation can only grow worse. We describe a prototype application that uses abductive inferencing to fuse information from multiple sensors to evaluate the evidence for higher-level hypotheses that are close to the levels of abstraction needed for decision making (approximately JDL levels 2 and 3). Abductive inference (abduction, inference to the best explanation) is a pattern of reasoning that occurs naturally in diverse settings such as medical diagnosis, criminal investigations, scientific theory formation, and military intelligence analysis. Because abduction is part of common-sense reasoning, implementations of it can produce reasoning traces that are very human understandable. Automated abductive inferencing can be deployed to augment human reasoning, taking advantage of computation to process large amounts of information, and to bypass limits to human attention and short-term memory. We illustrate the workings of the prototype system by describing an example of its use for small-unit military operations in an urban setting. Knowledge was encoded as it might be captured prior to engagement from a standard military decision making process (MDMP) and analysis of commander's priority intelligence requirements (PIR). The system is able to reasonably estimate the evidence for higher-level hypotheses based on information from multiple sensors. Its inference processes can be examined closely to verify correctness. Decision makers can override conclusions at any level and changes will propagate appropriately.
Intelligent multi-sensor integrations
NASA Technical Reports Server (NTRS)
Volz, Richard A.; Jain, Ramesh; Weymouth, Terry
1989-01-01
Growth in the intelligence of space systems requires the use and integration of data from multiple sensors. Generic tools are being developed for extracting and integrating information obtained from multiple sources. The full spectrum is addressed for issues ranging from data acquisition, to characterization of sensor data, to adaptive systems for utilizing the data. In particular, there are three major aspects to the project, multisensor processing, an adaptive approach to object recognition, and distributed sensor system integration.
Multisensor configurations for early sniper detection
NASA Astrophysics Data System (ADS)
Lindgren, D.; Bank, D.; Carlsson, L.; Dulski, R.; Duval, Y.; Fournier, G.; Grasser, R.; Habberstad, H.; Jacquelard, C.; Kastek, M.; Otterlei, R.; Piau, G.-P.; Pierre, F.; Renhorn, I.; Sjöqvist, L.; Steinvall, O.; Trzaskawka, P.
2011-11-01
This contribution reports some of the fusion results from the EDA SNIPOD project, where different multisensor configurations for sniper detection and localization have been studied. A project aim has been to cover the whole time line from sniper transport and establishment to shot. To do so, different optical sensors with and without laser illumination have been tested, as well as acoustic arrays and solid state projectile radar. A sensor fusion node collects detections and background statistics from all sensors and employs hypothesis testing and multisensor estimation programs to produce unified and reliable sniper alarms and accurate sniper localizations. Operator interfaces that connect to the fusion node should be able to support both sniper countermeasures and the guidance of personnel to safety. Although the integrated platform has not been actually built, sensors have been evaluated at common field trials with military ammunitions in the caliber range 5.56 to 12.7 mm, and at sniper distances up to 900 m. It is concluded that integrating complementary sensors for pre- and postshot sniper detection in a common system with automatic detection and fusion will give superior performance, compared to stand alone sensors. A practical system is most likely designed with a cost effective subset of available complementary sensors.
1986-01-01
the AAAI Workshop on Uncertainty and Probability in Artificial Intelligence , 1985. [McC771 McCarthy, J. "Epistemological Problems of Aritificial ...NUMBER OF PAGES Artificial Intelligence , Data Fusion, Inference, Probability, 30 Philosophy, Inheritance Hierachies, Default Reasoning ia.PRCECODE I...prominent philosophers Glymour and Thomason even applaud the uninhibited steps: Artificial Intelligence has done us the service not only of reminding us
Intelligent On-Board Processing in the Sensor Web
NASA Astrophysics Data System (ADS)
Tanner, S.
2005-12-01
Most existing sensing systems are designed as passive, independent observers. They are rarely aware of the phenomena they observe, and are even less likely to be aware of what other sensors are observing within the same environment. Increasingly, intelligent processing of sensor data is taking place in real-time, using computing resources on-board the sensor or the platform itself. One can imagine a sensor network consisting of intelligent and autonomous space-borne, airborne, and ground-based sensors. These sensors will act independently of one another, yet each will be capable of both publishing and receiving sensor information, observations, and alerts among other sensors in the network. Furthermore, these sensors will be capable of acting upon this information, perhaps altering acquisition properties of their instruments, changing the location of their platform, or updating processing strategies for their own observations to provide responsive information or additional alerts. Such autonomous and intelligent sensor networking capabilities provide significant benefits for collections of heterogeneous sensors within any environment. They are crucial for multi-sensor observations and surveillance, where real-time communication with external components and users may be inhibited, and the environment may be hostile. In all environments, mission automation and communication capabilities among disparate sensors will enable quicker response to interesting, rare, or unexpected events. Additionally, an intelligent network of heterogeneous sensors provides the advantage that all of the sensors can benefit from the unique capabilities of each sensor in the network. The University of Alabama in Huntsville (UAH) is developing a unique approach to data processing, integration and mining through the use of the Adaptive On-Board Data Processing (AODP) framework. AODP is a key foundation technology for autonomous internetworking capabilities to support situational awareness by sensors and their on-board processes. The two primary research areas for this project are (1) the on-board processing and communications framework itself, and (2) data mining algorithms targeted to the needs and constraints of the on-board environment. The team is leveraging its experience in on-board processing, data mining, custom data processing, and sensor network design. Several unique UAH-developed technologies are employed in the AODP project, including EVE, an EnVironmEnt for on-board processing, and the data mining tools included in the Algorithm Development and Mining (ADaM) toolkit.
Optical gateway for intelligent buildings: a new open-up window to the optical fibre sensors market?
NASA Astrophysics Data System (ADS)
Fernandez-Valdivielso, Carlos; Matias, Ignacio R.; Arregui, Francisco J.; Bariain, Candido; Lopez-Amo, Manuel
2004-06-01
This paper presents the first optical fiber sensor gateway for integrating these special measurement devices in Home Automation Systems, concretely in those buildings that use the KNX European Intelligent Buildings Standard.
Standards-based sensor interoperability and networking SensorWeb: an overview
NASA Astrophysics Data System (ADS)
Bolling, Sam
2012-06-01
The War fighter lacks a unified Intelligence, Surveillance, and Reconnaissance (ISR) environment to conduct mission planning, command and control (C2), tasking, collection, exploitation, processing, and data discovery of disparate sensor data across the ISR Enterprise. Legacy sensors and applications are not standardized or integrated for assured, universal access. Existing tasking and collection capabilities are not unified across the enterprise, inhibiting robust C2 of ISR including near-real time, cross-cueing operations. To address these critical needs, the National Measurement and Signature Intelligence (MASINT) Office (NMO), and partnering Combatant Commands and Intelligence Agencies are developing SensorWeb, an architecture that harmonizes heterogeneous sensor data to a common standard for users to discover, access, observe, subscribe to and task sensors. The SensorWeb initiative long term goal is to establish an open commercial standards-based, service-oriented framework to facilitate plug and play sensors. The current development effort will produce non-proprietary deliverables, intended as a Government off the Shelf (GOTS) solution to address the U.S. and Coalition nations' inability to quickly and reliably detect, identify, map, track, and fully understand security threats and operational activities.
Sensor Needs for Control and Health Management of Intelligent Aircraft Engines
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Gang, Sanjay; Hunter, Gary W.; Guo, Ten-Huei; Semega, Kenneth J.
2004-01-01
NASA and the U.S. Department of Defense are conducting programs which support the future vision of "intelligent" aircraft engines for enhancing the affordability, performance, operability, safety, and reliability of aircraft propulsion systems. Intelligent engines will have advanced control and health management capabilities enabling these engines to be self-diagnostic, self-prognostic, and adaptive to optimize performance based upon the current condition of the engine or the current mission of the vehicle. Sensors are a critical technology necessary to enable the intelligent engine vision as they are relied upon to accurately collect the data required for engine control and health management. This paper reviews the anticipated sensor requirements to support the future vision of intelligent engines from a control and health management perspective. Propulsion control and health management technologies are discussed in the broad areas of active component controls, propulsion health management and distributed controls. In each of these three areas individual technologies will be described, input parameters necessary for control feedback or health management will be discussed, and sensor performance specifications for measuring these parameters will be summarized.
Shi, Yunbo; Luo, Yi; Zhao, Wenjie; Shang, Chunxue; Wang, Yadong; Chen, Yinsheng
2013-01-01
This paper describes the design and implementation of a radiosonde which can measure the meteorological temperature, humidity, pressure, and other atmospheric data. The system is composed of a CPU, microwave module, temperature sensor, pressure sensor and humidity sensor array. In order to effectively solve the humidity sensor condensation problem due to the low temperatures in the high altitude environment, a capacitive humidity sensor including four humidity sensors to collect meteorological humidity and a platinum resistance heater was developed using micro-electro-mechanical-system (MEMS) technology. A platinum resistance wire with 99.999% purity and 0.023 mm in diameter was used to obtain the meteorological temperature. A multi-sensor data fusion technique was applied to process the atmospheric data. Static and dynamic experimental results show that the designed humidity sensor with platinum resistance heater can effectively tackle the sensor condensation problem, shorten response times and enhance sensitivity. The humidity sensor array can improve measurement accuracy and obtain a reliable initial meteorological humidity data, while the multi-sensor data fusion technique eliminates the uncertainty in the measurement. The radiosonde can accurately reflect the meteorological changes. PMID:23857263
Shi, Yunbo; Luo, Yi; Zhao, Wenjie; Shang, Chunxue; Wang, Yadong; Chen, Yinsheng
2013-07-12
This paper describes the design and implementation of a radiosonde which can measure the meteorological temperature, humidity, pressure, and other atmospheric data. The system is composed of a CPU, microwave module, temperature sensor, pressure sensor and humidity sensor array. In order to effectively solve the humidity sensor condensation problem due to the low temperatures in the high altitude environment, a capacitive humidity sensor including four humidity sensors to collect meteorological humidity and a platinum resistance heater was developed using micro-electro-mechanical-system (MEMS) technology. A platinum resistance wire with 99.999% purity and 0.023 mm in diameter was used to obtain the meteorological temperature. A multi-sensor data fusion technique was applied to process the atmospheric data. Static and dynamic experimental results show that the designed humidity sensor with platinum resistance heater can effectively tackle the sensor condensation problem, shorten response times and enhance sensitivity. The humidity sensor array can improve measurement accuracy and obtain a reliable initial meteorological humidity data, while the multi-sensor data fusion technique eliminates the uncertainty in the measurement. The radiosonde can accurately reflect the meteorological changes.
NASA Astrophysics Data System (ADS)
McCullough, Claire L.; Novobilski, Andrew J.; Fesmire, Francis M.
2006-04-01
Faculty from the University of Tennessee at Chattanooga and the University of Tennessee College of Medicine, Chattanooga Unit, have used data mining techniques and neural networks to examine a set of fourteen features, data items, and HUMINT assessments for 2,148 emergency room patients with symptoms possibly indicative of Acute Coronary Syndrome. Specifically, the authors have generated Bayesian networks describing linkages and causality in the data, and have compared them with neural networks. The data includes objective information routinely collected during triage and the physician's initial case assessment, a HUMINT appraisal. Both the neural network and the Bayesian network were used to fuse the disparate types of information with the goal of forecasting thirty-day adverse patient outcome. This paper presents details of the methods of data fusion including both the data mining techniques and the neural network. Results are compared using Receiver Operating Characteristic curves describing the outcomes of both methods, both using only objective features and including the subjective physician's assessment. While preliminary, the results of this continuing study are significant both from the perspective of potential use of the intelligent fusion of biomedical informatics to aid the physician in prescribing treatment necessary to prevent serious adverse outcome from ACS and as a model of fusion of objective data with subjective HUMINT assessment. Possible future work includes extension of successfully demonstrated intelligent fusion methods to other medical applications, and use of decision level fusion to combine results from data mining and neural net approaches for even more accurate outcome prediction.
Kim, Dae-Hee; Choi, Jae-Hun; Lim, Myung-Eun; Park, Soo-Jun
2008-01-01
This paper suggests the method of correcting distance between an ambient intelligence display and a user based on linear regression and smoothing method, by which distance information of a user who approaches to the display can he accurately output even in an unanticipated condition using a passive infrared VIR) sensor and an ultrasonic device. The developed system consists of an ambient intelligence display and an ultrasonic transmitter, and a sensor gateway. Each module communicates with each other through RF (Radio frequency) communication. The ambient intelligence display includes an ultrasonic receiver and a PIR sensor for motion detection. In particular, this system selects and processes algorithms such as smoothing or linear regression for current input data processing dynamically through judgment process that is determined using the previous reliable data stored in a queue. In addition, we implemented GUI software with JAVA for real time location tracking and an ambient intelligence display.
Sensor and Actuator Needs for More Intelligent Gas Turbine Engines
NASA Technical Reports Server (NTRS)
Garg, Sanjay; Schadow, Klaus; Horn, Wolfgang; Pfoertner, Hugo; Stiharu, Ion
2010-01-01
This paper provides an overview of the controls and diagnostics technologies, that are seen as critical for more intelligent gas turbine engines (GTE), with an emphasis on the sensor and actuator technologies that need to be developed for the controls and diagnostics implementation. The objective of the paper is to help the "Customers" of advanced technologies, defense acquisition and aerospace research agencies, understand the state-of-the-art of intelligent GTE technologies, and help the "Researchers" and "Technology Developers" for GTE sensors and actuators identify what technologies need to be developed to enable the "Intelligent GTE" concepts and focus their research efforts on closing the technology gap. To keep the effort manageable, the focus of the paper is on "On-Board Intelligence" to enable safe and efficient operation of the engine over its life time, with an emphasis on gas path performance
Improved blood glucose estimation through multi-sensor fusion.
Xiong, Feiyu; Hipszer, Brian R; Joseph, Jeffrey; Kam, Moshe
2011-01-01
Continuous glucose monitoring systems are an integral component of diabetes management. Efforts to improve the accuracy and robustness of these systems are at the forefront of diabetes research. Towards this goal, a multi-sensor approach was evaluated in hospitalized patients. In this paper, we report on a multi-sensor fusion algorithm to combine glucose sensor measurements in a retrospective fashion. The results demonstrate the algorithm's ability to improve the accuracy and robustness of the blood glucose estimation with current glucose sensor technology.
Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter.
Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei
2016-11-02
Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system's error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts.
Multi-Sensor Fusion with Interaction Multiple Model and Chi-Square Test Tolerant Filter
Yang, Chun; Mohammadi, Arash; Chen, Qing-Wei
2016-01-01
Motivated by the key importance of multi-sensor information fusion algorithms in the state-of-the-art integrated navigation systems due to recent advancements in sensor technologies, telecommunication, and navigation systems, the paper proposes an improved and innovative fault-tolerant fusion framework. An integrated navigation system is considered consisting of four sensory sub-systems, i.e., Strap-down Inertial Navigation System (SINS), Global Navigation System (GPS), the Bei-Dou2 (BD2) and Celestial Navigation System (CNS) navigation sensors. In such multi-sensor applications, on the one hand, the design of an efficient fusion methodology is extremely constrained specially when no information regarding the system’s error characteristics is available. On the other hand, the development of an accurate fault detection and integrity monitoring solution is both challenging and critical. The paper addresses the sensitivity issues of conventional fault detection solutions and the unavailability of a precisely known system model by jointly designing fault detection and information fusion algorithms. In particular, by using ideas from Interacting Multiple Model (IMM) filters, the uncertainty of the system will be adjusted adaptively by model probabilities and using the proposed fuzzy-based fusion framework. The paper also addresses the problem of using corrupted measurements for fault detection purposes by designing a two state propagator chi-square test jointly with the fusion algorithm. Two IMM predictors, running in parallel, are used and alternatively reactivated based on the received information form the fusion filter to increase the reliability and accuracy of the proposed detection solution. With the combination of the IMM and the proposed fusion method, we increase the failure sensitivity of the detection system and, thereby, significantly increase the overall reliability and accuracy of the integrated navigation system. Simulation results indicate that the proposed fault tolerant fusion framework provides superior performance over its traditional counterparts. PMID:27827832
Design of a multisensor data fusion system for target detection
NASA Astrophysics Data System (ADS)
Thomopoulos, Stelios C.; Okello, Nickens N.; Kadar, Ivan; Lovas, Louis A.
1993-09-01
The objective of this paper is to discuss the issues that are involved in the design of a multisensor fusion system and provide a systematic analysis and synthesis methodology for the design of the fusion system. The system under consideration consists of multifrequency (similar) radar sensors. However, the fusion design must be flexible to accommodate additional dissimilar sensors such as IR, EO, ESM, and Ladar. The motivation for the system design is the proof of the fusion concept for enhancing the detectability of small targets in clutter. In the context of down-selecting the proper configuration for multisensor (similar and dissimilar, and centralized vs. distributed) data fusion, the issues of data modeling, fusion approaches, and fusion architectures need to be addressed for the particular application being considered. Although the study of different approaches may proceed in parallel, the interplay among them is crucial in selecting a fusion configuration for a given application. The natural sequence for addressing the three different issues is to begin from the data modeling, in order to determine the information content of the data. This information will dictate the appropriate fusion approach. This, in turn, will lead to a global fusion architecture. Both distributed and centralized fusion architectures are used to illustrate the design issues along with Monte-Carlo simulation performance comparison of a single sensor versus a multisensor centrally fused system.
Implementation of Integrated System Fault Management Capability
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Schmalzel, John; Morris, Jon; Smith, Harvey; Turowski, Mark
2008-01-01
Fault Management to support rocket engine test mission with highly reliable and accurate measurements; while improving availability and lifecycle costs. CORE ELEMENTS: Architecture, taxonomy, and ontology (ATO) for DIaK management. Intelligent Sensor Processes; Intelligent Element Processes; Intelligent Controllers; Intelligent Subsystem Processes; Intelligent System Processes; Intelligent Component Processes.
The Intelligent Technologies of Electronic Information System
NASA Astrophysics Data System (ADS)
Li, Xianyu
2017-08-01
Based upon the synopsis of system intelligence and information services, this paper puts forward the attributes and the logic structure of information service, sets forth intelligent technology framework of electronic information system, and presents a series of measures, such as optimizing business information flow, advancing data decision capability, improving information fusion precision, strengthening deep learning application and enhancing prognostic and health management, and demonstrates system operation effectiveness. This will benefit the enhancement of system intelligence.
Research on the strategy of underwater united detection fusion and communication using multi-sensor
NASA Astrophysics Data System (ADS)
Xu, Zhenhua; Huang, Jianguo; Huang, Hai; Zhang, Qunfei
2011-09-01
In order to solve the distributed detection fusion problem of underwater target detection, when the signal to noise ratio (SNR) of the acoustic channel is low, a new strategy for united detection fusion and communication using multiple sensors was proposed. The performance of detection fusion was studied and compared based on the Neyman-Pearson principle when the binary phase shift keying (BPSK) and on-off keying (OOK) modes were used by the local sensors. The comparative simulation and analysis between the optimal likelihood ratio test and the proposed strategy was completed, and both the theoretical analysis and simulation indicate that using the proposed new strategy could improve the detection performance effectively. In theory, the proposed strategy of united detection fusion and communication is of great significance to the establishment of an underwater target detection system.
Data analysis and integration of environmental sensors to meet human needs
NASA Astrophysics Data System (ADS)
Santamaria, Amilcare Francesco; De Rango, Floriano; Barletta, Domenico; Falbo, Domenico; Imbrogno, Alessandro
2014-05-01
Nowadays one of the main task of technology is to make people's life simpler and easier. Ambient intelligence is an emerging discipline that brings intelligence to environments making them sensitive to us. This discipline has developed following the spread of sensors devices, sensor networks, pervasive computing and artificial intelligence. In this work, we attempt to enhance the Internet Of Things (loT) with intelligence and environments exploring various interactions between humans' beings and the environment they live in. In particular, the core of the system is composed of an automation system, which is made up with a domotic control unit and several sensors installed in the environment. The task of the sensors is to collect information from the environment and to send them to the control unit. Once the information is collected, the core combines them in order to infer the most accurate human needs. The knowledge of human needs and the current environment status compose the inputs of the intelligence block whose main goal is to find the right automations to satisfy human needs in a real time way. The system also provides a Speech Recognition service which allow users to interact with the system by their voice so human speech can be considered as additional input for smart automatisms.
Batchu, S; Narasimhachar, H; Mayeda, J C; Hall, T; Lopez, J; Nguyen, T; Banister, R E; Lie, D Y C
2017-07-01
Doppler-based non-contact vital signs (NCVS) sensors can monitor heart rates, respiration rates, and motions of patients without physically touching them. We have developed a novel single-board Doppler-based phased-array antenna NCVS biosensor system that can perform robust overnight continuous NCVS monitoring with intelligent automatic subject tracking and optimal beam steering algorithms. Our NCVS sensor achieved overnight continuous vital signs monitoring with an impressive heart-rate monitoring accuracy of over 94% (i.e., within ±5 Beats-Per-Minute vs. a reference sensor), analyzed from over 400,000 data points collected during each overnight monitoring period of ~ 6 hours at a distance of 1.75 meters. The data suggests our intelligent phased-array NCVS sensor can be very attractive for continuous monitoring of low-acuity patients.
Distributed Sensor Fusion for Scalar Field Mapping Using Mobile Sensor Networks.
La, Hung Manh; Sheng, Weihua
2013-04-01
In this paper, autonomous mobile sensor networks are deployed to measure a scalar field and build its map. We develop a novel method for multiple mobile sensor nodes to build this map using noisy sensor measurements. Our method consists of two parts. First, we develop a distributed sensor fusion algorithm by integrating two different distributed consensus filters to achieve cooperative sensing among sensor nodes. This fusion algorithm has two phases. In the first phase, the weighted average consensus filter is developed, which allows each sensor node to find an estimate of the value of the scalar field at each time step. In the second phase, the average consensus filter is used to allow each sensor node to find a confidence of the estimate at each time step. The final estimate of the value of the scalar field is iteratively updated during the movement of the mobile sensors via weighted average. Second, we develop the distributed flocking-control algorithm to drive the mobile sensors to form a network and track the virtual leader moving along the field when only a small subset of the mobile sensors know the information of the leader. Experimental results are provided to demonstrate our proposed algorithms.
Frequency domain surface EMG sensor fusion for estimating finger forces.
Potluri, Chandrasekhar; Kumar, Parmod; Anugolu, Madhavi; Urfer, Alex; Chiu, Steve; Naidu, D; Schoen, Marco P
2010-01-01
Extracting or estimating skeletal hand/finger forces using surface electro myographic (sEMG) signals poses many challenges due to cross-talk, noise, and a temporal and spatially modulated signal characteristics. Normal sEMG measurements are based on single sensor data. In this paper, array sensors are used along with a proposed sensor fusion scheme that result in a simple Multi-Input-Single-Output (MISO) transfer function. Experimental data is used along with system identification to find this MISO system. A Genetic Algorithm (GA) approach is employed to optimize the characteristics of the MISO system. The proposed fusion-based approach is tested experimentally and indicates improvement in finger/hand force estimation.
Information processing for aerospace structural health monitoring
NASA Astrophysics Data System (ADS)
Lichtenwalner, Peter F.; White, Edward V.; Baumann, Erwin W.
1998-06-01
Structural health monitoring (SHM) technology provides a means to significantly reduce life cycle of aerospace vehicles by eliminating unnecessary inspections, minimizing inspection complexity, and providing accurate diagnostics and prognostics to support vehicle life extension. In order to accomplish this, a comprehensive SHM system will need to acquire data from a wide variety of diverse sensors including strain gages, accelerometers, acoustic emission sensors, crack growth gages, corrosion sensors, and piezoelectric transducers. Significant amounts of computer processing will then be required to convert this raw sensor data into meaningful information which indicates both the diagnostics of the current structural integrity as well as the prognostics necessary for planning and managing the future health of the structure in a cost effective manner. This paper provides a description of the key types of information processing technologies required in an effective SHM system. These include artificial intelligence techniques such as neural networks, expert systems, and fuzzy logic for nonlinear modeling, pattern recognition, and complex decision making; signal processing techniques such as Fourier and wavelet transforms for spectral analysis and feature extraction; statistical algorithms for optimal detection, estimation, prediction, and fusion; and a wide variety of other algorithms for data analysis and visualization. The intent of this paper is to provide an overview of the role of information processing for SHM, discuss various technologies which can contribute to accomplishing this role, and present some example applications of information processing for SHM implemented at the Boeing Company.
NASA Astrophysics Data System (ADS)
Salehi, Hadi; Das, Saptarshi; Chakrabartty, Shantanu; Biswas, Subir; Burgueño, Rigoberto
2017-04-01
This study proposes a novel strategy for damage identification in aircraft structures. The strategy was evaluated based on the simulation of the binary data generated from self-powered wireless sensors employing a pulse switching architecture. The energy-aware pulse switching communication protocol uses single pulses instead of multi-bit packets for information delivery resulting in discrete binary data. A system employing this energy-efficient technology requires dealing with time-delayed binary data due to the management of power budgets for sensing and communication. This paper presents an intelligent machine-learning framework based on combination of the low-rank matrix decomposition and pattern recognition (PR) methods. Further, data fusion is employed as part of the machine-learning framework to take into account the effect of data time delay on its interpretation. Simulated time-delayed binary data from self-powered sensors was used to determine damage indicator variables. Performance and accuracy of the damage detection strategy was examined and tested for the case of an aircraft horizontal stabilizer. Damage states were simulated on a finite element model by reducing stiffness in a region of the stabilizer's skin. The proposed strategy shows satisfactory performance to identify the presence and location of the damage, even with noisy and incomplete data. It is concluded that PR is a promising machine-learning algorithm for damage detection for time-delayed binary data from novel self-powered wireless sensors.
Affordable and personalized lighting using inverse modeling and virtual sensors
NASA Astrophysics Data System (ADS)
Basu, Chandrayee; Chen, Benjamin; Richards, Jacob; Dhinakaran, Aparna; Agogino, Alice; Martin, Rodney
2014-03-01
Wireless sensor networks (WSN) have great potential to enable personalized intelligent lighting systems while reducing building energy use by 50%-70%. As a result WSN systems are being increasingly integrated in state-ofart intelligent lighting systems. In the future these systems will enable participation of lighting loads as ancillary services. However, such systems can be expensive to install and lack the plug-and-play quality necessary for user-friendly commissioning. In this paper we present an integrated system of wireless sensor platforms and modeling software to enable affordable and user-friendly intelligent lighting. It requires ⇠ 60% fewer sensor deployments compared to current commercial systems. Reduction in sensor deployments has been achieved by optimally replacing the actual photo-sensors with real-time discrete predictive inverse models. Spatially sparse and clustered sub-hourly photo-sensor data captured by the WSN platforms are used to develop and validate a piece-wise linear regression of indoor light distribution. This deterministic data-driven model accounts for sky conditions and solar position. The optimal placement of photo-sensors is performed iteratively to achieve the best predictability of the light field desired for indoor lighting control. Using two weeks of daylight and artificial light training data acquired at the Sustainability Base at NASA Ames, the model was able to predict the light level at seven monitored workstations with 80%-95% accuracy. We estimate that 10% adoption of this intelligent wireless sensor system in commercial buildings could save 0.2-0.25 quads BTU of energy nationwide.
State Level Intelligence Doctrine: Bridging the Gap
2013-12-01
intelligence operations that transcend the local/federal membrane and perceived glass ceilings . Currently, articles specifically give New York state police...oriented policing to most effectively conduct domestic intelligence operations. Currently, most state and local fusion centers operate as a...and effective as guidance and direction is clearly defined and prescribed. RESEARCH QUESTION How can state police agencies, in conjunction with DHS
Liu, Bailing; Zhang, Fumin; Qu, Xinghua
2015-01-01
An improvement method for the pose accuracy of a robot manipulator by using a multiple-sensor combination measuring system (MCMS) is presented. It is composed of a visual sensor, an angle sensor and a series robot. The visual sensor is utilized to measure the position of the manipulator in real time, and the angle sensor is rigidly attached to the manipulator to obtain its orientation. Due to the higher accuracy of the multi-sensor, two efficient data fusion approaches, the Kalman filter (KF) and multi-sensor optimal information fusion algorithm (MOIFA), are used to fuse the position and orientation of the manipulator. The simulation and experimental results show that the pose accuracy of the robot manipulator is improved dramatically by 38%∼78% with the multi-sensor data fusion. Comparing with reported pose accuracy improvement methods, the primary advantage of this method is that it does not require the complex solution of the kinematics parameter equations, increase of the motion constraints and the complicated procedures of the traditional vision-based methods. It makes the robot processing more autonomous and accurate. To improve the reliability and accuracy of the pose measurements of MCMS, the visual sensor repeatability is experimentally studied. An optimal range of 1 × 0.8 × 1 ∼ 2 × 0.8 × 1 m in the field of view (FOV) is indicated by the experimental results. PMID:25850067
Design and implementation of green intelligent lights based on the ZigBee
NASA Astrophysics Data System (ADS)
Gan, Yong; Jia, Chunli; Zou, Dongyao; Yang, Jiajia; Guo, Qianqian
2013-03-01
By analysis of the low degree of intelligence of the traditional lighting control methods, the paper uses the singlechip microcomputer for the control core, and uses a pyroelectric infrared technology to detect the existence of the human body, light sensors to sense the light intensity; the interface uses infrared sensor module, photosensitive sensor module, relay module to transmit the signal, which based on ZigBee wireless network. The main function of the design is to realize that the lighting can intelligently adjust the brightness according to the indoor light intensity when people in door, and it can turn off the light when people left. The circuit and program design of this system is flexible, and the system achieves the effect of intelligent energy saving control.
Flexible Fusion Structure-Based Performance Optimization Learning for Multisensor Target Tracking
Ge, Quanbo; Wei, Zhongliang; Cheng, Tianfa; Chen, Shaodong; Wang, Xiangfeng
2017-01-01
Compared with the fixed fusion structure, the flexible fusion structure with mixed fusion methods has better adjustment performance for the complex air task network systems, and it can effectively help the system to achieve the goal under the given constraints. Because of the time-varying situation of the task network system induced by moving nodes and non-cooperative target, and limitations such as communication bandwidth and measurement distance, it is necessary to dynamically adjust the system fusion structure including sensors and fusion methods in a given adjustment period. Aiming at this, this paper studies the design of a flexible fusion algorithm by using an optimization learning technology. The purpose is to dynamically determine the sensors’ numbers and the associated sensors to take part in the centralized and distributed fusion processes, respectively, herein termed sensor subsets selection. Firstly, two system performance indexes are introduced. Especially, the survivability index is presented and defined. Secondly, based on the two indexes and considering other conditions such as communication bandwidth and measurement distance, optimization models for both single target tracking and multi-target tracking are established. Correspondingly, solution steps are given for the two optimization models in detail. Simulation examples are demonstrated to validate the proposed algorithms. PMID:28481243
A Simulation Environment for Benchmarking Sensor Fusion-Based Pose Estimators.
Ligorio, Gabriele; Sabatini, Angelo Maria
2015-12-19
In-depth analysis and performance evaluation of sensor fusion-based estimators may be critical when performed using real-world sensor data. For this reason, simulation is widely recognized as one of the most powerful tools for algorithm benchmarking. In this paper, we present a simulation framework suitable for assessing the performance of sensor fusion-based pose estimators. The systems used for implementing the framework were magnetic/inertial measurement units (MIMUs) and a camera, although the addition of further sensing modalities is straightforward. Typical nuisance factors were also included for each sensor. The proposed simulation environment was validated using real-life sensor data employed for motion tracking. The higher mismatch between real and simulated sensors was about 5% of the measured quantity (for the camera simulation), whereas a lower correlation was found for an axis of the gyroscope (0.90). In addition, a real benchmarking example of an extended Kalman filter for pose estimation from MIMU and camera data is presented.
An epidemic model for biological data fusion in ad hoc sensor networks
NASA Astrophysics Data System (ADS)
Chang, K. C.; Kotari, Vikas
2009-05-01
Bio terrorism can be a very refined and a catastrophic approach of attacking a nation. This requires the development of a complete architecture dedicatedly designed for this purpose which includes but is not limited to Sensing/Detection, Tracking and Fusion, Communication, and others. In this paper we focus on one such architecture and evaluate its performance. Various sensors for this specific purpose have been studied. The accent has been on use of Distributed systems such as ad-hoc networks and on application of epidemic data fusion algorithms to better manage the bio threat data. The emphasis has been on understanding the performance characteristics of these algorithms under diversified real time scenarios which are implemented through extensive JAVA based simulations. Through comparative studies on communication and fusion the performance of channel filter algorithm for the purpose of biological sensor data fusion are validated.
Chowdhury, Amor; Sarjaš, Andrej
2016-01-01
The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation. PMID:27649197
Chowdhury, Amor; Sarjaš, Andrej
2016-09-15
The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.
Sensor fusion approaches for EMI and GPR-based subsurface threat identification
NASA Astrophysics Data System (ADS)
Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.
2011-06-01
Despite advances in both electromagnetic induction (EMI) and ground penetrating radar (GPR) sensing and related signal processing, neither sensor alone provides a perfect tool for detecting the myriad of possible buried objects that threaten the lives of Soldiers and civilians. However, while neither GPR nor EMI sensing alone can provide optimal detection across all target types, the two approaches are highly complementary. As a result, many landmine systems seek to make use of both sensing modalities simultaneously and fuse the results from both sensors to improve detection performance for targets with widely varying metal content and GPR responses. Despite this, little work has focused on large-scale comparisons of different approaches to sensor fusion and machine learning for combining data from these highly orthogonal phenomenologies. In this work we explore a wide array of pattern recognition techniques for algorithm development and sensor fusion. Results with the ARA Nemesis landmine detection system suggest that nonlinear and non-parametric classification algorithms provide significant performance benefits for single-sensor algorithm development, and that fusion of multiple algorithms can be performed satisfactorily using basic parametric approaches, such as logistic discriminant classification, for the targets under consideration in our data sets.
Algorithms for Efficient Intelligence Collection
2013-09-01
2006. Cortical substrates for exploratory decisions in humans. Nature 441(7095) 876–879. Deitchman, S. J. 1962. A lanchester model of guerrilla...Monterey, CA. Pearl, J. 1986. Fusion, propagation and structuring in belief networks. Artificial Intelligence 29 241–288. Schaffer, M. B. 1968. Lanchester
Effect of retransmission and retrodiction on estimation and fusion in long-haul sensor networks
Liu, Qiang; Wang, Xin; Rao, Nageswara S. V.; ...
2016-01-01
In a long-haul sensor network, sensors are remotely deployed over a large geographical area to perform certain tasks, such as target tracking. In this work, we study the scenario where sensors take measurements of one or more dynamic targets and send state estimates of the targets to a fusion center via satellite links. The severe loss and delay inherent over the satellite channels reduce the number of estimates successfully arriving at the fusion center, thereby limiting the potential fusion gain and resulting in suboptimal accuracy performance of the fused estimates. In addition, the errors in target-sensor data association can alsomore » degrade the estimation performance. To mitigate the effect of imperfect communications on state estimation and fusion, we consider retransmission and retrodiction. The system adopts certain retransmission-based transport protocols so that lost messages can be recovered over time. Besides, retrodiction/smoothing techniques are applied so that the chances of incurring excess delay due to retransmission are greatly reduced. We analyze the extent to which retransmission and retrodiction can improve the performance of delay-sensitive target tracking tasks under variable communication loss and delay conditions. Lastly, simulation results of a ballistic target tracking application are shown in the end to demonstrate the validity of our analysis.« less
Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving
Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice
2016-01-01
The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171
Minimum energy information fusion in sensor networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapline, G
1999-05-11
In this paper we consider how to organize the sharing of information in a distributed network of sensors and data processors so as to provide explanations for sensor readings with minimal expenditure of energy. We point out that the Minimum Description Length principle provides an approach to information fusion that is more naturally suited to energy minimization than traditional Bayesian approaches. In addition we show that for networks consisting of a large number of identical sensors Kohonen self-organization provides an exact solution to the problem of combing the sensor outputs into minimal description length explanations.
Sensor Technologies for Intelligent Transportation Systems
Guerrero-Ibáñez, Juan; Zeadally, Sherali
2018-01-01
Modern society faces serious problems with transportation systems, including but not limited to traffic congestion, safety, and pollution. Information communication technologies have gained increasing attention and importance in modern transportation systems. Automotive manufacturers are developing in-vehicle sensors and their applications in different areas including safety, traffic management, and infotainment. Government institutions are implementing roadside infrastructures such as cameras and sensors to collect data about environmental and traffic conditions. By seamlessly integrating vehicles and sensing devices, their sensing and communication capabilities can be leveraged to achieve smart and intelligent transportation systems. We discuss how sensor technology can be integrated with the transportation infrastructure to achieve a sustainable Intelligent Transportation System (ITS) and how safety, traffic control and infotainment applications can benefit from multiple sensors deployed in different elements of an ITS. Finally, we discuss some of the challenges that need to be addressed to enable a fully operational and cooperative ITS environment. PMID:29659524
Sensor Technologies for Intelligent Transportation Systems.
Guerrero-Ibáñez, Juan; Zeadally, Sherali; Contreras-Castillo, Juan
2018-04-16
Modern society faces serious problems with transportation systems, including but not limited to traffic congestion, safety, and pollution. Information communication technologies have gained increasing attention and importance in modern transportation systems. Automotive manufacturers are developing in-vehicle sensors and their applications in different areas including safety, traffic management, and infotainment. Government institutions are implementing roadside infrastructures such as cameras and sensors to collect data about environmental and traffic conditions. By seamlessly integrating vehicles and sensing devices, their sensing and communication capabilities can be leveraged to achieve smart and intelligent transportation systems. We discuss how sensor technology can be integrated with the transportation infrastructure to achieve a sustainable Intelligent Transportation System (ITS) and how safety, traffic control and infotainment applications can benefit from multiple sensors deployed in different elements of an ITS. Finally, we discuss some of the challenges that need to be addressed to enable a fully operational and cooperative ITS environment.
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains.
Posturing Tactical ISR Beyond The Umbilical Cord
2017-02-03
intelligence sensors, it carries a lethal payload of ordinance for strike and or close air support missions. In fact, world media have discussed the MQ-9’s...awareness all their visual and signal intelligence sensors provide is a force multiplier that enhances mission success significantly. For example, when...on C-17 Photo Source http://www.aircav.com/dodphoto/dod98/mh60-002rs.jpg 407MRH multirole armed ISR ( intelligence , surveillance, reconnaissance
Capturing and Modeling Domain Knowledge Using Natural Language Processing Techniques
2005-06-01
Intelligence Artificielle , France, May 2001, p. 109- 118 [Barrière, 2001] -----. “Investigating the Causal Relation in Informative Texts”. Terminology, 7:2...out of the flood of information, military have to create new ways of processing sensor and intelligence information, and of providing the results to...have to create new ways of processing sensor and intelligence information, and of providing the results to commanders who must take timely operational
Zhang, Ying; Wang, Jun; Hao, Guan
2018-01-08
With the development of autonomous unmanned intelligent systems, such as the unmanned boats, unmanned planes and autonomous underwater vehicles, studies on Wireless Sensor-Actor Networks (WSANs) have attracted more attention. Network connectivity algorithms play an important role in data exchange, collaborative detection and information fusion. Due to the harsh application environment, abnormal nodes often appear, and the network connectivity will be prone to be lost. Network self-healing mechanisms have become critical for these systems. In order to decrease the movement overhead of the sensor-actor nodes, an autonomous connectivity restoration algorithm based on finite state machine is proposed. The idea is to identify whether a node is a critical node by using a finite state machine, and update the connected dominating set in a timely way. If an abnormal node is a critical node, the nearest non-critical node will be relocated to replace the abnormal node. In the case of multiple node abnormality, a regional network restoration algorithm is introduced. It is designed to reduce the overhead of node movements while restoration happens. Simulation results indicate the proposed algorithm has better performance on the total moving distance and the number of total relocated nodes compared with some other representative restoration algorithms.
Zhang, Ying; Wang, Jun; Hao, Guan
2018-01-01
With the development of autonomous unmanned intelligent systems, such as the unmanned boats, unmanned planes and autonomous underwater vehicles, studies on Wireless Sensor-Actor Networks (WSANs) have attracted more attention. Network connectivity algorithms play an important role in data exchange, collaborative detection and information fusion. Due to the harsh application environment, abnormal nodes often appear, and the network connectivity will be prone to be lost. Network self-healing mechanisms have become critical for these systems. In order to decrease the movement overhead of the sensor-actor nodes, an autonomous connectivity restoration algorithm based on finite state machine is proposed. The idea is to identify whether a node is a critical node by using a finite state machine, and update the connected dominating set in a timely way. If an abnormal node is a critical node, the nearest non-critical node will be relocated to replace the abnormal node. In the case of multiple node abnormality, a regional network restoration algorithm is introduced. It is designed to reduce the overhead of node movements while restoration happens. Simulation results indicate the proposed algorithm has better performance on the total moving distance and the number of total relocated nodes compared with some other representative restoration algorithms. PMID:29316702
Lampoltshammer, Thomas J.; de Freitas, Edison Pignaton; Nowotny, Thomas; Plank, Stefan; da Costa, João Paulo Carvalho Lustosa; Larsson, Tony; Heistracher, Thomas
2014-01-01
The percentage of elderly people in European countries is increasing. Such conjuncture affects socio-economic structures and creates demands for resourceful solutions, such as Ambient Assisted Living (AAL), which is a possible methodology to foster health care for elderly people. In this context, sensor-based devices play a leading role in surveying, e.g., health conditions of elderly people, to alert care personnel in case of an incident. However, the adoption of such devices strongly depends on the comfort of wearing the devices. In most cases, the bottleneck is the battery lifetime, which impacts the effectiveness of the system. In this paper we propose an approach to reduce the energy consumption of sensors' by use of local sensors' intelligence. By increasing the intelligence of the sensor node, a substantial decrease in the necessary communication payload can be achieved. The results show a significant potential to preserve energy and decrease the actual size of the sensor device units. PMID:24618777
Lampoltshammer, Thomas J; Pignaton de Freitas, Edison; Nowotny, Thomas; Plank, Stefan; da Costa, João Paulo Carvalho Lustosa; Larsson, Tony; Heistracher, Thomas
2014-03-11
The percentage of elderly people in European countries is increasing. Such conjuncture affects socio-economic structures and creates demands for resourceful solutions, such as Ambient Assisted Living (AAL), which is a possible methodology to foster health care for elderly people. In this context, sensor-based devices play a leading role in surveying, e.g., health conditions of elderly people, to alert care personnel in case of an incident. However, the adoption of such devices strongly depends on the comfort of wearing the devices. In most cases, the bottleneck is the battery lifetime, which impacts the effectiveness of the system. In this paper we propose an approach to reduce the energy consumption of sensors' by use of local sensors' intelligence. By increasing the intelligence of the sensor node, a substantial decrease in the necessary communication payload can be achieved. The results show a significant potential to preserve energy and decrease the actual size of the sensor device units.
NASA Astrophysics Data System (ADS)
Gregorio, Massimo De
In this paper we present an intelligent active video surveillance system currently adopted in two different application domains: railway tunnels and outdoor storage areas. The system takes advantages of the integration of Artificial Neural Networks (ANN) and symbolic Artificial Intelligence (AI). This hybrid system is formed by virtual neural sensors (implemented as WiSARD-like systems) and BDI agents. The coupling of virtual neural sensors with symbolic reasoning for interpreting their outputs, makes this approach both very light from a computational and hardware point of view, and rather robust in performances. The system works on different scenarios and in difficult light conditions.
Decentralized Hypothesis Testing in Energy Harvesting Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Tarighati, Alla; Gross, James; Jalden, Joakim
2017-09-01
We consider the problem of decentralized hypothesis testing in a network of energy harvesting sensors, where sensors make noisy observations of a phenomenon and send quantized information about the phenomenon towards a fusion center. The fusion center makes a decision about the present hypothesis using the aggregate received data during a time interval. We explicitly consider a scenario under which the messages are sent through parallel access channels towards the fusion center. To avoid limited lifetime issues, we assume each sensor is capable of harvesting all the energy it needs for the communication from the environment. Each sensor has an energy buffer (battery) to save its harvested energy for use in other time intervals. Our key contribution is to formulate the problem of decentralized detection in a sensor network with energy harvesting devices. Our analysis is based on a queuing-theoretic model for the battery and we propose a sensor decision design method by considering long term energy management at the sensors. We show how the performance of the system changes for different battery capacities. We then numerically show how our findings can be used in the design of sensor networks with energy harvesting sensors.
Sensor Fusion of Gaussian Mixtures for Ballistic Target Tracking in the Re-Entry Phase
Lu, Kelin; Zhou, Rui
2016-01-01
A sensor fusion methodology for the Gaussian mixtures model is proposed for ballistic target tracking with unknown ballistic coefficients. To improve the estimation accuracy, a track-to-track fusion architecture is proposed to fuse tracks provided by the local interacting multiple model filters. During the fusion process, the duplicate information is removed by considering the first order redundant information between the local tracks. With extensive simulations, we show that the proposed algorithm improves the tracking accuracy in ballistic target tracking in the re-entry phase applications. PMID:27537883
Sensor Fusion of Gaussian Mixtures for Ballistic Target Tracking in the Re-Entry Phase.
Lu, Kelin; Zhou, Rui
2016-08-15
A sensor fusion methodology for the Gaussian mixtures model is proposed for ballistic target tracking with unknown ballistic coefficients. To improve the estimation accuracy, a track-to-track fusion architecture is proposed to fuse tracks provided by the local interacting multiple model filters. During the fusion process, the duplicate information is removed by considering the first order redundant information between the local tracks. With extensive simulations, we show that the proposed algorithm improves the tracking accuracy in ballistic target tracking in the re-entry phase applications.
2011-06-22
accessible by intelligence professionals and intelligence organizations frequently do not dedicate enough effort to support the process of...In every theater, Commanders have developed non-doctrinal organizations uniquely suited to their mission in an effort to integrate socio-cultural...information into military decision-making processes. A prime example of a non-traditional organization is the Stability Operations Information
Sabatini, Angelo Maria; Genovese, Vincenzo
2014-07-24
A sensor fusion method was developed for vertical channel stabilization by fusing inertial measurements from an Inertial Measurement Unit (IMU) and pressure altitude measurements from a barometric altimeter integrated in the same device (baro-IMU). An Extended Kalman Filter (EKF) estimated the quaternion from the sensor frame to the navigation frame; the sensed specific force was rotated into the navigation frame and compensated for gravity, yielding the vertical linear acceleration; finally, a complementary filter driven by the vertical linear acceleration and the measured pressure altitude produced estimates of height and vertical velocity. A method was also developed to condition the measured pressure altitude using a whitening filter, which helped to remove the short-term correlation due to environment-dependent pressure changes from raw pressure altitude. The sensor fusion method was implemented to work on-line using data from a wireless baro-IMU and tested for the capability of tracking low-frequency small-amplitude vertical human-like motions that can be critical for stand-alone inertial sensor measurements. Validation tests were performed in different experimental conditions, namely no motion, free-fall motion, forced circular motion and squatting. Accurate on-line tracking of height and vertical velocity was achieved, giving confidence to the use of the sensor fusion method for tracking typical vertical human motions: velocity Root Mean Square Error (RMSE) was in the range 0.04-0.24 m/s; height RMSE was in the range 5-68 cm, with statistically significant performance gains when the whitening filter was used by the sensor fusion method to track relatively high-frequency vertical motions.
Rivera, José; Carrillo, Mariano; Chacón, Mario; Herrera, Gilberto; Bojorquez, Gilberto
2007-01-01
The development of smart sensors involves the design of reconfigurable systems capable of working with different input sensors. Reconfigurable systems ideally should spend the least possible amount of time in their calibration. An autocalibration algorithm for intelligent sensors should be able to fix major problems such as offset, variation of gain and lack of linearity, as accurately as possible. This paper describes a new autocalibration methodology for nonlinear intelligent sensors based on artificial neural networks, ANN. The methodology involves analysis of several network topologies and training algorithms. The proposed method was compared against the piecewise and polynomial linearization methods. Method comparison was achieved using different number of calibration points, and several nonlinear levels of the input signal. This paper also shows that the proposed method turned out to have a better overall accuracy than the other two methods. Besides, experimentation results and analysis of the complete study, the paper describes the implementation of the ANN in a microcontroller unit, MCU. In order to illustrate the method capability to build autocalibration and reconfigurable systems, a temperature measurement system was designed and tested. The proposed method is an improvement over the classic autocalibration methodologies, because it impacts on the design process of intelligent sensors, autocalibration methodologies and their associated factors, like time and cost.
Deng, Changjian; Lv, Kun; Shi, Debo; Yang, Bo; Yu, Song; He, Zhiyi; Yan, Jia
2018-06-12
In this paper, a novel feature selection and fusion framework is proposed to enhance the discrimination ability of gas sensor arrays for odor identification. Firstly, we put forward an efficient feature selection method based on the separability and the dissimilarity to determine the feature selection order for each type of feature when increasing the dimension of selected feature subsets. Secondly, the K-nearest neighbor (KNN) classifier is applied to determine the dimensions of the optimal feature subsets for different types of features. Finally, in the process of establishing features fusion, we come up with a classification dominance feature fusion strategy which conducts an effective basic feature. Experimental results on two datasets show that the recognition rates of Database I and Database II achieve 97.5% and 80.11%, respectively, when k = 1 for KNN classifier and the distance metric is correlation distance (COR), which demonstrates the superiority of the proposed feature selection and fusion framework in representing signal features. The novel feature selection method proposed in this paper can effectively select feature subsets that are conducive to the classification, while the feature fusion framework can fuse various features which describe the different characteristics of sensor signals, for enhancing the discrimination ability of gas sensors and, to a certain extent, suppressing drift effect.
Activity recognition using Video Event Segmentation with Text (VEST)
NASA Astrophysics Data System (ADS)
Holloway, Hillary; Jones, Eric K.; Kaluzniacki, Andrew; Blasch, Erik; Tierno, Jorge
2014-06-01
Multi-Intelligence (multi-INT) data includes video, text, and signals that require analysis by operators. Analysis methods include information fusion approaches such as filtering, correlation, and association. In this paper, we discuss the Video Event Segmentation with Text (VEST) method, which provides event boundaries of an activity to compile related message and video clips for future interest. VEST infers meaningful activities by clustering multiple streams of time-sequenced multi-INT intelligence data and derived fusion products. We discuss exemplar results that segment raw full-motion video (FMV) data by using extracted commentary message timestamps, FMV metadata, and user-defined queries.
A spatial data handling system for retrieval of images by unrestricted regions of user interest
NASA Technical Reports Server (NTRS)
Dorfman, Erik; Cromp, Robert F.
1992-01-01
The Intelligent Data Management (IDM) project at NASA/Goddard Space Flight Center has prototyped an Intelligent Information Fusion System (IIFS), which automatically ingests metadata from remote sensor observations into a large catalog which is directly queryable by end-users. The greatest challenge in the implementation of this catalog was supporting spatially-driven searches, where the user has a possible complex region of interest and wishes to recover those images that overlap all or simply a part of that region. A spatial data management system is described, which is capable of storing and retrieving records of image data regardless of their source. This system was designed and implemented as part of the IIFS catalog. A new data structure, called a hypercylinder, is central to the design. The hypercylinder is specifically tailored for data distributed over the surface of a sphere, such as satellite observations of the Earth or space. Operations on the hypercylinder are regulated by two expert systems. The first governs the ingest of new metadata records, and maintains the efficiency of the data structure as it grows. The second translates, plans, and executes users' spatial queries, performing incremental optimization as partial query results are returned.
The fusion of satellite and UAV data: simulation of high spatial resolution band
NASA Astrophysics Data System (ADS)
Jenerowicz, Agnieszka; Siok, Katarzyna; Woroszkiewicz, Malgorzata; Orych, Agata
2017-10-01
Remote sensing techniques used in the precision agriculture and farming that apply imagery data obtained with sensors mounted on UAV platforms became more popular in the last few years due to the availability of low- cost UAV platforms and low- cost sensors. Data obtained from low altitudes with low- cost sensors can be characterised by high spatial and radiometric resolution but quite low spectral resolution, therefore the application of imagery data obtained with such technology is quite limited and can be used only for the basic land cover classification. To enrich the spectral resolution of imagery data acquired with low- cost sensors from low altitudes, the authors proposed the fusion of RGB data obtained with UAV platform with multispectral satellite imagery. The fusion is based on the pansharpening process, that aims to integrate the spatial details of the high-resolution panchromatic image with the spectral information of lower resolution multispectral or hyperspectral imagery to obtain multispectral or hyperspectral images with high spatial resolution. The key of pansharpening is to properly estimate the missing spatial details of multispectral images while preserving their spectral properties. In the research, the authors presented the fusion of RGB images (with high spatial resolution) obtained with sensors mounted on low- cost UAV platforms and multispectral satellite imagery with satellite sensors, i.e. Landsat 8 OLI. To perform the fusion of UAV data with satellite imagery, the simulation of the panchromatic bands from RGB data based on the spectral channels linear combination, was conducted. Next, for simulated bands and multispectral satellite images, the Gram-Schmidt pansharpening method was applied. As a result of the fusion, the authors obtained several multispectral images with very high spatial resolution and then analysed the spatial and spectral accuracies of processed images.
A Knowledge-Based Approach to Information Fusion for the Support of Military Intelligence
2004-03-01
and most reliable an appropriate picture of the battlespace. The presented approach of knowledge based information fusion is focussing on the...incomplete and imperfect information of military reports and background knowledge can be supported substantially in an automated system. Keywords
DARHT Multi-intelligence Seismic and Acoustic Data Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.
The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, andmore » finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where uncertainty is added to the system through noise in the measurements.« less
Dissolved oxygen content prediction in crab culture using a hybrid intelligent method
Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang
2016-01-01
A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds. PMID:27270206
Dissolved oxygen content prediction in crab culture using a hybrid intelligent method.
Yu, Huihui; Chen, Yingyi; Hassan, ShahbazGul; Li, Daoliang
2016-06-08
A precise predictive model is needed to obtain a clear understanding of the changing dissolved oxygen content in outdoor crab ponds, to assess how to reduce risk and to optimize water quality management. The uncertainties in the data from multiple sensors are a significant factor when building a dissolved oxygen content prediction model. To increase prediction accuracy, a new hybrid dissolved oxygen content forecasting model based on the radial basis function neural networks (RBFNN) data fusion method and a least squares support vector machine (LSSVM) with an optimal improved particle swarm optimization(IPSO) is developed. In the modelling process, the RBFNN data fusion method is used to improve information accuracy and provide more trustworthy training samples for the IPSO-LSSVM prediction model. The LSSVM is a powerful tool for achieving nonlinear dissolved oxygen content forecasting. In addition, an improved particle swarm optimization algorithm is developed to determine the optimal parameters for the LSSVM with high accuracy and generalizability. In this study, the comparison of the prediction results of different traditional models validates the effectiveness and accuracy of the proposed hybrid RBFNN-IPSO-LSSVM model for dissolved oxygen content prediction in outdoor crab ponds.
Enhancing hyperspectral spatial resolution using multispectral image fusion: A wavelet approach
NASA Astrophysics Data System (ADS)
Jazaeri, Amin
High spectral and spatial resolution images have a significant impact in remote sensing applications. Because both spatial and spectral resolutions of spaceborne sensors are fixed by design and it is not possible to further increase the spatial or spectral resolution, techniques such as image fusion must be applied to achieve such goals. This dissertation introduces the concept of wavelet fusion between hyperspectral and multispectral sensors in order to enhance the spectral and spatial resolution of a hyperspectral image. To test the robustness of this concept, images from Hyperion (hyperspectral sensor) and Advanced Land Imager (multispectral sensor) were first co-registered and then fused using different wavelet algorithms. A regression-based fusion algorithm was also implemented for comparison purposes. The results show that the fused images using a combined bi-linear wavelet-regression algorithm have less error than other methods when compared to the ground truth. In addition, a combined regression-wavelet algorithm shows more immunity to misalignment of the pixels due to the lack of proper registration. The quantitative measures of average mean square error show that the performance of wavelet-based methods degrades when the spatial resolution of hyperspectral images becomes eight times less than its corresponding multispectral image. Regardless of what method of fusion is utilized, the main challenge in image fusion is image registration, which is also a very time intensive process. Because the combined regression wavelet technique is computationally expensive, a hybrid technique based on regression and wavelet methods was also implemented to decrease computational overhead. However, the gain in faster computation was offset by the introduction of more error in the outcome. The secondary objective of this dissertation is to examine the feasibility and sensor requirements for image fusion for future NASA missions in order to be able to perform onboard image fusion. In this process, the main challenge of image registration was resolved by registering the input images using transformation matrices of previously acquired data. The composite image resulted from the fusion process remarkably matched the ground truth, indicating the possibility of real time onboard fusion processing.
Audio-visual affective expression recognition
NASA Astrophysics Data System (ADS)
Huang, Thomas S.; Zeng, Zhihong
2007-11-01
Automatic affective expression recognition has attracted more and more attention of researchers from different disciplines, which will significantly contribute to a new paradigm for human computer interaction (affect-sensitive interfaces, socially intelligent environments) and advance the research in the affect-related fields including psychology, psychiatry, and education. Multimodal information integration is a process that enables human to assess affective states robustly and flexibly. In order to understand the richness and subtleness of human emotion behavior, the computer should be able to integrate information from multiple sensors. We introduce in this paper our efforts toward machine understanding of audio-visual affective behavior, based on both deliberate and spontaneous displays. Some promising methods are presented to integrate information from both audio and visual modalities. Our experiments show the advantage of audio-visual fusion in affective expression recognition over audio-only or visual-only approaches.
Automotive Radar and Lidar Systems for Next Generation Driver Assistance Functions
NASA Astrophysics Data System (ADS)
Rasshofer, R. H.; Gresser, K.
2005-05-01
Automotive radar and lidar sensors represent key components for next generation driver assistance functions (Jones, 2001). Today, their use is limited to comfort applications in premium segment vehicles although an evolution process towards more safety-oriented functions is taking place. Radar sensors available on the market today suffer from low angular resolution and poor target detection in medium ranges (30 to 60m) over azimuth angles larger than ±30°. In contrast, Lidar sensors show large sensitivity towards environmental influences (e.g. snow, fog, dirt). Both sensor technologies today have a rather high cost level, forbidding their wide-spread usage on mass markets. A common approach to overcome individual sensor drawbacks is the employment of data fusion techniques (Bar-Shalom, 2001). Raw data fusion requires a common, standardized data interface to easily integrate a variety of asynchronous sensor data into a fusion network. Moreover, next generation sensors should be able to dynamically adopt to new situations and should have the ability to work in cooperative sensor environments. As vehicular function development today is being shifted more and more towards virtual prototyping, mathematical sensor models should be available. These models should take into account the sensor's functional principle as well as all typical measurement errors generated by the sensor.
NASA Technical Reports Server (NTRS)
LeMoigne, Jacqueline; Laporte, Nadine; Netanyahuy, Nathan S.; Zukor, Dorothy (Technical Monitor)
2001-01-01
The characterization and the mapping of land cover/land use of forest areas, such as the Central African rainforest, is a very complex task. This complexity is mainly due to the extent of such areas and, as a consequence, to the lack of full and continuous cloud-free coverage of those large regions by one single remote sensing instrument, In order to provide improved vegetation maps of Central Africa and to develop forest monitoring techniques for applications at the local and regional scales, we propose to utilize multi-sensor remote sensing observations coupled with in-situ data. Fusion and clustering of multi-sensor data are the first steps towards the development of such a forest monitoring system. In this paper, we will describe some preliminary experiments involving the fusion of SAR and Landsat image data of the Lope Reserve in Gabon. Similarly to previous fusion studies, our fusion method is wavelet-based. The fusion provides a new image data set which contains more detailed texture features and preserves the large homogeneous regions that are observed by the Thematic Mapper sensor. The fusion step is followed by unsupervised clustering and provides a vegetation map of the area.
Sensor Fusion and Smart Sensor in Sports and Biomedical Applications.
Mendes, José Jair Alves; Vieira, Mário Elias Marinho; Pires, Marcelo Bissi; Stevan, Sergio Luiz
2016-09-23
The following work presents an overview of smart sensors and sensor fusion targeted at biomedical applications and sports areas. In this work, the integration of these areas is demonstrated, promoting a reflection about techniques and applications to collect, quantify and qualify some physical variables associated with the human body. These techniques are presented in various biomedical and sports applications, which cover areas related to diagnostics, rehabilitation, physical monitoring, and the development of performance in athletes, among others. Although some applications are described in only one of two fields of study (biomedicine and sports), it is very likely that the same application fits in both, with small peculiarities or adaptations. To illustrate the contemporaneity of applications, an analysis of specialized papers published in the last six years has been made. In this context, the main characteristic of this review is to present the largest quantity of relevant examples of sensor fusion and smart sensors focusing on their utilization and proposals, without deeply addressing one specific system or technique, to the detriment of the others.
Intelligent Information Fusion in the Aviation Domain: A Semantic-Web based Approach
NASA Technical Reports Server (NTRS)
Ashish, Naveen; Goforth, Andre
2005-01-01
Information fusion from multiple sources is a critical requirement for System Wide Information Management in the National Airspace (NAS). NASA and the FAA envision creating an "integrated pool" of information originally coming from different sources, which users, intelligent agents and NAS decision support tools can tap into. In this paper we present the results of our initial investigations into the requirements and prototype development of such an integrated information pool for the NAS. We have attempted to ascertain key requirements for such an integrated pool based on a survey of DSS tools that will benefit from this integrated pool. We then advocate key technologies from computer science research areas such as the semantic web, information integration, and intelligent agents that we believe are well suited to achieving the envisioned system wide information management capabilities.
A Data Fusion Method in Wireless Sensor Networks
Izadi, Davood; Abawajy, Jemal H.; Ghanavati, Sara; Herawan, Tutut
2015-01-01
The success of a Wireless Sensor Network (WSN) deployment strongly depends on the quality of service (QoS) it provides regarding issues such as data accuracy, data aggregation delays and network lifetime maximisation. This is especially challenging in data fusion mechanisms, where a small fraction of low quality data in the fusion input may negatively impact the overall fusion result. In this paper, we present a fuzzy-based data fusion approach for WSN with the aim of increasing the QoS whilst reducing the energy consumption of the sensor network. The proposed approach is able to distinguish and aggregate only true values of the collected data as such, thus reducing the burden of processing the entire data at the base station (BS). It is also able to eliminate redundant data and consequently reduce energy consumption thus increasing the network lifetime. We studied the effectiveness of the proposed data fusion approach experimentally and compared it with two baseline approaches in terms of data collection, number of transferred data packets and energy consumption. The results of the experiments show that the proposed approach achieves better results than the baseline approaches. PMID:25635417
Decision Fusion with Channel Errors in Distributed Decode-Then-Fuse Sensor Networks
Yan, Yongsheng; Wang, Haiyan; Shen, Xiaohong; Zhong, Xionghu
2015-01-01
Decision fusion for distributed detection in sensor networks under non-ideal channels is investigated in this paper. Usually, the local decisions are transmitted to the fusion center (FC) and decoded, and a fusion rule is then applied to achieve a global decision. We propose an optimal likelihood ratio test (LRT)-based fusion rule to take the uncertainty of the decoded binary data due to modulation, reception mode and communication channel into account. The average bit error rate (BER) is employed to characterize such an uncertainty. Further, the detection performance is analyzed under both non-identical and identical local detection performance indices. In addition, the performance of the proposed method is compared with the existing optimal and suboptimal LRT fusion rules. The results show that the proposed fusion rule is more robust compared to these existing ones. PMID:26251908
Application of Sensor Fusion to Improve Uav Image Classification
NASA Astrophysics Data System (ADS)
Jabari, S.; Fathollahi, F.; Zhang, Y.
2017-08-01
Image classification is one of the most important tasks of remote sensing projects including the ones that are based on using UAV images. Improving the quality of UAV images directly affects the classification results and can save a huge amount of time and effort in this area. In this study, we show that sensor fusion can improve image quality which results in increasing the accuracy of image classification. Here, we tested two sensor fusion configurations by using a Panchromatic (Pan) camera along with either a colour camera or a four-band multi-spectral (MS) camera. We use the Pan camera to benefit from its higher sensitivity and the colour or MS camera to benefit from its spectral properties. The resulting images are then compared to the ones acquired by a high resolution single Bayer-pattern colour camera (here referred to as HRC). We assessed the quality of the output images by performing image classification tests. The outputs prove that the proposed sensor fusion configurations can achieve higher accuracies compared to the images of the single Bayer-pattern colour camera. Therefore, incorporating a Pan camera on-board in the UAV missions and performing image fusion can help achieving higher quality images and accordingly higher accuracy classification results.
NASA Astrophysics Data System (ADS)
Vajdic, Stevan M.; Katz, Henry E.; Downing, Andrew R.; Brooks, Michael J.
1994-09-01
A 3D relational image matching/fusion algorithm is introduced. It is implemented in the domain of medical imaging and is based on Artificial Intelligence paradigms--in particular, knowledge base representation and tree search. The 2D reference and target images are selected from 3D sets and segmented into non-touching and non-overlapping regions, using iterative thresholding and/or knowledge about the anatomical shapes of human organs. Selected image region attributes are calculated. Region matches are obtained using a tree search, and the error is minimized by evaluating a `goodness' of matching function based on similarities of region attributes. Once the matched regions are found and the spline geometric transform is applied to regional centers of gravity, images are ready for fusion and visualization into a single 3D image of higher clarity.
A New Multi-Sensor Track Fusion Architecture for Multi-Sensor Information Integration
2004-09-01
NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7. PERFORMING ORGANIZATION ...NAME(S) AND ADDRESS(ES) Lockheed Martin Aeronautical Systems Company,Marietta,GA,3063 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING...tracking process and degrades the track accuracy. ARCHITECHTURE OF MULTI-SENSOR TRACK FUSION MODEL The Alpha
The Multidimensional Integrated Intelligent Imaging project (MI-3)
NASA Astrophysics Data System (ADS)
Allinson, N.; Anaxagoras, T.; Aveyard, J.; Arvanitis, C.; Bates, R.; Blue, A.; Bohndiek, S.; Cabello, J.; Chen, L.; Chen, S.; Clark, A.; Clayton, C.; Cook, E.; Cossins, A.; Crooks, J.; El-Gomati, M.; Evans, P. M.; Faruqi, W.; French, M.; Gow, J.; Greenshaw, T.; Greig, T.; Guerrini, N.; Harris, E. J.; Henderson, R.; Holland, A.; Jeyasundra, G.; Karadaglic, D.; Konstantinidis, A.; Liang, H. X.; Maini, K. M. S.; McMullen, G.; Olivo, A.; O'Shea, V.; Osmond, J.; Ott, R. J.; Prydderch, M.; Qiang, L.; Riley, G.; Royle, G.; Segneri, G.; Speller, R.; Symonds-Tayler, J. R. N.; Triger, S.; Turchetta, R.; Venanzi, C.; Wells, K.; Zha, X.; Zin, H.
2009-06-01
MI-3 is a consortium of 11 universities and research laboratories whose mission is to develop complementary metal-oxide semiconductor (CMOS) active pixel sensors (APS) and to apply these sensors to a range of imaging challenges. A range of sensors has been developed: On-Pixel Intelligent CMOS (OPIC)—designed for in-pixel intelligence; FPN—designed to develop novel techniques for reducing fixed pattern noise; HDR—designed to develop novel techniques for increasing dynamic range; Vanilla/PEAPS—with digital and analogue modes and regions of interest, which has also been back-thinned; Large Area Sensor (LAS)—a novel, stitched LAS; and eLeNA—which develops a range of low noise pixels. Applications being developed include autoradiography, a gamma camera system, radiotherapy verification, tissue diffraction imaging, X-ray phase-contrast imaging, DNA sequencing and electron microscopy.
Greenhouse intelligent control system based on microcontroller
NASA Astrophysics Data System (ADS)
Zhang, Congwei
2018-04-01
As one of the hallmarks of agricultural modernization, intelligent greenhouse has the advantages of high yield, excellent quality, no pollution and continuous planting. Taking AT89S52 microcontroller as the main controller, the greenhouse intelligent control system uses soil moisture sensor, temperature and humidity sensors, light intensity sensor and CO2 concentration sensor to collect measurements and display them on the 12864 LCD screen real-time. Meantime, climate parameter values can be manually set online. The collected measured values are compared with the set standard values, and then the lighting, ventilation fans, warming lamps, water pumps and other facilities automatically start to adjust the climate such as light intensity, CO2 concentration, temperature, air humidity and soil moisture of the greenhouse parameter. So, the state of the environment in the greenhouse Stabilizes and the crop grows in a suitable environment.
NASA Astrophysics Data System (ADS)
Duquet, Jean Remi; Bergeron, Pierre; Blodgett, Dale E.; Couture, Jean; Macieszczak, Maciej; Mayrand, Michel; Chalmers, Bruce A.; Paradis, Stephane
1998-03-01
The Research and Development group at Lockheed Martin Canada, in collaboration with the Defence Research Establishment Valcartier, has undertaken a research project in order to capture and analyze the real-time and functional requirements of a next generation Command and Control System (CCS) for the Canadian Patrol Frigates, integrating Multi- Sensor Data Fusion (MSDF), Situation and Threat Assessment (STA) and Resource Management (RM). One important aspect of the project is to define how the use of Artificial Intelligence may optimize the performance of an integrated, real-time MSDF/STA/RM system. A closed-loop simulation environment is being developed to facilitate the evaluation of MSDF/STA/RM concepts, algorithms and architectures. This environment comprises (1) a scenario generator, (2) complex sensor, hardkill and softkill weapon models, (3) a real-time monitoring tool, (4) a distributed Knowledge-Base System (KBS) shell. The latter is being completely redesigned and implemented in-house since no commercial KBS shell could adequately satisfy all the project requirements. The closed- loop capability of the simulation environment, together with its `simulated real-time' capability, allows the interaction between the MSDF/STA/RM system and the environment targets during the execution of a scenario. This capability is essential to measure the performance of many STA and RM functionalities. Some benchmark scenarios have been selected to demonstrate quantitatively the capabilities of the selected MSDF/STA/RM algorithms. The paper describes the simulation environment and discusses the MSDF/STA/RM functionalities currently implemented and their performance as an automatic CCS.
Analysis of the frontier technology of agricultural IoT and its predication research
NASA Astrophysics Data System (ADS)
Han, Shuqing; Zhang, Jianhua; Zhu, Mengshuai; Wu, Jianzhai; Shen, Chen; Kong, Fantao
2017-09-01
Agricultural IoT (Internet of Things) develops rapidly. Nanotechnology, biotechnology and optoelectronic technology are successfully integrated into the agricultural sensor technology. Big data, cloud computing and artificial intelligence technology have also been successfully used in IoT. This paper carries out the research on integration of agricultural sensor technology, nanotechnology, biotechnology and optoelectronic technology and the application of big data, cloud computing and artificial intelligence technology in agricultural IoT. The advantages and development of the integration of nanotechnology, biotechnology and optoelectronic technology with agricultural sensor technology were discussed. The application of big data, cloud computing and artificial intelligence technology in IoT and their development trend were analysed.
Fusion Centers: Issues and Options for Congress
2008-01-18
largely financed and staffed by the states, and there is no one “model” for how a center should be structured. State and local law enforcement and...Information and Receive “Feedback” . . . . . . . . . . . . . . . . . . 69 4e . Establish a Mechanism for Fusion Centers to Have Input into the NIPF...intelligence fusion centers, particularly when networked together nationally, represent a proactive tool to be used to fight a global jihadist adversary which
Development of a head impact monitoring "Intelligent Mouthguard".
Hedin, Daniel S; Gibson, Paul L; Bartsch, Adam J; Samorezov, Sergey
2016-08-01
The authors present the development and laboratory system-level testing of an impact monitoring "Intelligent Mouthguard" intended to help with identification of potentially concussive head impacts and cumulative head impact dosage. The goal of Intelligent Mouthguard is to provide an indicator of potential concussion risk, and help caregiver identify athletes needing sideline concussion protocol testing. Intelligent Mouthguard may also help identify individuals who are at higher risk based on historical dosage. Intelligent Mouthguard integrates inertial sensors to provide 3-degree of freedom linear and rotational kinematics. The electronics are fully integrated into a custom mouthguard that couples tightly to the upper teeth. The combination of tight coupling and highly accurate sensor data means the Intelligent Mouthguard meets the National Football League (NFL) Level I validity specification based on laboratory system-level test data presented in this study.
Sabatini, Angelo Maria; Genovese, Vincenzo
2014-01-01
A sensor fusion method was developed for vertical channel stabilization by fusing inertial measurements from an Inertial Measurement Unit (IMU) and pressure altitude measurements from a barometric altimeter integrated in the same device (baro-IMU). An Extended Kalman Filter (EKF) estimated the quaternion from the sensor frame to the navigation frame; the sensed specific force was rotated into the navigation frame and compensated for gravity, yielding the vertical linear acceleration; finally, a complementary filter driven by the vertical linear acceleration and the measured pressure altitude produced estimates of height and vertical velocity. A method was also developed to condition the measured pressure altitude using a whitening filter, which helped to remove the short-term correlation due to environment-dependent pressure changes from raw pressure altitude. The sensor fusion method was implemented to work on-line using data from a wireless baro-IMU and tested for the capability of tracking low-frequency small-amplitude vertical human-like motions that can be critical for stand-alone inertial sensor measurements. Validation tests were performed in different experimental conditions, namely no motion, free-fall motion, forced circular motion and squatting. Accurate on-line tracking of height and vertical velocity was achieved, giving confidence to the use of the sensor fusion method for tracking typical vertical human motions: velocity Root Mean Square Error (RMSE) was in the range 0.04–0.24 m/s; height RMSE was in the range 5–68 cm, with statistically significant performance gains when the whitening filter was used by the sensor fusion method to track relatively high-frequency vertical motions. PMID:25061835
An intelligent rollator for mobility impaired persons, especially stroke patients.
Hellström, Thomas; Lindahl, Olof; Bäcklund, Tomas; Karlsson, Marcus; Hohnloser, Peter; Bråndal, Anna; Hu, Xiaolei; Wester, Per
2016-07-01
An intelligent rollator (IRO) was developed that aims at obstacle detection and guidance to avoid collisions and accidental falls. The IRO is a retrofit four-wheeled rollator with an embedded computer, two solenoid brakes, rotation sensors on the wheels and IR-distance sensors. The value reported by each distance sensor was compared in the computer to a nominal distance. Deviations indicated a present obstacle and caused activation of one of the brakes in order to influence the direction of motion to avoid the obstacle. The IRO was tested by seven healthy subjects with simulated restricted and blurred sight and five stroke subjects on a standardised indoor track with obstacles. All tested subjects walked faster with intelligence deactivated. Three out of five stroke patients experienced more detected obstacles with intelligence activated. This suggests enhanced safety during walking with IRO. Further studies are required to explore the full value of the IRO.
Mixed H2/H∞-Based Fusion Estimation for Energy-Limited Multi-Sensors in Wearable Body Networks
Li, Chao; Zhang, Zhenjiang; Chao, Han-Chieh
2017-01-01
In wireless sensor networks, sensor nodes collect plenty of data for each time period. If all of data are transmitted to a Fusion Center (FC), the power of sensor node would run out rapidly. On the other hand, the data also needs a filter to remove the noise. Therefore, an efficient fusion estimation model, which can save the energy of the sensor nodes while maintaining higher accuracy, is needed. This paper proposes a novel mixed H2/H∞-based energy-efficient fusion estimation model (MHEEFE) for energy-limited Wearable Body Networks. In the proposed model, the communication cost is firstly reduced efficiently while keeping the estimation accuracy. Then, the parameters in quantization method are discussed, and we confirm them by an optimization method with some prior knowledge. Besides, some calculation methods of important parameters are researched which make the final estimates more stable. Finally, an iteration-based weight calculation algorithm is presented, which can improve the fault tolerance of the final estimate. In the simulation, the impacts of some pivotal parameters are discussed. Meanwhile, compared with the other related models, the MHEEFE shows a better performance in accuracy, energy-efficiency and fault tolerance. PMID:29280950
Autonomous Mission Operations for Sensor Webs
NASA Astrophysics Data System (ADS)
Underbrink, A.; Witt, K.; Stanley, J.; Mandl, D.
2008-12-01
We present interim results of a 2005 ROSES AIST project entitled, "Using Intelligent Agents to Form a Sensor Web for Autonomous Mission Operations", or SWAMO. The goal of the SWAMO project is to shift the control of spacecraft missions from a ground-based, centrally controlled architecture to a collaborative, distributed set of intelligent agents. The network of intelligent agents intends to reduce management requirements by utilizing model-based system prediction and autonomic model/agent collaboration. SWAMO agents are distributed throughout the Sensor Web environment, which may include multiple spacecraft, aircraft, ground systems, and ocean systems, as well as manned operations centers. The agents monitor and manage sensor platforms, Earth sensing systems, and Earth sensing models and processes. The SWAMO agents form a Sensor Web of agents via peer-to-peer coordination. Some of the intelligent agents are mobile and able to traverse between on-orbit and ground-based systems. Other agents in the network are responsible for encapsulating system models to perform prediction of future behavior of the modeled subsystems and components to which they are assigned. The software agents use semantic web technologies to enable improved information sharing among the operational entities of the Sensor Web. The semantics include ontological conceptualizations of the Sensor Web environment, plus conceptualizations of the SWAMO agents themselves. By conceptualizations of the agents, we mean knowledge of their state, operational capabilities, current operational capacities, Web Service search and discovery results, agent collaboration rules, etc. The need for ontological conceptualizations over the agents is to enable autonomous and autonomic operations of the Sensor Web. The SWAMO ontology enables automated decision making and responses to the dynamic Sensor Web environment and to end user science requests. The current ontology is compatible with Open Geospatial Consortium (OGC) Sensor Web Enablement (SWE) Sensor Model Language (SensorML) concepts and structures. The agents are currently deployed on the U.S. Naval Academy MidSTAR-1 satellite and are actively managing the power subsystem on-orbit without the need for human intervention.
A review of potential image fusion methods for remote sensing-based irrigation management: Part II
USDA-ARS?s Scientific Manuscript database
Satellite-based sensors provide data at either greater spectral and coarser spatial resolutions, or lower spectral and finer spatial resolutions due to complementary spectral and spatial characteristics of optical sensor systems. In order to overcome this limitation, image fusion has been suggested ...
A novel framework for command and control of networked sensor systems
NASA Astrophysics Data System (ADS)
Chen, Genshe; Tian, Zhi; Shen, Dan; Blasch, Erik; Pham, Khanh
2007-04-01
In this paper, we have proposed a highly innovative advanced command and control framework for sensor networks used for future Integrated Fire Control (IFC). The primary goal is to enable and enhance target detection, validation, and mitigation for future military operations by graphical game theory and advanced knowledge information fusion infrastructures. The problem is approached by representing distributed sensor and weapon systems as generic warfare resources which must be optimized in order to achieve the operational benefits afforded by enabling a system of systems. This paper addresses the importance of achieving a Network Centric Warfare (NCW) foundation of information superiority-shared, accurate, and timely situational awareness upon which advanced automated management aids for IFC can be built. The approach uses the Data Fusion Information Group (DFIG) Fusion hierarchy of Level 0 through Level 4 to fuse the input data into assessments for the enemy target system threats in a battlespace to which military force is being applied. Compact graph models are employed across all levels of the fusion hierarchy to accomplish integrative data fusion and information flow control, as well as cross-layer sensor management. The functional block at each fusion level will have a set of innovative algorithms that not only exploit the corresponding graph model in a computationally efficient manner, but also permit combined functional experiments across levels by virtue of the unifying graphical model approach.
Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S
2018-06-11
Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.
1998-04-01
The result of the project is a demonstration of the fusion process, the sensors management and the real-time capabilities using simulated sensors...demonstrator (TAD) is a system that demonstrates the core ele- ment of a battlefield ground surveillance system by simulation in near real-time. The core...Management and Sensor/Platform simulation . The surveillance system observes the real world through a non-collocated heterogene- ous multisensory system
A Motion Tracking and Sensor Fusion Module for Medical Simulation.
Shen, Yunhe; Wu, Fan; Tseng, Kuo-Shih; Ye, Ding; Raymond, John; Konety, Badrinath; Sweet, Robert
2016-01-01
Here we introduce a motion tracking or navigation module for medical simulation systems. Our main contribution is a sensor fusion method for proximity or distance sensors integrated with inertial measurement unit (IMU). Since IMU rotation tracking has been widely studied, we focus on the position or trajectory tracking of the instrument moving freely within a given boundary. In our experiments, we have found that this module reliably tracks instrument motion.
Intelligent Transportation Systems (ITS) plan for Canada : en route to intelligent mobility
DOT National Transportation Integrated Search
1999-11-01
Intelligent Transportation Systems (ITS) include the application of advanced information processing, communications, sensor and control technologies and management strategies in an integrated manner to improve the functioning of the transportation sy...
Decentralized sensor fusion for Ubiquitous Networking Robotics in Urban Areas.
Sanfeliu, Alberto; Andrade-Cetto, Juan; Barbosa, Marco; Bowden, Richard; Capitán, Jesús; Corominas, Andreu; Gilbert, Andrew; Illingworth, John; Merino, Luis; Mirats, Josep M; Moreno, Plínio; Ollero, Aníbal; Sequeira, João; Spaan, Matthijs T J
2010-01-01
In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites), a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted.
Lennernäs, B; Edgren, M; Nilsson, S
1999-01-01
The purpose of this study was to evaluate the precision of a sensor and to ascertain the maximum distance between the sensor and the magnet, in a magnetic positioning system for external beam radiotherapy using a trained artificial intelligence neural network for position determination. Magnetic positioning for radiotherapy, previously described by Lennernäs and Nilsson, is a functional technique, but it is time consuming. The sensors are large and the distance between the sensor and the magnetic implant is limited to short distances. This paper presents a new technique for positioning, using an artificial intelligence neural network, which was trained to position the magnetic implant with at least 0.5 mm resolution in X and Y dimensions. The possibility of using the system for determination in the Z dimension, that is the distance between the magnet and the sensor, was also investigated. After training, this system positioned the magnet with a mean error of maximum 0.15 mm in all dimensions and up to 13 mm from the sensor. Of 400 test positions, 8 determinations had an error larger than 0.5 mm, maximum 0.55 mm. A position was determined in approximately 0.01 s.
An intelligent surveillance platform for large metropolitan areas with dense sensor deployment.
Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A; Smilansky, Zeev
2013-06-07
This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage.
A Bluetooth-Based Device Management Platform for Smart Sensor Environment
NASA Astrophysics Data System (ADS)
Lim, Ivan Boon-Kiat; Yow, Kin Choong
In this paper, we propose the use of Bluetooth as the device management platform for the various embedded sensors and actuators in an ambient intelligent environment. We demonstrate the ease of adding Bluetooth capability to common sensor circuits (e.g. motion sensor circuit based on a pyroelectric infrared (PIR) sensor). A central logic application is proposed which controls the operation of controller devices, based on values returned by sensors via Bluetooth. The operation of devices depends on rules that are learnt from user behavior using an Elman recurrent neural network. Overall, Bluetooth has shown its potential in being used as a device management platform in an ambient intelligent environment, which allows sensors and controllers to be deployed even in locations where power sources are not readily available, by using battery power.
Intergraph video and images exploitation capabilities
NASA Astrophysics Data System (ADS)
Colla, Simone; Manesis, Charalampos
2013-08-01
The current paper focuses on the capture, fusion and process of aerial imagery in order to leverage full motion video, giving analysts the ability to collect, analyze, and maximize the value of video assets. Unmanned aerial vehicles (UAV) have provided critical real-time surveillance and operational support to military organizations, and are a key source of intelligence, particularly when integrated with other geospatial data. In the current workflow, at first, the UAV operators plan the flight by using a flight planning software. During the flight the UAV send a live video stream directly on the field to be processed by Intergraph software, to generate and disseminate georeferenced images trough a service oriented architecture based on ERDAS Apollo suite. The raw video-based data sources provide the most recent view of a situation and can augment other forms of geospatial intelligence - such as satellite imagery and aerial photos - to provide a richer, more detailed view of the area of interest. To effectively use video as a source of intelligence, however, the analyst needs to seamlessly fuse the video with these other types of intelligence, such as map features and annotations. Intergraph has developed an application that automatically generates mosaicked georeferenced image, tags along the video route which can then be seamlessly integrated with other forms of static data, such as aerial photos, satellite imagery, or geospatial layers and features. Consumers will finally have the ability to use a single, streamlined system to complete the entire geospatial information lifecycle: capturing geospatial data using sensor technology; processing vector, raster, terrain data into actionable information; managing, fusing, and sharing geospatial data and video toghether; and finally, rapidly and securely delivering integrated information products, ensuring individuals can make timely decisions.
Summary of sensor evaluation for the Fusion Electromagnetic Induction Experiment (FELIX)
NASA Astrophysics Data System (ADS)
Knott, M. J.
1982-08-01
As part of the First Wall/Blanket/Shield Engineering Test Program, a test bed called FELIX (fusion electromagnetic induction experiment) is under construction. Its purpose is to test, evaluate, and develop computer codes for the prediction of electromagnetically induced phenomenon in a magnetic environment modeling that of a fusion reaction. Crucial to this process is the sensing and recording of the various induced effects. Sensor evaluation for FELIX reached the point where most sensor types were evaluated and preliminary decisions are being made as to type and quantity for the initial FELIX experiments. These early experiments, the first, flat plate experiment in particular, will be aimed at testing the sensors as well as the pertinent theories involved. The reason for these evaluations, decisions, and proof tests is the harsh electrical and magnetic environment that FELIX presents.
Fiber optic medical pressure-sensing system employing intelligent self-calibration
NASA Astrophysics Data System (ADS)
He, Gang
1996-01-01
In this article, we describe a fiber-optic catheter-type pressure-sensing system that has been successfully introduced for medical diagnostic applications. We present overall sensors and optoelectronics designs, and highlight product development efforts that lead to a reliable and accurate disposable pressure-sensing system. In particular, the incorporation of an intelligent on-site self-calibration approach allows limited sensor reuses for reducing end-user costs and for system adaptation to wide sensor variabilities associated with low-cost manufacturing processes. We demonstrate that fiber-optic sensors can be cost-effectively produced to satisfy needs of certain medical market segments.
Intelligent sensor-model automated control of PMR-15 autoclave processing
NASA Technical Reports Server (NTRS)
Hart, S.; Kranbuehl, D.; Loos, A.; Hinds, B.; Koury, J.
1992-01-01
An intelligent sensor model system has been built and used for automated control of the PMR-15 cure process in the autoclave. The system uses frequency-dependent FM sensing (FDEMS), the Loos processing model, and the Air Force QPAL intelligent software shell. The Loos model is used to predict and optimize the cure process including the time-temperature dependence of the extent of reaction, flow, and part consolidation. The FDEMS sensing system in turn monitors, in situ, the removal of solvent, changes in the viscosity, reaction advancement and cure completion in the mold continuously throughout the processing cycle. The sensor information is compared with the optimum processing conditions from the model. The QPAL composite cure control system allows comparison of the sensor monitoring with the model predictions to be broken down into a series of discrete steps and provides a language for making decisions on what to do next regarding time-temperature and pressure.
Bialas, Andrzej
2011-01-01
Intelligent sensors experience security problems very similar to those inherent to other kinds of IT products or systems. The assurance for these products or systems creation methodologies, like Common Criteria (ISO/IEC 15408) can be used to improve the robustness of the sensor systems in high risk environments. The paper presents the background and results of the previous research on patterns-based security specifications and introduces a new ontological approach. The elaborated ontology and knowledge base were validated on the IT security development process dealing with the sensor example. The contribution of the paper concerns the application of the knowledge engineering methodology to the previously developed Common Criteria compliant and pattern-based method for intelligent sensor security development. The issue presented in the paper has a broader significance in terms that it can solve information security problems in many application domains. PMID:22164064
Multisensor Fusion for Change Detection
NASA Astrophysics Data System (ADS)
Schenk, T.; Csatho, B.
2005-12-01
Combining sensors that record different properties of a 3-D scene leads to complementary and redundant information. If fused properly, a more robust and complete scene description becomes available. Moreover, fusion facilitates automatic procedures for object reconstruction and modeling. For example, aerial imaging sensors, hyperspectral scanning systems, and airborne laser scanning systems generate complementary data. We describe how data from these sensors can be fused for such diverse applications as mapping surface erosion and landslides, reconstructing urban scenes, monitoring urban land use and urban sprawl, and deriving velocities and surface changes of glaciers and ice sheets. An absolute prerequisite for successful fusion is a rigorous co-registration of the sensors involved. We establish a common 3-D reference frame by using sensor invariant features. Such features are caused by the same object space phenomena and are extracted in multiple steps from the individual sensors. After extracting, segmenting and grouping the features into more abstract entities, we discuss ways on how to automatically establish correspondences. This is followed by a brief description of rigorous mathematical models suitable to deal with linear and area features. In contrast to traditional, point-based registration methods, lineal and areal features lend themselves to a more robust and more accurate registration. More important, the chances to automate the registration process increases significantly. The result of the co-registration of the sensors is a unique transformation between the individual sensors and the object space. This makes spatial reasoning of extracted information more versatile; reasoning can be performed in sensor space or in 3-D space where domain knowledge about features and objects constrains reasoning processes, reduces the search space, and helps to make the problem well-posed. We demonstrate the feasibility of the proposed multisensor fusion approach with detecting surface elevation changes on the Byrd Glacier, Antarctica, with aerial imagery from 1980s and ICESat laser altimetry data from 2003-05. Change detection from such disparate data sets is an intricate fusion problem, beginning with sensor alignment, and on to reasoning with spatial information as to where changes occurred and to what extent.
Projection-based circular constrained state estimation and fusion over long-haul links
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Qiang; Rao, Nageswara S.
In this paper, we consider a scenario where sensors are deployed over a large geographical area for tracking a target with circular nonlinear constraints on its motion dynamics. The sensor state estimates are sent over long-haul networks to a remote fusion center for fusion. We are interested in different ways to incorporate the constraints into the estimation and fusion process in the presence of communication loss. In particular, we consider closed-form projection-based solutions, including rules for fusing the estimates and for incorporating the constraints, which jointly can guarantee timely fusion often required in realtime systems. We test the performance ofmore » these methods in the long-haul tracking environment using a simple example.« less
Smart and intelligent sensor payload project
2009-04-01
Engineers working on the smart and intelligent sensor payload project include (l to r): Ed Conley (NASA), Mark Mitchell (Jacobs Technology), Luke Richards (NASA), Robert Drackett (Jacobs Technology), Mark Turowski (Jacobs Technology) , Richard Franzl (seated, Jacobs Technology), Greg McVay (Jacobs Technology), Brianne Guillot (Jacobs Technology), Jon Morris (Jacobs Technology), Stephen Rawls (NASA), John Schmalzel (NASA) and Andrew Bracey (NASA).
Multisource image fusion method using support value transform.
Zheng, Sheng; Shi, Wen-Zhong; Liu, Jian; Zhu, Guang-Xi; Tian, Jin-Wen
2007-07-01
With the development of numerous imaging sensors, many images can be simultaneously pictured by various sensors. However, there are many scenarios where no one sensor can give the complete picture. Image fusion is an important approach to solve this problem and produces a single image which preserves all relevant information from a set of different sensors. In this paper, we proposed a new image fusion method using the support value transform, which uses the support value to represent the salient features of image. This is based on the fact that, in support vector machines (SVMs), the data with larger support values have a physical meaning in the sense that they reveal relative more importance of the data points for contributing to the SVM model. The mapped least squares SVM (mapped LS-SVM) is used to efficiently compute the support values of image. The support value analysis is developed by using a series of multiscale support value filters, which are obtained by filling zeros in the basic support value filter deduced from the mapped LS-SVM to match the resolution of the desired level. Compared with the widely used image fusion methods, such as the Laplacian pyramid, discrete wavelet transform methods, the proposed method is an undecimated transform-based approach. The fusion experiments are undertaken on multisource images. The results demonstrate that the proposed approach is effective and is superior to the conventional image fusion methods in terms of the pertained quantitative fusion evaluation indexes, such as quality of visual information (Q(AB/F)), the mutual information, etc.
An oil fraction neural sensor developed using electrical capacitance tomography sensor data.
Zainal-Mokhtar, Khursiah; Mohamad-Saleh, Junita
2013-08-26
This paper presents novel research on the development of a generic intelligent oil fraction sensor based on Electrical Capacitance Tomography (ECT) data. An artificial Neural Network (ANN) has been employed as the intelligent system to sense and estimate oil fractions from the cross-sections of two-component flows comprising oil and gas in a pipeline. Previous works only focused on estimating the oil fraction in the pipeline based on fixed ECT sensor parameters. With fixed ECT design sensors, an oil fraction neural sensor can be trained to deal with ECT data based on the particular sensor parameters, hence the neural sensor is not generic. This work focuses on development of a generic neural oil fraction sensor based on training a Multi-Layer Perceptron (MLP) ANN with various ECT sensor parameters. On average, the proposed oil fraction neural sensor has shown to be able to give a mean absolute error of 3.05% for various ECT sensor sizes.
An Oil Fraction Neural Sensor Developed Using Electrical capacitance Tomography Sensor Data
Zainal-Mokhtar, Khursiah; Mohamad-Saleh, Junita
2013-01-01
This paper presents novel research on the development of a generic intelligent oil fraction sensor based on Electrical capacitance Tomography (ECT) data. An artificial Neural Network (ANN) has been employed as the intelligent system to sense and estimate oil fractions from the cross-sections of two-component flows comprising oil and gas in a pipeline. Previous works only focused on estimating the oil fraction in the pipeline based on fixed ECT sensor parameters. With fixed ECT design sensors, an oil fraction neural sensor can be trained to deal with ECT data based on the particular sensor parameters, hence the neural sensor is not generic. This work focuses on development of a generic neural oil fraction sensor based on training a Multi-Layer Perceptron (MLP) ANN with various ECT sensor parameters. On average, the proposed oil fraction neural sensor has shown to be able to give a mean absolute error of 3.05% for various ECT sensor sizes. PMID:24064598
Efficient sensor network vehicle classification using peak harmonics of acoustic emissions
NASA Astrophysics Data System (ADS)
William, Peter E.; Hoffman, Michael W.
2008-04-01
An application is proposed for detection and classification of battlefield ground vehicles using the emitted acoustic signal captured at individual sensor nodes of an ad hoc Wireless Sensor Network (WSN). We make use of the harmonic characteristics of the acoustic emissions of battlefield vehicles, in reducing both the computations carried on the sensor node and the transmitted data to the fusion center for reliable and effcient classification of targets. Previous approaches focus on the lower frequency band of the acoustic emissions up to 500Hz; however, we show in the proposed application how effcient discrimination between battlefield vehicles is performed using features extracted from higher frequency bands (50 - 1500Hz). The application shows that selective time domain acoustic features surpass equivalent spectral features. Collaborative signal processing is utilized, such that estimation of certain signal model parameters is carried by the sensor node, in order to reduce the communication between the sensor node and the fusion center, while the remaining model parameters are estimated at the fusion center. The transmitted data from the sensor node to the fusion center ranges from 1 ~ 5% of the sampled acoustic signal at the node. A variety of classification schemes were examined, such as maximum likelihood, vector quantization and artificial neural networks. Evaluation of the proposed application, through processing of an acoustic data set with comparison to previous results, shows that the improvement is not only in the number of computations but also in the detection and false alarm rate as well.
Sensor fusion to enable next generation low cost Night Vision systems
NASA Astrophysics Data System (ADS)
Schweiger, R.; Franz, S.; Löhlein, O.; Ritter, W.; Källhammer, J.-E.; Franks, J.; Krekels, T.
2010-04-01
The next generation of automotive Night Vision Enhancement systems offers automatic pedestrian recognition with a performance beyond current Night Vision systems at a lower cost. This will allow high market penetration, covering the luxury as well as compact car segments. Improved performance can be achieved by fusing a Far Infrared (FIR) sensor with a Near Infrared (NIR) sensor. However, fusing with today's FIR systems will be too costly to get a high market penetration. The main cost drivers of the FIR system are its resolution and its sensitivity. Sensor cost is largely determined by sensor die size. Fewer and smaller pixels will reduce die size but also resolution and sensitivity. Sensitivity limits are mainly determined by inclement weather performance. Sensitivity requirements should be matched to the possibilities of low cost FIR optics, especially implications of molding of highly complex optical surfaces. As a FIR sensor specified for fusion can have lower resolution as well as lower sensitivity, fusing FIR and NIR can solve performance and cost problems. To allow compensation of FIR-sensor degradation on the pedestrian detection capabilities, a fusion approach called MultiSensorBoosting is presented that produces a classifier holding highly discriminative sub-pixel features from both sensors at once. The algorithm is applied on data with different resolution and on data obtained from cameras with varying optics to incorporate various sensor sensitivities. As it is not feasible to record representative data with all different sensor configurations, transformation routines on existing high resolution data recorded with high sensitivity cameras are investigated in order to determine the effects of lower resolution and lower sensitivity to the overall detection performance. This paper also gives an overview of the first results showing that a reduction of FIR sensor resolution can be compensated using fusion techniques and a reduction of sensitivity can be compensated.
Multi-sensor fusion of Landsat 8 thermal infrared (TIR) and panchromatic (PAN) images.
Jung, Hyung-Sup; Park, Sung-Whan
2014-12-18
Data fusion is defined as the combination of data from multiple sensors such that the resulting information is better than would be possible when the sensors are used individually. The multi-sensor fusion of panchromatic (PAN) and thermal infrared (TIR) images is a good example of this data fusion. While a PAN image has higher spatial resolution, a TIR one has lower spatial resolution. In this study, we have proposed an efficient method to fuse Landsat 8 PAN and TIR images using an optimal scaling factor in order to control the trade-off between the spatial details and the thermal information. We have compared the fused images created from different scaling factors and then tested the performance of the proposed method at urban and rural test areas. The test results show that the proposed method merges the spatial resolution of PAN image and the temperature information of TIR image efficiently. The proposed method may be applied to detect lava flows of volcanic activity, radioactive exposure of nuclear power plants, and surface temperature change with respect to land-use change.
NASA Astrophysics Data System (ADS)
Couture, Jean; Boily, Edouard; Simard, Marc-Alain
1996-05-01
The research and development group at Loral Canada is now at the second phase of the development of a data fusion demonstration model (DFDM) for a naval anti-air warfare to be used as a workbench tool to perform exploratory research. This project has emphatically addressed how the concepts related to fusion could be implemented within the Canadian Patrol Frigate (CPF) software environment. The project has been designed to read data passively on the CPF bus without any modification to the CPF software. This has brought to light important time alignment issues since the CPF sensors and the CPF command and control system were not important time alignment issues since the CPF sensors and the CPF command and control system were not originally designed to support a track management function which fuses information. The fusion of data from non-organic sensors with the tactical Link-11 data has produced stimulating spatial alignment problems which have been overcome by the use of a geodetic referencing coordinate system. Some benchmark scenarios have been selected to quantitatively demonstrate the capabilities of this fusion implementation. This paper describes the implementation design of DFDM (version 2), and summarizes the results obtained so far when fusing the scenarios simulated data.
Intelligence Sharing, Fusion Centers, and Homeland Security
2008-06-01
particular area. For example, “J3” focuses on current operations, planning “ J5 ” focuses on future planning and “J2” focuses on intelligence. USNORTHCOM...rights of the suspected attacker, taking days or months to complete the investigation. For example, in February 1998, three teenagers hacked into 11
NASA Astrophysics Data System (ADS)
Benaskeur, Abder R.; Roy, Jean
2001-08-01
Sensor Management (SM) has to do with how to best manage, coordinate and organize the use of sensing resources in a manner that synergistically improves the process of data fusion. Based on the contextual information, SM develops options for collecting further information, allocates and directs the sensors towards the achievement of the mission goals and/or tunes the parameters for the realtime improvement of the effectiveness of the sensing process. Conscious of the important role that SM has to play in modern data fusion systems, we are currently studying advanced SM Concepts that would help increase the survivability of the current Halifax and Iroquois Class ships, as well as their possible future upgrades. For this purpose, a hierarchical scheme has been proposed for data fusion and resource management adaptation, based on the control theory and within the process refinement paradigm of the JDL data fusion model, and taking into account the multi-agent model put forward by the SASS Group for the situation analysis process. The novelty of this work lies in the unified framework that has been defined for tackling the adaptation of both the fusion process and the sensor/weapon management.
NASA Astrophysics Data System (ADS)
Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.
1999-02-01
We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.
Sensor Fusion and Smart Sensor in Sports and Biomedical Applications
Mendes, José Jair Alves; Vieira, Mário Elias Marinho; Pires, Marcelo Bissi; Stevan, Sergio Luiz
2016-01-01
The following work presents an overview of smart sensors and sensor fusion targeted at biomedical applications and sports areas. In this work, the integration of these areas is demonstrated, promoting a reflection about techniques and applications to collect, quantify and qualify some physical variables associated with the human body. These techniques are presented in various biomedical and sports applications, which cover areas related to diagnostics, rehabilitation, physical monitoring, and the development of performance in athletes, among others. Although some applications are described in only one of two fields of study (biomedicine and sports), it is very likely that the same application fits in both, with small peculiarities or adaptations. To illustrate the contemporaneity of applications, an analysis of specialized papers published in the last six years has been made. In this context, the main characteristic of this review is to present the largest quantity of relevant examples of sensor fusion and smart sensors focusing on their utilization and proposals, without deeply addressing one specific system or technique, to the detriment of the others. PMID:27669260
Videos | Argonne National Laboratory
science --Agent-based modeling --Applied mathematics --Artificial intelligence --Cloud computing management -Intelligence & counterterrorrism -Vulnerability assessment -Sensors & detectors Programs
Large Efficient Intelligent Heating Relay Station System
NASA Astrophysics Data System (ADS)
Wu, C. Z.; Wei, X. G.; Wu, M. Q.
2017-12-01
The design of large efficient intelligent heating relay station system aims at the improvement of the existing heating system in our country, such as low heating efficiency, waste of energy and serious pollution, and the control still depends on the artificial problem. In this design, we first improve the existing plate heat exchanger. Secondly, the ATM89C51 is used to control the whole system and realize the intelligent control. The detection part is using the PT100 temperature sensor, pressure sensor, turbine flowmeter, heating temperature, detection of user end liquid flow, hydraulic, and real-time feedback, feedback signal to the microcontroller through the heating for users to adjust, realize the whole system more efficient, intelligent and energy-saving.
Simulating Operation of a Complex Sensor Network
NASA Technical Reports Server (NTRS)
Jennings, Esther; Clare, Loren; Woo, Simon
2008-01-01
Simulation Tool for ASCTA Microsensor Network Architecture (STAMiNA) ["ASCTA" denotes the Advanced Sensors Collaborative Technology Alliance.] is a computer program for evaluating conceptual sensor networks deployed over terrain to provide military situational awareness. This or a similar program is needed because of the complexity of interactions among such diverse phenomena as sensing and communication portions of a network, deployment of sensor nodes, effects of terrain, data-fusion algorithms, and threat characteristics. STAMiNA is built upon a commercial network-simulator engine, with extensions to include both sensing and communication models in a discrete-event simulation environment. Users can define (1) a mission environment, including terrain features; (2) objects to be sensed; (3) placements and modalities of sensors, abilities of sensors to sense objects of various types, and sensor false alarm rates; (4) trajectories of threatening objects; (5) means of dissemination and fusion of data; and (6) various network configurations. By use of STAMiNA, one can simulate detection of targets through sensing, dissemination of information by various wireless communication subsystems under various scenarios, and fusion of information, incorporating such metrics as target-detection probabilities, false-alarm rates, and communication loads, and capturing effects of terrain and threat.
Tkach, D C; Hargrove, L J
2013-01-01
Advances in battery and actuator technology have enabled clinical use of powered lower limb prostheses such as the BiOM Powered Ankle. To allow ambulation over various types of terrains, such devices rely on built-in mechanical sensors or manual actuation by the amputee to transition into an operational mode that is suitable for a given terrain. It is unclear if mechanical sensors alone can accurately modulate operational modes while voluntary actuation prevents seamless, naturalistic gait. Ensuring that the prosthesis is ready to accommodate new terrain types at first step is critical for user safety. EMG signals from patient's residual leg muscles may provide additional information to accurately choose the proper mode of prosthesis operation. Using a pattern recognition classifier we compared the accuracy of predicting 8 different mode transitions based on (1) prosthesis mechanical sensor output (2) EMG recorded from residual limb and (3) fusion of EMG and mechanical sensor data. Our findings indicate that the neuromechanical sensor fusion significantly decreases errors in predicting 10 mode transitions as compared to using either mechanical sensors or EMG alone (2.3±0.7% vs. 7.8±0.9% and 20.2±2.0% respectively).
Automatic intelligibility classification of sentence-level pathological speech
Kim, Jangwon; Kumar, Naveen; Tsiartas, Andreas; Li, Ming; Narayanan, Shrikanth S.
2014-01-01
Pathological speech usually refers to the condition of speech distortion resulting from atypicalities in voice and/or in the articulatory mechanisms owing to disease, illness or other physical or biological insult to the production system. Although automatic evaluation of speech intelligibility and quality could come in handy in these scenarios to assist experts in diagnosis and treatment design, the many sources and types of variability often make it a very challenging computational processing problem. In this work we propose novel sentence-level features to capture abnormal variation in the prosodic, voice quality and pronunciation aspects in pathological speech. In addition, we propose a post-classification posterior smoothing scheme which refines the posterior of a test sample based on the posteriors of other test samples. Finally, we perform feature-level fusions and subsystem decision fusion for arriving at a final intelligibility decision. The performances are tested on two pathological speech datasets, the NKI CCRT Speech Corpus (advanced head and neck cancer) and the TORGO database (cerebral palsy or amyotrophic lateral sclerosis), by evaluating classification accuracy without overlapping subjects’ data among training and test partitions. Results show that the feature sets of each of the voice quality subsystem, prosodic subsystem, and pronunciation subsystem, offer significant discriminating power for binary intelligibility classification. We observe that the proposed posterior smoothing in the acoustic space can further reduce classification errors. The smoothed posterior score fusion of subsystems shows the best classification performance (73.5% for unweighted, and 72.8% for weighted, average recalls of the binary classes). PMID:25414544
Intelligent Wireless Sensor Networks for System Health Monitoring
NASA Technical Reports Server (NTRS)
Alena, Rick
2011-01-01
Wireless sensor networks (WSN) based on the IEEE 802.15.4 Personal Area Network (PAN) standard are finding increasing use in the home automation and emerging smart energy markets. The network and application layers, based on the ZigBee 2007 Standard, provide a convenient framework for component-based software that supports customer solutions from multiple vendors. WSNs provide the inherent fault tolerance required for aerospace applications. The Discovery and Systems Health Group at NASA Ames Research Center has been developing WSN technology for use aboard aircraft and spacecraft for System Health Monitoring of structures and life support systems using funding from the NASA Engineering and Safety Center and Exploration Technology Development and Demonstration Program. This technology provides key advantages for low-power, low-cost ancillary sensing systems particularly across pressure interfaces and in areas where it is difficult to run wires. Intelligence for sensor networks could be defined as the capability of forming dynamic sensor networks, allowing high-level application software to identify and address any sensor that joined the network without the use of any centralized database defining the sensors characteristics. The IEEE 1451 Standard defines methods for the management of intelligent sensor systems and the IEEE 1451.4 section defines Transducer Electronic Datasheets (TEDS), which contain key information regarding the sensor characteristics such as name, description, serial number, calibration information and user information such as location within a vehicle. By locating the TEDS information on the wireless sensor itself and enabling access to this information base from the application software, the application can identify the sensor unambiguously and interpret and present the sensor data stream without reference to any other information. The application software is able to read the status of each sensor module, responding in real-time to changes of PAN configuration, providing the appropriate response for maintaining overall sensor system function, even when sensor modules fail or the WSN is reconfigured. The session will present the architecture and technical feasibility of creating fault-tolerant WSNs for aerospace applications based on our application of the technology to a Structural Health Monitoring testbed. The interim results of WSN development and testing including our software architecture for intelligent sensor management will be discussed in the context of the specific tradeoffs required for effective use. Initial certification measurement techniques and test results gauging WSN susceptibility to Radio Frequency interference are introduced as key challenges for technology adoption. A candidate Developmental and Flight Instrumentation implementation using intelligent sensor networks for wind tunnel and flight tests is developed as a guide to understanding key aspects of the aerospace vehicle design, test and operations life cycle.
2013-10-15
statistic,” in Artifical Intelligence and Statistics (AISTATS), 2013. [6] ——, “Detecting activity in graphs via the Graph Ellipsoid Scan Statistic... Artifical Intelligence and Statistics (AISTATS), 2013. [8] ——, “Near-optimal anomaly detection in graphs using Lovász Extended Scan Statistic,” in Neural...networks,” in Artificial Intelligence and Statistics (AISTATS), 2010. 11 [11] D. Aldous, “The random walk construction of uniform spanning trees and
Adding intelligence to scientific data management
NASA Technical Reports Server (NTRS)
Campbell, William J.; Short, Nicholas M., Jr.; Treinish, Lloyd A.
1989-01-01
NASA plans to solve some of the problems of handling large-scale scientific data bases by turning to artificial intelligence (AI) are discussed. The growth of the information glut and the ways that AI can help alleviate the resulting problems are reviewed. The employment of the Intelligent User Interface prototype, where the user will generate his own natural language query with the assistance of the system, is examined. Spatial data management, scientific data visualization, and data fusion are discussed.
Decentralized Sensor Fusion for Ubiquitous Networking Robotics in Urban Areas
Sanfeliu, Alberto; Andrade-Cetto, Juan; Barbosa, Marco; Bowden, Richard; Capitán, Jesús; Corominas, Andreu; Gilbert, Andrew; Illingworth, John; Merino, Luis; Mirats, Josep M.; Moreno, Plínio; Ollero, Aníbal; Sequeira, João; Spaan, Matthijs T.J.
2010-01-01
In this article we explain the architecture for the environment and sensors that has been built for the European project URUS (Ubiquitous Networking Robotics in Urban Sites), a project whose objective is to develop an adaptable network robot architecture for cooperation between network robots and human beings and/or the environment in urban areas. The project goal is to deploy a team of robots in an urban area to give a set of services to a user community. This paper addresses the sensor architecture devised for URUS and the type of robots and sensors used, including environment sensors and sensors onboard the robots. Furthermore, we also explain how sensor fusion takes place to achieve urban outdoor execution of robotic services. Finally some results of the project related to the sensor network are highlighted. PMID:22294927
Sood, Chetan; Marin, Mariana; Mason, Caleb S; Melikyan, Gregory B
2016-01-01
HIV-1 fusion leading to productive entry has long been thought to occur at the plasma membrane. However, our previous single virus imaging data imply that, after Env engagement of CD4 and coreceptors at the cell surface, the virus enters into and fuses with intracellular compartments. We were unable to reliably detect viral fusion at the plasma membrane. Here, we implement a novel virus labeling strategy that biases towards detection of virus fusion that occurs in a pH-neutral environment-at the plasma membrane or, possibly, in early pH-neutral vesicles. Virus particles are co-labeled with an intra-viral content marker, which is released upon fusion, and an extra-viral pH sensor consisting of ecliptic pHluorin fused to the transmembrane domain of ICAM-1. This sensor fully quenches upon virus trafficking to a mildly acidic compartment, thus precluding subsequent detection of viral content release. As an interesting secondary observation, the incorporation of the pH-sensor revealed that HIV-1 particles occasionally shuttle between neutral and acidic compartments in target cells expressing CD4, suggesting a small fraction of viral particles is recycled to the plasma membrane and re-internalized. By imaging viruses bound to living cells, we found that HIV-1 content release in neutral-pH environment was a rare event (~0.4% particles). Surprisingly, viral content release was not significantly reduced by fusion inhibitors, implying that content release was due to spontaneous formation of viral membrane defects occurring at the cell surface. We did not measure a significant occurrence of HIV-1 fusion at neutral pH above this defect-mediated background loss of content, suggesting that the pH sensor may destabilize the membrane of the HIV-1 pseudovirus and, thus, preclude reliable detection of single virus fusion events at neutral pH.
Machine learning-based augmented reality for improved surgical scene understanding.
Pauly, Olivier; Diotte, Benoit; Fallavollita, Pascal; Weidert, Simon; Euler, Ekkehard; Navab, Nassir
2015-04-01
In orthopedic and trauma surgery, AR technology can support surgeons in the challenging task of understanding the spatial relationships between the anatomy, the implants and their tools. In this context, we propose a novel augmented visualization of the surgical scene that mixes intelligently the different sources of information provided by a mobile C-arm combined with a Kinect RGB-Depth sensor. Therefore, we introduce a learning-based paradigm that aims at (1) identifying the relevant objects or anatomy in both Kinect and X-ray data, and (2) creating an object-specific pixel-wise alpha map that permits relevance-based fusion of the video and the X-ray images within one single view. In 12 simulated surgeries, we show very promising results aiming at providing for surgeons a better surgical scene understanding as well as an improved depth perception. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Williams, Arnold C.; Pachowicz, Peter W.
2004-09-01
Current mine detection research indicates that no single sensor or single look from a sensor will detect mines/minefields in a real-time manner at a performance level suitable for a forward maneuver unit. Hence, the integrated development of detectors and fusion algorithms are of primary importance. A problem in this development process has been the evaluation of these algorithms with relatively small data sets, leading to anecdotal and frequently over trained results. These anecdotal results are often unreliable and conflicting among various sensors and algorithms. Consequently, the physical phenomena that ought to be exploited and the performance benefits of this exploitation are often ambiguous. The Army RDECOM CERDEC Night Vision Laboratory and Electron Sensors Directorate has collected large amounts of multisensor data such that statistically significant evaluations of detection and fusion algorithms can be obtained. Even with these large data sets care must be taken in algorithm design and data processing to achieve statistically significant performance results for combined detectors and fusion algorithms. This paper discusses statistically significant detection and combined multilook fusion results for the Ellipse Detector (ED) and the Piecewise Level Fusion Algorithm (PLFA). These statistically significant performance results are characterized by ROC curves that have been obtained through processing this multilook data for the high resolution SAR data of the Veridian X-Band radar. We discuss the implications of these results on mine detection and the importance of statistical significance, sample size, ground truth, and algorithm design in performance evaluation.
Lightweight autonomous chemical identification system (LACIS)
NASA Astrophysics Data System (ADS)
Lozos, George; Lin, Hai; Burch, Timothy
2012-06-01
Smiths Detection and Intelligent Optical Systems have developed prototypes for the Lightweight Autonomous Chemical Identification System (LACIS) for the US Department of Homeland Security. LACIS is to be a handheld detection system for Chemical Warfare Agents (CWAs) and Toxic Industrial Chemicals (TICs). LACIS is designed to have a low limit of detection and rapid response time for use by emergency responders and could allow determination of areas having dangerous concentration levels and if protective garments will be required. Procedures for protection of responders from hazardous materials incidents require the use of protective equipment until such time as the hazard can be assessed. Such accurate analysis can accelerate operations and increase effectiveness. LACIS is to be an improved point detector employing novel CBRNE detection modalities that includes a militaryproven ruggedized ion mobility spectrometer (IMS) with an array of electro-resistive sensors to extend the range of chemical threats detected in a single device. It uses a novel sensor data fusion and threat classification architecture to interpret the independent sensor responses and provide robust detection at low levels in complex backgrounds with minimal false alarms. The performance of LACIS prototypes have been characterized in independent third party laboratory tests at the Battelle Memorial Institute (BMI, Columbus, OH) and indoor and outdoor field tests at the Nevada National Security Site (NNSS). LACIS prototypes will be entering operational assessment by key government emergency response groups to determine its capabilities versus requirements.
Sensor data monitoring and decision level fusion scheme for early fire detection
NASA Astrophysics Data System (ADS)
Rizogiannis, Constantinos; Thanos, Konstantinos Georgios; Astyakopoulos, Alkiviadis; Kyriazanos, Dimitris M.; Thomopoulos, Stelios C. A.
2017-05-01
The aim of this paper is to present the sensor monitoring and decision level fusion scheme for early fire detection which has been developed in the context of the AF3 Advanced Forest Fire Fighting European FP7 research project, adopted specifically in the OCULUS-Fire control and command system and tested during a firefighting field test in Greece with prescribed real fire, generating early-warning detection alerts and notifications. For this purpose and in order to improve the reliability of the fire detection system, a two-level fusion scheme is developed exploiting a variety of observation solutions from air e.g. UAV infrared cameras, ground e.g. meteorological and atmospheric sensors and ancillary sources e.g. public information channels, citizens smartphone applications and social media. In the first level, a change point detection technique is applied to detect changes in the mean value of each measured parameter by the ground sensors such as temperature, humidity and CO2 and then the Rate-of-Rise of each changed parameter is calculated. In the second level the fire event Basic Probability Assignment (BPA) function is determined for each ground sensor using Fuzzy-logic theory and then the corresponding mass values are combined in a decision level fusion process using Evidential Reasoning theory to estimate the final fire event probability.
Information Fusion in Ad hoc Wireless Sensor Networks for Aircraft Health Monitoring
NASA Astrophysics Data System (ADS)
Fragoulis, Nikos; Tsagaris, Vassilis; Anastassopoulos, Vassilis
In this paper the use of an ad hoc wireless sensor network for implementing a structural health monitoring system is discussed. The network is consisted of sensors deployed throughout the aircraft. These sensors being in the form of a microelectronic chip and consisted of sensing, data processing and communicating components could be easily embedded in any mechanical aircraft component. The established sensor network, due to its ad hoc nature is easily scalable, allowing adding or removing any number of sensors. The position of the sensor nodes need not necessarily to be engineered or predetermined, giving this way the ability to be deployed in inaccessible points. Information collected from various sensors of different modalities throughout the aircraft is then fused in order to provide a more comprehensive image of the aircraft structural health. Sensor level fusion along with decision quality information is used, in order to enhance detection performance.
Günzel, Karsten; Cash, Hannes; Buckendahl, John; Königbauer, Maximilian; Asbach, Patrick; Haas, Matthias; Neymeyer, Jörg; Hinz, Stefan; Miller, Kurt; Kempkensteffen, Carsten
2017-01-13
To explore the diagnostic benefit of an additional image fusion of the sagittal plane in addition to the standard axial image fusion, using a sensor-based MRI/US fusion platform. During July 2013 and September 2015, 251 patients with at least one suspicious lesion on mpMRI (rated by PI-RADS) were included into the analysis. All patients underwent MRI/US targeted biopsy (TB) in combination with a 10 core systematic prostate biopsy (SB). All biopsies were performed on a sensor-based fusion system. Group A included 162 men who received TB by an axial MRI/US image fusion. Group B comprised 89 men in whom the TB was performed with an additional sagittal image fusion. The median age in group A was 67 years (IQR 61-72) and in group B 68 years (IQR 60-71). The median PSA level in group A was 8.10 ng/ml (IQR 6.05-14) and in group B 8.59 ng/ml (IQR 5.65-12.32). In group A the proportion of patients with a suspicious digital rectal examination (DRE) (14 vs. 29%, p = 0.007) and the proportion of primary biopsies (33 vs 46%, p = 0.046) were significantly lower. The rate of PI-RADS 3 lesions were overrepresented in group A compared to group B (19 vs. 9%; p = 0.044). Classified according to PI-RADS 3, 4 and 5, the detection rates of TB were 42, 48, 75% in group A and 25, 74, 90% in group B. The rate of PCa with a Gleason score ≥7 missed by TB was 33% (18 cases) in group A and 9% (5 cases) in group B; p-value 0.072. An explorative multivariate binary logistic regression analysis revealed that PI-RADS, a suspicious DRE and performing an additional sagittal image fusion were significant predictors for PCa detection in TB. 9 PCa were only detected by TB with sagittal fusion (sTB) and sTB identified 10 additional clinically significant PCa (Gleason ≥7). Performing an additional sagittal image fusion besides the standard axial fusion appears to improve the accuracy of the sensor-based MRI/US fusion platform.
Multi-Sensor Fusion with Interacting Multiple Model Filter for Improved Aircraft Position Accuracy
Cho, Taehwan; Lee, Changho; Choi, Sangbang
2013-01-01
The International Civil Aviation Organization (ICAO) has decided to adopt Communications, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) as the 21st century standard for navigation. Accordingly, ICAO members have provided an impetus to develop related technology and build sufficient infrastructure. For aviation surveillance with CNS/ATM, Ground-Based Augmentation System (GBAS), Automatic Dependent Surveillance-Broadcast (ADS-B), multilateration (MLAT) and wide-area multilateration (WAM) systems are being established. These sensors can track aircraft positions more accurately than existing radar and can compensate for the blind spots in aircraft surveillance. In this paper, we applied a novel sensor fusion method with Interacting Multiple Model (IMM) filter to GBAS, ADS-B, MLAT, and WAM data in order to improve the reliability of the aircraft position. Results of performance analysis show that the position accuracy is improved by the proposed sensor fusion method with the IMM filter. PMID:23535715
Multi-sensor fusion with interacting multiple model filter for improved aircraft position accuracy.
Cho, Taehwan; Lee, Changho; Choi, Sangbang
2013-03-27
The International Civil Aviation Organization (ICAO) has decided to adopt Communications, Navigation, and Surveillance/Air Traffic Management (CNS/ATM) as the 21st century standard for navigation. Accordingly, ICAO members have provided an impetus to develop related technology and build sufficient infrastructure. For aviation surveillance with CNS/ATM, Ground-Based Augmentation System (GBAS), Automatic Dependent Surveillance-Broadcast (ADS-B), multilateration (MLAT) and wide-area multilateration (WAM) systems are being established. These sensors can track aircraft positions more accurately than existing radar and can compensate for the blind spots in aircraft surveillance. In this paper, we applied a novel sensor fusion method with Interacting Multiple Model (IMM) filter to GBAS, ADS-B, MLAT, and WAM data in order to improve the reliability of the aircraft position. Results of performance analysis show that the position accuracy is improved by the proposed sensor fusion method with the IMM filter.
Tsinganos, Panagiotis; Skodras, Athanassios
2018-02-14
In the context of the ageing global population, researchers and scientists have tried to find solutions to many challenges faced by older people. Falls, the leading cause of injury among elderly, are usually severe enough to require immediate medical attention; thus, their detection is of primary importance. To this effect, many fall detection systems that utilize wearable and ambient sensors have been proposed. In this study, we compare three newly proposed data fusion schemes that have been applied in human activity recognition and fall detection. Furthermore, these algorithms are compared to our recent work regarding fall detection in which only one type of sensor is used. The results show that fusion algorithms differ in their performance, whereas a machine learning strategy should be preferred. In conclusion, the methods presented and the comparison of their performance provide useful insights into the problem of fall detection.
All-IP-Ethernet architecture for real-time sensor-fusion processing
NASA Astrophysics Data System (ADS)
Hiraki, Kei; Inaba, Mary; Tezuka, Hiroshi; Tomari, Hisanobu; Koizumi, Kenichi; Kondo, Shuya
2016-03-01
Serendipter is a device that distinguishes and selects very rare particles and cells from huge amount of population. We are currently designing and constructing information processing system for a Serendipter. The information processing system for Serendipter is a kind of sensor-fusion system but with much more difficulties: To fulfill these requirements, we adopt All IP based architecture: All IP-Ethernet based data processing system consists of (1) sensor/detector directly output data as IP-Ethernet packet stream, (2) single Ethernet/TCP/IP streams by a L2 100Gbps Ethernet switch, (3) An FPGA board with 100Gbps Ethernet I/F connected to the switch and a Xeon based server. Circuits in the FPGA include 100Gbps Ethernet MAC, buffers and preprocessing, and real-time Deep learning circuits using multi-layer neural networks. Proposed All-IP architecture solves existing problem to construct large-scale sensor-fusion systems.
Joint FACET: the Canada-Netherlands initiative to study multisensor data fusion systems
NASA Astrophysics Data System (ADS)
Bosse, Eloi; Theil, Arne; Roy, Jean; Huizing, Albert G.; van Aartsen, Simon
1998-09-01
This paper presents the progress of a collaborative effort between Canada and The Netherlands in analyzing multi-sensor data fusion systems, e.g. for potential application to their respective frigates. In view of the overlapping interest in studying and comparing applicability and performance and advanced state-of-the-art Multi-Sensor Data FUsion (MSDF) techniques, the two research establishments involved have decided to join their efforts in the development of MSDF testbeds. This resulted in the so-called Joint-FACET, a highly modular and flexible series of applications that is capable of processing both real and synthetic input data. Joint-FACET allows the user to create and edit test scenarios with multiple ships, sensor and targets, generate realistic sensor outputs, and to process these outputs with a variety of MSDF algorithms. These MSDF algorithms can also be tested using typical experimental data collected during live military exercises.
Mohanasundaram, Ranganathan; Periasamy, Pappampalayam Sanmugam
2015-01-01
The current high profile debate with regard to data storage and its growth have become strategic task in the world of networking. It mainly depends on the sensor nodes called producers, base stations, and also the consumers (users and sensor nodes) to retrieve and use the data. The main concern dealt here is to find an optimal data storage position in wireless sensor networks. The works that have been carried out earlier did not utilize swarm intelligence based optimization approaches to find the optimal data storage positions. To achieve this goal, an efficient swam intelligence approach is used to choose suitable positions for a storage node. Thus, hybrid particle swarm optimization algorithm has been used to find the suitable positions for storage nodes while the total energy cost of data transmission is minimized. Clustering-based distributed data storage is utilized to solve clustering problem using fuzzy-C-means algorithm. This research work also considers the data rates and locations of multiple producers and consumers to find optimal data storage positions. The algorithm is implemented in a network simulator and the experimental results show that the proposed clustering and swarm intelligence based ODS strategy is more effective than the earlier approaches.
Data Fusion for a Vision-Radiological System: a Statistical Calibration Algorithm
DOE Office of Scientific and Technical Information (OSTI.GOV)
Enqvist, Andreas; Koppal, Sanjeev; Riley, Phillip
2015-07-01
Presented here is a fusion system based on simple, low-cost computer vision and radiological sensors for tracking of multiple objects and identifying potential radiological materials being transported or shipped. The main focus of this work is the development of calibration algorithms for characterizing the fused sensor system as a single entity. There is an apparent need for correcting for a scene deviation from the basic inverse distance-squared law governing the detection rates even when evaluating system calibration algorithms. In particular, the computer vision system enables a map of distance-dependence of the sources being tracked, to which the time-dependent radiological datamore » can be incorporated by means of data fusion of the two sensors' output data. (authors)« less
A Novel Strain-Based Method to Estimate Tire Conditions Using Fuzzy Logic for Intelligent Tires.
Garcia-Pozuelo, Daniel; Olatunbosun, Oluremi; Yunta, Jorge; Yang, Xiaoguang; Diaz, Vicente
2017-02-10
The so-called intelligent tires are one of the most promising research fields for automotive engineers. These tires are equipped with sensors which provide information about vehicle dynamics. Up to now, the commercial intelligent tires only provide information about inflation pressure and their contribution to stability control systems is currently very limited. Nowadays one of the major problems for intelligent tire development is how to embed feasible and low cost sensors to obtain reliable information such as inflation pressure, vertical load or rolling speed. These parameters provide key information for vehicle dynamics characterization. In this paper, we propose a novel algorithm based on fuzzy logic to estimate the mentioned parameters by means of a single strain-based system. Experimental tests have been carried out in order to prove the suitability and durability of the proposed on-board strain sensor system, as well as its low cost advantages, and the accuracy of the obtained estimations by means of fuzzy logic.
A Wireless and Batteryless Intelligent Carbon Monoxide Sensor.
Chen, Chen-Chia; Sung, Gang-Neng; Chen, Wen-Ching; Kuo, Chih-Ting; Chue, Jin-Ju; Wu, Chieh-Ming; Huang, Chun-Ming
2016-09-23
Carbon monoxide (CO) poisoning from natural gas water heaters is a common household accident in Taiwan. We propose a wireless and batteryless intelligent CO sensor for improving the safety of operating natural gas water heaters. A micro-hydropower generator supplies power to a CO sensor without battery (COSWOB) (2.5 W at a flow rate of 4.2 L/min), and the power consumption of the COSWOB is only ~13 mW. The COSWOB monitors the CO concentration in ambient conditions around natural gas water heaters and transmits it to an intelligent gateway. When the CO level reaches a dangerous level, the COSWOB alarm sounds loudly. Meanwhile, the intelligent gateway also sends a trigger to activate Wi-Fi alarms and sends notifications to the mobile device through the Internet. Our strategy can warn people indoors and outdoors, thereby reducing CO poisoning accidents. We also believe that our technique not only can be used for home security but also can be used in industrial applications (for example, to monitor leak occurrence in a pipeline).
A Wireless and Batteryless Intelligent Carbon Monoxide Sensor
Chen, Chen-Chia; Sung, Gang-Neng; Chen, Wen-Ching; Kuo, Chih-Ting; Chue, Jin-Ju; Wu, Chieh-Ming; Huang, Chun-Ming
2016-01-01
Carbon monoxide (CO) poisoning from natural gas water heaters is a common household accident in Taiwan. We propose a wireless and batteryless intelligent CO sensor for improving the safety of operating natural gas water heaters. A micro-hydropower generator supplies power to a CO sensor without battery (COSWOB) (2.5 W at a flow rate of 4.2 L/min), and the power consumption of the COSWOB is only ~13 mW. The COSWOB monitors the CO concentration in ambient conditions around natural gas water heaters and transmits it to an intelligent gateway. When the CO level reaches a dangerous level, the COSWOB alarm sounds loudly. Meanwhile, the intelligent gateway also sends a trigger to activate Wi-Fi alarms and sends notifications to the mobile device through the Internet. Our strategy can warn people indoors and outdoors, thereby reducing CO poisoning accidents. We also believe that our technique not only can be used for home security but also can be used in industrial applications (for example, to monitor leak occurrence in a pipeline). PMID:27669255
A Novel Strain-Based Method to Estimate Tire Conditions Using Fuzzy Logic for Intelligent Tires
Garcia-Pozuelo, Daniel; Olatunbosun, Oluremi; Yunta, Jorge; Yang, Xiaoguang; Diaz, Vicente
2017-01-01
The so-called intelligent tires are one of the most promising research fields for automotive engineers. These tires are equipped with sensors which provide information about vehicle dynamics. Up to now, the commercial intelligent tires only provide information about inflation pressure and their contribution to stability control systems is currently very limited. Nowadays one of the major problems for intelligent tire development is how to embed feasible and low cost sensors to obtain reliable information such as inflation pressure, vertical load or rolling speed. These parameters provide key information for vehicle dynamics characterization. In this paper, we propose a novel algorithm based on fuzzy logic to estimate the mentioned parameters by means of a single strain-based system. Experimental tests have been carried out in order to prove the suitability and durability of the proposed on-board strain sensor system, as well as its low cost advantages, and the accuracy of the obtained estimations by means of fuzzy logic. PMID:28208631
Data fusion algorithm for rapid multi-mode dust concentration measurement system based on MEMS
NASA Astrophysics Data System (ADS)
Liao, Maohao; Lou, Wenzhong; Wang, Jinkui; Zhang, Yan
2018-03-01
As single measurement method cannot fully meet the technical requirements of dust concentration measurement, the multi-mode detection method is put forward, as well as the new requirements for data processing. This paper presents a new dust concentration measurement system which contains MEMS ultrasonic sensor and MEMS capacitance sensor, and presents a new data fusion algorithm for this multi-mode dust concentration measurement system. After analyzing the relation between the data of the composite measurement method, the data fusion algorithm based on Kalman filtering is established, which effectively improve the measurement accuracy, and ultimately forms a rapid data fusion model of dust concentration measurement. Test results show that the data fusion algorithm is able to realize the rapid and exact concentration detection.
Data fusion of multiple kinect sensors for a rehabilitation system.
Huibin Du; Yiwen Zhao; Jianda Han; Zheng Wang; Guoli Song
2016-08-01
Kinect-like depth sensors have been widely used in rehabilitation systems. However, single depth sensor processes limb-blocking, data loss or data error poorly, making it less reliable. This paper focus on using two Kinect sensors and data fusion method to solve these problems. First, two Kinect sensors capture the motion data of the healthy arm of the hemiplegic patient; Second, merge the data using the method of Set-Membership-Filter (SMF); Then, mirror this motion data by the Middle-Plane; In the end, control the wearable robotic arm driving the patient's paralytic arm so that the patient can interactively and initiatively complete a variety of recovery actions prompted by computer with 3D animation games.
Knowledge Flow Mesh and Its Dynamics: A Decision Support Environment
2008-06-01
paper was the ability of the United States military to achieve dominance through information superiority. The use of intelligent sensors and... Intelligence Agency, National Security Agency, Defense Intelligence Agency, and individual Service intelligence agencies). In fact, these edge entities would... intelligence , design, choice, and implementation. 6. Support variety of decision processes and styles. 7. DSS should be adaptable and flexible. 8. DSS
NASA Astrophysics Data System (ADS)
Emmerman, Philip J.
2005-05-01
Teams of robots or mixed teams of warfighters and robots on reconnaissance and other missions can benefit greatly from a local fusion station. A local fusion station is defined here as a small mobile processor with interfaces to enable the ingestion of multiple heterogeneous sensor data and information streams, including blue force tracking data. These data streams are fused and integrated with contextual information (terrain features, weather, maps, dynamic background features, etc.), and displayed or processed to provide real time situational awareness to the robot controller or to the robots themselves. These blue and red force fusion applications remove redundancies, lessen ambiguities, correlate, aggregate, and integrate sensor information with context such as high resolution terrain. Applications such as safety, team behavior, asset control, training, pattern analysis, etc. can be generated or enhanced by these fusion stations. This local fusion station should also enable the interaction between these local units and a global information world.
An Intelligent Surveillance Platform for Large Metropolitan Areas with Dense Sensor Deployment
Fernández, Jorge; Calavia, Lorena; Baladrón, Carlos; Aguiar, Javier M.; Carro, Belén; Sánchez-Esguevillas, Antonio; Alonso-López, Jesus A.; Smilansky, Zeev
2013-01-01
This paper presents an intelligent surveillance platform based on the usage of large numbers of inexpensive sensors designed and developed inside the European Eureka Celtic project HuSIMS. With the aim of maximizing the number of deployable units while keeping monetary and resource/bandwidth costs at a minimum, the surveillance platform is based on the usage of inexpensive visual sensors which apply efficient motion detection and tracking algorithms to transform the video signal in a set of motion parameters. In order to automate the analysis of the myriad of data streams generated by the visual sensors, the platform's control center includes an alarm detection engine which comprises three components applying three different Artificial Intelligence strategies in parallel. These strategies are generic, domain-independent approaches which are able to operate in several domains (traffic surveillance, vandalism prevention, perimeter security, etc.). The architecture is completed with a versatile communication network which facilitates data collection from the visual sensors and alarm and video stream distribution towards the emergency teams. The resulting surveillance system is extremely suitable for its deployment in metropolitan areas, smart cities, and large facilities, mainly because cheap visual sensors and autonomous alarm detection facilitate dense sensor network deployments for wide and detailed coverage. PMID:23748169
Using multiple sensors for printed circuit board insertion
NASA Technical Reports Server (NTRS)
Sood, Deepak; Repko, Michael C.; Kelley, Robert B.
1989-01-01
As more and more activities are performed in space, there will be a greater demand placed on the information handling capacity of people who are to direct and accomplish these tasks. A promising alternative to full-time human involvement is the use of semi-autonomous, intelligent robot systems. To automate tasks such as assembly, disassembly, repair and maintenance, the issues presented by environmental uncertainties need to be addressed. These uncertainties are introduced by variations in the computed position of the robot at different locations in its work envelope, variations in part positioning, and tolerances of part dimensions. As a result, the robot system may not be able to accomplish the desired task without the help of sensor feedback. Measurements on the environment allow real time corrections to be made to the process. A design and implementation of an intelligent robot system which inserts printed circuit boards into a card cage are presented. Intelligent behavior is accomplished by coupling the task execution sequence with information derived from three different sensors: an overhead three-dimensional vision system, a fingertip infrared sensor, and a six degree of freedom wrist-mounted force/torque sensor.
Study on robot motion control for intelligent welding processes based on the laser tracking sensor
NASA Astrophysics Data System (ADS)
Zhang, Bin; Wang, Qian; Tang, Chen; Wang, Ju
2017-06-01
A robot motion control method is presented for intelligent welding processes of complex spatial free-form curve seams based on the laser tracking sensor. First, calculate the tip position of the welding torch according to the velocity of the torch and the seam trajectory detected by the sensor. Then, search the optimal pose of the torch under constraints using genetic algorithms. As a result, the intersection point of the weld seam and the laser plane of the sensor is within the detectable range of the sensor. Meanwhile, the angle between the axis of the welding torch and the tangent of the weld seam meets the requirements. The feasibility of the control method is proved by simulation.
Intelligence, Surveillance, and Reconnaissance Fusion for Coalition Operations
2008-07-01
classification of the targets of interest. The MMI features extracted in this manner have two properties that provide a sound justification for...are generalizations of well- known feature extraction methods such as Principal Components Analysis (PCA) and Independent Component Analysis (ICA...augment (without degrading performance) a large class of generic fusion processes. Ontologies Classifications Feature extraction Feature analysis
Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device
He, Xiang; Aloi, Daniel N.; Li, Jia
2015-01-01
Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design. PMID:26694387
Probabilistic Multi-Sensor Fusion Based Indoor Positioning System on a Mobile Device.
He, Xiang; Aloi, Daniel N; Li, Jia
2015-12-14
Nowadays, smart mobile devices include more and more sensors on board, such as motion sensors (accelerometer, gyroscope, magnetometer), wireless signal strength indicators (WiFi, Bluetooth, Zigbee), and visual sensors (LiDAR, camera). People have developed various indoor positioning techniques based on these sensors. In this paper, the probabilistic fusion of multiple sensors is investigated in a hidden Markov model (HMM) framework for mobile-device user-positioning. We propose a graph structure to store the model constructed by multiple sensors during the offline training phase, and a multimodal particle filter to seamlessly fuse the information during the online tracking phase. Based on our algorithm, we develop an indoor positioning system on the iOS platform. The experiments carried out in a typical indoor environment have shown promising results for our proposed algorithm and system design.
Joint sparsity based heterogeneous data-level fusion for target detection and estimation
NASA Astrophysics Data System (ADS)
Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe
2017-05-01
Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.
Improved chemical identification from sensor arrays using intelligent algorithms
NASA Astrophysics Data System (ADS)
Roppel, Thaddeus A.; Wilson, Denise M.
2001-02-01
Intelligent signal processing algorithms are shown to improve identification rates significantly in chemical sensor arrays. This paper focuses on the use of independently derived sensor status information to modify the processing of sensor array data by using a fast, easily-implemented "best-match" approach to filling in missing sensor data. Most fault conditions of interest (e.g., stuck high, stuck low, sudden jumps, excess noise, etc.) can be detected relatively simply by adjunct data processing, or by on-board circuitry. The objective then is to devise, implement, and test methods for using this information to improve the identification rates in the presence of faulted sensors. In one typical example studied, utilizing separately derived, a-priori knowledge about the health of the sensors in the array improved the chemical identification rate by an artificial neural network from below 10 percent correct to over 99 percent correct. While this study focuses experimentally on chemical sensor arrays, the results are readily extensible to other types of sensor platforms.
Hsu, Ling-Yuan; Chen, Tsung-Lin
2012-11-13
This paper presents a vehicle dynamics prediction system, which consists of a sensor fusion system and a vehicle parameter identification system. This sensor fusion system can obtain the six degree-of-freedom vehicle dynamics and two road angles without using a vehicle model. The vehicle parameter identification system uses the vehicle dynamics from the sensor fusion system to identify ten vehicle parameters in real time, including vehicle mass, moment of inertial, and road friction coefficients. With above two systems, the future vehicle dynamics is predicted by using a vehicle dynamics model, obtained from the parameter identification system, to propagate with time the current vehicle state values, obtained from the sensor fusion system. Comparing with most existing literatures in this field, the proposed approach improves the prediction accuracy both by incorporating more vehicle dynamics to the prediction system and by on-line identification to minimize the vehicle modeling errors. Simulation results show that the proposed method successfully predicts the vehicle dynamics in a left-hand turn event and a rollover event. The prediction inaccuracy is 0.51% in a left-hand turn event and 27.3% in a rollover event.
Hsu, Ling-Yuan; Chen, Tsung-Lin
2012-01-01
This paper presents a vehicle dynamics prediction system, which consists of a sensor fusion system and a vehicle parameter identification system. This sensor fusion system can obtain the six degree-of-freedom vehicle dynamics and two road angles without using a vehicle model. The vehicle parameter identification system uses the vehicle dynamics from the sensor fusion system to identify ten vehicle parameters in real time, including vehicle mass, moment of inertial, and road friction coefficients. With above two systems, the future vehicle dynamics is predicted by using a vehicle dynamics model, obtained from the parameter identification system, to propagate with time the current vehicle state values, obtained from the sensor fusion system. Comparing with most existing literatures in this field, the proposed approach improves the prediction accuracy both by incorporating more vehicle dynamics to the prediction system and by on-line identification to minimize the vehicle modeling errors. Simulation results show that the proposed method successfully predicts the vehicle dynamics in a left-hand turn event and a rollover event. The prediction inaccuracy is 0.51% in a left-hand turn event and 27.3% in a rollover event. PMID:23202231
AGSM Intelligent Devices/Smart Sensors Project
NASA Technical Reports Server (NTRS)
Harp, Janicce Leshay
2014-01-01
This project provides development and qualification of Smart Sensors capable of self-diagnosis and assessment of their capability/readiness to support operations. These sensors will provide pressure and temperature measurements to use in ground systems.
Rizvi, Sanam Shahla; Chung, Tae-Sun
2010-01-01
Flash memory has become a more widespread storage medium for modern wireless devices because of its effective characteristics like non-volatility, small size, light weight, fast access speed, shock resistance, high reliability and low power consumption. Sensor nodes are highly resource constrained in terms of limited processing speed, runtime memory, persistent storage, communication bandwidth and finite energy. Therefore, for wireless sensor networks supporting sense, store, merge and send schemes, an efficient and reliable file system is highly required with consideration of sensor node constraints. In this paper, we propose a novel log structured external NAND flash memory based file system, called Proceeding to Intelligent service oriented memorY Allocation for flash based data centric Sensor devices in wireless sensor networks (PIYAS). This is the extended version of our previously proposed PIYA [1]. The main goals of the PIYAS scheme are to achieve instant mounting and reduced SRAM space by keeping memory mapping information to a very low size of and to provide high query response throughput by allocation of memory to the sensor data by network business rules. The scheme intelligently samples and stores the raw data and provides high in-network data availability by keeping the aggregate data for a longer period of time than any other scheme has done before. We propose effective garbage collection and wear-leveling schemes as well. The experimental results show that PIYAS is an optimized memory management scheme allowing high performance for wireless sensor networks.
An Autonomous Sensor System Architecture for Active Flow and Noise Control Feedback
NASA Technical Reports Server (NTRS)
Humphreys, William M, Jr.; Culliton, William G.
2008-01-01
Multi-channel sensor fusion represents a powerful technique to simply and efficiently extract information from complex phenomena. While the technique has traditionally been used for military target tracking and situational awareness, a study has been successfully completed that demonstrates that sensor fusion can be applied equally well to aerodynamic applications. A prototype autonomous hardware processor was successfully designed and used to detect in real-time the two-dimensional flow reattachment location generated by a simple separated-flow wind tunnel model. The success of this demonstration illustrates the feasibility of using autonomous sensor processing architectures to enhance flow control feedback signal generation.
Embedded Relative Navigation Sensor Fusion Algorithms for Autonomous Rendezvous and Docking Missions
NASA Technical Reports Server (NTRS)
DeKock, Brandon K.; Betts, Kevin M.; McDuffie, James H.; Dreas, Christine B.
2008-01-01
bd Systems (a subsidiary of SAIC) has developed a suite of embedded relative navigation sensor fusion algorithms to enable NASA autonomous rendezvous and docking (AR&D) missions. Translational and rotational Extended Kalman Filters (EKFs) were developed for integrating measurements based on the vehicles' orbital mechanics and high-fidelity sensor error models and provide a solution with increased accuracy and robustness relative to any single relative navigation sensor. The filters were tested tinough stand-alone covariance analysis, closed-loop testing with a high-fidelity multi-body orbital simulation, and hardware-in-the-loop (HWIL) testing in the Marshall Space Flight Center (MSFC) Flight Robotics Laboratory (FRL).
Application of data fusion techniques and technologies for wearable health monitoring.
King, Rachel C; Villeneuve, Emma; White, Ruth J; Sherratt, R Simon; Holderbaum, William; Harwin, William S
2017-04-01
Technological advances in sensors and communications have enabled discrete integration into everyday objects, both in the home and about the person. Information gathered by monitoring physiological, behavioural, and social aspects of our lives, can be used to achieve a positive impact on quality of life, health, and well-being. Wearable sensors are at the cusp of becoming truly pervasive, and could be woven into the clothes and accessories that we wear such that they become ubiquitous and transparent. To interpret the complex multidimensional information provided by these sensors, data fusion techniques are employed to provide a meaningful representation of the sensor outputs. This paper is intended to provide a short overview of data fusion techniques and algorithms that can be used to interpret wearable sensor data in the context of health monitoring applications. The application of these techniques are then described in the context of healthcare including activity and ambulatory monitoring, gait analysis, fall detection, and biometric monitoring. A snap-shot of current commercially available sensors is also provided, focusing on their sensing capability, and a commentary on the gaps that need to be bridged to bring research to market. Copyright © 2017. Published by Elsevier Ltd.
The role of data fusion in predictive maintenance using digital twin
NASA Astrophysics Data System (ADS)
Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih
2018-04-01
Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.
Fusion solution for soldier wearable gunfire detection systems
NASA Astrophysics Data System (ADS)
Cakiades, George; Desai, Sachi; Deligeorges, Socrates; Buckland, Bruce E.; George, Jemin
2012-06-01
Currently existing acoustic based Gunfire Detection Systems (GDS) such as soldier wearable, vehicle mounted, and fixed site devices provide enemy detection and localization capabilities to the user. However, the solution to the problem of portability versus performance tradeoff remains elusive. The Data Fusion Module (DFM), described herein, is a sensor/platform agnostic software supplemental tool that addresses this tradeoff problem by leveraging existing soldier networks to enhance GDS performance across a Tactical Combat Unit (TCU). The DFM software enhances performance by leveraging all available acoustic GDS information across the TCU synergistically to calculate highly accurate solutions more consistently than any individual GDS in the TCU. The networked sensor architecture provides additional capabilities addressing the multiple shooter/fire-fight problems in addition to sniper detection/localization. The addition of the fusion solution to the overall Size, Weight and Power & Cost (SWaP&C) is zero to negligible. At the end of the first-year effort, the DFM integrated sensor network's performance was impressive showing improvements upwards of 50% in comparison to a single sensor solution. Further improvements are expected when the networked sensor architecture created in this effort is fully exploited.
Integrated intelligent sensor for the textile industry
NASA Astrophysics Data System (ADS)
Peltie, Philippe; David, Dominique
1996-08-01
A new sensor has been developed for pantyhose inspection. Unlike a first complete inspection machine devoted to post- manufacturing control of the whole panty, this sensor will be directly integrated on currently existing manufacturing machines, and will combine advantages of miniaturization is to design an intelligent, compact and very cheap product, which should be integrated without requiring any modifications of host machines. The sensor part was designed to achieve closed acquisition, and various solutions have been explored to maintain adequate depth of field. The illumination source will be integrated in the device. The processing part will include correction facilities and electronic processing. Finally, high-level information will be output in order to interface directly with the manufacturing machine automate.
Performance Evaluation of Fusing Protected Fingerprint Minutiae Templates on the Decision Level
Yang, Bian; Busch, Christoph; de Groot, Koen; Xu, Haiyun; Veldhuis, Raymond N. J.
2012-01-01
In a biometric authentication system using protected templates, a pseudonymous identifier is the part of a protected template that can be directly compared. Each compared pair of pseudonymous identifiers results in a decision testing whether both identifiers are derived from the same biometric characteristic. Compared to an unprotected system, most existing biometric template protection methods cause to a certain extent degradation in biometric performance. Fusion is therefore a promising way to enhance the biometric performance in template-protected biometric systems. Compared to feature level fusion and score level fusion, decision level fusion has not only the least fusion complexity, but also the maximum interoperability across different biometric features, template protection and recognition algorithms, templates formats, and comparison score rules. However, performance improvement via decision level fusion is not obvious. It is influenced by both the dependency and the performance gap among the conducted tests for fusion. We investigate in this paper several fusion scenarios (multi-sample, multi-instance, multi-sensor, multi-algorithm, and their combinations) on the binary decision level, and evaluate their biometric performance and fusion efficiency on a multi-sensor fingerprint database with 71,994 samples. PMID:22778583
Mass classification in mammography with multi-agent based fusion of human and machine intelligence
NASA Astrophysics Data System (ADS)
Xi, Dongdong; Fan, Ming; Li, Lihua; Zhang, Juan; Shan, Yanna; Dai, Gang; Zheng, Bin
2016-03-01
Although the computer-aided diagnosis (CAD) system can be applied for classifying the breast masses, the effects of this method on improvement of the radiologist' accuracy for distinguishing malignant from benign lesions still remain unclear. This study provided a novel method to classify breast masses by integrating the intelligence of human and machine. In this research, 224 breast masses were selected in mammography from database of DDSM with Breast Imaging Reporting and Data System (BI-RADS) categories. Three observers (a senior and a junior radiologist, as well as a radiology resident) were employed to independently read and classify these masses utilizing the Positive Predictive Values (PPV) for each BI-RADS category. Meanwhile, a CAD system was also implemented for classification of these breast masses between malignant and benign. To combine the decisions from the radiologists and CAD, the fusion method of the Multi-Agent was provided. Significant improvements are observed for the fusion system over solely radiologist or CAD. The area under the receiver operating characteristic curve (AUC) of the fusion system increased by 9.6%, 10.3% and 21% compared to that of radiologists with senior, junior and resident level, respectively. In addition, the AUC of this method based on the fusion of each radiologist and CAD are 3.5%, 3.6% and 3.3% higher than that of CAD alone. Finally, the fusion of the three radiologists with CAD achieved AUC value of 0.957, which was 5.6% larger compared to CAD. Our results indicated that the proposed fusion method has better performance than radiologist or CAD alone.
Vehicle following controller design for autonomous intelligent vehicles
NASA Technical Reports Server (NTRS)
Chien, C. C.; Lai, M. C.; Mayr, R.
1994-01-01
A new vehicle following controller is proposed for autonomous intelligent vehicles. The proposed vehicle following controller not only provides smooth transient maneuvers for unavoidable nonzero initial conditions but also guarantees the asymptotic platoon stability without the availability of feedforward information. Furthermore, the achieved asymptotic platoon stability is shown to be robust to sensor delays and an upper bound for the allowable sensor delays is also provided in this paper.
Mohanasundaram, Ranganathan; Periasamy, Pappampalayam Sanmugam
2015-01-01
The current high profile debate with regard to data storage and its growth have become strategic task in the world of networking. It mainly depends on the sensor nodes called producers, base stations, and also the consumers (users and sensor nodes) to retrieve and use the data. The main concern dealt here is to find an optimal data storage position in wireless sensor networks. The works that have been carried out earlier did not utilize swarm intelligence based optimization approaches to find the optimal data storage positions. To achieve this goal, an efficient swam intelligence approach is used to choose suitable positions for a storage node. Thus, hybrid particle swarm optimization algorithm has been used to find the suitable positions for storage nodes while the total energy cost of data transmission is minimized. Clustering-based distributed data storage is utilized to solve clustering problem using fuzzy-C-means algorithm. This research work also considers the data rates and locations of multiple producers and consumers to find optimal data storage positions. The algorithm is implemented in a network simulator and the experimental results show that the proposed clustering and swarm intelligence based ODS strategy is more effective than the earlier approaches. PMID:25734182
Discrete distributed strain sensing of intelligent structures
NASA Technical Reports Server (NTRS)
Anderson, Mark S.; Crawley, Edward F.
1992-01-01
Techniques are developed for the design of discrete highly distributed sensor systems for use in intelligent structures. First the functional requirements for such a system are presented. Discrete spatially averaging strain sensors are then identified as satisfying the functional requirements. A variety of spatial weightings for spatially averaging sensors are examined, and their wave number characteristics are determined. Preferable spatial weightings are identified. Several numerical integration rules used to integrate such sensors in order to determine the global deflection of the structure are discussed. A numerical simulation is conducted using point and rectangular sensors mounted on a cantilevered beam under static loading. Gage factor and sensor position uncertainties are incorporated to assess the absolute error and standard deviation of the error in the estimated tip displacement found by numerically integrating the sensor outputs. An experiment is carried out using a statically loaded cantilevered beam with five point sensors. It is found that in most cases the actual experimental error is within one standard deviation of the absolute error as found in the numerical simulation.
A Trusted National Fusion Center Network: Are Baseline Capabilities and Accreditation Needed?
2010-09-01
Criminal Intelligence Sharing Plan NCTC National Counterterrorism Center NEMA National Emergency Management Agency NFCA National Fusion Center...1997, during its mid-year conference, the National Emergency Management Association ( NEMA ) met to discuss the need for the development of nationally...and accreditation. EMAP is governed by a commission comprised of ten members, consisting of five representatives from NEMA and five
Detection of buried objects by fusing dual-band infrared images
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, G.A.; Sengupta, S.K.; Sherwood, R.J.
1993-11-01
We have conducted experiments to demonstrate the enhanced detectability of buried land mines using sensor fusion techniques. Multiple sensors, including visible imagery, infrared imagery, and ground penetrating radar (GPR), have been used to acquire data on a number of buried mines and mine surrogates. Because the visible wavelength and GPR data are currently incomplete. This paper focuses on the fusion of two-band infrared images. We use feature-level fusion and supervised learning with the probabilistic neural network (PNN) to evaluate detection performance. The novelty of the work lies in the application of advanced target recognition algorithms, the fusion of dual-band infraredmore » images and evaluation of the techniques using two real data sets.« less
The Human Factors of Sensor Fusion
2008-05-01
tool in future military operations. 23 7. References Abdi, H. Neural Networks. In M. Lewis-Beck, A . Bryman , T. Futing (Eds), Encyclopedia for... a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1...This report discusses select, cognitively based principles associated with the sensor fusion process. A review is made of the standard
Biometric image enhancement using decision rule based image fusion techniques
NASA Astrophysics Data System (ADS)
Sagayee, G. Mary Amirtha; Arumugam, S.
2010-02-01
Introducing biometrics into information systems may result in considerable benefits. Most of the researchers confirmed that the finger print is widely used than the iris or face and more over it is the primary choice for most privacy concerned applications. For finger prints applications, choosing proper sensor is at risk. The proposed work deals about, how the image quality can be improved by introducing image fusion technique at sensor levels. The results of the images after introducing the decision rule based image fusion technique are evaluated and analyzed with its entropy levels and root mean square error.
NASA Technical Reports Server (NTRS)
Litt, Jonathan; Kurtkaya, Mehmet; Duyar, Ahmet
1994-01-01
This paper presents an application of a fault detection and diagnosis scheme for the sensor faults of a helicopter engine. The scheme utilizes a model-based approach with real time identification and hypothesis testing which can provide early detection, isolation, and diagnosis of failures. It is an integral part of a proposed intelligent control system with health monitoring capabilities. The intelligent control system will allow for accommodation of faults, reduce maintenance cost, and increase system availability. The scheme compares the measured outputs of the engine with the expected outputs of an engine whose sensor suite is functioning normally. If the differences between the real and expected outputs exceed threshold values, a fault is detected. The isolation of sensor failures is accomplished through a fault parameter isolation technique where parameters which model the faulty process are calculated on-line with a real-time multivariable parameter estimation algorithm. The fault parameters and their patterns can then be analyzed for diagnostic and accommodation purposes. The scheme is applied to the detection and diagnosis of sensor faults of a T700 turboshaft engine. Sensor failures are induced in a T700 nonlinear performance simulation and data obtained are used with the scheme to detect, isolate, and estimate the magnitude of the faults.
A Universal Intelligent System-on-Chip Based Sensor Interface
Mattoli, Virgilio; Mondini, Alessio; Mazzolai, Barbara; Ferri, Gabriele; Dario, Paolo
2010-01-01
The need for real-time/reliable/low-maintenance distributed monitoring systems, e.g., wireless sensor networks, has been becoming more and more evident in many applications in the environmental, agro-alimentary, medical, and industrial fields. The growing interest in technologies related to sensors is an important indicator of these new needs. The design and the realization of complex and/or distributed monitoring systems is often difficult due to the multitude of different electronic interfaces presented by the sensors available on the market. To address these issues the authors propose the concept of a Universal Intelligent Sensor Interface (UISI), a new low-cost system based on a single commercial chip able to convert a generic transducer into an intelligent sensor with multiple standardized interfaces. The device presented offers a flexible analog and/or digital front-end, able to interface different transducer typologies (such as conditioned, unconditioned, resistive, current output, capacitive and digital transducers). The device also provides enhanced processing and storage capabilities, as well as a configurable multi-standard output interface (including plug-and-play interface based on IEEE 1451.3). In this work the general concept of UISI and the design of reconfigurable hardware are presented, together with experimental test results validating the proposed device. PMID:22163624
Determination of feature generation methods for PTZ camera object tracking
NASA Astrophysics Data System (ADS)
Doyle, Daniel D.; Black, Jonathan T.
2012-06-01
Object detection and tracking using computer vision (CV) techniques have been widely applied to sensor fusion applications. Many papers continue to be written that speed up performance and increase learning of artificially intelligent systems through improved algorithms, workload distribution, and information fusion. Military application of real-time tracking systems is becoming more and more complex with an ever increasing need of fusion and CV techniques to actively track and control dynamic systems. Examples include the use of metrology systems for tracking and measuring micro air vehicles (MAVs) and autonomous navigation systems for controlling MAVs. This paper seeks to contribute to the determination of select tracking algorithms that best track a moving object using a pan/tilt/zoom (PTZ) camera applicable to both of the examples presented. The select feature generation algorithms compared in this paper are the trained Scale-Invariant Feature Transform (SIFT) and Speeded Up Robust Features (SURF), the Mixture of Gaussians (MoG) background subtraction method, the Lucas- Kanade optical flow method (2000) and the Farneback optical flow method (2003). The matching algorithm used in this paper for the trained feature generation algorithms is the Fast Library for Approximate Nearest Neighbors (FLANN). The BSD licensed OpenCV library is used extensively to demonstrate the viability of each algorithm and its performance. Initial testing is performed on a sequence of images using a stationary camera. Further testing is performed on a sequence of images such that the PTZ camera is moving in order to capture the moving object. Comparisons are made based upon accuracy, speed and memory.
NASA Astrophysics Data System (ADS)
Lebedev, M. A.; Stepaniants, D. G.; Komarov, D. V.; Vygolov, O. V.; Vizilter, Yu. V.; Zheltov, S. Yu.
2014-08-01
The paper addresses a promising visualization concept related to combination of sensor and synthetic images in order to enhance situation awareness of a pilot during an aircraft landing. A real-time algorithm for a fusion of a sensor image, acquired by an onboard camera, and a synthetic 3D image of the external view, generated in an onboard computer, is proposed. The pixel correspondence between the sensor and the synthetic images is obtained by an exterior orientation of a "virtual" camera using runway points as a geospatial reference. The runway points are detected by the Projective Hough Transform, which idea is to project the edge map onto a horizontal plane in the object space (the runway plane) and then to calculate intensity projections of edge pixels on different directions of intensity gradient. The performed experiments on simulated images show that on a base glide path the algorithm provides image fusion with pixel accuracy, even in the case of significant navigation errors.
URREF Reliability Versus Credibility in Information Fusion
2013-07-01
Fusion, Vol. 3, No. 2, December, 2008. [31] E. Blasch, J. Dezert, and P. Valin , “DSMT Applied to Seismic and Acoustic Sensor Fusion,” Proc. IEEE Nat...44] E. Blasch, P. Valin , E. Bossé, “Measures of Effectiveness for High- Level Fusion,” Int. Conference on Information Fusion, 2010. [45] X. Mei, H...and P. Valin , “Information Fusion Measures of Effectiveness (MOE) for Decision Support,” Proc. SPIE 8050, 2011. [49] Y. Zheng, W. Dong, and E
Development of a fusion approach selection tool
NASA Astrophysics Data System (ADS)
Pohl, C.; Zeng, Y.
2015-06-01
During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.
Hall, Travis; Nguyen, Tam Q.; Mayeda, Jill C.; Lie, Paul E.; Lopez, Jerry; Banister, Ron E.
2017-01-01
It has been the dream of many scientists and engineers to realize a non-contact remote sensing system that can perform continuous, accurate and long-term monitoring of human vital signs as we have seen in many Sci-Fi movies. Having an intelligible sensor system that can measure and record key vital signs (such as heart rates and respiration rates) remotely and continuously without touching the patients, for example, can be an invaluable tool for physicians who need to make rapid life-and-death decisions. Such a sensor system can also effectively help physicians and patients making better informed decisions when patients’ long-term vital signs data is available. Therefore, there has been a lot of research activities on developing a non-contact sensor system that can monitor a patient’s vital signs and quickly transmit the information to healthcare professionals. Doppler-based radio-frequency (RF) non-contact vital signs (NCVS) monitoring system are particularly attractive for long term vital signs monitoring because there are no wires, electrodes, wearable devices, nor any contact-based sensors involved so the subjects may not be even aware of the ubiquitous monitoring. In this paper, we will provide a brief review on some latest development on NCVS sensors and compare them against a few novel and intelligent phased-array Doppler-based RF NCVS biosensors we have built in our labs. Some of our NCVS sensor tests were performed within a clutter-free anechoic chamber to mitigate the environmental clutters, while most tests were conducted within the typical Herman-Miller type office cubicle setting to mimic a more practical monitoring environment. Additionally, we will show the measurement data to demonstrate the feasibility of long-term NCVS monitoring. The measured data strongly suggests that our latest phased array NCVS system should be able to perform long-term vital signs monitoring intelligently and robustly, especially for situations where the subject is sleeping without hectic movements nearby. PMID:29140281
Hall, Travis; Lie, Donald Y C; Nguyen, Tam Q; Mayeda, Jill C; Lie, Paul E; Lopez, Jerry; Banister, Ron E
2017-11-15
It has been the dream of many scientists and engineers to realize a non-contact remote sensing system that can perform continuous, accurate and long-term monitoring of human vital signs as we have seen in many Sci-Fi movies. Having an intelligible sensor system that can measure and record key vital signs (such as heart rates and respiration rates) remotely and continuously without touching the patients, for example, can be an invaluable tool for physicians who need to make rapid life-and-death decisions. Such a sensor system can also effectively help physicians and patients making better informed decisions when patients' long-term vital signs data is available. Therefore, there has been a lot of research activities on developing a non-contact sensor system that can monitor a patient's vital signs and quickly transmit the information to healthcare professionals. Doppler-based radio-frequency (RF) non-contact vital signs (NCVS) monitoring system are particularly attractive for long term vital signs monitoring because there are no wires, electrodes, wearable devices, nor any contact-based sensors involved so the subjects may not be even aware of the ubiquitous monitoring. In this paper, we will provide a brief review on some latest development on NCVS sensors and compare them against a few novel and intelligent phased-array Doppler-based RF NCVS biosensors we have built in our labs. Some of our NCVS sensor tests were performed within a clutter-free anechoic chamber to mitigate the environmental clutters, while most tests were conducted within the typical Herman-Miller type office cubicle setting to mimic a more practical monitoring environment. Additionally, we will show the measurement data to demonstrate the feasibility of long-term NCVS monitoring. The measured data strongly suggests that our latest phased array NCVS system should be able to perform long-term vital signs monitoring intelligently and robustly, especially for situations where the subject is sleeping without hectic movements nearby.
Advanced Ground Systems Maintenance Intelligent Devices/Smart Sensors Project
NASA Technical Reports Server (NTRS)
Perotti, Jose M. (Compiler)
2015-01-01
This project provides development and qualification of Smart Sensors capable of self-diagnosis and assessment of their capability/readiness to support operations. These sensors will provide pressure and temperature measurements for use in ground systems.
Self-powered Real-time Movement Monitoring Sensor Using Triboelectric Nanogenerator Technology.
Jin, Liangmin; Tao, Juan; Bao, Rongrong; Sun, Li; Pan, Caofeng
2017-09-05
The triboelectric nanogenerator (TENG) has great potential in the field of self-powered sensor fabrication. Recently, smart electronic devices and movement monitoring sensors have attracted the attention of scientists because of their application in the field of artificial intelligence. In this article, a TENG finger movement monitoring, self-powered sensor has been designed and analysed. Under finger movements, the TENG realizes the contact and separation to convert the mechanical energy into electrical signal. A pulse output current of 7.8 μA is generated by the bending and straightening motions of the artificial finger. The optimal output power can be realized when the external resistance is approximately 30 MΩ. The random motions of the finger are detected by the system with multiple TENG sensors in series. This type of flexible and self-powered sensor has potential applications in artificial intelligence and robot manufacturing.
A heuristic for deriving the optimal number and placement of reconnaissance sensors
NASA Astrophysics Data System (ADS)
Nanda, S.; Weeks, J.; Archer, M.
2008-04-01
A key to mastering asymmetric warfare is the acquisition of accurate intelligence on adversaries and their assets in urban and open battlefields. To achieve this, one needs adequate numbers of tactical sensors placed in locations to optimize coverage, where optimality is realized by covering a given area of interest with the least number of sensors, or covering the largest possible subsection of an area of interest with a fixed set of sensors. Unfortunately, neither problem admits a polynomial time algorithm as a solution, and therefore, the placement of such sensors must utilize intelligent heuristics instead. In this paper, we present a scheme implemented on parallel SIMD processing architectures to yield significantly faster results, and that is highly scalable with respect to dynamic changes in the area of interest. Furthermore, the solution to the first problem immediately translates to serve as a solution to the latter if and when any sensors are rendered inoperable.
A survey on bio inspired meta heuristic based clustering protocols for wireless sensor networks
NASA Astrophysics Data System (ADS)
Datta, A.; Nandakumar, S.
2017-11-01
Recent studies have shown that utilizing a mobile sink to harvest and carry data from a Wireless Sensor Network (WSN) can improve network operational efficiency as well as maintain uniform energy consumption by the sensor nodes in the network. Due to Sink mobility, the path between two sensor nodes continuously changes and this has a profound effect on the operational longevity of the network and a need arises for a protocol which utilizes minimal resources in maintaining routes between the mobile sink and the sensor nodes. Swarm Intelligence based techniques inspired by the foraging behavior of ants, termites and honey bees can be artificially simulated and utilized to solve real wireless network problems. The author presents a brief survey on various bio inspired swarm intelligence based protocols used in routing data in wireless sensor networks while outlining their general principle and operation.
NASA Technical Reports Server (NTRS)
Pavel, M.
1993-01-01
The topics covered include the following: a system overview of the basic components of a system designed to improve the ability of a pilot to fly through low-visibility conditions such as fog; the role of visual sciences; fusion issues; sensor characterization; sources of information; image processing; and image fusion.
Pan, Leilei; Yang, Simon X
2007-12-01
This paper introduces a new portable intelligent electronic nose system developed especially for measuring and analysing livestock and poultry farm odours. It can be used in both laboratory and field. The sensor array of the proposed electronic nose consists of 14 gas sensors, a humidity sensor, and a temperature sensor. The gas sensors were especially selected for the main compounds from the livestock farm odours. An expert system called "Odour Expert" was developed to support researchers' and farmers' decision making on odour control strategies for livestock and poultry operations. "Odour Expert" utilises several advanced artificial intelligence technologies tailored to livestock and poultry farm odours. It can provide more advanced odour analysis than existing commercially available products. In addition, a rank of odour generation factors is provided, which refines the focus of odour control research. Field experiments were conducted downwind from the barns on 14 livestock and poultry farms. Experimental results show that the predicted odour strengths by the electronic nose yield higher consistency in comparison to the perceived odour intensity by human panel. The "Odour Expert" is a useful tool for assisting farmers' odour management practises.
New hydrologic instrumentation in the U.S. Geological Survey
Latkovich, V.J.; Shope, W.G.; ,
1991-01-01
New water-level sensing and recording instrumentation is being used by the U.S. Geological Survey for monitoring water levels, stream velocities, and water-quality characteristics. Several of these instruments are briefly described. The Basic Data Recorder (BDR) is an electronic data logger, that interfaces to sensor systems through a serial-digital interface standard (SDI-12), which was proposed by the data-logger industry; the Incremental Shaft Encoder is an intelligent water-level sensor, which interfaces to the BDR through the SDI-12; the Pressure Sensor is an intelligent, nonsubmersible pressure sensor, which interfaces to the BDR through the SDI-12 and monitors water levels from 0 to 50 feet; the Ultrasonic Velocity Meter is an intelligent, water-velocity sensor, which interfaces to the BDR through the SDI-12 and measures the velocity across a stream up to 500 feet in width; the Collapsible Hand Sampler can be collapsed for insertion through holes in the ice and opened under the ice to collect a water sample; the Lighweight Ice Auger, weighing only 32 pounds, can auger 6- and 8-inch holes through approximately 3.5 feet of ice; and the Ice Chisel has a specially hardened steel blade and 6-foot long, hickory D-handle.
Sensor data fusion for textured reconstruction and virtual representation of alpine scenes
NASA Astrophysics Data System (ADS)
Häufel, Gisela; Bulatov, Dimitri; Solbrig, Peter
2017-10-01
The concept of remote sensing is to provide information about a wide-range area without making physical contact with this area. If, additionally to satellite imagery, images and videos taken by drones provide a more up-to-date data at a higher resolution, or accurate vector data is downloadable from the Internet, one speaks of sensor data fusion. The concept of sensor data fusion is relevant for many applications, such as virtual tourism, automatic navigation, hazard assessment, etc. In this work, we describe sensor data fusion aiming to create a semantic 3D model of an extremely interesting yet challenging dataset: An alpine region in Southern Germany. A particular challenge of this work is that rock faces including overhangs are present in the input airborne laser point cloud. The proposed procedure for identification and reconstruction of overhangs from point clouds comprises four steps: Point cloud preparation, filtering out vegetation, mesh generation and texturing. Further object types are extracted in several interesting subsections of the dataset: Building models with textures from UAV (Unmanned Aerial Vehicle) videos, hills reconstructed as generic surfaces and textured by the orthophoto, individual trees detected by the watershed algorithm, as well as the vector data for roads retrieved from openly available shapefiles and GPS-device tracks. We pursue geo-specific reconstruction by assigning texture and width to roads of several pre-determined types and modeling isolated trees and rocks using commercial software. For visualization and simulation of the area, we have chosen the simulation system Virtual Battlespace 3 (VBS3). It becomes clear that the proposed concept of sensor data fusion allows a coarse reconstruction of a large scene and, at the same time, an accurate and up-to-date representation of its relevant subsections, in which simulation can take place.
Research of home energy management system based on technology of PLC and ZigBee
NASA Astrophysics Data System (ADS)
Wei, Qi; Shen, Jiaojiao
2015-12-01
In view of the problem of saving effectively energy and energy management in home, this paper designs a home energy intelligent control system based on power line carrier communication and wireless ZigBee sensor networks. The system is based on ARM controller, power line carrier communication and wireless ZigBee sensor network as the terminal communication mode, and realizes the centralized and intelligent control of home appliances. Through the combination of these two technologies, the advantages of the two technologies complement each other, and provide a feasible plan for the construction of energy-efficient, intelligent home energy management system.
Introduction to the Special Issue on "State-of-the-Art Sensor Technology in Japan 2015".
Tokumitsu, Masahiro; Ishida, Yoshiteru
2016-08-23
This Special Issue, "State-of-the-Art Sensor Technology in Japan 2015", collected papers on different kinds of sensing technology: fundamental technology for intelligent sensors, information processing for monitoring humans, and information processing for adaptive and survivable sensor systems.[...].
Distributed Multisensor Data Fusion under Unknown Correlation and Data Inconsistency
Abu Bakr, Muhammad; Lee, Sukhan
2017-01-01
The paradigm of multisensor data fusion has been evolved from a centralized architecture to a decentralized or distributed architecture along with the advancement in sensor and communication technologies. These days, distributed state estimation and data fusion has been widely explored in diverse fields of engineering and control due to its superior performance over the centralized one in terms of flexibility, robustness to failure and cost effectiveness in infrastructure and communication. However, distributed multisensor data fusion is not without technical challenges to overcome: namely, dealing with cross-correlation and inconsistency among state estimates and sensor data. In this paper, we review the key theories and methodologies of distributed multisensor data fusion available to date with a specific focus on handling unknown correlation and data inconsistency. We aim at providing readers with a unifying view out of individual theories and methodologies by presenting a formal analysis of their implications. Finally, several directions of future research are highlighted. PMID:29077035
Multisensor data fusion for integrated maritime surveillance
NASA Astrophysics Data System (ADS)
Premji, A.; Ponsford, A. M.
1995-01-01
A prototype Integrated Coastal Surveillance system has been developed on Canada's East Coast to provide effective surveillance out to and beyond the 200 nautical mile Exclusive Economic Zone. The system has been designed to protect Canada's natural resources, and to monitor and control the coastline for smuggling, drug trafficking, and similar illegal activity. This paper describes the Multiple Sensor - Multiple Target data fusion system that has been developed. The fusion processor has been developed around the celebrated Multiple Hypothesis Tracking algorithm which accommodates multiple targets, new targets, false alarms, and missed detections. This processor performs four major functions: plot-to-track association to form individual radar tracks; fusion of radar tracks with secondary sensor reports; track identification and tagging using secondary reports; and track level fusion to form common tracks. Radar data from coherent and non-coherent radars has been used to evaluate the performance of the processor. This paper presents preliminary results.
A methodology for hard/soft information fusion in the condition monitoring of aircraft
NASA Astrophysics Data System (ADS)
Bernardo, Joseph T.
2013-05-01
Condition-based maintenance (CBM) refers to the philosophy of performing maintenance when the need arises, based upon indicators of deterioration in the condition of the machinery. Traditionally, CBM involves equipping machinery with electronic sensors that continuously monitor components and collect data for analysis. The addition of the multisensory capability of human cognitive functions (i.e., sensemaking, problem detection, planning, adaptation, coordination, naturalistic decision making) to traditional CBM may create a fuller picture of machinery condition. Cognitive systems engineering techniques provide an opportunity to utilize a dynamic resource—people acting as soft sensors. The literature is extensive on techniques to fuse data from electronic sensors, but little work exists on fusing data from humans with that from electronic sensors (i.e., hard/soft fusion). The purpose of my research is to explore, observe, investigate, analyze, and evaluate the fusion of pilot and maintainer knowledge, experiences, and sensory perceptions with digital maintenance resources. Hard/soft information fusion has the potential to increase problem detection capability, improve flight safety, and increase mission readiness. This proposed project consists the creation of a methodology that is based upon the Living Laboratories framework, a research methodology that is built upon cognitive engineering principles1. This study performs a critical assessment of concept, which will support development of activities to demonstrate hard/soft information fusion in operationally relevant scenarios of aircraft maintenance. It consists of fieldwork, knowledge elicitation to inform a simulation and a prototype.
Intelligent Network-Centric Sensors Development Program
2012-07-31
Image sensor Configuration: ; Cone 360 degree LWIR PFx Sensor: •■. Image sensor . Configuration: Image MWIR Configuration; Cone 360 degree... LWIR PFx Sensor: Video Configuration: Cone 360 degree SW1R, 2. Reasoning Process to Match Sensor Systems to Algorithms The ontological...effects of coherent imaging because of aberrations. Another reason is the specular nature of active imaging. Both contribute to the nonuniformity
Zakaria, Ammar; Shakaff, Ali Yeon Md; Masnan, Maz Jamilah; Saad, Fathinul Syahir Ahmad; Adom, Abdul Hamid; Ahmad, Mohd Noor; Jaafar, Mahmad Nor; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2012-01-01
In recent years, there have been a number of reported studies on the use of non-destructive techniques to evaluate and determine mango maturity and ripeness levels. However, most of these reported works were conducted using single-modality sensing systems, either using an electronic nose, acoustics or other non-destructive measurements. This paper presents the work on the classification of mangoes (Magnifera Indica cv. Harumanis) maturity and ripeness levels using fusion of the data of an electronic nose and an acoustic sensor. Three groups of samples each from two different harvesting times (week 7 and week 8) were evaluated by the e-nose and then followed by the acoustic sensor. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) were able to discriminate the mango harvested at week 7 and week 8 based solely on the aroma and volatile gases released from the mangoes. However, when six different groups of different maturity and ripeness levels were combined in one classification analysis, both PCA and LDA were unable to discriminate the age difference of the Harumanis mangoes. Instead of six different groups, only four were observed using the LDA, while PCA showed only two distinct groups. By applying a low level data fusion technique on the e-nose and acoustic data, the classification for maturity and ripeness levels using LDA was improved. However, no significant improvement was observed using PCA with data fusion technique. Further work using a hybrid LDA-Competitive Learning Neural Network was performed to validate the fusion technique and classify the samples. It was found that the LDA-CLNN was also improved significantly when data fusion was applied. PMID:22778629
Zakaria, Ammar; Shakaff, Ali Yeon Md; Masnan, Maz Jamilah; Saad, Fathinul Syahir Ahmad; Adom, Abdul Hamid; Ahmad, Mohd Noor; Jaafar, Mahmad Nor; Abdullah, Abu Hassan; Kamarudin, Latifah Munirah
2012-01-01
In recent years, there have been a number of reported studies on the use of non-destructive techniques to evaluate and determine mango maturity and ripeness levels. However, most of these reported works were conducted using single-modality sensing systems, either using an electronic nose, acoustics or other non-destructive measurements. This paper presents the work on the classification of mangoes (Magnifera Indica cv. Harumanis) maturity and ripeness levels using fusion of the data of an electronic nose and an acoustic sensor. Three groups of samples each from two different harvesting times (week 7 and week 8) were evaluated by the e-nose and then followed by the acoustic sensor. Principal Component Analysis (PCA) and Linear Discriminant Analysis (LDA) were able to discriminate the mango harvested at week 7 and week 8 based solely on the aroma and volatile gases released from the mangoes. However, when six different groups of different maturity and ripeness levels were combined in one classification analysis, both PCA and LDA were unable to discriminate the age difference of the Harumanis mangoes. Instead of six different groups, only four were observed using the LDA, while PCA showed only two distinct groups. By applying a low level data fusion technique on the e-nose and acoustic data, the classification for maturity and ripeness levels using LDA was improved. However, no significant improvement was observed using PCA with data fusion technique. Further work using a hybrid LDA-Competitive Learning Neural Network was performed to validate the fusion technique and classify the samples. It was found that the LDA-CLNN was also improved significantly when data fusion was applied.
Bluetooth-based distributed measurement system
NASA Astrophysics Data System (ADS)
Tang, Baoping; Chen, Zhuo; Wei, Yuguo; Qin, Xiaofeng
2007-07-01
A novel distributed wireless measurement system, which is consisted of a base station, wireless intelligent sensors and relay nodes etc, is established by combining of Bluetooth-based wireless transmission, virtual instrument, intelligent sensor, and network. The intelligent sensors mounted on the equipments to be measured acquire various parameters and the Bluetooth relay nodes get the acquired data modulated and sent to the base station, where data analysis and processing are done so that the operational condition of the equipment can be evaluated. The establishment of the distributed measurement system is discussed with a measurement flow chart for the distributed measurement system based on Bluetooth technology, and the advantages and disadvantages of the system are analyzed at the end of the paper and the measurement system has successfully been used in Daqing oilfield, China for measurement of parameters, such as temperature, flow rate and oil pressure at an electromotor-pump unit.
2007-03-01
54 M . Fishbein and I . Ajzen , Belief, Attitude, Intention, and Behavior: An Introduction to Theory and Research. (MA: Addison-Wesley Reading...National Intelligence, 2006. http://www.fas.org/irp/dni/implement.pdf (accessed 9/9/2006). Fishbein , M ., and I . Ajzen . “Belief, Attitude...Advisor: Richard Bergin Second Reader: David Brannan THIS PAGE INTENTIONALLY LEFT BLANK i REPORT
Learning to Classify with Possible Sensor Failures
2014-05-04
SVMs), have demonstrated good classification performance when the training data is representative of the test data [1, 2, 3]. However, in many real...Detection of people and animals using non- imaging sensors,” Information Fusion (FUSION), 2011 Proceedings of the 14th International Conference on, pp...classification methods in terms of both classification accuracy and anomaly detection rate using 1. REPORT DATE (DD-MM-YYYY) 4. TITLE AND SUBTITLE 13
NASA Astrophysics Data System (ADS)
Boldyreff, Anton S.; Bespalov, Dmitry A.; Adzhiev, Anatoly Kh.
2017-05-01
Methods of artificial intelligence are a good solution for weather phenomena forecasting. They allow to process a large amount of diverse data. Recirculation Neural Networks is implemented in the paper for the system of thunderstorm events prediction. Large amounts of experimental data from lightning sensors and electric field mills networks are received and analyzed. The average recognition accuracy of sensor signals is calculated. It is shown that Recirculation Neural Networks is a promising solution in the forecasting of thunderstorms and weather phenomena, characterized by the high efficiency of the recognition elements of the sensor signals, allows to compress images and highlight their characteristic features for subsequent recognition.
NASA Astrophysics Data System (ADS)
Parhad, Ashutosh
Intelligent transportation systems use in-pavement inductive loop sensors to collect real time traffic data. This method is very expensive in terms of installation and maintenance. Our research is focused on developing advanced algorithms capable of generating high amounts of energy that can charge a battery. This electromechanical energy conversion is an optimal way of energy scavenging that makes use of piezoelectric sensors. The power generated is sufficient to run the vehicle detection module that has several sensors embedded together. To achieve these goals, we have developed a simulation module using software's like LabVIEW and Multisim. The simulation module recreates a practical scenario that takes into consideration vehicle weight, speed, wheel width and frequency of the traffic.
Design of an Intelligent Front-End Signal Conditioning Circuit for IR Sensors
NASA Astrophysics Data System (ADS)
de Arcas, G.; Ruiz, M.; Lopez, J. M.; Gutierrez, R.; Villamayor, V.; Gomez, L.; Montojo, Mª. T.
2008-02-01
This paper presents the design of an intelligent front-end signal conditioning system for IR sensors. The system has been developed as an interface between a PbSe IR sensor matrix and a TMS320C67x digital signal processor. The system architecture ensures its scalability so it can be used for sensors with different matrix sizes. It includes an integrator based signal conditioning circuit, a data acquisition converter block, and a FPGA based advanced control block that permits including high level image preprocessing routines such as faulty pixel detection and sensor calibration in the signal conditioning front-end. During the design phase virtual instrumentation technologies proved to be a very valuable tool for prototyping when choosing the best A/D converter type for the application. Development time was significantly reduced due to the use of this technology.
Noncontact Sleep Study by Multi-Modal Sensor Fusion.
Chung, Ku-Young; Song, Kwangsub; Shin, Kangsoo; Sohn, Jinho; Cho, Seok Hyun; Chang, Joon-Hyuk
2017-07-21
Polysomnography (PSG) is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner.
Noncontact Sleep Study by Multi-Modal Sensor Fusion
Chung, Ku-young; Song, Kwangsub; Shin, Kangsoo; Sohn, Jinho; Cho, Seok Hyun; Chang, Joon-Hyuk
2017-01-01
Polysomnography (PSG) is considered as the gold standard for determining sleep stages, but due to the obtrusiveness of its sensor attachments, sleep stage classification algorithms using noninvasive sensors have been developed throughout the years. However, the previous studies have not yet been proven reliable. In addition, most of the products are designed for healthy customers rather than for patients with sleep disorder. We present a novel approach to classify sleep stages via low cost and noncontact multi-modal sensor fusion, which extracts sleep-related vital signals from radar signals and a sound-based context-awareness technique. This work is uniquely designed based on the PSG data of sleep disorder patients, which were received and certified by professionals at Hanyang University Hospital. The proposed algorithm further incorporates medical/statistical knowledge to determine personal-adjusted thresholds and devise post-processing. The efficiency of the proposed algorithm is highlighted by contrasting sleep stage classification performance between single sensor and sensor-fusion algorithms. To validate the possibility of commercializing this work, the classification results of this algorithm were compared with the commercialized sleep monitoring device, ResMed S+. The proposed algorithm was investigated with random patients following PSG examination, and results show a promising novel approach for determining sleep stages in a low cost and unobtrusive manner. PMID:28753994
NASA Astrophysics Data System (ADS)
Liu, Jie; Hu, Youmin; Wang, Yan; Wu, Bo; Fan, Jikai; Hu, Zhongxu
2018-05-01
The diagnosis of complicated fault severity problems in rotating machinery systems is an important issue that affects the productivity and quality of manufacturing processes and industrial applications. However, it usually suffers from several deficiencies. (1) A considerable degree of prior knowledge and expertise is required to not only extract and select specific features from raw sensor signals, and but also choose a suitable fusion for sensor information. (2) Traditional artificial neural networks with shallow architectures are usually adopted and they have a limited ability to learn the complex and variable operating conditions. In multi-sensor-based diagnosis applications in particular, massive high-dimensional and high-volume raw sensor signals need to be processed. In this paper, an integrated multi-sensor fusion-based deep feature learning (IMSFDFL) approach is developed to identify the fault severity in rotating machinery processes. First, traditional statistics and energy spectrum features are extracted from multiple sensors with multiple channels and combined. Then, a fused feature vector is constructed from all of the acquisition channels. Further, deep feature learning with stacked auto-encoders is used to obtain the deep features. Finally, the traditional softmax model is applied to identify the fault severity. The effectiveness of the proposed IMSFDFL approach is primarily verified by a one-stage gearbox experimental platform that uses several accelerometers under different operating conditions. This approach can identify fault severity more effectively than the traditional approaches.
Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network.
Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R
2016-08-15
Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors.
Intelligent fiber optic sensor for solution concentration examination
NASA Astrophysics Data System (ADS)
Borecki, Michal; Kruszewski, Jerzy
2003-09-01
This paper presents the working principles of intelligent fiber-optic intensity sensor used for solution concentration examination. The sensor head is the ending of the large core polymer optical fiber. The head works on the reflection intensity basis. The reflected signal level depends on Fresnel reflection and reflection on suspended matter when the head is submersed in solution. The sensor head is mounted on a lift. For detection purposes the signal includes head submerging, submersion, emerging and emergence is measured. This way the viscosity turbidity and refraction coefficient has an effect on measured signal. The signal forthcoming from head is processed electrically in opto-electronic interface. Then it is feed to neural network. The novelty of presented sensor is implementation of neural network that works in generalization mode. The sensor resolution depends on opto-electronic signal conversion precision and neural network learning accuracy. Therefore, the number and quality of points used for learning process is very important. The example sensor application for examination of liquid soap concentration in water is presented in the paper.
Ontology-Based Architecture for Intelligent Transportation Systems Using a Traffic Sensor Network
Fernandez, Susel; Hadfi, Rafik; Ito, Takayuki; Marsa-Maestre, Ivan; Velasco, Juan R.
2016-01-01
Intelligent transportation systems are a set of technological solutions used to improve the performance and safety of road transportation. A crucial element for the success of these systems is the exchange of information, not only between vehicles, but also among other components in the road infrastructure through different applications. One of the most important information sources in this kind of systems is sensors. Sensors can be within vehicles or as part of the infrastructure, such as bridges, roads or traffic signs. Sensors can provide information related to weather conditions and traffic situation, which is useful to improve the driving process. To facilitate the exchange of information between the different applications that use sensor data, a common framework of knowledge is needed to allow interoperability. In this paper an ontology-driven architecture to improve the driving environment through a traffic sensor network is proposed. The system performs different tasks automatically to increase driver safety and comfort using the information provided by the sensors. PMID:27537878
IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion.
Dehzangi, Omid; Taherisadr, Mojtaba; ChangalVala, Raghvendar
2017-11-27
The wide spread usage of wearable sensors such as in smart watches has provided continuous access to valuable user generated data such as human motion that could be used to identify an individual based on his/her motion patterns such as, gait. Several methods have been suggested to extract various heuristic and high-level features from gait motion data to identify discriminative gait signatures and distinguish the target individual from others. However, the manual and hand crafted feature extraction is error prone and subjective. Furthermore, the motion data collected from inertial sensors have complex structure and the detachment between manual feature extraction module and the predictive learning models might limit the generalization capabilities. In this paper, we propose a novel approach for human gait identification using time-frequency (TF) expansion of human gait cycles in order to capture joint 2 dimensional (2D) spectral and temporal patterns of gait cycles. Then, we design a deep convolutional neural network (DCNN) learning to extract discriminative features from the 2D expanded gait cycles and jointly optimize the identification model and the spectro-temporal features in a discriminative fashion. We collect raw motion data from five inertial sensors placed at the chest, lower-back, right hand wrist, right knee, and right ankle of each human subject synchronously in order to investigate the impact of sensor location on the gait identification performance. We then present two methods for early (input level) and late (decision score level) multi-sensor fusion to improve the gait identification generalization performance. We specifically propose the minimum error score fusion (MESF) method that discriminatively learns the linear fusion weights of individual DCNN scores at the decision level by minimizing the error rate on the training data in an iterative manner. 10 subjects participated in this study and hence, the problem is a 10-class identification task. Based on our experimental results, 91% subject identification accuracy was achieved using the best individual IMU and 2DTF-DCNN. We then investigated our proposed early and late sensor fusion approaches, which improved the gait identification accuracy of the system to 93.36% and 97.06%, respectively.
ERIC Educational Resources Information Center
Kanagarajan, Sujith; Ramakrishnan, Sivakumar
2018-01-01
Ubiquitous Learning Environment (ULE) has been becoming a mobile and sensor based technology equipped environment that suits the modern world education discipline requirements for the past few years. Ambient Intelligence (AmI) makes much smarter the ULE by the support of optimization and intelligent techniques. Various efforts have been so far…
Study on intelligent processing system of man-machine interactive garment frame model
NASA Astrophysics Data System (ADS)
Chen, Shuwang; Yin, Xiaowei; Chang, Ruijiang; Pan, Peiyun; Wang, Xuedi; Shi, Shuze; Wei, Zhongqian
2018-05-01
A man-machine interactive garment frame model intelligent processing system is studied in this paper. The system consists of several sensor device, voice processing module, mechanical parts and data centralized acquisition devices. The sensor device is used to collect information on the environment changes brought by the body near the clothes frame model, the data collection device is used to collect the information of the environment change induced by the sensor device, voice processing module is used for speech recognition of nonspecific person to achieve human-machine interaction, mechanical moving parts are used to make corresponding mechanical responses to the information processed by data collection device.it is connected with data acquisition device by a means of one-way connection. There is a one-way connection between sensor device and data collection device, two-way connection between data acquisition device and voice processing module. The data collection device is one-way connection with mechanical movement parts. The intelligent processing system can judge whether it needs to interact with the customer, realize the man-machine interaction instead of the current rigid frame model.
Distributed cluster management techniques for unattended ground sensor networks
NASA Astrophysics Data System (ADS)
Essawy, Magdi A.; Stelzig, Chad A.; Bevington, James E.; Minor, Sharon
2005-05-01
Smart Sensor Networks are becoming important target detection and tracking tools. The challenging problems in such networks include the sensor fusion, data management and communication schemes. This work discusses techniques used to distribute sensor management and multi-target tracking responsibilities across an ad hoc, self-healing cluster of sensor nodes. Although miniaturized computing resources possess the ability to host complex tracking and data fusion algorithms, there still exist inherent bandwidth constraints on the RF channel. Therefore, special attention is placed on the reduction of node-to-node communications within the cluster by minimizing unsolicited messaging, and distributing the sensor fusion and tracking tasks onto local portions of the network. Several challenging problems are addressed in this work including track initialization and conflict resolution, track ownership handling, and communication control optimization. Emphasis is also placed on increasing the overall robustness of the sensor cluster through independent decision capabilities on all sensor nodes. Track initiation is performed using collaborative sensing within a neighborhood of sensor nodes, allowing each node to independently determine if initial track ownership should be assumed. This autonomous track initiation prevents the formation of duplicate tracks while eliminating the need for a central "management" node to assign tracking responsibilities. Track update is performed as an ownership node requests sensor reports from neighboring nodes based on track error covariance and the neighboring nodes geo-positional location. Track ownership is periodically recomputed using propagated track states to determine which sensing node provides the desired coverage characteristics. High fidelity multi-target simulation results are presented, indicating the distribution of sensor management and tracking capabilities to not only reduce communication bandwidth consumption, but to also simplify multi-target tracking within the cluster.
Heart rate estimation from FBG sensors using cepstrum analysis and sensor fusion.
Zhu, Yongwei; Fook, Victor Foo Siang; Jianzhong, Emily Hao; Maniyeri, Jayachandran; Guan, Cuntai; Zhang, Haihong; Jiliang, Eugene Phua; Biswas, Jit
2014-01-01
This paper presents a method of estimating heart rate from arrays of fiber Bragg grating (FBG) sensors embedded in a mat. A cepstral domain signal analysis technique is proposed to characterize Ballistocardiogram (BCG) signals. With this technique, the average heart beat intervals can be estimated by detecting the dominant peaks in the cepstrum, and the signals of multiple sensors can be fused together to obtain higher signal to noise ratio than each individual sensor. Experiments were conducted with 10 human subjects lying on 2 different postures on a bed. The estimated heart rate from BCG was compared with heart rate ground truth from ECG, and the mean error of estimation obtained is below 1 beat per minute (BPM). The results show that the proposed fusion method can achieve promising heart rate measurement accuracy and robustness against various sensor contact conditions.
Non-verbal communication through sensor fusion
NASA Astrophysics Data System (ADS)
Tairych, Andreas; Xu, Daniel; O'Brien, Benjamin M.; Anderson, Iain A.
2016-04-01
When we communicate face to face, we subconsciously engage our whole body to convey our message. In telecommunication, e.g. during phone calls, this powerful information channel cannot be used. Capturing nonverbal information from body motion and transmitting it to the receiver parallel to speech would make these conversations feel much more natural. This requires a sensing device that is capable of capturing different types of movements, such as the flexion and extension of joints, and the rotation of limbs. In a first embodiment, we developed a sensing glove that is used to control a computer game. Capacitive dielectric elastomer (DE) sensors measure finger positions, and an inertial measurement unit (IMU) detects hand roll. These two sensor technologies complement each other, with the IMU allowing the player to move an avatar through a three-dimensional maze, and the DE sensors detecting finger flexion to fire weapons or open doors. After demonstrating the potential of sensor fusion in human-computer interaction, we take this concept to the next level and apply it in nonverbal communication between humans. The current fingerspelling glove prototype uses capacitive DE sensors to detect finger gestures performed by the sending person. These gestures are mapped to corresponding messages and transmitted wirelessly to another person. A concept for integrating an IMU into this system is presented. The fusion of the DE sensor and the IMU combines the strengths of both sensor types, and therefore enables very comprehensive body motion sensing, which makes a large repertoire of gestures available to nonverbal communication over distances.
Fusion of intraoperative force sensoring, surface reconstruction and biomechanical modeling
NASA Astrophysics Data System (ADS)
Röhl, S.; Bodenstedt, S.; Küderle, C.; Suwelack, S.; Kenngott, H.; Müller-Stich, B. P.; Dillmann, R.; Speidel, S.
2012-02-01
Minimally invasive surgery is medically complex and can heavily benefit from computer assistance. One way to help the surgeon is to integrate preoperative planning data into the surgical workflow. This information can be represented as a customized preoperative model of the surgical site. To use it intraoperatively, it has to be updated during the intervention due to the constantly changing environment. Hence, intraoperative sensor data has to be acquired and registered with the preoperative model. Haptic information which could complement the visual sensor data is still not established. In addition, biomechanical modeling of the surgical site can help in reflecting the changes which cannot be captured by intraoperative sensors. We present a setting where a force sensor is integrated into a laparoscopic instrument. In a test scenario using a silicone liver phantom, we register the measured forces with a reconstructed surface model from stereo endoscopic images and a finite element model. The endoscope, the instrument and the liver phantom are tracked with a Polaris optical tracking system. By fusing this information, we can transfer the deformation onto the finite element model. The purpose of this setting is to demonstrate the principles needed and the methods developed for intraoperative sensor data fusion. One emphasis lies on the calibration of the force sensor with the instrument and first experiments with soft tissue. We also present our solution and first results concerning the integration of the force sensor as well as accuracy to the fusion of force measurements, surface reconstruction and biomechanical modeling.
Neuro-Analogical Gate Tuning of Trajectory Data Fusion for a Mecanum-Wheeled Special Needs Chair
ElSaharty, M. A.; zakzouk, Ezz Eldin
2017-01-01
Trajectory tracking of mobile wheeled chairs using internal shaft encoder and inertia measurement unit(IMU), exhibits several complications and accumulated errors in the tracking process due to wheel slippage, offset drift and integration approximations. These errors can be realized when comparing localization results from such sensors with a camera tracking system. In long trajectory tracking, such errors can accumulate and result in significant deviations which make data from these sensors unreliable for tracking. Meanwhile the utilization of an external camera tracking system is not always a feasible solution depending on the implementation environment. This paper presents a novel sensor fusion method that combines the measurements of internal sensors to accurately predict the location of the wheeled chair in an environment. The method introduces a new analogical OR gate structured with tuned parameters using multi-layer feedforward neural network denoted as “Neuro-Analogical Gate” (NAG). The resulting system minimize any deviation error caused by the sensors, thus accurately tracking the wheeled chair location without the requirement of an external camera tracking system. The fusion methodology has been tested with a prototype Mecanum wheel-based chair, and significant improvement over tracking response, error and performance has been observed. PMID:28045973
Fusion of footsteps and face biometrics on an unsupervised and uncontrolled environment
NASA Astrophysics Data System (ADS)
Vera-Rodriguez, Ruben; Tome, Pedro; Fierrez, Julian; Ortega-Garcia, Javier
2012-06-01
This paper reports for the first time experiments on the fusion of footsteps and face on an unsupervised and not controlled environment for person authentication. Footstep recognition is a relatively new biometric based on signals extracted from people walking over floor sensors. The idea of the fusion between footsteps and face starts from the premise that in an area where footstep sensors are installed it is very simple to place a camera to capture also the face of the person that walks over the sensors. This setup may find application in scenarios like ambient assisted living, smart homes, eldercare, or security access. The paper reports a comparative assessment of both biometrics using the same database and experimental protocols. In the experimental work we consider two different applications: smart homes (small group of users with a large set of training data) and security access (larger group of users with a small set of training data) obtaining results of 0.9% and 5.8% EER respectively for the fusion of both modalities. This is a significant performance improvement compared with the results obtained by the individual systems.
Estimating time available for sensor fusion exception handling
NASA Astrophysics Data System (ADS)
Murphy, Robin R.; Rogers, Erika
1995-09-01
In previous work, we have developed a generate, test, and debug methodology for detecting, classifying, and responding to sensing failures in autonomous and semi-autonomous mobile robots. An important issue has arisen from these efforts: how much time is there available to classify the cause of the failure and determine an alternative sensing strategy before the robot mission must be terminated? In this paper, we consider the impact of time for teleoperation applications where a remote robot attempts to autonomously maintain sensing in the presence of failures yet has the option to contact the local for further assistance. Time limits are determined by using evidential reasoning with a novel generalization of Dempster-Shafer theory. Generalized Dempster-Shafer theory is used to estimate the time remaining until the robot behavior must be suspended because of uncertainty; this becomes the time limit on autonomous exception handling at the remote. If the remote cannot complete exception handling in this time or needs assistance, responsibility is passed to the local, while the remote assumes a `safe' state. An intelligent assistant then facilitates human intervention, either directing the remote without human assistance or coordinating data collection and presentation to the operator within time limits imposed by the mission. The impact of time on exception handling activities is demonstrated using video camera sensor data.
NASA Technical Reports Server (NTRS)
1990-01-01
Various papers on remote sensing (RS) for the nineties are presented. The general topics addressed include: subsurface methods, radar scattering, oceanography, microwave models, atmospheric correction, passive microwave systems, RS in tropical forests, moderate resolution land analysis, SAR geometry and SNR improvement, image analysis, inversion and signal processing for geoscience, surface scattering, rain measurements, sensor calibration, wind measurements, terrestrial ecology, agriculture, geometric registration, subsurface sediment geology, radar modulation mechanisms, radar ocean scattering, SAR calibration, airborne radar systems, water vapor retrieval, forest ecosystem dynamics, land analysis, multisensor data fusion. Also considered are: geologic RS, RS sensor optical measurements, RS of snow, temperature retrieval, vegetation structure, global change, artificial intelligence, SAR processing techniques, geologic RS field experiment, stochastic modeling, topography and Digital Elevation model, SAR ocean waves, spaceborne lidar and optical, sea ice field measurements, millimeter waves, advanced spectroscopy, spatial analysis and data compression, SAR polarimetry techniques. Also discussed are: plant canopy modeling, optical RS techniques, optical and IR oceanography, soil moisture, sea ice back scattering, lightning cloud measurements, spatial textural analysis, SAR systems and techniques, active microwave sensing, lidar and optical, radar scatterometry, RS of estuaries, vegetation modeling, RS systems, EOS/SAR Alaska, applications for developing countries, SAR speckle and texture.
Using generic tool kits to build intelligent systems
NASA Technical Reports Server (NTRS)
Miller, David J.
1994-01-01
The Intelligent Systems and Robots Center at Sandia National Laboratories is developing technologies for the automation of processes associated with environmental remediation and information-driven manufacturing. These technologies, which focus on automated planning and programming and sensor-based and model-based control, are used to build intelligent systems which are able to generate plans of action, program the necessary devices, and use sensors to react to changes in the environment. By automating tasks through the use of programmable devices tied to computer models which are augmented by sensing, requirements for faster, safer, and cheaper systems are being satisfied. However, because of the need for rapid cost-effect prototyping and multi-laboratory teaming, it is also necessary to define a consistent approach to the construction of controllers for such systems. As a result, the Generic Intelligent System Controller (GISC) concept has been developed. This concept promotes the philosophy of producing generic tool kits which can be used and reused to build intelligent control systems.
Intelligence Control System for Landfills Based on Wireless Sensor Network
NASA Astrophysics Data System (ADS)
Zhang, Qian; Huang, Chuan; Gong, Jian
2018-06-01
This paper put forward an intelligence system for controlling the landfill gas in landfills to make the landfill gas (LFG) exhaust controllably and actively. The system, which is assigned by the wireless sensor network, were developed and supervised by remote applications in workshop instead of manual work. An automatic valve control depending on the sensor units embedded is installed in tube, the air pressure and concentration of LFG are detected to decide the level of the valve switch. The paper also proposed a modified algorithm to solve transmission problem, so that the system can keep a high efficiency and long service life.
Sensor data fusion for spectroscopy-based detection of explosives
NASA Astrophysics Data System (ADS)
Shah, Pratik V.; Singh, Abhijeet; Agarwal, Sanjeev; Sedigh, Sahra; Ford, Alan; Waterbury, Robert
2009-05-01
In-situ trace detection of explosive compounds such as RDX, TNT, and ammonium nitrate, is an important problem for the detection of IEDs and IED precursors. Spectroscopic techniques such as LIBS and Raman have shown promise for the detection of residues of explosive compounds on surfaces from standoff distances. Individually, both LIBS and Raman techniques suffer from various limitations, e.g., their robustness and reliability suffers due to variations in peak strengths and locations. However, the orthogonal nature of the spectral and compositional information provided by these techniques makes them suitable candidates for the use of sensor fusion to improve the overall detection performance. In this paper, we utilize peak energies in a region by fitting Lorentzian or Gaussian peaks around the location of interest. The ratios of peak energies are used for discrimination, in order to normalize the effect of changes in overall signal strength. Two data fusion techniques are discussed in this paper. Multi-spot fusion is performed on a set of independent samples from the same region based on the maximum likelihood formulation. Furthermore, the results from LIBS and Raman sensors are fused using linear discriminators. Improved detection performance with significantly reduced false alarm rates is reported using fusion techniques on data collected for sponsor demonstration at Fort Leonard Wood.
Conception d'un capteur intelligent pour la détection des vapeurs de styrène dans l'industrie
NASA Astrophysics Data System (ADS)
Agbossou, Kodjo; Agbebavi, T. James; Koffi, Demagna; Elhiri, Mohammed
1994-10-01
The techniques of measurement of toxic gases are nowadays based on the semiconductor type sensors. The modelling and the electronic processing of their signals can be used to improve the accuracy and the efficiency of the measurement. In this paper, an intelligent system using a semiconductor sensor has been designed for the detection of the styrene vapors. A set of the environmental parameters sensors such as the temperature, the pressure and the humidity, is added to the basic sensor and allows a precise detection of the styrene vapors in air. A microcontroller and a communication interface, that are included in the control system and in the data processing system, provide the local intelligence. The linearization routines of the differents sensors are in the memory of the microcontroller. The system made of the sensors, of the amplification circuits, of the microcontroller and of the communication network between the smart sensor and the computer is analysed. A laboratory test of the device is presented and the accuracies and efficiencies of the differents sensors are given. Les techniques fiables de quantification des gaz polluants sont aujourd'hui basées sur l'utilisation des détecteurs à récepteurs chimiques et sur des capteurs à semiconducteurs. La modélisation et le traitement numérique des signaux résultants sont importants pour une mesure efficace et précise dans un milieu donné. Dans cet article, un capteur intelligent, utilisant un détecteur de gaz type semiconducteur a été réalisé pour la détection des vapeurs de styrène. Un ensemble de détecteurs des paramètres environnementaux, tels que la température, la pression et l'humidité, ajoutés au capteur de styrène, permettent de mesurer avec un bon contrôle les vapeurs de styrène dans l'air. Le système de contrôle et de gestion local des données est constitué d'un microcontrôleur et d'une interface de communication. Le microcontrôleur contient dans sa mémoire toutes les fonctions de linéarisation des différents capteurs. Cet ensemble de capteurs, de circuits conditionneurs, de microcontrôleur et d'interface de communication est appelé " capteur intelligent ". Le réseau de communication entre le capteur intelligent et le micro-ordinateur est analysé en terme de traitement de signal. Un exemple d'application au laboratoire est présenté, les sensibilités et les précisions des différents capteurs sont données.
Sarker, Muzaddid; de Antueno, Roberto; Langelaan, David N.; Parmar, Hiren B.; Shin, Kyungsoo; Rainey, Jan K.; Duncan, Roy
2015-01-01
Pore formation is the most energy-demanding step during virus-induced membrane fusion, where high curvature of the fusion pore rim increases the spacing between lipid headgroups, exposing the hydrophobic interior of the membrane to water. How protein fusogens breach this thermodynamic barrier to pore formation is unclear. We identified a novel fusion-inducing lipid packing sensor (FLiPS) in the cytosolic endodomain of the baboon reovirus p15 fusion-associated small transmembrane (FAST) protein that is essential for pore formation during cell-cell fusion and syncytiogenesis. NMR spectroscopy and mutational studies indicate the dependence of this FLiPS on a hydrophobic helix-loop-helix structure. Biochemical and biophysical assays reveal the p15 FLiPS preferentially partitions into membranes with high positive curvature, and this partitioning is impeded by bis-ANS, a small molecule that inserts into hydrophobic defects in membranes. Most notably, the p15 FLiPS can be functionally replaced by heterologous amphipathic lipid packing sensors (ALPS) but not by other membrane-interactive amphipathic helices. Furthermore, a previously unrecognized amphipathic helix in the cytosolic domain of the reptilian reovirus p14 FAST protein can functionally replace the p15 FLiPS, and is itself replaceable by a heterologous ALPS motif. Anchored near the cytoplasmic leaflet by the FAST protein transmembrane domain, the FLiPS is perfectly positioned to insert into hydrophobic defects that begin to appear in the highly curved rim of nascent fusion pores, thereby lowering the energy barrier to stable pore formation. PMID:26061049
Selected Tracking and Fusion Applications for the Defence and Security Domain
2010-05-01
SUBTITLE Selected Tracking and Fusion Applications for the Defence and Security Domain 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER...characterized, for example, by sensor ranges from less than a meter to hundreds of kilometers, by time scales ranging from less than second to a few...been carried out within the framework of a multinational technology program called MAJIIC (Multi-Sensor Aerospace-Ground Joint ISR Interoperability
Design of intelligent composites with life-cycle health management capabilities
NASA Astrophysics Data System (ADS)
Rosania, Colleen L.; Larrosa, Cecilia C.; Chang, Fu-Kuo
2015-03-01
Use of carbon fiber reinforced polymers (CFRPs) presents challenges because of their complex manufacturing processes and different damage mechanics in relation to legacy metal materials. New monitoring methods for manufacturing, quality verification, damage estimation, and prognosis are needed to use CFRPs safely and efficiently. This work evaluates the development of intelligent composite materials using integrated piezoelectric sensors to monitor the material during cure and throughout service life. These sensors are used to propagate ultrasonic waves through the structure for health monitoring. During manufacturing, data is collected at different stages during the cure cycle, detecting the changing material properties during cure and verifying quality and degree of cure. The same sensors can then be used with previously developed techniques to perform damage detection, such as impact detection and matrix crack density estimation. Real-time damage estimation can be combined with prognostic models to predict future propagation of damage in the material. In this work experimental results will be presented from composite coupons with embedded piezoelectric sensors. Cure monitoring and damage detection results derived from analysis of the ultrasonic sensor signal will be shown. Sensitive signal parameters to the different stimuli in both the time and frequency domains will be explored for this analysis. From these results, use of the same sensor networks from manufacturing throughout the life of the composite material will demonstrate the full life-cycle monitoring capability of these intelligent materials.
Development of a Low-Cost Attitude Sensor for Agricultural Vehicles
USDA-ARS?s Scientific Manuscript database
The objective of this research was to develop a low-cost attitude sensor for agricultural vehicles. The attitude sensor was composed of three vibratory gyroscopes and two inclinometers. A sensor fusion algorithm was developed to estimate tilt angles (roll and pitch) by least-squares method. In the a...
Fusion of 3D laser scanner and depth images for obstacle recognition in mobile applications
NASA Astrophysics Data System (ADS)
Budzan, Sebastian; Kasprzyk, Jerzy
2016-02-01
The problem of obstacle detection and recognition or, generally, scene mapping is one of the most investigated problems in computer vision, especially in mobile applications. In this paper a fused optical system using depth information with color images gathered from the Microsoft Kinect sensor and 3D laser range scanner data is proposed for obstacle detection and ground estimation in real-time mobile systems. The algorithm consists of feature extraction in the laser range images, processing of the depth information from the Kinect sensor, fusion of the sensor information, and classification of the data into two separate categories: road and obstacle. Exemplary results are presented and it is shown that fusion of information gathered from different sources increases the effectiveness of the obstacle detection in different scenarios, and it can be used successfully for road surface mapping.
Decision-level fusion of SAR and IR sensor information for automatic target detection
NASA Astrophysics Data System (ADS)
Cho, Young-Rae; Yim, Sung-Hyuk; Cho, Hyun-Woong; Won, Jin-Ju; Song, Woo-Jin; Kim, So-Hyeon
2017-05-01
We propose a decision-level architecture that combines synthetic aperture radar (SAR) and an infrared (IR) sensor for automatic target detection. We present a new size-based feature, called target-silhouette to reduce the number of false alarms produced by the conventional target-detection algorithm. Boolean Map Visual Theory is used to combine a pair of SAR and IR images to generate the target-enhanced map. Then basic belief assignment is used to transform this map into a belief map. The detection results of sensors are combined to build the target-silhouette map. We integrate the fusion mass and the target-silhouette map on the decision level to exclude false alarms. The proposed algorithm is evaluated using a SAR and IR synthetic database generated by SE-WORKBENCH simulator, and compared with conventional algorithms. The proposed fusion scheme achieves higher detection rate and lower false alarm rate than the conventional algorithms.
Utilization of extended bayesian networks in decision making under uncertainty
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Eeckhout, Edward M; Leishman, Deborah A; Gibson, William L
2009-01-01
Bayesian network tool (called IKE for Integrated Knowledge Engine) has been developed to assess the probability of undesirable events. The tool allows indications and observables from sensors and/or intelligence to feed directly into hypotheses of interest, thus allowing one to quantify the probability and uncertainty of these events resulting from very disparate evidence. For example, the probability that a facility is processing nuclear fuel or assembling a weapon can be assessed by examining the processes required, establishing the observables that should be present, then assembling information from intelligence, sensors and other information sources related to the observables. IKE also hasmore » the capability to determine tasking plans, that is, prioritize which observable should be collected next to most quickly ascertain the 'true' state and drive the probability toward 'zero' or 'one.' This optimization capability is called 'evidence marshaling.' One example to be discussed is a denied facility monitoring situation; there is concern that certain process(es) are being executed at the site (due to some intelligence or other data). We will show how additional pieces of evidence will then ascertain with some degree of certainty the likelihood of this process(es) as each piece of evidence is obtained. This example shows how both intelligence and sensor data can be incorporated into the analysis. A second example involves real-time perimeter security. For this demonstration we used seismic, acoustic, and optical sensors linked back to IKE. We show how these sensors identified and assessed the likelihood of 'intruder' versus friendly vehicles.« less
Modeling Sensor Reliability in Fault Diagnosis Based on Evidence Theory
Yuan, Kaijuan; Xiao, Fuyuan; Fei, Liguo; Kang, Bingyi; Deng, Yong
2016-01-01
Sensor data fusion plays an important role in fault diagnosis. Dempster–Shafer (D-R) evidence theory is widely used in fault diagnosis, since it is efficient to combine evidence from different sensors. However, under the situation where the evidence highly conflicts, it may obtain a counterintuitive result. To address the issue, a new method is proposed in this paper. Not only the statistic sensor reliability, but also the dynamic sensor reliability are taken into consideration. The evidence distance function and the belief entropy are combined to obtain the dynamic reliability of each sensor report. A weighted averaging method is adopted to modify the conflict evidence by assigning different weights to evidence according to sensor reliability. The proposed method has better performance in conflict management and fault diagnosis due to the fact that the information volume of each sensor report is taken into consideration. An application in fault diagnosis based on sensor fusion is illustrated to show the efficiency of the proposed method. The results show that the proposed method improves the accuracy of fault diagnosis from 81.19% to 89.48% compared to the existing methods. PMID:26797611