NASA Astrophysics Data System (ADS)
Huang, Zhijiong; Hu, Yongtao; Zheng, Junyu; Zhai, Xinxin; Huang, Ran
2018-05-01
Lateral boundary conditions (LBCs) are essential for chemical transport models to simulate regional transport; however they often contain large uncertainties. This study proposes an optimized data fusion approach to reduce the bias of LBCs by fusing gridded model outputs, from which the daughter domain's LBCs are derived, with ground-level measurements. The optimized data fusion approach follows the framework of a previous interpolation-based fusion method but improves it by using a bias kriging method to correct the spatial bias in gridded model outputs. Cross-validation shows that the optimized approach better estimates fused fields in areas with a large number of observations compared to the previous interpolation-based method. The optimized approach was applied to correct LBCs of PM2.5 concentrations for simulations in the Pearl River Delta (PRD) region as a case study. Evaluations show that the LBCs corrected by data fusion improve in-domain PM2.5 simulations in terms of the magnitude and temporal variance. Correlation increases by 0.13-0.18 and fractional bias (FB) decreases by approximately 3%-15%. This study demonstrates the feasibility of applying data fusion to improve regional air quality modeling.
Design of a multisensor data fusion system for target detection
NASA Astrophysics Data System (ADS)
Thomopoulos, Stelios C.; Okello, Nickens N.; Kadar, Ivan; Lovas, Louis A.
1993-09-01
The objective of this paper is to discuss the issues that are involved in the design of a multisensor fusion system and provide a systematic analysis and synthesis methodology for the design of the fusion system. The system under consideration consists of multifrequency (similar) radar sensors. However, the fusion design must be flexible to accommodate additional dissimilar sensors such as IR, EO, ESM, and Ladar. The motivation for the system design is the proof of the fusion concept for enhancing the detectability of small targets in clutter. In the context of down-selecting the proper configuration for multisensor (similar and dissimilar, and centralized vs. distributed) data fusion, the issues of data modeling, fusion approaches, and fusion architectures need to be addressed for the particular application being considered. Although the study of different approaches may proceed in parallel, the interplay among them is crucial in selecting a fusion configuration for a given application. The natural sequence for addressing the three different issues is to begin from the data modeling, in order to determine the information content of the data. This information will dictate the appropriate fusion approach. This, in turn, will lead to a global fusion architecture. Both distributed and centralized fusion architectures are used to illustrate the design issues along with Monte-Carlo simulation performance comparison of a single sensor versus a multisensor centrally fused system.
Leaf area index uncertainty estimates for model-data fusion applications
Andrew D. Richardson; D. Bryan Dail; D.Y. Hollinger
2011-01-01
Estimates of data uncertainties are required to integrate different observational data streams as model constraints using model-data fusion. We describe an approach with which random and systematic uncertainties in optical measurements of leaf area index [LAI] can be quantified. We use data from a measurement campaign at the spruce-dominated Howland Forest AmeriFlux...
Formulating Spatially Varying Performance in the Statistical Fusion Framework
Landman, Bennett A.
2012-01-01
To date, label fusion methods have primarily relied either on global (e.g. STAPLE, globally weighted vote) or voxelwise (e.g. locally weighted vote) performance models. Optimality of the statistical fusion framework hinges upon the validity of the stochastic model of how a rater errs (i.e., the labeling process model). Hitherto, approaches have tended to focus on the extremes of potential models. Herein, we propose an extension to the STAPLE approach to seamlessly account for spatially varying performance by extending the performance level parameters to account for a smooth, voxelwise performance level field that is unique to each rater. This approach, Spatial STAPLE, provides significant improvements over state-of-the-art label fusion algorithms in both simulated and empirical data sets. PMID:22438513
NASA Astrophysics Data System (ADS)
Wang, Jun; Wang, Yang; Zeng, Hui
2016-01-01
A key issue to address in synthesizing spatial data with variable-support in spatial analysis and modeling is the change-of-support problem. We present an approach for solving the change-of-support and variable-support data fusion problems. This approach is based on geostatistical inverse modeling that explicitly accounts for differences in spatial support. The inverse model is applied here to produce both the best predictions of a target support and prediction uncertainties, based on one or more measurements, while honoring measurements. Spatial data covering large geographic areas often exhibit spatial nonstationarity and can lead to computational challenge due to the large data size. We developed a local-window geostatistical inverse modeling approach to accommodate these issues of spatial nonstationarity and alleviate computational burden. We conducted experiments using synthetic and real-world raster data. Synthetic data were generated and aggregated to multiple supports and downscaled back to the original support to analyze the accuracy of spatial predictions and the correctness of prediction uncertainties. Similar experiments were conducted for real-world raster data. Real-world data with variable-support were statistically fused to produce single-support predictions and associated uncertainties. The modeling results demonstrate that geostatistical inverse modeling can produce accurate predictions and associated prediction uncertainties. It is shown that the local-window geostatistical inverse modeling approach suggested offers a practical way to solve the well-known change-of-support problem and variable-support data fusion problem in spatial analysis and modeling.
A novel framework for command and control of networked sensor systems
NASA Astrophysics Data System (ADS)
Chen, Genshe; Tian, Zhi; Shen, Dan; Blasch, Erik; Pham, Khanh
2007-04-01
In this paper, we have proposed a highly innovative advanced command and control framework for sensor networks used for future Integrated Fire Control (IFC). The primary goal is to enable and enhance target detection, validation, and mitigation for future military operations by graphical game theory and advanced knowledge information fusion infrastructures. The problem is approached by representing distributed sensor and weapon systems as generic warfare resources which must be optimized in order to achieve the operational benefits afforded by enabling a system of systems. This paper addresses the importance of achieving a Network Centric Warfare (NCW) foundation of information superiority-shared, accurate, and timely situational awareness upon which advanced automated management aids for IFC can be built. The approach uses the Data Fusion Information Group (DFIG) Fusion hierarchy of Level 0 through Level 4 to fuse the input data into assessments for the enemy target system threats in a battlespace to which military force is being applied. Compact graph models are employed across all levels of the fusion hierarchy to accomplish integrative data fusion and information flow control, as well as cross-layer sensor management. The functional block at each fusion level will have a set of innovative algorithms that not only exploit the corresponding graph model in a computationally efficient manner, but also permit combined functional experiments across levels by virtue of the unifying graphical model approach.
Márquez, Cristina; López, M Isabel; Ruisánchez, Itziar; Callao, M Pilar
2016-12-01
Two data fusion strategies (high- and mid-level) combined with a multivariate classification approach (Soft Independent Modelling of Class Analogy, SIMCA) have been applied to take advantage of the synergistic effect of the information obtained from two spectroscopic techniques: FT-Raman and NIR. Mid-level data fusion consists of merging some of the previous selected variables from the spectra obtained from each spectroscopic technique and then applying the classification technique. High-level data fusion combines the SIMCA classification results obtained individually from each spectroscopic technique. Of the possible ways to make the necessary combinations, we decided to use fuzzy aggregation connective operators. As a case study, we considered the possible adulteration of hazelnut paste with almond. Using the two-class SIMCA approach, class 1 consisted of unadulterated hazelnut samples and class 2 of samples adulterated with almond. Models performance was also studied with samples adulterated with chickpea. The results show that data fusion is an effective strategy since the performance parameters are better than the individual ones: sensitivity and specificity values between 75% and 100% for the individual techniques and between 96-100% and 88-100% for the mid- and high-level data fusion strategies, respectively. Copyright © 2016 Elsevier B.V. All rights reserved.
Evaluation of a data fusion approach to estimate daily PM2.5 levels in North China
Liang, Fengchao; Gao, Meng; Xiao, Qingyang; Carmichael, Gregory R.
2017-01-01
PM2.5 air pollution has been a growing concern worldwide. Previous studies have conducted several techniques to estimate PM2.5 exposure spatiotemporally in China, but all these have limitations. This study was to develop a data fusion approach and compare it with kriging and Chemistry Module. Two techniques were applied to create daily spatial cover of PM2.5 in grid cells with a resolution of 10 km in North China in 2013, respectively, which was kriging with an external drift (KED) and Weather Research and Forecast Model with Chemistry Module (WRF-Chem). A data fusion technique was developed by fusing PM2.5 concentration predicted by KED and WRF-Chem, accounting for the distance from the central of grid cell to the nearest ground observations and daily spatial correlations between WRF-Chem and observations. Model performances were evaluated by comparing them with ground observations and the spatial prediction errors. KED and data fusion performed better at monitoring sites with a daily model R2 of 0.95 and 0.94, respectively and PM2.5 was overestimated by WRF-Chem (R2=0.51). KED and data fusion performed better around the ground monitors, WRF-Chem performed relative worse with high prediction errors in the central of study domain. In our study, both KED and data fusion technique provided highly accurate PM2.5. Current monitoring network in North China was dense enough to provide a reliable PM2.5 prediction by interpolation technique. PMID:28599195
Evaluation of a data fusion approach to estimate daily PM2.5 levels in North China.
Liang, Fengchao; Gao, Meng; Xiao, Qingyang; Carmichael, Gregory R; Pan, Xiaochuan; Liu, Yang
2017-10-01
PM 2.5 air pollution has been a growing concern worldwide. Previous studies have conducted several techniques to estimate PM 2.5 exposure spatiotemporally in China, but all these have limitations. This study was to develop a data fusion approach and compare it with kriging and Chemistry Module. Two techniques were applied to create daily spatial cover of PM 2.5 in grid cells with a resolution of 10km in North China in 2013, respectively, which was kriging with an external drift (KED) and Weather Research and Forecast Model with Chemistry Module (WRF-Chem). A data fusion technique was developed by fusing PM 2.5 concentration predicted by KED and WRF-Chem, accounting for the distance from the central of grid cell to the nearest ground observations and daily spatial correlations between WRF-Chem and observations. Model performances were evaluated by comparing them with ground observations and the spatial prediction errors. KED and data fusion performed better at monitoring sites with a daily model R 2 of 0.95 and 0.94, respectively and PM 2.5 was overestimated by WRF-Chem (R 2 =0.51). KED and data fusion performed better around the ground monitors, WRF-Chem performed relative worse with high prediction errors in the central of study domain. In our study, both KED and data fusion technique provided highly accurate PM 2.5 . Current monitoring network in North China was dense enough to provide a reliable PM 2.5 prediction by interpolation technique. Copyright © 2017. Published by Elsevier Inc.
Jing, Luyang; Wang, Taiyong; Zhao, Ming; Wang, Peng
2017-01-01
A fault diagnosis approach based on multi-sensor data fusion is a promising tool to deal with complicated damage detection problems of mechanical systems. Nevertheless, this approach suffers from two challenges, which are (1) the feature extraction from various types of sensory data and (2) the selection of a suitable fusion level. It is usually difficult to choose an optimal feature or fusion level for a specific fault diagnosis task, and extensive domain expertise and human labor are also highly required during these selections. To address these two challenges, we propose an adaptive multi-sensor data fusion method based on deep convolutional neural networks (DCNN) for fault diagnosis. The proposed method can learn features from raw data and optimize a combination of different fusion levels adaptively to satisfy the requirements of any fault diagnosis task. The proposed method is tested through a planetary gearbox test rig. Handcraft features, manual-selected fusion levels, single sensory data, and two traditional intelligent models, back-propagation neural networks (BPNN) and a support vector machine (SVM), are used as comparisons in the experiment. The results demonstrate that the proposed method is able to detect the conditions of the planetary gearbox effectively with the best diagnosis accuracy among all comparative methods in the experiment. PMID:28230767
A Markov game theoretic data fusion approach for cyber situational awareness
NASA Astrophysics Data System (ADS)
Shen, Dan; Chen, Genshe; Cruz, Jose B., Jr.; Haynes, Leonard; Kruger, Martin; Blasch, Erik
2007-04-01
This paper proposes an innovative data-fusion/ data-mining game theoretic situation awareness and impact assessment approach for cyber network defense. Alerts generated by Intrusion Detection Sensors (IDSs) or Intrusion Prevention Sensors (IPSs) are fed into the data refinement (Level 0) and object assessment (L1) data fusion components. High-level situation/threat assessment (L2/L3) data fusion based on Markov game model and Hierarchical Entity Aggregation (HEA) are proposed to refine the primitive prediction generated by adaptive feature/pattern recognition and capture new unknown features. A Markov (Stochastic) game method is used to estimate the belief of each possible cyber attack pattern. Game theory captures the nature of cyber conflicts: determination of the attacking-force strategies is tightly coupled to determination of the defense-force strategies and vice versa. Also, Markov game theory deals with uncertainty and incompleteness of available information. A software tool is developed to demonstrate the performance of the high level information fusion for cyber network defense situation and a simulation example shows the enhanced understating of cyber-network defense.
Riniker, Sereina; Fechner, Nikolas; Landrum, Gregory A
2013-11-25
The concept of data fusion - the combination of information from different sources describing the same object with the expectation to generate a more accurate representation - has found application in a very broad range of disciplines. In the context of ligand-based virtual screening (VS), data fusion has been applied to combine knowledge from either different active molecules or different fingerprints to improve similarity search performance. Machine-learning (ML) methods based on fusion of multiple homogeneous classifiers, in particular random forests, have also been widely applied in the ML literature. The heterogeneous version of classifier fusion - fusing the predictions from different model types - has been less explored. Here, we investigate heterogeneous classifier fusion for ligand-based VS using three different ML methods, RF, naïve Bayes (NB), and logistic regression (LR), with four 2D fingerprints, atom pairs, topological torsions, RDKit fingerprint, and circular fingerprint. The methods are compared using a previously developed benchmarking platform for 2D fingerprints which is extended to ML methods in this article. The original data sets are filtered for difficulty, and a new set of challenging data sets from ChEMBL is added. Data sets were also generated for a second use case: starting from a small set of related actives instead of diverse actives. The final fused model consistently outperforms the other approaches across the broad variety of targets studied, indicating that heterogeneous classifier fusion is a very promising approach for ligand-based VS. The new data sets together with the adapted source code for ML methods are provided in the Supporting Information .
A Bayesian trans-dimensional approach for the fusion of multiple geophysical datasets
NASA Astrophysics Data System (ADS)
JafarGandomi, Arash; Binley, Andrew
2013-09-01
We propose a Bayesian fusion approach to integrate multiple geophysical datasets with different coverage and sensitivity. The fusion strategy is based on the capability of various geophysical methods to provide enough resolution to identify either subsurface material parameters or subsurface structure, or both. We focus on electrical resistivity as the target material parameter and electrical resistivity tomography (ERT), electromagnetic induction (EMI), and ground penetrating radar (GPR) as the set of geophysical methods. However, extending the approach to different sets of geophysical parameters and methods is straightforward. Different geophysical datasets are entered into a trans-dimensional Markov chain Monte Carlo (McMC) search-based joint inversion algorithm. The trans-dimensional property of the McMC algorithm allows dynamic parameterisation of the model space, which in turn helps to avoid bias of the post-inversion results towards a particular model. Given that we are attempting to develop an approach that has practical potential, we discretize the subsurface into an array of one-dimensional earth-models. Accordingly, the ERT data that are collected by using two-dimensional acquisition geometry are re-casted to a set of equivalent vertical electric soundings. Different data are inverted either individually or jointly to estimate one-dimensional subsurface models at discrete locations. We use Shannon's information measure to quantify the information obtained from the inversion of different combinations of geophysical datasets. Information from multiple methods is brought together via introducing joint likelihood function and/or constraining the prior information. A Bayesian maximum entropy approach is used for spatial fusion of spatially dispersed estimated one-dimensional models and mapping of the target parameter. We illustrate the approach with a synthetic dataset and then apply it to a field dataset. We show that the proposed fusion strategy is successful not only in enhancing the subsurface information but also as a survey design tool to identify the appropriate combination of the geophysical tools and show whether application of an individual method for further investigation of a specific site is beneficial.
NASA Technical Reports Server (NTRS)
Gopalan, Arun; Zubko, Viktor; Leptoukh, Gregory G.
2008-01-01
We look at issues, barriers and approaches for Data Fusion of satellite aerosol data as available from the GES DISC GIOVANNI Web Service. Daily Global Maps of AOT from a single satellite sensor alone contain gaps that arise due to various sources (sun glint regions, clouds, orbital swath gaps at low latitudes, bright underlying surfaces etc.). The goal is to develop a fast, accurate and efficient method to improve the spatial coverage of the Daily AOT data to facilitate comparisons with Global Models. Data Fusion may be supplemented by Optimal Interpolation (OI) as needed.
A Data Fusion Method in Wireless Sensor Networks
Izadi, Davood; Abawajy, Jemal H.; Ghanavati, Sara; Herawan, Tutut
2015-01-01
The success of a Wireless Sensor Network (WSN) deployment strongly depends on the quality of service (QoS) it provides regarding issues such as data accuracy, data aggregation delays and network lifetime maximisation. This is especially challenging in data fusion mechanisms, where a small fraction of low quality data in the fusion input may negatively impact the overall fusion result. In this paper, we present a fuzzy-based data fusion approach for WSN with the aim of increasing the QoS whilst reducing the energy consumption of the sensor network. The proposed approach is able to distinguish and aggregate only true values of the collected data as such, thus reducing the burden of processing the entire data at the base station (BS). It is also able to eliminate redundant data and consequently reduce energy consumption thus increasing the network lifetime. We studied the effectiveness of the proposed data fusion approach experimentally and compared it with two baseline approaches in terms of data collection, number of transferred data packets and energy consumption. The results of the experiments show that the proposed approach achieves better results than the baseline approaches. PMID:25635417
Forecasting Chronic Diseases Using Data Fusion.
Acar, Evrim; Gürdeniz, Gözde; Savorani, Francesco; Hansen, Louise; Olsen, Anja; Tjønneland, Anne; Dragsted, Lars Ove; Bro, Rasmus
2017-07-07
Data fusion, that is, extracting information through the fusion of complementary data sets, is a topic of great interest in metabolomics because analytical platforms such as liquid chromatography-mass spectrometry (LC-MS) and nuclear magnetic resonance (NMR) spectroscopy commonly used for chemical profiling of biofluids provide complementary information. In this study, with a goal of forecasting acute coronary syndrome (ACS), breast cancer, and colon cancer, we jointly analyzed LC-MS, NMR measurements of plasma samples, and the metadata corresponding to the lifestyle of participants. We used supervised data fusion based on multiple kernel learning and exploited the linearity of the models to identify significant metabolites/features for the separation of healthy referents and the cases developing a disease. We demonstrated that (i) fusing LC-MS, NMR, and metadata provided better separation of ACS cases and referents compared with individual data sets, (ii) NMR data performed the best in terms of forecasting breast cancer, while fusion degraded the performance, and (iii) neither the individual data sets nor their fusion performed well for colon cancer. Furthermore, we showed the strengths and limitations of the fusion models by discussing their performance in terms of capturing known biomarkers for smoking and coffee. While fusion may improve performance in terms of separating certain conditions by jointly analyzing metabolomics and metadata sets, it is not necessarily always the best approach as in the case of breast cancer.
Measuring situational awareness and resolving inherent high-level fusion obstacles
NASA Astrophysics Data System (ADS)
Sudit, Moises; Stotz, Adam; Holender, Michael; Tagliaferri, William; Canarelli, Kathie
2006-04-01
Information Fusion Engine for Real-time Decision Making (INFERD) is a tool that was developed to supplement current graph matching techniques in Information Fusion models. Based on sensory data and a priori models, INFERD dynamically generates, evolves, and evaluates hypothesis on the current state of the environment. The a priori models developed are hierarchical in nature lending them to a multi-level Information Fusion process whose primary output provides a situational awareness of the environment of interest in the context of the models running. In this paper we look at INFERD's multi-level fusion approach and provide insight on the inherent problems such as fragmentation in the approach and the research being undertaken to mitigate those deficiencies. Due to the large variance of data in disparate environments, the awareness of situations in those environments can be drastically different. To accommodate this, the INFERD framework provides support for plug-and-play fusion modules which can be developed specifically for domains of interest. However, because the models running in INFERD are graph based, some default measurements can be provided and will be discussed in the paper. Among these are a Depth measurement to determine how much danger is presented by the action taking place, a Breadth measurement to gain information regarding the scale of an attack that is currently happening, and finally a Reliability measure to tell the user the credibility of a particular hypothesis. All of these results will be demonstrated in the Cyber domain where recent research has shown to be an area that is welldefined and bounded, so that new models and algorithms can be developed and evaluated.
Kernel-Based Sensor Fusion With Application to Audio-Visual Voice Activity Detection
NASA Astrophysics Data System (ADS)
Dov, David; Talmon, Ronen; Cohen, Israel
2016-12-01
In this paper, we address the problem of multiple view data fusion in the presence of noise and interferences. Recent studies have approached this problem using kernel methods, by relying particularly on a product of kernels constructed separately for each view. From a graph theory point of view, we analyze this fusion approach in a discrete setting. More specifically, based on a statistical model for the connectivity between data points, we propose an algorithm for the selection of the kernel bandwidth, a parameter, which, as we show, has important implications on the robustness of this fusion approach to interferences. Then, we consider the fusion of audio-visual speech signals measured by a single microphone and by a video camera pointed to the face of the speaker. Specifically, we address the task of voice activity detection, i.e., the detection of speech and non-speech segments, in the presence of structured interferences such as keyboard taps and office noise. We propose an algorithm for voice activity detection based on the audio-visual signal. Simulation results show that the proposed algorithm outperforms competing fusion and voice activity detection approaches. In addition, we demonstrate that a proper selection of the kernel bandwidth indeed leads to improved performance.
Distributed service-based approach for sensor data fusion in IoT environments.
Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A; Gutiérrez-Guerrero, José M; Muros-Cobos, Jesús L
2014-10-15
The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments.
Distributed Service-Based Approach for Sensor Data Fusion in IoT Environments
Rodríguez-Valenzuela, Sandra; Holgado-Terriza, Juan A.; Gutiérrez-Guerrero, José M.; Muros-Cobos, Jesús L.
2014-01-01
The Internet of Things (IoT) enables the communication among smart objects promoting the pervasive presence around us of a variety of things or objects that are able to interact and cooperate jointly to reach common goals. IoT objects can obtain data from their context, such as the home, office, industry or body. These data can be combined to obtain new and more complex information applying data fusion processes. However, to apply data fusion algorithms in IoT environments, the full system must deal with distributed nodes, decentralized communication and support scalability and nodes dynamicity, among others restrictions. In this paper, a novel method to manage data acquisition and fusion based on a distributed service composition model is presented, improving the data treatment in IoT pervasive environments. PMID:25320907
NASA Astrophysics Data System (ADS)
Kruger, Scott; Shasharina, S.; Vadlamani, S.; McCune, D.; Holland, C.; Jenkins, T. G.; Candy, J.; Cary, J. R.; Hakim, A.; Miah, M.; Pletzer, A.
2010-11-01
As various efforts to integrate fusion codes proceed worldwide, standards for sharing data have emerged. In the U.S., the SWIM project has pioneered the development of the Plasma State, which has a flat-hierarchy and is dominated by its use within 1.5D transport codes. The European Integrated Tokamak Modeling effort has developed a more ambitious data interoperability effort organized around the concept of Consistent Physical Objects (CPOs). CPOs have deep hierarchies as needed by an effort that seeks to encompass all of fusion computing. Here, we discuss ideas for implementing data interoperability that is complementary to both the Plasma State and CPOs. By making use of attributes within the netcdf and HDF5 binary file formats, the goals of data interoperability can be achieved with a more informal approach. In addition, a file can be simultaneously interoperable to several standards at once. As an illustration of this approach, we discuss its application to the development of synthetic diagnostics that can be used for multiple codes.
Multi-model data fusion to improve an early warning system for hypo-/hyperglycemic events.
Botwey, Ransford Henry; Daskalaki, Elena; Diem, Peter; Mougiakakou, Stavroula G
2014-01-01
Correct predictions of future blood glucose levels in individuals with Type 1 Diabetes (T1D) can be used to provide early warning of upcoming hypo-/hyperglycemic events and thus to improve the patient's safety. To increase prediction accuracy and efficiency, various approaches have been proposed which combine multiple predictors to produce superior results compared to single predictors. Three methods for model fusion are presented and comparatively assessed. Data from 23 T1D subjects under sensor-augmented pump (SAP) therapy were used in two adaptive data-driven models (an autoregressive model with output correction - cARX, and a recurrent neural network - RNN). Data fusion techniques based on i) Dempster-Shafer Evidential Theory (DST), ii) Genetic Algorithms (GA), and iii) Genetic Programming (GP) were used to merge the complimentary performances of the prediction models. The fused output is used in a warning algorithm to issue alarms of upcoming hypo-/hyperglycemic events. The fusion schemes showed improved performance with lower root mean square errors, lower time lags, and higher correlation. In the warning algorithm, median daily false alarms (DFA) of 0.25%, and 100% correct alarms (CA) were obtained for both event types. The detection times (DT) before occurrence of events were 13.0 and 12.1 min respectively for hypo-/hyperglycemic events. Compared to the cARX and RNN models, and a linear fusion of the two, the proposed fusion schemes represents a significant improvement.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, Zhiling; Wei, Wei; Turlapaty, Anish
2012-07-01
At the United States Army's test sites, fired penetrators made of Depleted Uranium (DU) have been buried under ground and become hazardous waste. Previously, we developed techniques for detecting buried radioactive targets. We also developed approaches for locating buried paramagnetic metal objects by utilizing the electromagnetic induction (EMI) sensor data. In this paper, we apply data fusion techniques to combine results from both the radiation detection and the EMI detection, so that we can further distinguish among DU penetrators, DU oxide, and non- DU metal debris. We develop a two-step fusion approach for the task, and test it with surveymore » data collected on simulation targets. In this work, we explored radiation and EMI data fusion for detecting DU, oxides, and non-DU metals. We developed a two-step fusion approach based on majority voting and a set of decision rules. With this approach, we fuse results from radiation detection based on the RX algorithm and EMI detection based on a 3-step analysis. Our fusion approach has been tested successfully with data collected on simulation targets. In the future, we will need to further verify the effectiveness of this fusion approach with field data. (authors)« less
Dynamical approach to fusion-fission process in superheavy mass region
NASA Astrophysics Data System (ADS)
Aritomo, Y.; Hinde, D. J.; Wakhle, A.; du Rietz, R.; Dasgupta, M.; Hagino, K.; Chiba, S.; Nishio, K.
2012-10-01
In order to describe heavy-ion fusion reactions around the Coulomb barrier with an actinide target nucleus, we propose a model which combines the coupled-channels approach and a fluctuation-dissipation model for dynamical calculations. This model takes into account couplings to the collective states of the interacting nuclei in the penetration of the Coulomb barrier and the subsequent dynamical evolution of a nuclear shape from the contact configuration. In the fluctuation-dissipation model with a Langevin equation, the effect of nuclear orientation at the initial impact on the prolately deformed target nucleus is considered. Fusion-fission, quasifission and deep quasifission are separated as different Langevin trajectories on the potential energy surface. Using this model, we analyze the experimental data for the mass distribution of fission fragments (MDFF) in the reaction of 36S+238U at several incident energies around the Coulomb barrier.
Peng, Changhui; Guiot, Joel; Wu, Haibin; Jiang, Hong; Luo, Yiqi
2011-05-01
It is increasingly being recognized that global ecological research requires novel methods and strategies in which to combine process-based ecological models and data in cohesive, systematic ways. Model-data fusion (MDF) is an emerging area of research in ecology and palaeoecology. It provides a new quantitative approach that offers a high level of empirical constraint over model predictions based on observations using inverse modelling and data assimilation (DA) techniques. Increasing demands to integrate model and data methods in the past decade has led to MDF utilization in palaeoecology, ecology and earth system sciences. This paper reviews key features and principles of MDF and highlights different approaches with regards to DA. After providing a critical evaluation of the numerous benefits of MDF and its current applications in palaeoecology (i.e., palaeoclimatic reconstruction, palaeovegetation and palaeocarbon storage) and ecology (i.e. parameter and uncertainty estimation, model error identification, remote sensing and ecological forecasting), the paper discusses method limitations, current challenges and future research direction. In the ongoing data-rich era of today's world, MDF could become an important diagnostic and prognostic tool in which to improve our understanding of ecological processes while testing ecological theory and hypotheses and forecasting changes in ecosystem structure, function and services. © 2011 Blackwell Publishing Ltd/CNRS.
Systematic investigations of deep sub-barrier fusion reactions using an adiabatic approach
NASA Astrophysics Data System (ADS)
Ichikawa, Takatoshi
2015-12-01
Background: At extremely low incident energies, unexpected decreases in fusion cross sections, compared to the standard coupled-channels (CC) calculations, have been observed in a wide range of fusion reactions. These significant reductions of the fusion cross sections are often referred to as the fusion hindrance. However, the physical origin of the fusion hindrance is still unclear. Purpose: To describe the fusion hindrance based on an adiabatic approach, I propose a novel extension of the standard CC model by introducing a damping factor that describes a smooth transition from sudden to adiabatic processes, that is, the transition from the separated two-body to the united dinuclear system. I demonstrate the performance of this model by systematically investigating various deep sub-barrier fusion reactions. Method: I extend the standard CC model by introducing a damping factor into the coupling matrix elements in the standard CC model. This avoids double counting of the CC effects, when two colliding nuclei overlap one another. I adopt the Yukawa-plus-exponential (YPE) model as a basic heavy ion-ion potential, which is advantageous for a unified description of the one- and two-body potentials. For the purpose of these systematic investigations, I approximate the one-body potential with a third-order polynomial function based on the YPE model. Results: Calculated fusion cross sections for the medium-heavy mass systems of 64Ni+64Ni , 58Ni+58Ni , and 58Ni+54Fe , the medium-light mass systems of 40Ca+40Ca , 48Ca+48Ca , and 24Mg+30Si , and the mass-asymmetric systems of 48Ca+96Zr and 16O+208Pb are consistent with the experimental data. The astrophysical S factor and logarithmic derivative representations of these are also in good agreement with the experimental data. The values obtained for the individual radius and diffuseness parameters in the damping factor, which reproduce the fusion cross sections well, are nearly equal to the average value for all the systems. Conclusions: Since the results calculated with the damping factor are in excellent agreement with the experimental data in all systems, I conclude that a coordinate-dependent coupling strength is responsible for the fusion hindrance. In all systems, the potential energies at the touching point VTouch strongly correlate with the incident threshold energies for which the fusion hindrance starts to emerge, except for the medium-light mass systems.
Case Study: Organotypic human in vitro models of embryonic ...
Morphogenetic fusion of tissues is a common event in embryonic development and disruption of fusion is associated with birth defects of the eye, heart, neural tube, phallus, palate, and other organ systems. Embryonic tissue fusion requires precise regulation of cell-cell and cell-matrix interactions that drive proliferation, differentiation, and morphogenesis. Chemical low-dose exposures can disrupt morphogenesis across space and time by interfering with key embryonic fusion events. The Morphogenetic Fusion Task uses computer and in vitro models to elucidate consequences of developmental exposures. The Morphogenetic Fusion Task integrates multiple approaches to model responses to chemicals that leaad to birth defects, including integrative mining on ToxCast DB, ToxRefDB, and chemical structures, advanced computer agent-based models, and human cell-based cultures that model disruption of cellular and molecular behaviors including mechanisms predicted from integrative data mining and agent-based models. The purpose of the poster is to indicate progress on the CSS 17.02 Virtual Tissue Models Morphogenesis Task 1 products for the Board of Scientific Counselors meeting on Nov 16-17.
A New Approach to Image Fusion Based on Cokriging
NASA Technical Reports Server (NTRS)
Memarsadeghi, Nargess; LeMoigne, Jacqueline; Mount, David M.; Morisette, Jeffrey T.
2005-01-01
We consider the image fusion problem involving remotely sensed data. We introduce cokriging as a method to perform fusion. We investigate the advantages of fusing Hyperion with ALI. The evaluation is performed by comparing the classification of the fused data with that of input images and by calculating well-chosen quantitative fusion quality metrics. We consider the Invasive Species Forecasting System (ISFS) project as our fusion application. The fusion of ALI with Hyperion data is studies using PCA and wavelet-based fusion. We then propose utilizing a geostatistical based interpolation method called cokriging as a new approach for image fusion.
Kalman filter-based EM-optical sensor fusion for needle deflection estimation.
Jiang, Baichuan; Gao, Wenpeng; Kacher, Daniel; Nevo, Erez; Fetics, Barry; Lee, Thomas C; Jayender, Jagadeesan
2018-04-01
In many clinical procedures such as cryoablation that involves needle insertion, accurate placement of the needle's tip at the desired target is the major issue for optimizing the treatment and minimizing damage to the neighboring anatomy. However, due to the interaction force between the needle and tissue, considerable error in intraoperative tracking of the needle tip can be observed as needle deflects. In this paper, measurements data from an optical sensor at the needle base and a magnetic resonance (MR) gradient field-driven electromagnetic (EM) sensor placed 10 cm from the needle tip are used within a model-integrated Kalman filter-based sensor fusion scheme. Bending model-based estimations and EM-based direct estimation are used as the measurement vectors in the Kalman filter, thus establishing an online estimation approach. Static tip bending experiments show that the fusion method can reduce the mean error of the tip position estimation from 29.23 mm of the optical sensor-based approach to 3.15 mm of the fusion-based approach and from 39.96 to 6.90 mm, at the MRI isocenter and the MRI entrance, respectively. This work established a novel sensor fusion scheme that incorporates model information, which enables real-time tracking of needle deflection with MRI compatibility, in a free-hand operating setup.
The fusion of large scale classified side-scan sonar image mosaics.
Reed, Scott; Tena, Ruiz Ioseba; Capus, Chris; Petillot, Yvan
2006-07-01
This paper presents a unified framework for the creation of classified maps of the seafloor from sonar imagery. Significant challenges in photometric correction, classification, navigation and registration, and image fusion are addressed. The techniques described are directly applicable to a range of remote sensing problems. Recent advances in side-scan data correction are incorporated to compensate for the sonar beam pattern and motion of the acquisition platform. The corrected images are segmented using pixel-based textural features and standard classifiers. In parallel, the navigation of the sonar device is processed using Kalman filtering techniques. A simultaneous localization and mapping framework is adopted to improve the navigation accuracy and produce georeferenced mosaics of the segmented side-scan data. These are fused within a Markovian framework and two fusion models are presented. The first uses a voting scheme regularized by an isotropic Markov random field and is applicable when the reliability of each information source is unknown. The Markov model is also used to inpaint regions where no final classification decision can be reached using pixel level fusion. The second model formally introduces the reliability of each information source into a probabilistic model. Evaluation of the two models using both synthetic images and real data from a large scale survey shows significant quantitative and qualitative improvement using the fusion approach.
An epidemic model for biological data fusion in ad hoc sensor networks
NASA Astrophysics Data System (ADS)
Chang, K. C.; Kotari, Vikas
2009-05-01
Bio terrorism can be a very refined and a catastrophic approach of attacking a nation. This requires the development of a complete architecture dedicatedly designed for this purpose which includes but is not limited to Sensing/Detection, Tracking and Fusion, Communication, and others. In this paper we focus on one such architecture and evaluate its performance. Various sensors for this specific purpose have been studied. The accent has been on use of Distributed systems such as ad-hoc networks and on application of epidemic data fusion algorithms to better manage the bio threat data. The emphasis has been on understanding the performance characteristics of these algorithms under diversified real time scenarios which are implemented through extensive JAVA based simulations. Through comparative studies on communication and fusion the performance of channel filter algorithm for the purpose of biological sensor data fusion are validated.
Model-theoretic framework for sensor data fusion
NASA Astrophysics Data System (ADS)
Zavoleas, Kyriakos P.; Kokar, Mieczyslaw M.
1993-09-01
The main goal of our research in sensory data fusion (SDF) is the development of a systematic approach (a methodology) to designing systems for interpreting sensory information and for reasoning about the situation based upon this information and upon available data bases and knowledge bases. To achieve such a goal, two kinds of subgoals have been set: (1) develop a theoretical framework in which rational design/implementation decisions can be made, and (2) design a prototype SDF system along the lines of the framework. Our initial design of the framework has been described in our previous papers. In this paper we concentrate on the model-theoretic aspects of this framework. We postulate that data are embedded in data models, and information processing mechanisms are embedded in model operators. The paper is devoted to analyzing the classes of model operators and their significance in SDF. We investigate transformation abstraction and fusion operators. A prototype SDF system, fusing data from range and intensity sensors, is presented, exemplifying the structures introduced. Our framework is justified by the fact that it provides modularity, traceability of information flow, and a basis for a specification language for SDF.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kovalenko, V. N.; Vechernin, V. V.
2016-01-22
The ultrarelativistic collisions of heavy and light ions in the center-of-mass energy range from a few up to a hundred GeV per nucleon have been considered in string fusion approach. A Monte Carlo model of proton-proton, proton-nucleus, and nucleus-nucleus collisions has been developed, which takes into account both the string fusion and the finite rapidity length of strings, implementing the hadronic scattering through the interaction of color dipoles. It well describes the proton-nucleus and nucleus-nucleus collisions at the partonic level without using Glauber model of nuclear collisions. All parameters are fixed using experimental data on inelastic cross section and multiplicity.more » In the framework of the model, we performed a beam energy and system size scan and studied the behaviour of n-n, pt-n and pt-pt long-range correlation coefficients. The detailed modeling of the event by event charged particles production allowed to provide predictions in the conditions close to the experimental ones allowing a direct comparison to the data.« less
An Analytical Framework for Soft and Hard Data Fusion: A Dempster-Shafer Belief Theoretic Approach
2012-08-01
fusion. Therefore, we provide a detailed discussion on uncertain data types, their origins and three uncertainty pro- cessing formalisms that are popular...suitable membership functions corresponding to the fuzzy sets. 3.2.3 DS Theory The DS belief theory, originally proposed by Dempster, can be thought of as... originated and various imperfections of the source. Uncertainty handling formalisms provide techniques for modeling and working with these uncertain data types
Compressive hyperspectral and multispectral imaging fusion
NASA Astrophysics Data System (ADS)
Espitia, Óscar; Castillo, Sergio; Arguello, Henry
2016-05-01
Image fusion is a valuable framework which combines two or more images of the same scene from one or multiple sensors, allowing to improve the resolution of the images and increase the interpretable content. In remote sensing a common fusion problem consists of merging hyperspectral (HS) and multispectral (MS) images that involve large amount of redundant data, which ignores the highly correlated structure of the datacube along the spatial and spectral dimensions. Compressive HS and MS systems compress the spectral data in the acquisition step allowing to reduce the data redundancy by using different sampling patterns. This work presents a compressed HS and MS image fusion approach, which uses a high dimensional joint sparse model. The joint sparse model is formulated by combining HS and MS compressive acquisition models. The high spectral and spatial resolution image is reconstructed by using sparse optimization algorithms. Different fusion spectral image scenarios are used to explore the performance of the proposed scheme. Several simulations with synthetic and real datacubes show promising results as the reliable reconstruction of a high spectral and spatial resolution image can be achieved by using as few as just the 50% of the datacube.
Low-energy fusion dynamics of weakly bound nuclei: A time dependent perspective
NASA Astrophysics Data System (ADS)
Diaz-Torres, A.; Boselli, M.
2016-05-01
Recent dynamical fusion models for weakly bound nuclei at low incident energies, based on a time-dependent perspective, are briefly presented. The main features of both the PLATYPUS model and a new quantum approach are highlighted. In contrast to existing timedependent quantum models, the present quantum approach separates the complete and incomplete fusion from the total fusion. Calculations performed within a toy model for 6Li + 209Bi at near-barrier energies show that converged excitation functions for total, complete and incomplete fusion can be determined with the time-dependent wavepacket dynamics.
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
Modeling and analysis of tritium dynamics in a DT fusion fuel cycle
NASA Astrophysics Data System (ADS)
Kuan, William
1998-11-01
A number of crucial design issues have a profound effect on the dynamics of the tritium fuel cycle in a DT fusion reactor, where the development of appropriate solutions to these issues is of particular importance to the introduction of fusion as a commercial system. Such tritium-related issues can be classified according to their operational, safety, and economic impact to the operation of the reactor during its lifetime. Given such key design issues inherent in next generation fusion devices using the DT fuel cycle development of appropriate models can then lead to optimized designs of the fusion fuel cycle for different types of DT fusion reactors. In this work, two different types of modeling approaches are developed and their application to solving key tritium issues presented. For the first approach, time-dependent inventories, concentrations, and flow rates characterizing the main subsystems of the fuel cycle are simulated with a new dynamic modular model of a fusion reactor's fuel cycle, named X-TRUFFLES (X-Windows TRitiUm Fusion Fuel cycLE dynamic Simulation). The complex dynamic behavior of the recycled fuel within each of the modeled subsystems is investigated using this new integrated model for different reactor scenarios and design approaches. Results for a proposed fuel cycle design taking into account current technologies are presented, including sensitivity studies. Ways to minimize the tritium inventory are also assessed by examining various design options that could be used to minimize local and global tritium inventories. The second modeling approach involves an analytical model to be used for the calculation of the required tritium breeding ratio, i.e., a primary design issue which relates directly to the feasibility and economics of DT fusion systems. A time-integrated global tritium balance scheme is developed and appropriate analytical expressions are derived for tritium self-sufficiency relevant parameters. The easy exploration of the large parameter space of the fusion fuel cycle can thus be conducted as opposed to previous modeling approaches. Future guidance for R&D (research and development) in fusion nuclear technology is discussed in view of possible routes to take in reducing the tritium breeding requirements of DT fusion reactors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
The Data Fusion Modeling (DFM) approach has been used to develop a groundwater flow and transport model of the Old Burial Grounds (OBG) at the US Department of Energy`s Savannah River Site (SRS). The resulting DFM model was compared to an existing model that was calibrated via the typical trial-and-error method. The OBG was chosen because a substantial amount of hydrogeologic information is available, a FACT (derivative of VAM3DCG) flow and transport model of the site exists, and the calibration and numerics were challenging with standard approaches. The DFM flow model developed here is similar to the flow model bymore » Flach et al. This allows comparison of the two flow models and validates the utility of DFM. The contaminant of interest for this study is tritium, because it is a geochemically conservative tracer that has been monitored along the seepline near the F-Area effluent and Fourmile Branch for several years.« less
Evaluation of alternative model-data fusion approaches in water balance estimation across Australia
NASA Astrophysics Data System (ADS)
van Dijk, A. I. J. M.; Renzullo, L. J.
2009-04-01
Australia's national agencies are developing a continental modelling system to provide a range of water information services. It will include rolling water balance estimation to underpin national water accounts, water resources assessments that interpret current water resources availability and trends in a historical context, and water resources predictions coupled to climate and weather forecasting. The nation-wide coverage, currency, accuracy, and consistency required means that remote sensing will need to play an important role along with in-situ observations. Different approaches to blending models and observations can be considered. Integration of on-ground and remote sensing data into land surface models in atmospheric applications often involves state updating through model-data assimilation techniques. By comparison, retrospective water balance estimation and hydrological scenario modelling to date has mostly relied on static parameter fitting against observations and has made little use of earth observation. The model-data fusion approach most appropriate for a continental water balance estimation system will need to consider the trade-off between computational overhead and the accuracy gains achieved when using more sophisticated synthesis techniques and additional observations. This trade-off was investigated using a landscape hydrological model and satellite-based estimates of soil moisture and vegetation properties for aseveral gauged test catchments in southeast Australia.
Using Geostatistical Data Fusion Techniques and MODIS Data to Upscale Simulated Wheat Yield
NASA Astrophysics Data System (ADS)
Castrignano, A.; Buttafuoco, G.; Matese, A.; Toscano, P.
2014-12-01
Population growth increases food request. Assessing food demand and predicting the actual supply for a given location are critical components of strategic food security planning at regional scale. Crop yield can be simulated using crop models because is site-specific and determined by weather, management, length of growing season and soil properties. Crop models require reliable location-specific data that are not generally available. Obtaining these data at a large number of locations is time-consuming, costly and sometimes simply not feasible. An upscaling method to extend coverage of sparse estimates of crop yield to an appropriate extrapolation domain is required. This work is aimed to investigate the applicability of a geostatistical data fusion approach for merging remote sensing data with the predictions of a simulation model of wheat growth and production using ground-based data. The study area is Capitanata plain (4000 km2) located in Apulia Region, mostly cropped with durum wheat. The MODIS EVI/NDVI data products for Capitanata plain were downloaded from the Land Processes Distributed Active Archive Center (LPDAAC) remote for the whole crop cycle of durum wheat. Phenological development, biomass growth and grain quantity of durum wheat were simulated by the Delphi system, based on a crop simulation model linked to a database including soil properties, agronomical and meteorological data. Multicollocated cokriging was used to integrate secondary exhaustive information (multi-spectral MODIS data) with primary variable (sparsely distributed biomass/yield model predictions of durum wheat). The model estimates looked strongly spatially correlated with the radiance data (red and NIR bands) and the fusion data approach proved to be quite suitable and flexible to integrate data of different type and support.
Statistical label fusion with hierarchical performance models
Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.
2014-01-01
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809
Matrix factorization-based data fusion for gene function prediction in baker's yeast and slime mold.
Zitnik, Marinka; Zupan, Blaž
2014-01-01
The development of effective methods for the characterization of gene functions that are able to combine diverse data sources in a sound and easily-extendible way is an important goal in computational biology. We have previously developed a general matrix factorization-based data fusion approach for gene function prediction. In this manuscript, we show that this data fusion approach can be applied to gene function prediction and that it can fuse various heterogeneous data sources, such as gene expression profiles, known protein annotations, interaction and literature data. The fusion is achieved by simultaneous matrix tri-factorization that shares matrix factors between sources. We demonstrate the effectiveness of the approach by evaluating its performance on predicting ontological annotations in slime mold D. discoideum and on recognizing proteins of baker's yeast S. cerevisiae that participate in the ribosome or are located in the cell membrane. Our approach achieves predictive performance comparable to that of the state-of-the-art kernel-based data fusion, but requires fewer data preprocessing steps.
Gene Fusion Markup Language: a prototype for exchanging gene fusion data.
Kalyana-Sundaram, Shanker; Shanmugam, Achiraman; Chinnaiyan, Arul M
2012-10-16
An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses.
Autonomous Soil Assessment System: A Data-Driven Approach to Planetary Mobility Hazard Detection
NASA Astrophysics Data System (ADS)
Raimalwala, K.; Faragalli, M.; Reid, E.
2018-04-01
The Autonomous Soil Assessment System predicts mobility hazards for rovers. Its development and performance are presented, with focus on its data-driven models, machine learning algorithms, and real-time sensor data fusion for predictive analytics.
Qi, Shile; Calhoun, Vince D.; van Erp, Theo G. M.; Bustillo, Juan; Damaraju, Eswar; Turner, Jessica A.; Du, Yuhui; Chen, Jiayu; Yu, Qingbao; Mathalon, Daniel H.; Ford, Judith M.; Voyvodic, James; Mueller, Bryon A.; Belger, Aysenil; Ewen, Sarah Mc; Potkin, Steven G.; Preda, Adrian; Jiang, Tianzi
2017-01-01
Multimodal fusion is an effective approach to take advantage of cross-information among multiple imaging data to better understand brain diseases. However, most current fusion approaches are blind, without adopting any prior information. To date, there is increasing interest to uncover the neurocognitive mapping of specific behavioral measurement on enriched brain imaging data; hence, a supervised, goal-directed model that enables a priori information as a reference to guide multimodal data fusion is in need and a natural option. Here we proposed a fusion with reference model, called “multi-site canonical correlation analysis with reference plus joint independent component analysis” (MCCAR+jICA), which can precisely identify co-varying multimodal imaging patterns closely related to reference information, such as cognitive scores. In a 3-way fusion simulation, the proposed method was compared with its alternatives on estimation accuracy of both target component decomposition and modality linkage detection. MCCAR+jICA outperforms others with higher precision. In human imaging data, working memory performance was utilized as a reference to investigate the covarying functional and structural brain patterns among 3 modalities and how they are impaired in schizophrenia. Two independent cohorts (294 and 83 subjects respectively) were used. Interestingly, similar brain maps were identified between the two cohorts, with substantial overlap in the executive control networks in fMRI, salience network in sMRI, and major white matter tracts in dMRI. These regions have been linked with working memory deficits in schizophrenia in multiple reports, while MCCAR+jICA further verified them in a repeatable, joint manner, demonstrating the potential of such results to identify potential neuromarkers for mental disorders. PMID:28708547
Joint parameter and state estimation algorithms for real-time traffic monitoring.
DOT National Transportation Integrated Search
2013-12-01
A common approach to traffic monitoring is to combine a macroscopic traffic flow model with traffic sensor data in a process called state estimation, data fusion, or data assimilation. The main challenge of traffic state estimation is the integration...
Data Fusion Based on Optical Technology for Observation of Human Manipulation
NASA Astrophysics Data System (ADS)
Falco, Pietro; De Maria, Giuseppe; Natale, Ciro; Pirozzi, Salvatore
2012-01-01
The adoption of human observation is becoming more and more frequent within imitation learning and programming by demonstration approaches (PbD) to robot programming. For robotic systems equipped with anthropomorphic hands, the observation phase is very challenging and no ultimate solution exists. This work proposes a novel mechatronic approach to the observation of human hand motion during manipulation tasks. The strategy is based on the combined use of an optical motion capture system and a low-cost data glove equipped with novel joint angle sensors, based on optoelectronic technology. The combination of the two information sources is obtained through a sensor fusion algorithm based on the extended Kalman filter (EKF) suitably modified to tackle the problem of marker occlusions, typical of optical motion capture systems. This approach requires a kinematic model of the human hand. Another key contribution of this work is a new method to calibrate this model.
Data fusion for CD metrology: heterogeneous hybridization of scatterometry, CDSEM, and AFM data
NASA Astrophysics Data System (ADS)
Hazart, J.; Chesneau, N.; Evin, G.; Largent, A.; Derville, A.; Thérèse, R.; Bos, S.; Bouyssou, R.; Dezauzier, C.; Foucher, J.
2014-04-01
The manufacturing of next generation semiconductor devices forces metrology tool providers for an exceptional effort in order to meet the requirements for precision, accuracy and throughput stated in the ITRS. In the past years hybrid metrology (based on data fusion theories) has been investigated as a new methodology for advanced metrology [1][2][3]. This paper provides a new point of view of data fusion for metrology through some experiments and simulations. The techniques are presented concretely in terms of equations to be solved. The first point of view is High Level Fusion which is the use of simple numbers with their associated uncertainty postprocessed by tools. In this paper, it is divided into two stages: one for calibration to reach accuracy, the second to reach precision thanks to Bayesian Fusion. From our perspective, the first stage is mandatory before applying the second stage which is commonly presented [1]. However a reference metrology system is necessary for this fusion. So, precision can be improved if and only if the tools to be fused are perfectly matched at least for some parameters. We provide a methodology similar to a multidimensional TMU able to perform this matching exercise. It is demonstrated on a 28 nm node backend lithography case. The second point of view is Deep Level Fusion which works on the contrary with raw data and their combination. In the approach presented here, the analysis of each raw data is based on a parametric model and connections between the parameters of each tool. In order to allow OCD/SEM Deep Level Fusion, a SEM Compact Model derived from [4] has been developed and compared to AFM. As far as we know, this is the first time such techniques have been coupled at Deep Level. A numerical study on the case of a simple stack for lithography is performed. We show strict equivalence of Deep Level Fusion and High Level Fusion when tools are sensitive and models are perfect. When one of the tools can be considered as a reference and the second is biased, High Level Fusion is far superior to standard Deep Level Fusion. Otherwise, only the second stage of High Level Fusion is possible (Bayesian Fusion) and do not provide substantial advantage. Finally, when OCD is equipped with methods for bias detection [5], Deep Level Fusion outclasses the two-stage High Level Fusion and will benefit to the industry for most advanced nodes production.
MATRIX FACTORIZATION-BASED DATA FUSION FOR GENE FUNCTION PREDICTION IN BAKER’S YEAST AND SLIME MOLD
ŽITNIK, MARINKA; ZUPAN, BLAŽ
2014-01-01
The development of effective methods for the characterization of gene functions that are able to combine diverse data sources in a sound and easily-extendible way is an important goal in computational biology. We have previously developed a general matrix factorization-based data fusion approach for gene function prediction. In this manuscript, we show that this data fusion approach can be applied to gene function prediction and that it can fuse various heterogeneous data sources, such as gene expression profiles, known protein annotations, interaction and literature data. The fusion is achieved by simultaneous matrix tri-factorization that shares matrix factors between sources. We demonstrate the effectiveness of the approach by evaluating its performance on predicting ontological annotations in slime mold D. discoideum and on recognizing proteins of baker’s yeast S. cerevisiae that participate in the ribosome or are located in the cell membrane. Our approach achieves predictive performance comparable to that of the state-of-the-art kernel-based data fusion, but requires fewer data preprocessing steps. PMID:24297565
Gene Fusion Markup Language: a prototype for exchanging gene fusion data
2012-01-01
Background An avalanche of next generation sequencing (NGS) studies has generated an unprecedented amount of genomic structural variation data. These studies have also identified many novel gene fusion candidates with more detailed resolution than previously achieved. However, in the excitement and necessity of publishing the observations from this recently developed cutting-edge technology, no community standardization approach has arisen to organize and represent the data with the essential attributes in an interchangeable manner. As transcriptome studies have been widely used for gene fusion discoveries, the current non-standard mode of data representation could potentially impede data accessibility, critical analyses, and further discoveries in the near future. Results Here we propose a prototype, Gene Fusion Markup Language (GFML) as an initiative to provide a standard format for organizing and representing the significant features of gene fusion data. GFML will offer the advantage of representing the data in a machine-readable format to enable data exchange, automated analysis interpretation, and independent verification. As this database-independent exchange initiative evolves it will further facilitate the formation of related databases, repositories, and analysis tools. The GFML prototype is made available at http://code.google.com/p/gfml-prototype/. Conclusion The Gene Fusion Markup Language (GFML) presented here could facilitate the development of a standard format for organizing, integrating and representing the significant features of gene fusion data in an inter-operable and query-able fashion that will enable biologically intuitive access to gene fusion findings and expedite functional characterization. A similar model is envisaged for other NGS data analyses. PMID:23072312
Joint Facial Action Unit Detection and Feature Fusion: A Multi-conditional Learning Approach.
Eleftheriadis, Stefanos; Rudovic, Ognjen; Pantic, Maja
2016-10-05
Automated analysis of facial expressions can benefit many domains, from marketing to clinical diagnosis of neurodevelopmental disorders. Facial expressions are typically encoded as a combination of facial muscle activations, i.e., action units. Depending on context, these action units co-occur in specific patterns, and rarely in isolation. Yet, most existing methods for automatic action unit detection fail to exploit dependencies among them, and the corresponding facial features. To address this, we propose a novel multi-conditional latent variable model for simultaneous fusion of facial features and joint action unit detection. Specifically, the proposed model performs feature fusion in a generative fashion via a low-dimensional shared subspace, while simultaneously performing action unit detection using a discriminative classification approach. We show that by combining the merits of both approaches, the proposed methodology outperforms existing purely discriminative/generative methods for the target task. To reduce the number of parameters, and avoid overfitting, a novel Bayesian learning approach based on Monte Carlo sampling is proposed, to integrate out the shared subspace. We validate the proposed method on posed and spontaneous data from three publicly available datasets (CK+, DISFA and Shoulder-pain), and show that both feature fusion and joint learning of action units leads to improved performance compared to the state-of-the-art methods for the task.
NASA Astrophysics Data System (ADS)
Duhan, Sukhvinder S.; Singh, Manjeet; Kharab, Rajesh
2012-06-01
We have studied the effects of nuclear induced breakup channel coupling on the fusion cross-section for 6Li+12C and 6He+12C systems in the near barrier energy regime using the dynamic polarization potential (DPP) approach. It has been found that there is enhancement in the fusion cross-section with respect to standard one-dimensional barrier penetration model in the below barrier energy regime while at energies above the barrier there is suppression of fusion cross-section with respect to simple barrier penetration model is observed. The agreement between data and predictions for 6Li+12C system improves significantly as a result of the inclusion of nuclear induced DPP.
Borràs, Eva; Ferré, Joan; Boqué, Ricard; Mestres, Montserrat; Aceña, Laura; Calvo, Angels; Busto, Olga
2016-08-01
Headspace-Mass Spectrometry (HS-MS), Fourier Transform Mid-Infrared spectroscopy (FT-MIR) and UV-Visible spectrophotometry (UV-vis) instrumental responses have been combined to predict virgin olive oil sensory descriptors. 343 olive oil samples analyzed during four consecutive harvests (2010-2014) were used to build multivariate calibration models using partial least squares (PLS) regression. The reference values of the sensory attributes were provided by expert assessors from an official taste panel. The instrumental data were modeled individually and also using data fusion approaches. The use of fused data with both low- and mid-level of abstraction improved PLS predictions for all the olive oil descriptors. The best PLS models were obtained for two positive attributes (fruity and bitter) and two defective descriptors (fusty and musty), all of them using data fusion of MS and MIR spectral fingerprints. Although good predictions were not obtained for some sensory descriptors, the results are encouraging, specially considering that the legal categorization of virgin olive oils only requires the determination of fruity and defective descriptors. Copyright © 2016 Elsevier B.V. All rights reserved.
Sensor-agnostic photogrammetric image registration with applications to population modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Devin A; Moehl, Jessica J
2016-01-01
Photogrammetric registration of airborne and spaceborne imagery is a crucial prerequisite to many data fusion tasks. While embedded sensor models provide a rough geolocation estimate, these metadata may be incomplete or imprecise. Manual solutions are appropriate for small-scale projects, but for rapid streams of cross-modal, multi-sensor, multi-temporal imagery with varying metadata standards, an automated approach is required. We present a high-performance image registration workflow to address this need. This paper outlines the core development concepts and demonstrates its utility with respect to the 2016 data fusion contest imagery. In particular, Iris ultra-HD video is georeferenced to the Earth surface viamore » registration to DEIMOS-2 imagery, which serves as a trusted control source. Geolocation provides opportunity to augment the video with spatial context, stereo-derived disparity, spectral sensitivity, change detection, and numerous ancillary geospatial layers. We conclude by leveraging these derivative data layers towards one such fusion application: population distribution modeling.« less
Fusion of GEDI, ICESAT2 & NISAR data for above ground biomass mapping in Sonoma County, California
NASA Astrophysics Data System (ADS)
Duncanson, L.; Simard, M.; Thomas, N. M.; Neuenschwander, A. L.; Hancock, S.; Armston, J.; Dubayah, R.; Hofton, M. A.; Huang, W.; Tang, H.; Marselis, S.; Fatoyinbo, T.
2017-12-01
Several upcoming NASA missions will collect data sensitive to forest structure (GEDI, ICESAT-2 & NISAR). The LiDAR and SAR data collected by these missions will be used in coming years to map forest aboveground biomass at various resolutions. This research focuses on developing and testing multi-sensor data fusion approaches in advance of these missions. Here, we present the first case study of a CMS-16 grant with results from Sonoma County, California. We simulate lidar and SAR datasets from GEDI, ICESAT-2 and NISAR using airborne discrete return lidar and UAVSAR data, respectively. GEDI and ICESAT-2 signals are simulated from high point density discrete return lidar that was acquired over the entire county in 2014 through a previous CMS project (Dubayah & Hurtt, CMS-13). NISAR is simulated from L-band UAVSAR data collected in 2014. These simulations are empirically related to 300 field plots of aboveground biomass as well as a 30m biomass map produced from the 2014 airborne lidar data. We model biomass independently for each simulated mission dataset and then test two fusion methods for County-wide mapping 1) a pixel based approach and 2) an object oriented approach. In the pixel-based approach, GEDI and ICESAT-2 biomass models are calibrated over field plots and applied in orbital simulations for a 2-year period of the GEDI and ICESAT-2 missions. These simulated samples are then used to calibrate UAVSAR data to produce a 0.25 ha map. In the object oriented approach, the GEDI and ICESAT-2 data are identical to the pixel-based approach, but calibrate image objects of similar L-band backscatter rather than uniform pixels. The results of this research demonstrate the estimated ability for each of these three missions to independently map biomass in a temperate, high biomass system, as well as the potential improvement expected through combining mission datasets.
High Level Information Fusion (HLIF) with nested fusion loops
NASA Astrophysics Data System (ADS)
Woodley, Robert; Gosnell, Michael; Fischer, Amber
2013-05-01
Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.
Peralta, Emmanuel; Vargas, Héctor; Hermosilla, Gabriel
2018-01-01
Proximity sensors are broadly used in mobile robots for obstacle detection. The traditional calibration process of this kind of sensor could be a time-consuming task because it is usually done by identification in a manual and repetitive way. The resulting obstacles detection models are usually nonlinear functions that can be different for each proximity sensor attached to the robot. In addition, the model is highly dependent on the type of sensor (e.g., ultrasonic or infrared), on changes in light intensity, and on the properties of the obstacle such as shape, colour, and surface texture, among others. That is why in some situations it could be useful to gather all the measurements provided by different kinds of sensor in order to build a unique model that estimates the distances to the obstacles around the robot. This paper presents a novel approach to get an obstacles detection model based on the fusion of sensors data and automatic calibration by using artificial neural networks. PMID:29495338
Assessment of Spatiotemporal Fusion Algorithms for Planet and Worldview Images
Zhu, Xiaolin; Gao, Feng; Chou, Bryan; Li, Jiang; Shen, Yuzhong; Koperski, Krzysztof; Marchisio, Giovanni
2018-01-01
Although Worldview-2 (WV) images (non-pansharpened) have 2-m resolution, the re-visit times for the same areas may be seven days or more. In contrast, Planet images are collected using small satellites that can cover the whole Earth almost daily. However, the resolution of Planet images is 3.125 m. It would be ideal to fuse these two satellites images to generate high spatial resolution (2 m) and high temporal resolution (1 or 2 days) images for applications such as damage assessment, border monitoring, etc. that require quick decisions. In this paper, we evaluate three approaches to fusing Worldview (WV) and Planet images. These approaches are known as Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM), Flexible Spatiotemporal Data Fusion (FSDAF), and Hybrid Color Mapping (HCM), which have been applied to the fusion of MODIS and Landsat images in recent years. Experimental results using actual Planet and Worldview images demonstrated that the three aforementioned approaches have comparable performance and can all generate high quality prediction images. PMID:29614745
Assessment of Spatiotemporal Fusion Algorithms for Planet and Worldview Images.
Kwan, Chiman; Zhu, Xiaolin; Gao, Feng; Chou, Bryan; Perez, Daniel; Li, Jiang; Shen, Yuzhong; Koperski, Krzysztof; Marchisio, Giovanni
2018-03-31
Although Worldview-2 (WV) images (non-pansharpened) have 2-m resolution, the re-visit times for the same areas may be seven days or more. In contrast, Planet images are collected using small satellites that can cover the whole Earth almost daily. However, the resolution of Planet images is 3.125 m. It would be ideal to fuse these two satellites images to generate high spatial resolution (2 m) and high temporal resolution (1 or 2 days) images for applications such as damage assessment, border monitoring, etc. that require quick decisions. In this paper, we evaluate three approaches to fusing Worldview (WV) and Planet images. These approaches are known as Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM), Flexible Spatiotemporal Data Fusion (FSDAF), and Hybrid Color Mapping (HCM), which have been applied to the fusion of MODIS and Landsat images in recent years. Experimental results using actual Planet and Worldview images demonstrated that the three aforementioned approaches have comparable performance and can all generate high quality prediction images.
Chen, Baisheng; Wu, Huanan; Li, Sam Fong Yau
2014-03-01
To overcome the challenging task to select an appropriate pathlength for wastewater chemical oxygen demand (COD) monitoring with high accuracy by UV-vis spectroscopy in wastewater treatment process, a variable pathlength approach combined with partial-least squares regression (PLSR) was developed in this study. Two new strategies were proposed to extract relevant information of UV-vis spectral data from variable pathlength measurements. The first strategy was by data fusion with two data fusion levels: low-level data fusion (LLDF) and mid-level data fusion (MLDF). Predictive accuracy was found to improve, indicated by the lower root-mean-square errors of prediction (RMSEP) compared with those obtained for single pathlength measurements. Both fusion levels were found to deliver very robust PLSR models with residual predictive deviations (RPD) greater than 3 (i.e. 3.22 and 3.29, respectively). The second strategy involved calculating the slopes of absorbance against pathlength at each wavelength to generate slope-derived spectra. Without the requirement to select the optimal pathlength, the predictive accuracy (RMSEP) was improved by 20-43% as compared to single pathlength spectroscopy. Comparing to nine-factor models from fusion strategy, the PLSR model from slope-derived spectroscopy was found to be more parsimonious with only five factors and more robust with residual predictive deviation (RPD) of 3.72. It also offered excellent correlation of predicted and measured COD values with R(2) of 0.936. In sum, variable pathlength spectroscopy with the two proposed data analysis strategies proved to be successful in enhancing prediction performance of COD in wastewater and showed high potential to be applied in on-line water quality monitoring. Copyright © 2013 Elsevier B.V. All rights reserved.
Multisource image fusion method using support value transform.
Zheng, Sheng; Shi, Wen-Zhong; Liu, Jian; Zhu, Guang-Xi; Tian, Jin-Wen
2007-07-01
With the development of numerous imaging sensors, many images can be simultaneously pictured by various sensors. However, there are many scenarios where no one sensor can give the complete picture. Image fusion is an important approach to solve this problem and produces a single image which preserves all relevant information from a set of different sensors. In this paper, we proposed a new image fusion method using the support value transform, which uses the support value to represent the salient features of image. This is based on the fact that, in support vector machines (SVMs), the data with larger support values have a physical meaning in the sense that they reveal relative more importance of the data points for contributing to the SVM model. The mapped least squares SVM (mapped LS-SVM) is used to efficiently compute the support values of image. The support value analysis is developed by using a series of multiscale support value filters, which are obtained by filling zeros in the basic support value filter deduced from the mapped LS-SVM to match the resolution of the desired level. Compared with the widely used image fusion methods, such as the Laplacian pyramid, discrete wavelet transform methods, the proposed method is an undecimated transform-based approach. The fusion experiments are undertaken on multisource images. The results demonstrate that the proposed approach is effective and is superior to the conventional image fusion methods in terms of the pertained quantitative fusion evaluation indexes, such as quality of visual information (Q(AB/F)), the mutual information, etc.
Fusion and Gaussian mixture based classifiers for SONAR data
NASA Astrophysics Data System (ADS)
Kotari, Vikas; Chang, KC
2011-06-01
Underwater mines are inexpensive and highly effective weapons. They are difficult to detect and classify. Hence detection and classification of underwater mines is essential for the safety of naval vessels. This necessitates a formulation of highly efficient classifiers and detection techniques. Current techniques primarily focus on signals from one source. Data fusion is known to increase the accuracy of detection and classification. In this paper, we formulated a fusion-based classifier and a Gaussian mixture model (GMM) based classifier for classification of underwater mines. The emphasis has been on sound navigation and ranging (SONAR) signals due to their extensive use in current naval operations. The classifiers have been tested on real SONAR data obtained from University of California Irvine (UCI) repository. The performance of both GMM based classifier and fusion based classifier clearly demonstrate their superior classification accuracy over conventional single source cases and validate our approach.
NASA Astrophysics Data System (ADS)
Grova, C.; Jannin, P.; Biraben, A.; Buvat, I.; Benali, H.; Bernard, A. M.; Scarabin, J. M.; Gibaud, B.
2003-12-01
Quantitative evaluation of brain MRI/SPECT fusion methods for normal and in particular pathological datasets is difficult, due to the frequent lack of relevant ground truth. We propose a methodology to generate MRI and SPECT datasets dedicated to the evaluation of MRI/SPECT fusion methods and illustrate the method when dealing with ictal SPECT. The method consists in generating normal or pathological SPECT data perfectly aligned with a high-resolution 3D T1-weighted MRI using realistic Monte Carlo simulations that closely reproduce the response of a SPECT imaging system. Anatomical input data for the SPECT simulations are obtained from this 3D T1-weighted MRI, while functional input data result from an inter-individual analysis of anatomically standardized SPECT data. The method makes it possible to control the 'brain perfusion' function by proposing a theoretical model of brain perfusion from measurements performed on real SPECT images. Our method provides an absolute gold standard for assessing MRI/SPECT registration method accuracy since, by construction, the SPECT data are perfectly registered with the MRI data. The proposed methodology has been applied to create a theoretical model of normal brain perfusion and ictal brain perfusion characteristic of mesial temporal lobe epilepsy. To approach realistic and unbiased perfusion models, real SPECT data were corrected for uniform attenuation, scatter and partial volume effect. An anatomic standardization was used to account for anatomic variability between subjects. Realistic simulations of normal and ictal SPECT deduced from these perfusion models are presented. The comparison of real and simulated SPECT images showed relative differences in regional activity concentration of less than 20% in most anatomical structures, for both normal and ictal data, suggesting realistic models of perfusion distributions for evaluation purposes. Inter-hemispheric asymmetry coefficients measured on simulated data were found within the range of asymmetry coefficients measured on corresponding real data. The features of the proposed approach are compared with those of other methods previously described to obtain datasets appropriate for the assessment of fusion methods.
Revisions to the JDL data fusion model
NASA Astrophysics Data System (ADS)
Steinberg, Alan N.; Bowman, Christopher L.; White, Franklin E.
1999-03-01
The Data Fusion Model maintained by the Joint Directors of Laboratories (JDL) Data Fusion Group is the most widely-used method for categorizing data fusion-related functions. This paper discusses the current effort to revise the expand this model to facilitate the cost-effective development, acquisition, integration and operation of multi- sensor/multi-source systems. Data fusion involves combining information - in the broadest sense - to estimate or predict the state of some aspect of the universe. These may be represented in terms of attributive and relational states. If the job is to estimate the state of a people, it can be useful to include consideration of informational and perceptual states in addition to the physical state. Developing cost-effective multi-source information systems requires a method for specifying data fusion processing and control functions, interfaces, and associate databases. The lack of common engineering standards for data fusion systems has been a major impediment to integration and re-use of available technology: current developments do not lend themselves to objective evaluation, comparison or re-use. This paper reports on proposed revisions and expansions of the JDL Data FUsion model to remedy some of these deficiencies. This involves broadening the functional model and related taxonomy beyond the original military focus, and integrating the Data Fusion Tree Architecture model for system description, design and development.
a Comparative Analysis of Spatiotemporal Data Fusion Models for Landsat and Modis Data
NASA Astrophysics Data System (ADS)
Hazaymeh, K.; Almagbile, A.
2018-04-01
In this study, three documented spatiotemporal data fusion models were applied to Landsat-7 and MODIS surface reflectance, and NDVI. The algorithms included the spatial and temporal adaptive reflectance fusion model (STARFM), sparse representation based on a spatiotemporal reflectance fusion model (SPSTFM), and spatiotemporal image-fusion model (STI-FM). The objectives of this study were to (i) compare the performance of these three fusion models using a one Landsat-MODIS spectral reflectance image pairs using time-series datasets from the Coleambally irrigation area in Australia, and (ii) quantitatively evaluate the accuracy of the synthetic images generated from each fusion model using statistical measurements. Results showed that the three fusion models predicted the synthetic Landsat-7 image with adequate agreements. The STI-FM produced more accurate reconstructions of both Landsat-7 spectral bands and NDVI. Furthermore, it produced surface reflectance images having the highest correlation with the actual Landsat-7 images. This study indicated that STI-FM would be more suitable for spatiotemporal data fusion applications such as vegetation monitoring, drought monitoring, and evapotranspiration.
Dynamical approach to heavy-ion induced fusion using actinide target
NASA Astrophysics Data System (ADS)
Aritomo, Y.; Hagino, K.; Chiba, S.; Nishio, K.
2012-10-01
To treat heavy-ion reactions using actinide target nucleus, we propose a model which takes into account the coupling to the collective states of interacting nuclei in the penetration of the Coulomb barrier and the dynamical evolution of nuclear shape from the contact configuration. A fluctuation-dissipation model (Langevin equation) was applied in the dynamical calculation, where effect of nuclear orientation at the initial impact on the prolately deformed target nucleus was considered. Using this model, we analyzed the experimental data for the mass distribution of fission fragments (MDFF) in the reaction of 36S+238U at several incident energies. Fusion-fission, quasifission and deep-quasi-fission are separated as different trajectories on the potential energy surface. We estimated the fusion cross section of the reaction.
Begum, Shahina; Barua, Shaibal; Ahmed, Mobyen Uddin
2014-07-03
Today, clinicians often do diagnosis and classification of diseases based on information collected from several physiological sensor signals. However, sensor signal could easily be vulnerable to uncertain noises or interferences and due to large individual variations sensitivity to different physiological sensors could also vary. Therefore, multiple sensor signal fusion is valuable to provide more robust and reliable decision. This paper demonstrates a physiological sensor signal classification approach using sensor signal fusion and case-based reasoning. The proposed approach has been evaluated to classify Stressed or Relaxed individuals using sensor data fusion. Physiological sensor signals i.e., Heart Rate (HR), Finger Temperature (FT), Respiration Rate (RR), Carbon dioxide (CO2) and Oxygen Saturation (SpO2) are collected during the data collection phase. Here, sensor fusion has been done in two different ways: (i) decision-level fusion using features extracted through traditional approaches; and (ii) data-level fusion using features extracted by means of Multivariate Multiscale Entropy (MMSE). Case-Based Reasoning (CBR) is applied for the classification of the signals. The experimental result shows that the proposed system could classify Stressed or Relaxed individual 87.5% accurately compare to an expert in the domain. So, it shows promising result in the psychophysiological domain and could be possible to adapt this approach to other relevant healthcare systems.
Condorcet and borda count fusion method for ligand-based virtual screening.
Ahmed, Ali; Saeed, Faisal; Salim, Naomie; Abdo, Ammar
2014-01-01
It is known that any individual similarity measure will not always give the best recall of active molecule structure for all types of activity classes. Recently, the effectiveness of ligand-based virtual screening approaches can be enhanced by using data fusion. Data fusion can be implemented using two different approaches: group fusion and similarity fusion. Similarity fusion involves searching using multiple similarity measures. The similarity scores, or ranking, for each similarity measure are combined to obtain the final ranking of the compounds in the database. The Condorcet fusion method was examined. This approach combines the outputs of similarity searches from eleven association and distance similarity coefficients, and then the winner measure for each class of molecules, based on Condorcet fusion, was chosen to be the best method of searching. The recall of retrieved active molecules at top 5% and significant test are used to evaluate our proposed method. The MDL drug data report (MDDR), maximum unbiased validation (MUV) and Directory of Useful Decoys (DUD) data sets were used for experiments and were represented by 2D fingerprints. Simulated virtual screening experiments with the standard two data sets show that the use of Condorcet fusion provides a very simple way of improving the ligand-based virtual screening, especially when the active molecules being sought have a lowest degree of structural heterogeneity. However, the effectiveness of the Condorcet fusion was increased slightly when structural sets of high diversity activities were being sought.
Condorcet and borda count fusion method for ligand-based virtual screening
2014-01-01
Background It is known that any individual similarity measure will not always give the best recall of active molecule structure for all types of activity classes. Recently, the effectiveness of ligand-based virtual screening approaches can be enhanced by using data fusion. Data fusion can be implemented using two different approaches: group fusion and similarity fusion. Similarity fusion involves searching using multiple similarity measures. The similarity scores, or ranking, for each similarity measure are combined to obtain the final ranking of the compounds in the database. Results The Condorcet fusion method was examined. This approach combines the outputs of similarity searches from eleven association and distance similarity coefficients, and then the winner measure for each class of molecules, based on Condorcet fusion, was chosen to be the best method of searching. The recall of retrieved active molecules at top 5% and significant test are used to evaluate our proposed method. The MDL drug data report (MDDR), maximum unbiased validation (MUV) and Directory of Useful Decoys (DUD) data sets were used for experiments and were represented by 2D fingerprints. Conclusions Simulated virtual screening experiments with the standard two data sets show that the use of Condorcet fusion provides a very simple way of improving the ligand-based virtual screening, especially when the active molecules being sought have a lowest degree of structural heterogeneity. However, the effectiveness of the Condorcet fusion was increased slightly when structural sets of high diversity activities were being sought. PMID:24883114
NASA Astrophysics Data System (ADS)
Bowman, Christopher; Haith, Gary; Steinberg, Alan; Morefield, Charles; Morefield, Michael
2013-05-01
This paper describes methods to affordably improve the robustness of distributed fusion systems by opportunistically leveraging non-traditional data sources. Adaptive methods help find relevant data, create models, and characterize the model quality. These methods also can measure the conformity of this non-traditional data with fusion system products including situation modeling and mission impact prediction. Non-traditional data can improve the quantity, quality, availability, timeliness, and diversity of the baseline fusion system sources and therefore can improve prediction and estimation accuracy and robustness at all levels of fusion. Techniques are described that automatically learn to characterize and search non-traditional contextual data to enable operators integrate the data with the high-level fusion systems and ontologies. These techniques apply the extension of the Data Fusion & Resource Management Dual Node Network (DNN) technical architecture at Level 4. The DNN architecture supports effectively assessment and management of the expanded portfolio of data sources, entities of interest, models, and algorithms including data pattern discovery and context conformity. Affordable model-driven and data-driven data mining methods to discover unknown models from non-traditional and `big data' sources are used to automatically learn entity behaviors and correlations with fusion products, [14 and 15]. This paper describes our context assessment software development, and the demonstration of context assessment of non-traditional data to compare to an intelligence surveillance and reconnaissance fusion product based upon an IED POIs workflow.
Chowdhury, Rasheda Arman; Zerouali, Younes; Hedrich, Tanguy; Heers, Marcel; Kobayashi, Eliane; Lina, Jean-Marc; Grova, Christophe
2015-11-01
The purpose of this study is to develop and quantitatively assess whether fusion of EEG and MEG (MEEG) data within the maximum entropy on the mean (MEM) framework increases the spatial accuracy of source localization, by yielding better recovery of the spatial extent and propagation pathway of the underlying generators of inter-ictal epileptic discharges (IEDs). The key element in this study is the integration of the complementary information from EEG and MEG data within the MEM framework. MEEG was compared with EEG and MEG when localizing single transient IEDs. The fusion approach was evaluated using realistic simulation models involving one or two spatially extended sources mimicking propagation patterns of IEDs. We also assessed the impact of the number of EEG electrodes required for an efficient EEG-MEG fusion. MEM was compared with minimum norm estimate, dynamic statistical parametric mapping, and standardized low-resolution electromagnetic tomography. The fusion approach was finally assessed on real epileptic data recorded from two patients showing IEDs simultaneously in EEG and MEG. Overall the localization of MEEG data using MEM provided better recovery of the source spatial extent, more sensitivity to the source depth and more accurate detection of the onset and propagation of IEDs than EEG or MEG alone. MEM was more accurate than the other methods. MEEG proved more robust than EEG and MEG for single IED localization in low signal-to-noise ratio conditions. We also showed that only few EEG electrodes are required to bring additional relevant information to MEG during MEM fusion.
A data fusion approach to indications and warnings of terrorist attacks
NASA Astrophysics Data System (ADS)
McDaniel, David; Schaefer, Gregory
2014-05-01
Indications and Warning (I&W) of terrorist attacks, particularly IED attacks, require detection of networks of agents and patterns of behavior. Social Network Analysis tries to detect a network; activity analysis tries to detect anomalous activities. This work builds on both to detect elements of an activity model of terrorist attack activity - the agents, resources, networks, and behaviors. The activity model is expressed as RDF triples statements where the tuple positions are elements or subsets of a formal ontology for activity models. The advantage of a model is that elements are interdependent and evidence for or against one will influence others so that there is a multiplier effect. The advantage of the formality is that detection could occur hierarchically, that is, at different levels of abstraction. The model matching is expressed as a likelihood ratio between input text and the model triples. The likelihood ratio is designed to be analogous to track correlation likelihood ratios common in JDL fusion level 1. This required development of a semantic distance metric for positive and null hypotheses as well as for complex objects. The metric uses the Web 1Terabype database of one to five gram frequencies for priors. This size requires the use of big data technologies so a Hadoop cluster is used in conjunction with OpenNLP natural language and Mahout clustering software. Distributed data fusion Map Reduce jobs distribute parts of the data fusion problem to the Hadoop nodes. For the purposes of this initial testing, open source models and text inputs of similar complexity to terrorist events were used as surrogates for the intended counter-terrorist application.
Borràs, Eva; Ferré, Joan; Boqué, Ricard; Mestres, Montserrat; Aceña, Laura; Calvo, Angels; Busto, Olga
2016-07-15
Three instrumental techniques, headspace-mass spectrometry (HS-MS), mid-infrared spectroscopy (MIR) and UV-visible spectrophotometry (UV-vis), have been combined to classify virgin olive oil samples based on the presence or absence of sensory defects. The reference sensory values were provided by an official taste panel. Different data fusion strategies were studied to improve the discrimination capability compared to using each instrumental technique individually. A general model was applied to discriminate high-quality non-defective olive oils (extra-virgin) and the lowest-quality olive oils considered non-edible (lampante). A specific identification of key off-flavours, such as musty, winey, fusty and rancid, was also studied. The data fusion of the three techniques improved the classification results in most of the cases. Low-level data fusion was the best strategy to discriminate musty, winey and fusty defects, using HS-MS, MIR and UV-vis, and the rancid defect using only HS-MS and MIR. The mid-level data fusion approach using partial least squares-discriminant analysis (PLS-DA) scores was found to be the best strategy for defective vs non-defective and edible vs non-edible oil discrimination. However, the data fusion did not sufficiently improve the results obtained by a single technique (HS-MS) to classify non-defective classes. These results indicate that instrumental data fusion can be useful for the identification of sensory defects in virgin olive oils. Copyright © 2016 Elsevier Ltd. All rights reserved.
Multi-PSF fusion in image restoration of range-gated systems
NASA Astrophysics Data System (ADS)
Wang, Canjin; Sun, Tao; Wang, Tingfeng; Miao, Xikui; Wang, Rui
2018-07-01
For the task of image restoration, an accurate estimation of degrading PSF/kernel is the premise of recovering a visually superior image. The imaging process of range-gated imaging system in atmosphere associates with lots of factors, such as back scattering, background radiation, diffraction limit and the vibration of the platform. On one hand, due to the difficulty of constructing models for all factors, the kernels from physical-model based methods are not strictly accurate and practical. On the other hand, there are few strong edges in images, which brings significant errors to most of image-feature-based methods. Since different methods focus on different formation factors of the kernel, their results often complement each other. Therefore, we propose an approach which combines physical model with image features. With an fusion strategy using GCRF (Gaussian Conditional Random Fields) framework, we get a final kernel which is closer to the actual one. Aiming at the problem that ground-truth image is difficult to obtain, we then propose a semi data-driven fusion method in which different data sets are used to train fusion parameters. Finally, a semi blind restoration strategy based on EM (Expectation Maximization) and RL (Richardson-Lucy) algorithm is proposed. Our methods not only models how the lasers transfer in the atmosphere and imaging in the ICCD (Intensified CCD) plane, but also quantifies other unknown degraded factors using image-based methods, revealing how multiple kernel elements interact with each other. The experimental results demonstrate that our method achieves better performance than state-of-the-art restoration approaches.
NASA Astrophysics Data System (ADS)
Weisenseel, Robert A.; Karl, William C.; Castanon, David A.; DiMarzio, Charles A.
1999-02-01
We present an analysis of statistical model based data-level fusion for near-IR polarimetric and thermal data, particularly for the detection of mines and mine-like targets. Typical detection-level data fusion methods, approaches that fuse detections from individual sensors rather than fusing at the level of the raw data, do not account rationally for the relative reliability of different sensors, nor the redundancy often inherent in multiple sensors. Representative examples of such detection-level techniques include logical AND/OR operations on detections from individual sensors and majority vote methods. In this work, we exploit a statistical data model for the detection of mines and mine-like targets to compare and fuse multiple sensor channels. Our purpose is to quantify the amount of knowledge that each polarimetric or thermal channel supplies to the detection process. With this information, we can make reasonable decisions about the usefulness of each channel. We can use this information to improve the detection process, or we can use it to reduce the number of required channels.
Investigations of image fusion
NASA Astrophysics Data System (ADS)
Zhang, Zhong
1999-12-01
The objective of image fusion is to combine information from multiple images of the same scene. The result of image fusion is a single image which is more suitable for the purpose of human visual perception or further image processing tasks. In this thesis, a region-based fusion algorithm using the wavelet transform is proposed. The identification of important features in each image, such as edges and regions of interest, are used to guide the fusion process. The idea of multiscale grouping is also introduced and a generic image fusion framework based on multiscale decomposition is studied. The framework includes all of the existing multiscale-decomposition- based fusion approaches we found in the literature which did not assume a statistical model for the source images. Comparisons indicate that our framework includes some new approaches which outperform the existing approaches for the cases we consider. Registration must precede our fusion algorithms. So we proposed a hybrid scheme which uses both feature-based and intensity-based methods. The idea of robust estimation of optical flow from time- varying images is employed with a coarse-to-fine multi- resolution approach and feature-based registration to overcome some of the limitations of the intensity-based schemes. Experiments show that this approach is robust and efficient. Assessing image fusion performance in a real application is a complicated issue. In this dissertation, a mixture probability density function model is used in conjunction with the Expectation- Maximization algorithm to model histograms of edge intensity. Some new techniques are proposed for estimating the quality of a noisy image of a natural scene. Such quality measures can be used to guide the fusion. Finally, we study fusion of images obtained from several copies of a new type of camera developed for video surveillance. Our techniques increase the capability and reliability of the surveillance system and provide an easy way to obtain 3-D information of objects in the space monitored by the system.
Identifying transposon insertions and their effects from RNA-sequencing data.
de Ruiter, Julian R; Kas, Sjors M; Schut, Eva; Adams, David J; Koudijs, Marco J; Wessels, Lodewyk F A; Jonkers, Jos
2017-07-07
Insertional mutagenesis using engineered transposons is a potent forward genetic screening technique used to identify cancer genes in mouse model systems. In the analysis of these screens, transposon insertion sites are typically identified by targeted DNA-sequencing and subsequently assigned to predicted target genes using heuristics. As such, these approaches provide no direct evidence that insertions actually affect their predicted targets or how transcripts of these genes are affected. To address this, we developed IM-Fusion, an approach that identifies insertion sites from gene-transposon fusions in standard single- and paired-end RNA-sequencing data. We demonstrate IM-Fusion on two separate transposon screens of 123 mammary tumors and 20 B-cell acute lymphoblastic leukemias, respectively. We show that IM-Fusion accurately identifies transposon insertions and their true target genes. Furthermore, by combining the identified insertion sites with expression quantification, we show that we can determine the effect of a transposon insertion on its target gene(s) and prioritize insertions that have a significant effect on expression. We expect that IM-Fusion will significantly enhance the accuracy of cancer gene discovery in forward genetic screens and provide initial insight into the biological effects of insertions on candidate cancer genes. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Weirather, Jason L.; Afshar, Pegah Tootoonchi; Clark, Tyson A.; Tseng, Elizabeth; Powers, Linda S.; Underwood, Jason G.; Zabner, Joseph; Korlach, Jonas; Wong, Wing Hung; Au, Kin Fai
2015-01-01
We developed an innovative hybrid sequencing approach, IDP-fusion, to detect fusion genes, determine fusion sites and identify and quantify fusion isoforms. IDP-fusion is the first method to study gene fusion events by integrating Third Generation Sequencing long reads and Second Generation Sequencing short reads. We applied IDP-fusion to PacBio data and Illumina data from the MCF-7 breast cancer cells. Compared with the existing tools, IDP-fusion detects fusion genes at higher precision and a very low false positive rate. The results show that IDP-fusion will be useful for unraveling the complexity of multiple fusion splices and fusion isoforms within tumorigenesis-relevant fusion genes. PMID:26040699
Behavior Knowledge Space-Based Fusion for Copy-Move Forgery Detection.
Ferreira, Anselmo; Felipussi, Siovani C; Alfaro, Carlos; Fonseca, Pablo; Vargas-Munoz, John E; Dos Santos, Jefersson A; Rocha, Anderson
2016-07-20
The detection of copy-move image tampering is of paramount importance nowadays, mainly due to its potential use for misleading the opinion forming process of the general public. In this paper, we go beyond traditional forgery detectors and aim at combining different properties of copy-move detection approaches by modeling the problem on a multiscale behavior knowledge space, which encodes the output combinations of different techniques as a priori probabilities considering multiple scales of the training data. Afterwards, the conditional probabilities missing entries are properly estimated through generative models applied on the existing training data. Finally, we propose different techniques that exploit the multi-directionality of the data to generate the final outcome detection map in a machine learning decision-making fashion. Experimental results on complex datasets, comparing the proposed techniques with a gamut of copy-move detection approaches and other fusion methodologies in the literature show the effectiveness of the proposed method and its suitability for real-world applications.
A local approach for focussed Bayesian fusion
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael; Goussev, Igor; Beyerer, Jürgen
2009-04-01
Local Bayesian fusion approaches aim to reduce high storage and computational costs of Bayesian fusion which is separated from fixed modeling assumptions. Using the small world formalism, we argue why this proceeding is conform with Bayesian theory. Then, we concentrate on the realization of local Bayesian fusion by focussing the fusion process solely on local regions that are task relevant with a high probability. The resulting local models correspond then to restricted versions of the original one. In a previous publication, we used bounds for the probability of misleading evidence to show the validity of the pre-evaluation of task specific knowledge and prior information which we perform to build local models. In this paper, we prove the validity of this proceeding using information theoretic arguments. For additional efficiency, local Bayesian fusion can be realized in a distributed manner. Here, several local Bayesian fusion tasks are evaluated and unified after the actual fusion process. For the practical realization of distributed local Bayesian fusion, software agents are predestinated. There is a natural analogy between the resulting agent based architecture and criminal investigations in real life. We show how this analogy can be used to improve the efficiency of distributed local Bayesian fusion additionally. Using a landscape model, we present an experimental study of distributed local Bayesian fusion in the field of reconnaissance, which highlights its high potential.
Adaptive Multi-sensor Data Fusion Model for In-situ Exploration of Mars
NASA Astrophysics Data System (ADS)
Schneiderman, T.; Sobron, P.
2014-12-01
Laser Raman spectroscopy (LRS) and laser-induced breakdown spectroscopy (LIBS) can be used synergistically to characterize the geochemistry and mineralogy of potential microbial habitats and biosignatures. The value of LRS and LIBS has been recognized by the planetary science community: (i) NASA's Mars2020 mission features a combined LRS-LIBS instrument, SuperCam, and an LRS instrument, SHERLOC; (ii) an LRS instrument, RLS, will fly on ESA's 2018 ExoMars mission. The advantages of combining LRS and LIBS are evident: (1) LRS/LIBS can share hardware components; (2) LIBS reveals the relative concentration of major (and often trace) elements present in a sample; and (3) LRS yields information on the individual mineral species and their chemical/structural nature. Combining data from LRS and LIBS enables definitive mineral phase identification with precise chemical characterization of major, minor, and trace mineral species. New approaches to data processing are needed to analyze large amounts of LRS+LIBS data efficiently and maximize the scientific return of integrated measurements. Multi-sensor data fusion (MSDF) is a method that allows for robust sample identification through automated acquisition, processing, and combination of data. It optimizes information usage, yielding a more robust characterization of a target than could be acquired through single sensor use. We have developed a prototype fuzzy logic adaptive MSDF model aimed towards the unsupervised characterization of Martian habitats and their biosignatures using LRS and LIBS datasets. Our model also incorporates fusion of microimaging (MI) data - critical for placing analyses in geological and spatial context. Here, we discuss the performance of our novel MSDF model and demonstrate that automated quantification of the salt abundance in sulfate/clay/phyllosilicate mixtures is possible through data fusion of collocated LRS, LIBS, and MI data.
On the relativistic field theory model of the deuteron II
NASA Astrophysics Data System (ADS)
Ivanov, A. N.; Troitskaya, N. I.; Faber, M.; Oberhummer, H.
1997-02-01
The relativistic field theory model of the deuteron suggested previously is revised and applied to the calculation of the cross sections of the low-energy radiative neutron-proton capture n + p -> D + γ and the low-energy two-proton fusion p + p -> D + e+ + νc. For the low-energy radiative neutron-proton capture n + p -> D + γ our result agrees well with both experimental data and the potential model prediction. In the case of the two-proton fusion the cross section obtained is 2.9 times as much as that given by the potential approach. The obtained result is discussed in connection with the solar neutrino problem.
Improvement of information fusion-based audio steganalysis
NASA Astrophysics Data System (ADS)
Kraetzer, Christian; Dittmann, Jana
2010-01-01
In the paper we extend an existing information fusion based audio steganalysis approach by three different kinds of evaluations: The first evaluation addresses the so far neglected evaluations on sensor level fusion. Our results show that this fusion removes content dependability while being capable of achieving similar classification rates (especially for the considered global features) if compared to single classifiers on the three exemplarily tested audio data hiding algorithms. The second evaluation enhances the observations on fusion from considering only segmental features to combinations of segmental and global features, with the result of a reduction of the required computational complexity for testing by about two magnitudes while maintaining the same degree of accuracy. The third evaluation tries to build a basis for estimating the plausibility of the introduced steganalysis approach by measuring the sensibility of the models used in supervised classification of steganographic material against typical signal modification operations like de-noising or 128kBit/s MP3 encoding. Our results show that for some of the tested classifiers the probability of false alarms rises dramatically after such modifications.
Beutler, William J; Peppelman, Walter C; DiMarco, Luciano A
2013-02-15
Technique development to use the da Vince Robotic Surgical System for anterior lumbar interbody fusion at L5-S1 is detailed. A case report is also presented. To evaluate and develop the da Vinci robotic assisted laparoscopic anterior lumbar stand-alone interbody fusion procedure. Anterior lumbar interbody fusion is a common procedure associated with potential morbidity related to the surgical approach. The da Vinci robot provides intra-abdominal dissection and visualization advantages compared with the traditional open and laparoscopic approach. The surgical techniques for approach to the anterior lumbar spine using the da Vinci robot were developed and modified progressively beginning with operative models followed by placement of an interbody fusion cage in the living porcine model. Development continued to progress with placement of fusion cage in a human cadaver, completed first in the laboratory setting and then in the operating room. Finally, the first patient with fusion completed using the da Vinci robot-assisted approach is presented. The anterior transperitoneal approach to the lumbar spine is accomplished with enhanced visualization and dissection capability, with maintenance of pneumoperitoneum using the da Vinci robot. Blood loss is minimal. The visualization inside the disc space and surrounding structures was considered better than current open and laparoscopic techniques. The da Vinci robot Surgical System technique continues to develop and is now described for the transperitoneal approach to the anterior lumbar spine. 4.
Adali, Tülay; Levin-Schwartz, Yuri; Calhoun, Vince D.
2015-01-01
Fusion of information from multiple sets of data in order to extract a set of features that are most useful and relevant for the given task is inherent to many problems we deal with today. Since, usually, very little is known about the actual interaction among the datasets, it is highly desirable to minimize the underlying assumptions. This has been the main reason for the growing importance of data-driven methods, and in particular of independent component analysis (ICA) as it provides useful decompositions with a simple generative model and using only the assumption of statistical independence. A recent extension of ICA, independent vector analysis (IVA) generalizes ICA to multiple datasets by exploiting the statistical dependence across the datasets, and hence, as we discuss in this paper, provides an attractive solution to fusion of data from multiple datasets along with ICA. In this paper, we focus on two multivariate solutions for multi-modal data fusion that let multiple modalities fully interact for the estimation of underlying features that jointly report on all modalities. One solution is the Joint ICA model that has found wide application in medical imaging, and the second one is the the Transposed IVA model introduced here as a generalization of an approach based on multi-set canonical correlation analysis. In the discussion, we emphasize the role of diversity in the decompositions achieved by these two models, present their properties and implementation details to enable the user make informed decisions on the selection of a model along with its associated parameters. Discussions are supported by simulation results to help highlight the main issues in the implementation of these methods. PMID:26525830
Track classification within wireless sensor network
NASA Astrophysics Data System (ADS)
Doumerc, Robin; Pannetier, Benjamin; Moras, Julien; Dezert, Jean; Canevet, Loic
2017-05-01
In this paper, we present our study on track classification by taking into account environmental information and target estimated states. The tracker uses several motion model adapted to different target dynamics (pedestrian, ground vehicle and SUAV, i.e. small unmanned aerial vehicle) and works in centralized architecture. The main idea is to explore both: classification given by heterogeneous sensors and classification obtained with our fusion module. The fusion module, presented in his paper, provides a class on each track according to track location, velocity and associated uncertainty. To model the likelihood on each class, a fuzzy approach is used considering constraints on target capability to move in the environment. Then the evidential reasoning approach based on Dempster-Shafer Theory (DST) is used to perform a time integration of this classifier output. The fusion rules are tested and compared on real data obtained with our wireless sensor network.In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of this system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
Double Cluster Heads Model for Secure and Accurate Data Fusion in Wireless Sensor Networks
Fu, Jun-Song; Liu, Yun
2015-01-01
Secure and accurate data fusion is an important issue in wireless sensor networks (WSNs) and has been extensively researched in the literature. In this paper, by combining clustering techniques, reputation and trust systems, and data fusion algorithms, we propose a novel cluster-based data fusion model called Double Cluster Heads Model (DCHM) for secure and accurate data fusion in WSNs. Different from traditional clustering models in WSNs, two cluster heads are selected after clustering for each cluster based on the reputation and trust system and they perform data fusion independently of each other. Then, the results are sent to the base station where the dissimilarity coefficient is computed. If the dissimilarity coefficient of the two data fusion results exceeds the threshold preset by the users, the cluster heads will be added to blacklist, and the cluster heads must be reelected by the sensor nodes in a cluster. Meanwhile, feedback is sent from the base station to the reputation and trust system, which can help us to identify and delete the compromised sensor nodes in time. Through a series of extensive simulations, we found that the DCHM performed very well in data fusion security and accuracy. PMID:25608211
Heideklang, René; Shokouhi, Parisa
2016-01-01
This article focuses on the fusion of flaw indications from multi-sensor nondestructive materials testing. Because each testing method makes use of a different physical principle, a multi-method approach has the potential of effectively differentiating actual defect indications from the many false alarms, thus enhancing detection reliability. In this study, we propose a new technique for aggregating scattered two- or three-dimensional sensory data. Using a density-based approach, the proposed method explicitly addresses localization uncertainties such as registration errors. This feature marks one of the major of advantages of this approach over pixel-based image fusion techniques. We provide guidelines on how to set all the key parameters and demonstrate the technique’s robustness. Finally, we apply our fusion approach to experimental data and demonstrate its capability to locate small defects by substantially reducing false alarms under conditions where no single-sensor method is adequate. PMID:26784200
Semantic Indexing of Multimedia Content Using Visual, Audio, and Text Cues
NASA Astrophysics Data System (ADS)
Adams, W. H.; Iyengar, Giridharan; Lin, Ching-Yung; Naphade, Milind Ramesh; Neti, Chalapathy; Nock, Harriet J.; Smith, John R.
2003-12-01
We present a learning-based approach to the semantic indexing of multimedia content using cues derived from audio, visual, and text features. We approach the problem by developing a set of statistical models for a predefined lexicon. Novel concepts are then mapped in terms of the concepts in the lexicon. To achieve robust detection of concepts, we exploit features from multiple modalities, namely, audio, video, and text. Concept representations are modeled using Gaussian mixture models (GMM), hidden Markov models (HMM), and support vector machines (SVM). Models such as Bayesian networks and SVMs are used in a late-fusion approach to model concepts that are not explicitly modeled in terms of features. Our experiments indicate promise in the proposed classification and fusion methodologies: our proposed fusion scheme achieves more than 10% relative improvement over the best unimodal concept detector.
Depth and thermal sensor fusion to enhance 3D thermographic reconstruction.
Cao, Yanpeng; Xu, Baobei; Ye, Zhangyu; Yang, Jiangxin; Cao, Yanlong; Tisse, Christel-Loic; Li, Xin
2018-04-02
Three-dimensional geometrical models with incorporated surface temperature data provide important information for various applications such as medical imaging, energy auditing, and intelligent robots. In this paper we present a robust method for mobile and real-time 3D thermographic reconstruction through depth and thermal sensor fusion. A multimodal imaging device consisting of a thermal camera and a RGB-D sensor is calibrated geometrically and used for data capturing. Based on the underlying principle that temperature information remains robust against illumination and viewpoint changes, we present a Thermal-guided Iterative Closest Point (T-ICP) methodology to facilitate reliable 3D thermal scanning applications. The pose of sensing device is initially estimated using correspondences found through maximizing the thermal consistency between consecutive infrared images. The coarse pose estimate is further refined by finding the motion parameters that minimize a combined geometric and thermographic loss function. Experimental results demonstrate that complimentary information captured by multimodal sensors can be utilized to improve performance of 3D thermographic reconstruction. Through effective fusion of thermal and depth data, the proposed approach generates more accurate 3D thermal models using significantly less scanning data.
A 3D Scan Model and Thermal Image Data Fusion Algorithms for 3D Thermography in Medicine
Klima, Ondrej
2017-01-01
Objectives At present, medical thermal imaging is still considered a mere qualitative tool enabling us to distinguish between but lacking the ability to quantify the physiological and nonphysiological states of the body. Such a capability would, however, facilitate solving the problem of medical quantification, whose presence currently manifests itself within the entire healthcare system. Methods A generally applicable method to enhance captured 3D spatial data carrying temperature-related information is presented; in this context, all equations required for other data fusions are derived. The method can be utilized for high-density point clouds or detailed meshes at a high resolution but is conveniently usable in large objects with sparse points. Results The benefits of the approach are experimentally demonstrated on 3D thermal scans of injured subjects. We obtained diagnostic information inaccessible via traditional methods. Conclusion Using a 3D model and thermal image data fusion allows the quantification of inflammation, facilitating more precise injury and illness diagnostics or monitoring. The technique offers a wide application potential in medicine and multiple technological domains, including electrical and mechanical engineering. PMID:29250306
NASA Astrophysics Data System (ADS)
Grover, Neha; Sandhu, Kirandeep; Sharma, Manoj K.
2018-06-01
The dynamics of 17F + 58Ni reaction induced via a loosely bound projectile (17F) is examined using the collective clusterization approach of the dynamical cluster decay model (DCM) with respect to the recent experimental data available at beam energies Ebeam = 54.1 and 58.5 MeV. The calculations are done for quadrupole deformations of fragments using the optimum orientation approach. In view of the loosely bound nature of 17F, the main focus of the present work is on the comparison of complete and incomplete fusion. It is studied using various components such as fragmentation potential, mass distribution, and barrier modification. Different decay modes (ER, IMF, HMF, and fission) are also compared to determine the complete fusion and incomplete fusion paths. Additionally, the decay paths of the nucleus formed from loosely bound (17F) and tightly bound (16O) projectiles are compared. Furthermore, the role of temperature-dependent pairing strength is analyzed in terms of the binary fragmentation of the compound system formed.
Review of 3d GIS Data Fusion Methods and Progress
NASA Astrophysics Data System (ADS)
Hua, Wei; Hou, Miaole; Hu, Yungang
2018-04-01
3D data fusion is a research hotspot in the field of computer vision and fine mapping, and plays an important role in fine measurement, risk monitoring, data display and other processes. At present, the research of 3D data fusion in the field of Surveying and mapping focuses on the 3D model fusion of terrain and ground objects. This paper summarizes the basic methods of 3D data fusion of terrain and ground objects in recent years, and classified the data structure and the establishment method of 3D model, and some of the most widely used fusion methods are analysed and commented.
Integration of language and sensor information
NASA Astrophysics Data System (ADS)
Perlovsky, Leonid I.; Weijers, Bertus
2003-04-01
The talk describes the development of basic technologies of intelligent systems fusing data from multiple domains and leading to automated computational techniques for understanding data contents. Understanding involves inferring appropriate decisions and recommending proper actions, which in turn requires fusion of data and knowledge about objects, situations, and actions. Data might include sensory data, verbal reports, intelligence intercepts, or public records, whereas knowledge ought to encompass the whole range of objects, situations, people and their behavior, and knowledge of languages. In the past, a fundamental difficulty in combining knowledge with data was the combinatorial complexity of computations, too many combinations of data and knowledge pieces had to be evaluated. Recent progress in understanding of natural intelligent systems, including the human mind, leads to the development of neurophysiologically motivated architectures for solving these challenging problems, in particular the role of emotional neural signals in overcoming combinatorial complexity of old logic-based approaches. Whereas past approaches based on logic tended to identify logic with language and thinking, recent studies in cognitive linguistics have led to appreciation of more complicated nature of linguistic models. Little is known about the details of the brain mechanisms integrating language and thinking. Understanding and fusion of linguistic information with sensory data represent a novel challenging aspect of the development of integrated fusion systems. The presentation will describe a non-combinatorial approach to this problem and outline techniques that can be used for fusing diverse and uncertain knowledge with sensory and linguistic data.
Verma, Gyanendra K; Tiwary, Uma Shanker
2014-11-15
The purpose of this paper is twofold: (i) to investigate the emotion representation models and find out the possibility of a model with minimum number of continuous dimensions and (ii) to recognize and predict emotion from the measured physiological signals using multiresolution approach. The multimodal physiological signals are: Electroencephalogram (EEG) (32 channels) and peripheral (8 channels: Galvanic skin response (GSR), blood volume pressure, respiration pattern, skin temperature, electromyogram (EMG) and electrooculogram (EOG)) as given in the DEAP database. We have discussed the theories of emotion modeling based on i) basic emotions, ii) cognitive appraisal and physiological response approach and iii) the dimensional approach and proposed a three continuous dimensional representation model for emotions. The clustering experiment on the given valence, arousal and dominance values of various emotions has been done to validate the proposed model. A novel approach for multimodal fusion of information from a large number of channels to classify and predict emotions has also been proposed. Discrete Wavelet Transform, a classical transform for multiresolution analysis of signal has been used in this study. The experiments are performed to classify different emotions from four classifiers. The average accuracies are 81.45%, 74.37%, 57.74% and 75.94% for SVM, MLP, KNN and MMC classifiers respectively. The best accuracy is for 'Depressing' with 85.46% using SVM. The 32 EEG channels are considered as independent modes and features from each channel are considered with equal importance. May be some of the channel data are correlated but they may contain supplementary information. In comparison with the results given by others, the high accuracy of 85% with 13 emotions and 32 subjects from our proposed method clearly proves the potential of our multimodal fusion approach. Copyright © 2013 Elsevier Inc. All rights reserved.
Discovering and understanding oncogenic gene fusions through data intensive computational approaches
Latysheva, Natasha S.; Babu, M. Madan
2016-01-01
Abstract Although gene fusions have been recognized as important drivers of cancer for decades, our understanding of the prevalence and function of gene fusions has been revolutionized by the rise of next-generation sequencing, advances in bioinformatics theory and an increasing capacity for large-scale computational biology. The computational work on gene fusions has been vastly diverse, and the present state of the literature is fragmented. It will be fruitful to merge three camps of gene fusion bioinformatics that appear to rarely cross over: (i) data-intensive computational work characterizing the molecular biology of gene fusions; (ii) development research on fusion detection tools, candidate fusion prioritization algorithms and dedicated fusion databases and (iii) clinical research that seeks to either therapeutically target fusion transcripts and proteins or leverages advances in detection tools to perform large-scale surveys of gene fusion landscapes in specific cancer types. In this review, we unify these different—yet highly complementary and symbiotic—approaches with the view that increased synergy will catalyze advancements in gene fusion identification, characterization and significance evaluation. PMID:27105842
Sensitivity of the fusion cross section to the density dependence of the symmetry energy
NASA Astrophysics Data System (ADS)
Reinhard, P.-G.; Umar, A. S.; Stevenson, P. D.; Piekarewicz, J.; Oberacker, V. E.; Maruhn, J. A.
2016-04-01
Background: The study of the nuclear equation of state (EOS) and the behavior of nuclear matter under extreme conditions is crucial to our understanding of many nuclear and astrophysical phenomena. Nuclear reactions serve as one of the means for studying the EOS. Purpose: It is the aim of this paper to discuss the impact of nuclear fusion on the EOS. This is a timely subject given the expected availability of increasingly exotic beams at rare isotope facilities [A. B. Balantekin et al., Mod. Phys. Lett. A 29, 1430010 (2014), 10.1142/S0217732314300109]. In practice, we focus on 48Ca+48Ca fusion. Method: We employ three different approaches to calculate fusion cross sections for a set of energy density functionals with systematically varying nuclear matter properties. Fusion calculations are performed using frozen densities, using a dynamic microscopic method based on density-constrained time-dependent Hartree-Fock (DC-TDHF) approach, as well as direct TDHF study of above barrier cross sections. For these studies, we employ a family of Skyrme parametrizations with systematically varied nuclear matter properties. Results: The folding-potential model provides a reasonable first estimate of cross sections. DC-TDHF, which includes dynamical polarization, reduces the fusion barriers and delivers much better cross sections. Full TDHF near the barrier agrees nicely with DC-TDHF. Most of the Skyrme forces which we used deliver, on the average, fusion cross sections in good agreement with the data. Trying to read off a trend in the results, we find a slight preference for forces which deliver a slope of symmetry energy of L ≈50 MeV that corresponds to a neutron-skin thickness of 48Ca of Rskin=(0.180 -0.210 ) fm. Conclusions: Fusion reactions in the barrier and sub-barrier region can be a tool to study the EOS and the neutron skin of nuclei. The success of the approach will depend on reduced experimental uncertainties of fusion data as well as the development of fusion theories that closely couple to the microscopic structure and dynamics.
Matrix approaches to assess terrestrial nitrogen scheme in CLM4.5
NASA Astrophysics Data System (ADS)
Du, Z.
2017-12-01
Terrestrial carbon (C) and nitrogen (N) cycles have been commonly represented by a series of balance equations to track their influxes into and effluxes out of individual pools in earth system models (ESMs). This representation matches our understanding of C and N cycle processes well but makes it difficult to track model behaviors. To overcome these challenges, we developed a matrix approach, which reorganizes the series of terrestrial C and N balance equations in the CLM4.5 into two matrix equations based on original representation of C and N cycle processes and mechanisms. The matrix approach would consequently help improve the comparability of models and data, evaluate impacts of additional model components, facilitate benchmark analyses, model intercomparisons, and data-model fusion, and improve model predictive power.
Temporal Data Fusion Approaches to Remote Sensing-Based Wetland Classification
NASA Astrophysics Data System (ADS)
Montgomery, Joshua S. M.
This thesis investigates the ecology of wetlands and associated classification in prairie and boreal environments of Alberta, Canada, using remote sensing technology to enhance classification of wetlands in the province. Objectives of the thesis are divided into two case studies, 1) examining how satellite borne Synthetic Aperture Radar (SAR), optical (RapidEye & SPOT) can be used to evaluate surface water trends in a prairie pothole environment (Shepard Slough); and 2) investigating a data fusion methodology combining SAR, optical and Lidar data to characterize wetland vegetation and surface water attributes in a boreal environment (Utikuma Regional Study Area (URSA)). Surface water extent and hydroperiod products were derived from SAR data, and validated using optical imagery with high accuracies (76-97% overall) for both case studies. High resolution Lidar Digital Elevation Models (DEM), Digital Surface Models (DSM), and Canopy Height Model (CHM) products provided the means for data fusion to extract riparian vegetation communities and surface water; producing model accuracies of (R2 0.90) for URSA, and RMSE of 0.2m to 0.7m at Shepard Slough when compared to field and optical validation data. Integration of Alberta and Canadian wetland classifications systems used to classify and determine economic value of wetlands into the methodology produced thematic maps relevant for policy and decision makers for potential wetland monitoring and policy development.
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion.
Xu, Qinneng; Gel, Yulia R; Ramirez Ramirez, L Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung
2017-01-01
The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient.
NASA Astrophysics Data System (ADS)
Hannachi, Ammar; Kohler, Sophie; Lallement, Alex; Hirsch, Ernest
2015-04-01
3D modeling of scene contents takes an increasing importance for many computer vision based applications. In particular, industrial applications of computer vision require efficient tools for the computation of this 3D information. Routinely, stereo-vision is a powerful technique to obtain the 3D outline of imaged objects from the corresponding 2D images. As a consequence, this approach provides only a poor and partial description of the scene contents. On another hand, for structured light based reconstruction techniques, 3D surfaces of imaged objects can often be computed with high accuracy. However, the resulting active range data in this case lacks to provide data enabling to characterize the object edges. Thus, in order to benefit from the positive points of various acquisition techniques, we introduce in this paper promising approaches, enabling to compute complete 3D reconstruction based on the cooperation of two complementary acquisition and processing techniques, in our case stereoscopic and structured light based methods, providing two 3D data sets describing respectively the outlines and surfaces of the imaged objects. We present, accordingly, the principles of three fusion techniques and their comparison based on evaluation criterions related to the nature of the workpiece and also the type of the tackled application. The proposed fusion methods are relying on geometric characteristics of the workpiece, which favour the quality of the registration. Further, the results obtained demonstrate that the developed approaches are well adapted for 3D modeling of manufactured parts including free-form surfaces and, consequently quality control applications using these 3D reconstructions.
Analyzing Human-Landscape Interactions: Tools That Integrate
NASA Astrophysics Data System (ADS)
Zvoleff, Alex; An, Li
2014-01-01
Humans have transformed much of Earth's land surface, giving rise to loss of biodiversity, climate change, and a host of other environmental issues that are affecting human and biophysical systems in unexpected ways. To confront these problems, environmental managers must consider human and landscape systems in integrated ways. This means making use of data obtained from a broad range of methods (e.g., sensors, surveys), while taking into account new findings from the social and biophysical science literatures. New integrative methods (including data fusion, simulation modeling, and participatory approaches) have emerged in recent years to address these challenges, and to allow analysts to provide information that links qualitative and quantitative elements for policymakers. This paper brings attention to these emergent tools while providing an overview of the tools currently in use for analysis of human-landscape interactions. Analysts are now faced with a staggering array of approaches in the human-landscape literature—in an attempt to bring increased clarity to the field, we identify the relative strengths of each tool, and provide guidance to analysts on the areas to which each tool is best applied. We discuss four broad categories of tools: statistical methods (including survival analysis, multi-level modeling, and Bayesian approaches), GIS and spatial analysis methods, simulation approaches (including cellular automata, agent-based modeling, and participatory modeling), and mixed-method techniques (such as alternative futures modeling and integrated assessment). For each tool, we offer an example from the literature of its application in human-landscape research. Among these tools, participatory approaches are gaining prominence for analysts to make the broadest possible array of information available to researchers, environmental managers, and policymakers. Further development of new approaches of data fusion and integration across sites or disciplines pose an important challenge for future work in integrating human and landscape components.
Fusion yield: Guderley model and Tsallis statistics
NASA Astrophysics Data System (ADS)
Haubold, H. J.; Kumar, D.
2011-02-01
The reaction rate probability integral is extended from Maxwell-Boltzmann approach to a more general approach by using the pathway model introduced by Mathai in 2005 (A pathway to matrix-variate gamma and normal densities. Linear Algebr. Appl. 396, 317-328). The extended thermonuclear reaction rate is obtained in the closed form via a Meijer's G-function and the so-obtained G-function is represented as a solution of a homogeneous linear differential equation. A physical model for the hydrodynamical process in a fusion plasma-compressed and laser-driven spherical shock wave is used for evaluating the fusion energy integral by integrating the extended thermonuclear reaction rate integral over the temperature. The result obtained is compared with the standard fusion yield obtained by Haubold and John in 1981 (Analytical representation of the thermonuclear reaction rate and fusion energy production in a spherical plasma shock wave. Plasma Phys. 23, 399-411). An interpretation for the pathway parameter is also given.
Wu, Mingquan; Li, Hua; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-08-01
There is a shortage of daily high spatial land surface temperature (LST) data for use in high spatial and temporal resolution environmental process monitoring. To address this shortage, this work used the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM), Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM), and the Spatial and Temporal Data Fusion Approach (STDFA) to estimate high spatial and temporal resolution LST by combining Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) LST and Moderate Resolution Imaging Spectroradiometer (MODIS) LST products. The actual ASTER LST products were used to evaluate the precision of the combined LST images using the correlation analysis method. This method was tested and validated in study areas located in Gansu Province, China. The results show that all the models can generate daily synthetic LST image with a high correlation coefficient (r) of 0.92 between the synthetic image and the actual ASTER LST observations. The ESTARFM has the best performance, followed by the STDFA and the STARFM. Those models had better performance in desert areas than in cropland. The STDFA had better noise immunity than the other two models.
Weirather, Jason L; Afshar, Pegah Tootoonchi; Clark, Tyson A; Tseng, Elizabeth; Powers, Linda S; Underwood, Jason G; Zabner, Joseph; Korlach, Jonas; Wong, Wing Hung; Au, Kin Fai
2015-10-15
We developed an innovative hybrid sequencing approach, IDP-fusion, to detect fusion genes, determine fusion sites and identify and quantify fusion isoforms. IDP-fusion is the first method to study gene fusion events by integrating Third Generation Sequencing long reads and Second Generation Sequencing short reads. We applied IDP-fusion to PacBio data and Illumina data from the MCF-7 breast cancer cells. Compared with the existing tools, IDP-fusion detects fusion genes at higher precision and a very low false positive rate. The results show that IDP-fusion will be useful for unraveling the complexity of multiple fusion splices and fusion isoforms within tumorigenesis-relevant fusion genes. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Interplay of Laser-Plasma Interactions and Inertial Fusion Hydrodynamics.
Strozzi, D J; Bailey, D S; Michel, P; Divol, L; Sepke, S M; Kerbel, G D; Thomas, C A; Ralph, J E; Moody, J D; Schneider, M B
2017-01-13
The effects of laser-plasma interactions (LPI) on the dynamics of inertial confinement fusion hohlraums are investigated via a new approach that self-consistently couples reduced LPI models into radiation-hydrodynamics numerical codes. The interplay between hydrodynamics and LPI-specifically stimulated Raman scatter and crossed-beam energy transfer (CBET)-mostly occurs via momentum and energy deposition into Langmuir and ion acoustic waves. This spatially redistributes energy coupling to the target, which affects the background plasma conditions and thus, modifies laser propagation. This model shows reduced CBET and significant laser energy depletion by Langmuir waves, which reduce the discrepancy between modeling and data from hohlraum experiments on wall x-ray emission and capsule implosion shape.
Evaluation of parallel reduction strategies for fusion of sensory information from a robot team
NASA Astrophysics Data System (ADS)
Lyons, Damian M.; Leroy, Joseph
2015-05-01
The advantage of using a team of robots to search or to map an area is that by navigating the robots to different parts of the area, searching or mapping can be completed more quickly. A crucial aspect of the problem is the combination, or fusion, of data from team members to generate an integrated model of the search/mapping area. In prior work we looked at the issue of removing mutual robots views from an integrated point cloud model built from laser and stereo sensors, leading to a cleaner and more accurate model. This paper addresses a further challenge: Even with mutual views removed, the stereo data from a team of robots can quickly swamp a WiFi connection. This paper proposes and evaluates a communication and fusion approach based on the parallel reduction operation, where data is combined in a series of steps of increasing subsets of the team. Eight different strategies for selecting the subsets are evaluated for bandwidth requirements using three robot missions, each carried out with teams of four Pioneer 3-AT robots. Our results indicate that selecting groups to combine based on similar pose but distant location yields the best results.
Feature and Score Fusion Based Multiple Classifier Selection for Iris Recognition
Islam, Md. Rabiul
2014-01-01
The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al. PMID:25114676
Feature and score fusion based multiple classifier selection for iris recognition.
Islam, Md Rabiul
2014-01-01
The aim of this work is to propose a new feature and score fusion based iris recognition approach where voting method on Multiple Classifier Selection technique has been applied. Four Discrete Hidden Markov Model classifiers output, that is, left iris based unimodal system, right iris based unimodal system, left-right iris feature fusion based multimodal system, and left-right iris likelihood ratio score fusion based multimodal system, is combined using voting method to achieve the final recognition result. CASIA-IrisV4 database has been used to measure the performance of the proposed system with various dimensions. Experimental results show the versatility of the proposed system of four different classifiers with various dimensions. Finally, recognition accuracy of the proposed system has been compared with existing N hamming distance score fusion approach proposed by Ma et al., log-likelihood ratio score fusion approach proposed by Schmid et al., and single level feature fusion approach proposed by Hollingsworth et al.
Application of the JDL data fusion process model for cyber security
NASA Astrophysics Data System (ADS)
Giacobe, Nicklaus A.
2010-04-01
A number of cyber security technologies have proposed the use of data fusion to enhance the defensive capabilities of the network and aid in the development of situational awareness for the security analyst. While there have been advances in fusion technologies and the application of fusion in intrusion detection systems (IDSs), in particular, additional progress can be made by gaining a better understanding of a variety of data fusion processes and applying them to the cyber security application domain. This research explores the underlying processes identified in the Joint Directors of Laboratories (JDL) data fusion process model and further describes them in a cyber security context.
NASA Astrophysics Data System (ADS)
Simard, M.; Denbina, M. W.
2017-12-01
Using data collected by NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR) and Land, Vegetation, and Ice Sensor (LVIS) lidar, we have estimated forest canopy height for a number of study areas in the country of Gabon using a new machine learning data fusion approach. Using multi-baseline polarimetric synthetic aperture radar interferometry (PolInSAR) data collected by UAVSAR, forest heights can be estimated using the random volume over ground model. In the case of multi-baseline UAVSAR data consisting of many repeat passes with spatially separated flight tracks, we can estimate different forest height values for each different image pair, or baseline. In order to choose the best forest height estimate for each pixel, the baselines must be selected or ranked, taking care to avoid baselines with unsuitable spatial separation, or severe temporal decorrelation effects. The current baseline selection algorithms in the literature use basic quality metrics derived from the PolInSAR data which are not necessarily indicative of the true height accuracy in all cases. We have developed a new data fusion technique which treats PolInSAR baseline selection as a supervised classification problem, where the classifier is trained using a sparse sampling of lidar data within the PolInSAR coverage area. The classifier uses a large variety of PolInSAR-derived features as input, including radar backscatter as well as features based on the PolInSAR coherence region shape and the PolInSAR complex coherences. The resulting data fusion method produces forest height estimates which are more accurate than a purely radar-based approach, while having a larger coverage area than the input lidar training data, combining some of the strengths of each sensor. The technique demonstrates the strong potential for forest canopy height and above-ground biomass mapping using fusion of PolInSAR with data from future spaceborne lidar missions such as the upcoming Global Ecosystems Dynamics Investigation (GEDI) lidar.
NASA Astrophysics Data System (ADS)
Sabeur, Zoheir; Chakravarthy, Ajay; Bashevoy, Maxim; Modafferi, Stefano
2013-04-01
The rapid increase in environmental observations which are conducted by Small to Medium Enterprise communities and volunteers using affordable in situ sensors at various scales, in addition to the more established observatories set up by environmental and space agencies using airborne and space-borne sensing technologies is generating serious amounts of BIG data at ever increasing speeds. Furthermore, the emergence of Future Internet technologies and the urgent requirements for the deployment of specific enablers for the delivery of processed environmental knowledge in real-time with advanced situation awareness to citizens has reached paramount importance. Specifically, it has become highly critical now to build and provide services which automate the aggregation of data from various sources, while surmounting the semantic gaps, conflicts and heterogeneity in data sources. The early stage aggregation of data will enable the pre-processing of data from multiple sources while reconciling the temporal gaps in measurement time series, and aligning their respective a-synchronicities. This low level type of data fusion process needs to be automated and chained to more advanced level of data fusion services specialising in observation forecasts at spaces where sensing is not deployed; or at time slices where sensing has not taken place yet. As a result, multi-level fusion services are required among the families of specific enablers for monitoring environments and spaces in the Future Internet. These have been intially deployed and piloted in the ongoing ENVIROFI project of the FI-PPP programme [1]. Automated fusion and modelling of in situ and remote sensing data has been set up and the experimentation successfully conducted using RBF networks for the spatial fusion of water quality parameters measurements from satellite and stationary buoys in the Irish Sea. The RBF networks method scales for the spatial data fusion of multiple types of observation sources. This important approach provides a strong basis for the delivery of environmental observations at desired spatial and temporal scales to multiple users with various needs of spatial and temporal resolutions. It has also led to building robust future internet specific enablers on data fusion, which can indeed be used for multiple usage areas above and beyond the environmental domains of the Future Internet. In this paper, data and processing workflow scenarios shall be described. The fucntionalities of the multi-level fusion services shall be demonstrated and made accessible to the wider communities of the Fututre Internet. [1] The Environmental Observation Web and its Service Applications within the Future Internet. ENVIROFI IP. FP7-2011-ICT-IF Pr.No: 284898 http://www.envirofi.eu/
DOE Office of Scientific and Technical Information (OSTI.GOV)
Garcia, Natalie K.; Guttman, Miklos; Ebner, Jamie L.
Influenza hemagglutinin (HA) mediates virus attachment to host cells and fusion of the viral and endosomal membranes during entry. While high-resolution structures are available for the pre-fusion HA ectodomain and the post-fusion HA2 subunit, the sequence of conformational changes during HA activation has eluded structural characterization. In this paper, we apply hydrogen-deuterium exchange with mass spectrometry to examine changes in structural dynamics of the HA ectodomain at various stages of activation, and compare the soluble ectodomain with intact HA on virions. At pH conditions approaching activation (pH 6.0–5.5) HA exhibits increased dynamics at the fusion peptide and neighboring regions, whilemore » the interface between receptor binding subunits (HA1) becomes stabilized. In contrast to many activation models, these data suggest that HA responds to endosomal acidification by releasing the fusion peptide prior to HA1 uncaging and the spring-loaded refolding of HA2. Finally, this staged process may facilitate efficient HA-mediated fusion.« less
Dynamic changes during acid-induced activation of influenza hemagglutinin
Garcia, Natalie K.; Guttman, Miklos; Ebner, Jamie L.; ...
2015-03-12
Influenza hemagglutinin (HA) mediates virus attachment to host cells and fusion of the viral and endosomal membranes during entry. While high-resolution structures are available for the pre-fusion HA ectodomain and the post-fusion HA2 subunit, the sequence of conformational changes during HA activation has eluded structural characterization. In this paper, we apply hydrogen-deuterium exchange with mass spectrometry to examine changes in structural dynamics of the HA ectodomain at various stages of activation, and compare the soluble ectodomain with intact HA on virions. At pH conditions approaching activation (pH 6.0–5.5) HA exhibits increased dynamics at the fusion peptide and neighboring regions, whilemore » the interface between receptor binding subunits (HA1) becomes stabilized. In contrast to many activation models, these data suggest that HA responds to endosomal acidification by releasing the fusion peptide prior to HA1 uncaging and the spring-loaded refolding of HA2. Finally, this staged process may facilitate efficient HA-mediated fusion.« less
Graph-based Data Modeling and Analysis for Data Fusion in Remote Sensing
NASA Astrophysics Data System (ADS)
Fan, Lei
Hyperspectral imaging provides the capability of increased sensitivity and discrimination over traditional imaging methods by combining standard digital imaging with spectroscopic methods. For each individual pixel in a hyperspectral image (HSI), a continuous spectrum is sampled as the spectral reflectance/radiance signature to facilitate identification of ground cover and surface material. The abundant spectrum knowledge allows all available information from the data to be mined. The superior qualities within hyperspectral imaging allow wide applications such as mineral exploration, agriculture monitoring, and ecological surveillance, etc. The processing of massive high-dimensional HSI datasets is a challenge since many data processing techniques have a computational complexity that grows exponentially with the dimension. Besides, a HSI dataset may contain a limited number of degrees of freedom due to the high correlations between data points and among the spectra. On the other hand, merely taking advantage of the sampled spectrum of individual HSI data point may produce inaccurate results due to the mixed nature of raw HSI data, such as mixed pixels, optical interferences and etc. Fusion strategies are widely adopted in data processing to achieve better performance, especially in the field of classification and clustering. There are mainly three types of fusion strategies, namely low-level data fusion, intermediate-level feature fusion, and high-level decision fusion. Low-level data fusion combines multi-source data that is expected to be complementary or cooperative. Intermediate-level feature fusion aims at selection and combination of features to remove redundant information. Decision level fusion exploits a set of classifiers to provide more accurate results. The fusion strategies have wide applications including HSI data processing. With the fast development of multiple remote sensing modalities, e.g. Very High Resolution (VHR) optical sensors, LiDAR, etc., fusion of multi-source data can in principal produce more detailed information than each single source. On the other hand, besides the abundant spectral information contained in HSI data, features such as texture and shape may be employed to represent data points from a spatial perspective. Furthermore, feature fusion also includes the strategy of removing redundant and noisy features in the dataset. One of the major problems in machine learning and pattern recognition is to develop appropriate representations for complex nonlinear data. In HSI processing, a particular data point is usually described as a vector with coordinates corresponding to the intensities measured in the spectral bands. This vector representation permits the application of linear and nonlinear transformations with linear algebra to find an alternative representation of the data. More generally, HSI is multi-dimensional in nature and the vector representation may lose the contextual correlations. Tensor representation provides a more sophisticated modeling technique and a higher-order generalization to linear subspace analysis. In graph theory, data points can be generalized as nodes with connectivities measured from the proximity of a local neighborhood. The graph-based framework efficiently characterizes the relationships among the data and allows for convenient mathematical manipulation in many applications, such as data clustering, feature extraction, feature selection and data alignment. In this thesis, graph-based approaches applied in the field of multi-source feature and data fusion in remote sensing area are explored. We will mainly investigate the fusion of spatial, spectral and LiDAR information with linear and multilinear algebra under graph-based framework for data clustering and classification problems.
NASA Astrophysics Data System (ADS)
McCullough, Claire L.; Novobilski, Andrew J.; Fesmire, Francis M.
2006-04-01
Faculty from the University of Tennessee at Chattanooga and the University of Tennessee College of Medicine, Chattanooga Unit, have used data mining techniques and neural networks to examine a set of fourteen features, data items, and HUMINT assessments for 2,148 emergency room patients with symptoms possibly indicative of Acute Coronary Syndrome. Specifically, the authors have generated Bayesian networks describing linkages and causality in the data, and have compared them with neural networks. The data includes objective information routinely collected during triage and the physician's initial case assessment, a HUMINT appraisal. Both the neural network and the Bayesian network were used to fuse the disparate types of information with the goal of forecasting thirty-day adverse patient outcome. This paper presents details of the methods of data fusion including both the data mining techniques and the neural network. Results are compared using Receiver Operating Characteristic curves describing the outcomes of both methods, both using only objective features and including the subjective physician's assessment. While preliminary, the results of this continuing study are significant both from the perspective of potential use of the intelligent fusion of biomedical informatics to aid the physician in prescribing treatment necessary to prevent serious adverse outcome from ACS and as a model of fusion of objective data with subjective HUMINT assessment. Possible future work includes extension of successfully demonstrated intelligent fusion methods to other medical applications, and use of decision level fusion to combine results from data mining and neural net approaches for even more accurate outcome prediction.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1992-01-01
Various papers on control paradigms and data structures in sensor fusion are presented. The general topics addressed include: decision models and computational methods, sensor modeling and data representation, active sensing strategies, geometric planning and visualization, task-driven sensing, motion analysis, models motivated biology and psychology, decentralized detection and distributed decision, data fusion architectures, robust estimation of shapes and features, application and implementation. Some of the individual subjects considered are: the Firefly experiment on neural networks for distributed sensor data fusion, manifold traversing as a model for learning control of autonomous robots, choice of coordinate systems for multiple sensor fusion, continuous motion using task-directed stereo vision, interactive and cooperative sensing and control for advanced teleoperation, knowledge-based imaging for terrain analysis, physical and digital simulations for IVA robotics.
Fusion of imaging and nonimaging data for surveillance aircraft
NASA Astrophysics Data System (ADS)
Shahbazian, Elisa; Gagnon, Langis; Duquet, Jean Remi; Macieszczak, Maciej; Valin, Pierre
1997-06-01
This paper describes a phased incremental integration approach for application of image analysis and data fusion technologies to provide automated intelligent target tracking and identification for airborne surveillance on board an Aurora Maritime Patrol Aircraft. The sensor suite of the Aurora consists of a radar, an identification friend or foe (IFF) system, an electronic support measures (ESM) system, a spotlight synthetic aperture radar (SSAR), a forward looking infra-red (FLIR) sensor and a link-11 tactical datalink system. Lockheed Martin Canada (LMCan) is developing a testbed, which will be used to analyze and evaluate approaches for combining the data provided by the existing sensors, which were initially not designed to feed a fusion system. Three concurrent research proof-of-concept activities provide techniques, algorithms and methodology into three sequential phases of integration of this testbed. These activities are: (1) analysis of the fusion architecture (track/contact/hybrid) most appropriate for the type of data available, (2) extraction and fusion of simple features from the imaging data into the fusion system performing automatic target identification, and (3) development of a unique software architecture which will permit integration and independent evolution, enhancement and optimization of various decision aid capabilities, such as multi-sensor data fusion (MSDF), situation and threat assessment (STA) and resource management (RM).
NASA Astrophysics Data System (ADS)
Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian
2017-03-01
The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control
Estimating rice yield from MODIS-Landsat fusion data in Taiwan
NASA Astrophysics Data System (ADS)
Chen, C. R.; Chen, C. F.; Nguyen, S. T.
2017-12-01
Rice production monitoring with remote sensing is an important activity in Taiwan due to official initiatives. Yield estimation is a challenge in Taiwan because rice fields are small and fragmental. High spatiotemporal satellite data providing phenological information of rice crops is thus required for this monitoring purpose. This research aims to develop data fusion approaches to integrate daily Moderate Resolution Imaging Spectroradiometer (MODIS) and Landsat data for rice yield estimation in Taiwan. In this study, the low-resolution MODIS LST and emissivity data are used as reference data sources to obtain the high-resolution LST from Landsat data using the mixed-pixel analysis technique, and the time-series EVI data were derived the fusion of MODIS and Landsat spectral band data using STARFM method. The LST and EVI simulated results showed the close agreement between the LST and EVI obtained by the proposed methods with the reference data. The rice-yield model was established using EVI and LST data based on information of rice crop phenology collected from 371 ground survey sites across the country in 2014. The results achieved from the fusion datasets compared with the reference data indicated the close relationship between the two datasets with the correlation coefficient (R2) of 0.75 and root mean square error (RMSE) of 338.7 kgs, which were more accurate than those using the coarse-resolution MODIS LST data (R2 = 0.71 and RMSE = 623.82 kgs). For the comparison of total production, 64 towns located in the west part of Taiwan were used. The results also confirmed that the model using fusion datasets produced more accurate results (R2 = 0.95 and RMSE = 1,243 tons) than that using the course-resolution MODIS data (R2 = 0.91 and RMSE = 1,749 tons). This study demonstrates the application of MODIS-Landsat fusion data for rice yield estimation at the township level in Taiwan. The results obtained from the methods used in this study could be useful to policymakers; and thus, the methods can be transferable to other regions in the world for rice yield estimation.
A Bio-Inspired Herbal Tea Flavour Assessment Technique
Zakaria, Nur Zawatil Isqi; Masnan, Maz Jamilah; Zakaria, Ammar; Shakaff, Ali Yeon Md
2014-01-01
Herbal-based products are becoming a widespread production trend among manufacturers for the domestic and international markets. As the production increases to meet the market demand, it is very crucial for the manufacturer to ensure that their products have met specific criteria and fulfil the intended quality determined by the quality controller. One famous herbal-based product is herbal tea. This paper investigates bio-inspired flavour assessments in a data fusion framework involving an e-nose and e-tongue. The objectives are to attain good classification of different types and brands of herbal tea, classification of different flavour masking effects and finally classification of different concentrations of herbal tea. Two data fusion levels were employed in this research, low level data fusion and intermediate level data fusion. Four classification approaches; LDA, SVM, KNN and PNN were examined in search of the best classifier to achieve the research objectives. In order to evaluate the classifiers' performance, an error estimator based on k-fold cross validation and leave-one-out were applied. Classification based on GC-MS TIC data was also included as a comparison to the classification performance using fusion approaches. Generally, KNN outperformed the other classification techniques for the three flavour assessments in the low level data fusion and intermediate level data fusion. However, the classification results based on GC-MS TIC data are varied. PMID:25010697
Lipid droplets fusion in adipocyte differentiated 3T3-L1 cells: A Monte Carlo simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boschi, Federico, E-mail: federico.boschi@univr.it; Department of Computer Science, University of Verona, Strada Le Grazie 15, 37134 Verona; Rizzatti, Vanni
Several human worldwide diseases like obesity, type 2 diabetes, hepatic steatosis, atherosclerosis and other metabolic pathologies are related to the excessive accumulation of lipids in cells. Lipids accumulate in spherical cellular inclusions called lipid droplets (LDs) whose sizes range from fraction to one hundred of micrometers in adipocytes. It has been suggested that LDs can grow in size due to a fusion process by which a larger LD is obtained with spherical shape and volume equal to the sum of the progenitors’ ones. In this study, the size distribution of two populations of LDs was analyzed in immature and maturemore » (5-days differentiated) 3T3-L1 adipocytes (first and second populations, respectively) after Oil Red O staining. A Monte Carlo simulation of interaction between LDs has been developed in order to quantify the size distribution and the number of fusion events needed to obtain the distribution of the second population size starting from the first one. Four models are presented here based on different kinds of interaction: a surface weighted interaction (R2 Model), a volume weighted interaction (R3 Model), a random interaction (Random model) and an interaction related to the place where the LDs are born (Nearest Model). The last two models mimic quite well the behavior found in the experimental data. This work represents a first step in developing numerical simulations of the LDs growth process. Due to the complex phenomena involving LDs (absorption, growth through additional neutral lipid deposition in existing droplets, de novo formation and catabolism) the study focuses on the fusion process. The results suggest that, to obtain the observed size distribution, a number of fusion events comparable with the number of LDs themselves is needed. Moreover the MC approach results a powerful tool for investigating the LDs growth process. Highlights: • We evaluated the role of the fusion process in the synthesis of the lipid droplets. • We compared the size distribution of the lipid droplets in immature and mature cells. • We used the Monte Carlo simulation approach, simulating 10 thousand of fusion events. • Four different interaction models between the lipid droplets were tested. • The best model which mimics the experimental measures was selected.« less
A model for explaining fusion suppression using classical trajectory method
NASA Astrophysics Data System (ADS)
Phookan, C. K.; Kalita, K.
2015-01-01
We adopt a semi-classical approach for explanation of projectile breakup and above barrier fusion suppression for the reactions 6Li+152Sm and 6Li+144Sm. The cut-off impact parameter for fusion is determined by employing quantum mechanical ideas. Within this cut-off impact parameter for fusion, the fraction of projectiles undergoing breakup is determined using the method of classical trajectory in two-dimensions. For obtaining the initial conditions of the equations of motion, a simplified model of the 6Li nucleus has been proposed. We introduce a simple formula for explanation of fusion suppression. We find excellent agreement between the experimental and calculated fusion cross section. A slight modification of the above formula for fusion suppression is also proposed for a three-dimensional model.
A fusion approach for coarse-to-fine target recognition
NASA Astrophysics Data System (ADS)
Folkesson, Martin; Grönwall, Christina; Jungert, Erland
2006-04-01
A fusion approach in a query based information system is presented. The system is designed for querying multimedia data bases, and here applied to target recognition using heterogeneous data sources. The recognition process is coarse-to-fine, with an initial attribute estimation step and a following matching step. Several sensor types and algorithms are involved in each of these two steps. An independence of the matching results, on the origin of the estimation results, is observed. It allows for distribution of data between algorithms in an intermediate fusion step, without risk of data incest. This increases the overall chance of recognising the target. An implementation of the system is described.
Coupled-channel analyses on 16O + 147,148,150,152,154Sm heavy-ion fusion reactions
NASA Astrophysics Data System (ADS)
Erol, Burcu; Yılmaz, Ahmet Hakan
2018-02-01
Heavy-ion collisons are typically characterized by the presence of many open reaction channels. In the energies around the Coulomb barrier, the main processes are elastic scattering, inelastic excitations of low-lying modes and fusion operations of one or two nuclei. The fusion process is generally defined as the effect of one-dimensional barrier penetration model, taking scattering potential as the sum of Coulomb and proximity potential. We have performed heay-ion fusion reactions with coupled-channel (CC) calculations. Coupled-channel formalism is carried out under barrier energy in heavy-ion fusion reactions. In this work fusion cross sections have been calculated and analyzed in detail for the five systems 16O + 147,148,150,152,154sm in the framework of coupled-channel approach (using the codes CCFUS and CCDEF) and Wong Formula. Calculated results are compared with experimental data, CC calculations using code CCFULL and with the cross section datas taken from `nrv'. CCDEF, CCFULL and Wong Formula explains the fusion reactions of heavy-ions very well, while using the scattering potential as WOODS-SAXON volume potential with Akyuz-Winther parameters. It was observed that AW potential parameters are able to reproduce the experimentally observed fusion cross sections reasonably well for these systems. There is a good agreement between the calculated results with the experimental and nrv[8] results.
A data fusion approach for mapping daily evapotranspiration at field scale
USDA-ARS?s Scientific Manuscript database
The capability for mapping water consumption over cropped landscapes on a daily and seasonal basis is increasingly relevant given forecasted scenarios of reduced water availability. Prognostic modeling of water losses to the atmosphere, or evapotranspiration (ET), at field or finer scales in agricul...
NASA Astrophysics Data System (ADS)
Kumar, Raj; Sharma, Manoj K.; Gupta, Raj K.
2011-11-01
First, the nuclear proximity potential, obtained by using the semiclassical extended Thomas Fermi (ETF) approach in Skyrme energy density formalism (SEDF), is shown to give more realistic barriers in frozen density approximation, as compared to the sudden approximation. Then, taking advantage of the fact that, in ETF method, different Skyrme forces give different barriers (height, position and curvature), we use the ℓ-summed extended-Wong model of Gupta and collaborators (2009) [1] under frozen densities approximation for calculating the cross-sections, where the Skyrme force is chosen with proper barrier characteristics, not-requiring additional "barrier modification" effects (lowering or narrowing, etc.), for a best fit to data at sub-barrier energies. The method is applied to capture cross-section data from 48Ca + 238U, 244Pu, and 248Cm reactions and to fusion-evaporation cross-sections from 58Ni + 58Ni, 64Ni + 64Ni, and 64Ni + 100Mo reactions, with effects of deformations and orientations of nuclei included, wherever required. Interestingly, whereas the capture cross-sections in Ca-induced reactions could be fitted to any force, such as SIII, SV and GSkI, by allowing a small change of couple of units in deduced ℓ-values at below-barrier energies, the near-barrier data point of 48Ca + 248Cm reaction could not be fitted to ℓ-values deduced for below-barrier energies, calling for a check of data. On the other hand, the fusion-evaporation cross-sections in Ni-induced reactions at sub-barrier energies required different Skyrme forces, representing "modifications of the barrier", for the best fit to data at all incident center-of-mass energies E's, displaying a kind of fusion hindrance at sub-barrier energies. This barrier modification effect is taken into care here by using different Skyrme forces for reactions belonging to different regions of the periodic table. Note that more than one Skyrme force (with identical barrier characteristics) could equally well fit the same data.
Remote Sensing Technologies and Geospatial Modelling Hierarchy for Smart City Support
NASA Astrophysics Data System (ADS)
Popov, M.; Fedorovsky, O.; Stankevich, S.; Filipovich, V.; Khyzhniak, A.; Piestova, I.; Lubskyi, M.; Svideniuk, M.
2017-12-01
The approach to implementing the remote sensing technologies and geospatial modelling for smart city support is presented. The hierarchical structure and basic components of the smart city information support subsystem are considered. Some of the already available useful practical developments are described. These include city land use planning, urban vegetation analysis, thermal condition forecasting, geohazard detection, flooding risk assessment. Remote sensing data fusion approach for comprehensive geospatial analysis is discussed. Long-term city development forecasting by Forrester - Graham system dynamics model is provided over Kiev urban area.
Fusion gene addiction: can tumours be forced to give up the habit?
Selfe, Joanna L; Shipley, Janet
2017-07-01
Fusion of genes in tumours can have oncogenic roles in reprogramming cells through overexpression of oncogenes or the production of novel fusion proteins. A fundamental question in cancer biology is what genetic events are critical for initiation and whether these are also required for cancer progression. In recent work published in The Journal of Pathology, dependency on a fusion protein was addressed using a model of alveolar rhabdomyosarcomas - a sarcoma subtype with frequent fusion of PAX3 and FOXO1 genes that is associated with poor outcome. PAX3-FOXO1 encodes a potent transcription factor that together with MYCN alters the transcriptional landscape of cells. Building on previous work, an inducible model in human myoblast cells was used to show that PAX3-FOXO1 and MYCN can initiate rhabdomyosarcoma development but, contrary to current thinking, tumour recurrences occasionally arose independent of the fusion protein. Further work needs to identify the molecular nature of this independence and assess any relevance in human tumours. Such functional approaches are required together with computational modeling of molecular data to unravel spatial and temporal dependencies on specific genetic events. This may support molecular prognostic markers and therapeutic targets. Copyright © 2017 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd. Copyright © 2017 Pathological Society of Great Britain and Ireland. Published by John Wiley & Sons, Ltd.
A data fusion framework for meta-evaluation of intelligent transportation system effectiveness
DOT National Transportation Integrated Search
This study presents a framework for the meta-evaluation of Intelligent Transportation System effectiveness. The framework is based on data fusion approaches that adjust for data biases and violations of other standard statistical assumptions. Operati...
Joint sparsity based heterogeneous data-level fusion for target detection and estimation
NASA Astrophysics Data System (ADS)
Niu, Ruixin; Zulch, Peter; Distasio, Marcello; Blasch, Erik; Shen, Dan; Chen, Genshe
2017-05-01
Typical surveillance systems employ decision- or feature-level fusion approaches to integrate heterogeneous sensor data, which are sub-optimal and incur information loss. In this paper, we investigate data-level heterogeneous sensor fusion. Since the sensors monitor the common targets of interest, whose states can be determined by only a few parameters, it is reasonable to assume that the measurement domain has a low intrinsic dimensionality. For heterogeneous sensor data, we develop a joint-sparse data-level fusion (JSDLF) approach based on the emerging joint sparse signal recovery techniques by discretizing the target state space. This approach is applied to fuse signals from multiple distributed radio frequency (RF) signal sensors and a video camera for joint target detection and state estimation. The JSDLF approach is data-driven and requires minimum prior information, since there is no need to know the time-varying RF signal amplitudes, or the image intensity of the targets. It can handle non-linearity in the sensor data due to state space discretization and the use of frequency/pixel selection matrices. Furthermore, for a multi-target case with J targets, the JSDLF approach only requires discretization in a single-target state space, instead of discretization in a J-target state space, as in the case of the generalized likelihood ratio test (GLRT) or the maximum likelihood estimator (MLE). Numerical examples are provided to demonstrate that the proposed JSDLF approach achieves excellent performance with near real-time accurate target position and velocity estimates.
Continuous Indoor Positioning Fusing WiFi, Smartphone Sensors and Landmarks
Deng, Zhi-An; Wang, Guofeng; Qin, Danyang; Na, Zhenyu; Cui, Yang; Chen, Juan
2016-01-01
To exploit the complementary strengths of WiFi positioning, pedestrian dead reckoning (PDR), and landmarks, we propose a novel fusion approach based on an extended Kalman filter (EKF). For WiFi positioning, unlike previous fusion approaches setting measurement noise parameters empirically, we deploy a kernel density estimation-based model to adaptively measure the related measurement noise statistics. Furthermore, a trusted area of WiFi positioning defined by fusion results of previous step and WiFi signal outlier detection are exploited to reduce computational cost and improve WiFi positioning accuracy. For PDR, we integrate a gyroscope, an accelerometer, and a magnetometer to determine the user heading based on another EKF model. To reduce accumulation error of PDR and enable continuous indoor positioning, not only the positioning results but also the heading estimations are recalibrated by indoor landmarks. Experimental results in a realistic indoor environment show that the proposed fusion approach achieves substantial positioning accuracy improvement than individual positioning approaches including PDR and WiFi positioning. PMID:27608019
Continuous Indoor Positioning Fusing WiFi, Smartphone Sensors and Landmarks.
Deng, Zhi-An; Wang, Guofeng; Qin, Danyang; Na, Zhenyu; Cui, Yang; Chen, Juan
2016-09-05
To exploit the complementary strengths of WiFi positioning, pedestrian dead reckoning (PDR), and landmarks, we propose a novel fusion approach based on an extended Kalman filter (EKF). For WiFi positioning, unlike previous fusion approaches setting measurement noise parameters empirically, we deploy a kernel density estimation-based model to adaptively measure the related measurement noise statistics. Furthermore, a trusted area of WiFi positioning defined by fusion results of previous step and WiFi signal outlier detection are exploited to reduce computational cost and improve WiFi positioning accuracy. For PDR, we integrate a gyroscope, an accelerometer, and a magnetometer to determine the user heading based on another EKF model. To reduce accumulation error of PDR and enable continuous indoor positioning, not only the positioning results but also the heading estimations are recalibrated by indoor landmarks. Experimental results in a realistic indoor environment show that the proposed fusion approach achieves substantial positioning accuracy improvement than individual positioning approaches including PDR and WiFi positioning.
Interplay of Laser-Plasma Interactions and Inertial Fusion Hydrodynamics
Strozzi, D. J.; Bailey, D. S.; Michel, P.; ...
2017-01-12
The effects of laser-plasma interactions (LPI) on the dynamics of inertial confinement fusion hohlraums are investigated in this work via a new approach that self-consistently couples reduced LPI models into radiation-hydrodynamics numerical codes. The interplay between hydrodynamics and LPI—specifically stimulated Raman scatter and crossed-beam energy transfer (CBET)—mostly occurs via momentum and energy deposition into Langmuir and ion acoustic waves. This spatially redistributes energy coupling to the target, which affects the background plasma conditions and thus, modifies laser propagation. In conclusion, this model shows reduced CBET and significant laser energy depletion by Langmuir waves, which reduce the discrepancy between modeling andmore » data from hohlraum experiments on wall x-ray emission and capsule implosion shape.« less
IMU-Based Gait Recognition Using Convolutional Neural Networks and Multi-Sensor Fusion.
Dehzangi, Omid; Taherisadr, Mojtaba; ChangalVala, Raghvendar
2017-11-27
The wide spread usage of wearable sensors such as in smart watches has provided continuous access to valuable user generated data such as human motion that could be used to identify an individual based on his/her motion patterns such as, gait. Several methods have been suggested to extract various heuristic and high-level features from gait motion data to identify discriminative gait signatures and distinguish the target individual from others. However, the manual and hand crafted feature extraction is error prone and subjective. Furthermore, the motion data collected from inertial sensors have complex structure and the detachment between manual feature extraction module and the predictive learning models might limit the generalization capabilities. In this paper, we propose a novel approach for human gait identification using time-frequency (TF) expansion of human gait cycles in order to capture joint 2 dimensional (2D) spectral and temporal patterns of gait cycles. Then, we design a deep convolutional neural network (DCNN) learning to extract discriminative features from the 2D expanded gait cycles and jointly optimize the identification model and the spectro-temporal features in a discriminative fashion. We collect raw motion data from five inertial sensors placed at the chest, lower-back, right hand wrist, right knee, and right ankle of each human subject synchronously in order to investigate the impact of sensor location on the gait identification performance. We then present two methods for early (input level) and late (decision score level) multi-sensor fusion to improve the gait identification generalization performance. We specifically propose the minimum error score fusion (MESF) method that discriminatively learns the linear fusion weights of individual DCNN scores at the decision level by minimizing the error rate on the training data in an iterative manner. 10 subjects participated in this study and hence, the problem is a 10-class identification task. Based on our experimental results, 91% subject identification accuracy was achieved using the best individual IMU and 2DTF-DCNN. We then investigated our proposed early and late sensor fusion approaches, which improved the gait identification accuracy of the system to 93.36% and 97.06%, respectively.
Calhoun, Vince D; Sui, Jing
2016-01-01
It is becoming increasingly clear that combining multi-modal brain imaging data is able to provide more information for individual subjects by exploiting the rich multimodal information that exists. However, the number of studies that do true multimodal fusion (i.e. capitalizing on joint information among modalities) is still remarkably small given the known benefits. In part, this is because multi-modal studies require broader expertise in collecting, analyzing, and interpreting the results than do unimodal studies. In this paper, we start by introducing the basic reasons why multimodal data fusion is important and what it can do, and importantly how it can help us avoid wrong conclusions and help compensate for imperfect brain imaging studies. We also discuss the challenges that need to be confronted for such approaches to be more widely applied by the community. We then provide a review of the diverse studies that have used multimodal data fusion (primarily focused on psychosis) as well as provide an introduction to some of the existing analytic approaches. Finally, we discuss some up-and-coming approaches to multi-modal fusion including deep learning and multimodal classification which show considerable promise. Our conclusion is that multimodal data fusion is rapidly growing, but it is still underutilized. The complexity of the human brain coupled with the incomplete measurement provided by existing imaging technology makes multimodal fusion essential in order to mitigate against misdirection and hopefully provide a key to finding the missing link(s) in complex mental illness. PMID:27347565
Calhoun, Vince D; Sui, Jing
2016-05-01
It is becoming increasingly clear that combining multi-modal brain imaging data is able to provide more information for individual subjects by exploiting the rich multimodal information that exists. However, the number of studies that do true multimodal fusion (i.e. capitalizing on joint information among modalities) is still remarkably small given the known benefits. In part, this is because multi-modal studies require broader expertise in collecting, analyzing, and interpreting the results than do unimodal studies. In this paper, we start by introducing the basic reasons why multimodal data fusion is important and what it can do, and importantly how it can help us avoid wrong conclusions and help compensate for imperfect brain imaging studies. We also discuss the challenges that need to be confronted for such approaches to be more widely applied by the community. We then provide a review of the diverse studies that have used multimodal data fusion (primarily focused on psychosis) as well as provide an introduction to some of the existing analytic approaches. Finally, we discuss some up-and-coming approaches to multi-modal fusion including deep learning and multimodal classification which show considerable promise. Our conclusion is that multimodal data fusion is rapidly growing, but it is still underutilized. The complexity of the human brain coupled with the incomplete measurement provided by existing imaging technology makes multimodal fusion essential in order to mitigate against misdirection and hopefully provide a key to finding the missing link(s) in complex mental illness.
Depth-color fusion strategy for 3-D scene modeling with Kinect.
Camplani, Massimo; Mantecon, Tomas; Salgado, Luis
2013-12-01
Low-cost depth cameras, such as Microsoft Kinect, have completely changed the world of human-computer interaction through controller-free gaming applications. Depth data provided by the Kinect sensor presents several noise-related problems that have to be tackled to improve the accuracy of the depth data, thus obtaining more reliable game control platforms and broadening its applicability. In this paper, we present a depth-color fusion strategy for 3-D modeling of indoor scenes with Kinect. Accurate depth and color models of the background elements are iteratively built, and used to detect moving objects in the scene. Kinect depth data is processed with an innovative adaptive joint-bilateral filter that efficiently combines depth and color by analyzing an edge-uncertainty map and the detected foreground regions. Results show that the proposed approach efficiently tackles main Kinect data problems: distance-dependent depth maps, spatial noise, and temporal random fluctuations are dramatically reduced; objects depth boundaries are refined, and nonmeasured depth pixels are interpolated. Moreover, a robust depth and color background model and accurate moving objects silhouette are generated.
A practical approach for active camera coordination based on a fusion-driven multi-agent system
NASA Astrophysics Data System (ADS)
Bustamante, Alvaro Luis; Molina, José M.; Patricio, Miguel A.
2014-04-01
In this paper, we propose a multi-agent system architecture to manage spatially distributed active (or pan-tilt-zoom) cameras. Traditional video surveillance algorithms are of no use for active cameras, and we have to look at different approaches. Such multi-sensor surveillance systems have to be designed to solve two related problems: data fusion and coordinated sensor-task management. Generally, architectures proposed for the coordinated operation of multiple cameras are based on the centralisation of management decisions at the fusion centre. However, the existence of intelligent sensors capable of decision making brings with it the possibility of conceiving alternative decentralised architectures. This problem is approached by means of a MAS, integrating data fusion as an integral part of the architecture for distributed coordination purposes. This paper presents the MAS architecture and system agents.
NASA Astrophysics Data System (ADS)
Lendzioch, Theodora; Langhammer, Jakub; Hartvich, Filip
2015-04-01
Fusion of remote sensing data is a common and rapidly developing discipline, which combines data from multiple sources with different spatial and spectral resolution, from satellite sensors, aircraft and ground platforms. Fusion data contains more detailed information than each of the source and enhances the interpretation performance and accuracy of the source data and produces a high-quality visualisation of the final data. Especially, in fluvial geomorphology it is essential to get valuable images in sub-meter resolution to obtain high quality 2D and 3D information for a detailed identification, extraction and description of channel features of different river regimes and to perform a rapid mapping of changes in river topography. In order to design, test and evaluate a new approach for detection of river morphology, we combine different research techniques from remote sensing products to drone-based photogrammetry and LiDAR products (aerial LiDAR Scanner and TLS). Topographic information (e.g. changes in river channel morphology, surface roughness, evaluation of floodplain inundation, mapping gravel bars and slope characteristics) will be extracted either from one single layer or from combined layers in accordance to detect fluvial topographic changes before and after flood events. Besides statistical approaches for predictive geomorphological mapping and the determination of errors and uncertainties of the data, we will also provide 3D modelling of small fluvial features.
A hybrid sensing approach for pure and adulterated honey classification.
Subari, Norazian; Mohamad Saleh, Junita; Md Shakaff, Ali Yeon; Zakaria, Ammar
2012-10-17
This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar) were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose) and Fourier Transform Infrared Spectroscopy (FTIR) were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0%) gave higher classification accuracy than e-nose data (76.5%) using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data.
Fusion of laser and image sensory data for 3-D modeling of the free navigation space
NASA Technical Reports Server (NTRS)
Mass, M.; Moghaddamzadeh, A.; Bourbakis, N.
1994-01-01
A fusion technique which combines two different types of sensory data for 3-D modeling of a navigation space is presented. The sensory data is generated by a vision camera and a laser scanner. The problem of different resolutions for these sensory data was solved by reduced image resolution, fusion of different data, and use of a fuzzy image segmentation technique.
Integrated Data Analysis for Fusion: A Bayesian Tutorial for Fusion Diagnosticians
NASA Astrophysics Data System (ADS)
Dinklage, Andreas; Dreier, Heiko; Fischer, Rainer; Gori, Silvio; Preuss, Roland; Toussaint, Udo von
2008-03-01
Integrated Data Analysis (IDA) offers a unified way of combining information relevant to fusion experiments. Thereby, IDA meets with typical issues arising in fusion data analysis. In IDA, all information is consistently formulated as probability density functions quantifying uncertainties in the analysis within the Bayesian probability theory. For a single diagnostic, IDA allows the identification of faulty measurements and improvements in the setup. For a set of diagnostics, IDA gives joint error distributions allowing the comparison and integration of different diagnostics results. Validation of physics models can be performed by model comparison techniques. Typical data analysis applications benefit from IDA capabilities of nonlinear error propagation, the inclusion of systematic effects and the comparison of different physics models. Applications range from outlier detection, background discrimination, model assessment and design of diagnostics. In order to cope with next step fusion device requirements, appropriate techniques are explored for fast analysis applications.
FuzzyFusion: an application architecture for multisource information fusion
NASA Astrophysics Data System (ADS)
Fox, Kevin L.; Henning, Ronda R.
2009-04-01
The correlation of information from disparate sources has long been an issue in data fusion research. Traditional data fusion addresses the correlation of information from sources as diverse as single-purpose sensors to all-source multi-media information. Information system vulnerability information is similar in its diversity of sources and content, and in the desire to draw a meaningful conclusion, namely, the security posture of the system under inspection. FuzzyFusionTM, A data fusion model that is being applied to the computer network operations domain is presented. This model has been successfully prototyped in an applied research environment and represents a next generation assurance tool for system and network security.
NASA Astrophysics Data System (ADS)
Oikawa, P. Y.; Jenerette, G. D.; Knox, S. H.; Sturtevant, C.; Verfaillie, J.; Dronova, I.; Poindexter, C. M.; Eichelmann, E.; Baldocchi, D. D.
2017-01-01
Wetlands and flooded peatlands can sequester large amounts of carbon (C) and have high greenhouse gas mitigation potential. There is growing interest in financing wetland restoration using C markets; however, this requires careful accounting of both CO2 and CH4 exchange at the ecosystem scale. Here we present a new model, the PEPRMT model (Peatland Ecosystem Photosynthesis Respiration and Methane Transport), which consists of a hierarchy of biogeochemical models designed to estimate CO2 and CH4 exchange in restored managed wetlands. Empirical models using temperature and/or photosynthesis to predict respiration and CH4 production were contrasted with a more process-based model that simulated substrate-limited respiration and CH4 production using multiple carbon pools. Models were parameterized by using a model-data fusion approach with multiple years of eddy covariance data collected in a recently restored wetland and a mature restored wetland. A third recently restored wetland site was used for model validation. During model validation, the process-based model explained 70% of the variance in net ecosystem exchange of CO2 (NEE) and 50% of the variance in CH4 exchange. Not accounting for high respiration following restoration led to empirical models overestimating annual NEE by 33-51%. By employing a model-data fusion approach we provide rigorous estimates of uncertainty in model predictions, accounting for uncertainty in data, model parameters, and model structure. The PEPRMT model is a valuable tool for understanding carbon cycling in restored wetlands and for application in carbon market-funded wetland restoration, thereby advancing opportunity to counteract the vast degradation of wetlands and flooded peatlands.
Zhao, Y J; Liu, Y; Sun, Y C; Wang, Y
2017-08-18
To explore a three-dimensional (3D) data fusion and integration method of optical scanning tooth crowns and cone beam CT (CBCT) reconstructing tooth roots for their natural transition in the 3D profile. One mild dental crowding case was chosen from orthodontics clinics with full denture. The CBCT data were acquired to reconstruct the dental model with tooth roots by Mimics 17.0 medical imaging software, and the optical impression was taken to obtain the dentition model with high precision physiological contour of crowns by Smart Optics dental scanner. The two models were doing 3D registration based on their common part of the crowns' shape in Geomagic Studio 2012 reverse engineering software. The model coordinate system was established by defining the occlusal plane. crown-gingiva boundary was extracted from optical scanning model manually, then crown-root boundary was generated by offsetting and projecting crown-gingiva boundary to the root model. After trimming the crown and root models, the 3D fusion model with physiological contour crown and nature root was formed by curvature continuity filling algorithm finally. In the study, 10 patients with dentition mild crowded from the oral clinics were followed up with this method to obtain 3D crown and root fusion models, and 10 high qualification doctors were invited to do subjective evaluation of these fusion models. This study based on commercial software platform, preliminarily realized the 3D data fusion and integration method of optical scanning tooth crowns and CBCT tooth roots with a curvature continuous shape transition. The 10 patients' 3D crown and root fusion models were constructed successfully by the method, and the average score of the doctors' subjective evaluation for these 10 models was 8.6 points (0-10 points). which meant that all the fusion models could basically meet the need of the oral clinics, and also showed the method in our study was feasible and efficient in orthodontics study and clinics. The method of this study for 3D crown and root data fusion could obtain an integrate tooth or dental model more close to the nature shape. CBCT model calibration may probably improve the precision of the fusion model. The adaptation of this method for severe dentition crowding and micromaxillary deformity needs further research.
RUBE: an XML-based architecture for 3D process modeling and model fusion
NASA Astrophysics Data System (ADS)
Fishwick, Paul A.
2002-07-01
Information fusion is a critical problem for science and engineering. There is a need to fuse information content specified as either data or model. We frame our work in terms of fusing dynamic and geometric models, to create an immersive environment where these models can be juxtaposed in 3D, within the same interface. The method by which this is accomplished fits well into other eXtensible Markup Language (XML) approaches to fusion in general. The task of modeling lies at the heart of the human-computer interface, joining the human to the system under study through a variety of sensory modalities. I overview modeling as a key concern for the Defense Department and the Air Force, and then follow with a discussion of past, current, and future work. Past work began with a package with C and has progressed, in current work, to an implementation in XML. Our current work is defined within the RUBE architecture, which is detailed in subsequent papers devoted to key components. We have built RUBE as a next generation modeling framework using our prior software, with research opportunities in immersive 3D and tangible user interfaces.
Importance of interpolation and coincidence errors in data fusion
NASA Astrophysics Data System (ADS)
Ceccherini, Simone; Carli, Bruno; Tirelli, Cecilia; Zoppetti, Nicola; Del Bianco, Samuele; Cortesi, Ugo; Kujanpää, Jukka; Dragani, Rossana
2018-02-01
The complete data fusion (CDF) method is applied to ozone profiles obtained from simulated measurements in the ultraviolet and in the thermal infrared in the framework of the Sentinel 4 mission of the Copernicus programme. We observe that the quality of the fused products is degraded when the fusing profiles are either retrieved on different vertical grids or referred to different true profiles. To address this shortcoming, a generalization of the complete data fusion method, which takes into account interpolation and coincidence errors, is presented. This upgrade overcomes the encountered problems and provides products of good quality when the fusing profiles are both retrieved on different vertical grids and referred to different true profiles. The impact of the interpolation and coincidence errors on number of degrees of freedom and errors of the fused profile is also analysed. The approach developed here to account for the interpolation and coincidence errors can also be followed to include other error components, such as forward model errors.
Li, Yun; Zhang, Jin-Yu; Wang, Yuan-Zhong
2018-01-01
Three data fusion strategies (low-llevel, mid-llevel, and high-llevel) combined with a multivariate classification algorithm (random forest, RF) were applied to authenticate the geographical origins of Panax notoginseng collected from five regions of Yunnan province in China. In low-level fusion, the original data from two spectra (Fourier transform mid-IR spectrum and near-IR spectrum) were directly concatenated into a new matrix, which then was applied for the classification. Mid-level fusion was the strategy that inputted variables extracted from the spectral data into an RF classification model. The extracted variables were processed by iterate variable selection of the RF model and principal component analysis. The use of high-level fusion combined the decision making of each spectroscopic technique and resulted in an ensemble decision. The results showed that the mid-level and high-level data fusion take advantage of the information synergy from two spectroscopic techniques and had better classification performance than that of independent decision making. High-level data fusion is the most effective strategy since the classification results are better than those of the other fusion strategies: accuracy rates ranged between 93% and 96% for the low-level data fusion, between 95% and 98% for the mid-level data fusion, and between 98% and 100% for the high-level data fusion. In conclusion, the high-level data fusion strategy for Fourier transform mid-IR and near-IR spectra can be used as a reliable tool for correct geographical identification of P. notoginseng. Graphical abstract The analytical steps of Fourier transform mid-IR and near-IR spectral data fusion for the geographical traceability of Panax notoginseng.
Forecasting influenza in Hong Kong with Google search queries and statistical model fusion
Ramirez Ramirez, L. Leticia; Nezafati, Kusha; Zhang, Qingpeng; Tsui, Kwok-Leung
2017-01-01
Background The objective of this study is to investigate predictive utility of online social media and web search queries, particularly, Google search data, to forecast new cases of influenza-like-illness (ILI) in general outpatient clinics (GOPC) in Hong Kong. To mitigate the impact of sensitivity to self-excitement (i.e., fickle media interest) and other artifacts of online social media data, in our approach we fuse multiple offline and online data sources. Methods Four individual models: generalized linear model (GLM), least absolute shrinkage and selection operator (LASSO), autoregressive integrated moving average (ARIMA), and deep learning (DL) with Feedforward Neural Networks (FNN) are employed to forecast ILI-GOPC both one week and two weeks in advance. The covariates include Google search queries, meteorological data, and previously recorded offline ILI. To our knowledge, this is the first study that introduces deep learning methodology into surveillance of infectious diseases and investigates its predictive utility. Furthermore, to exploit the strength from each individual forecasting models, we use statistical model fusion, using Bayesian model averaging (BMA), which allows a systematic integration of multiple forecast scenarios. For each model, an adaptive approach is used to capture the recent relationship between ILI and covariates. Results DL with FNN appears to deliver the most competitive predictive performance among the four considered individual models. Combing all four models in a comprehensive BMA framework allows to further improve such predictive evaluation metrics as root mean squared error (RMSE) and mean absolute predictive error (MAPE). Nevertheless, DL with FNN remains the preferred method for predicting locations of influenza peaks. Conclusions The proposed approach can be viewed a feasible alternative to forecast ILI in Hong Kong or other countries where ILI has no constant seasonal trend and influenza data resources are limited. The proposed methodology is easily tractable and computationally efficient. PMID:28464015
Pansharpening via coupled triple factorization dictionary learning
Skau, Erik; Wohlberg, Brendt; Krim, Hamid; ...
2016-03-01
Data fusion is the operation of integrating data from different modalities to construct a single consistent representation. Here, this paper proposes variations of coupled dictionary learning through an additional factorization. One variation of this model is applicable to the pansharpening data fusion problem. Real world pansharpening data was applied to train and test our proposed formulation. The results demonstrate that the data fusion model can successfully be applied to the pan-sharpening problem.
The Integration of SMOS Soil Moisture in a Consistent Soil Moisture Climate Record
NASA Astrophysics Data System (ADS)
de Jeu, Richard; Kerr, Yann; Wigneron, Jean Pierre; Rodriguez-Fernandez, Nemesio; Al-Yaari, Amen; van der Schalie, Robin; Dolman, Han; Drusch, Matthias; Mecklenburg, Susanne
2015-04-01
Recently, a study funded by the European Space Agency (ESA) was set up to provide guidelines for the development of a global soil moisture climate record with a special emphasis on the integration of SMOS. Three different data fusion approaches were designed and implemented on 10 year passive microwave data (2003-2013) from two different satellite sensors; the ESA Soil Moisture Ocean Salinity Mission (SMOS) and the NASA/JAXA Advanced Scanning Microwave Radiometer (AMSR-E). The AMSR-E data covered the period from January 2003 until Oct 2011 and SMOS data covered the period from June 2010 until the end of 2013. The fusion approaches included a neural network approach (Rodriguez-Fernandez et al., this conference session HS6.4), a regression approach (Wigneron et al., 2004), and an approach based on the baseline algorithm of ESAs current Climate Change Initiative soil moisture program, the Land Parameter Retrieval Model (Van der Schalie et al., this conference session HS6.4). With this presentation we will show the first results from this study including a description of the different approaches and the validation activities using both globally covered modeled datasets and ground observations from the international soil moisture network. The statistical validation analyses will give us information on the temporal and spatial performance of the three different approaches. Based on these results we will then discuss the next steps towards a seamless integration of SMOS in a consistent soil moisture climate record. References Wigneron J.-P., J.-C. Calvet, P. de Rosnay, Y. Kerr, P. Waldteufel, K. Saleh, M. J. Escorihuela, A. Kruszewski, 'Soil Moisture Retrievals from Bi-Angular L-band Passive Microwave Observations', IEEE Trans. Geosc. Remote Sens. Let., vol 1, no. 4, 277-281, 2004.
NASA Astrophysics Data System (ADS)
Ziemba, Alexander; El Serafy, Ghada
2016-04-01
Ecological modeling and water quality investigations are complex processes which can require a high level of parameterization and a multitude of varying data sets in order to properly execute the model in question. Since models are generally complex, their calibration and validation can benefit from the application of data and information fusion techniques. The data applied to ecological models comes from a wide range of sources such as remote sensing, earth observation, and in-situ measurements, resulting in a high variability in the temporal and spatial resolution of the various data sets available to water quality investigators. It is proposed that effective fusion into a comprehensive singular set will provide a more complete and robust data resource with which models can be calibrated, validated, and driven by. Each individual product contains a unique valuation of error resulting from the method of measurement and application of pre-processing techniques. The uncertainty and error is further compounded when the data being fused is of varying temporal and spatial resolution. In order to have a reliable fusion based model and data set, the uncertainty of the results and confidence interval of the data being reported must be effectively communicated to those who would utilize the data product or model outputs in a decision making process[2]. Here we review an array of data fusion techniques applied to various remote sensing, earth observation, and in-situ data sets whose domains' are varied in spatial and temporal resolution. The data sets examined are combined in a manner so that the various classifications, complementary, redundant, and cooperative, of data are all assessed to determine classification's impact on the propagation and compounding of error. In order to assess the error of the fused data products, a comparison is conducted with data sets containing a known confidence interval and quality rating. We conclude with a quantification of the performance of the data fusion techniques and a recommendation on the feasibility of applying of the fused products in operating forecast systems and modeling scenarios. The error bands and confidence intervals derived can be used in order to clarify the error and confidence of water quality variables produced by prediction and forecasting models. References [1] F. Castanedo, "A Review of Data Fusion Techniques", The Scientific World Journal, vol. 2013, pp. 1-19, 2013. [2] T. Keenan, M. Carbone, M. Reichstein and A. Richardson, "The model-data fusion pitfall: assuming certainty in an uncertain world", Oecologia, vol. 167, no. 3, pp. 587-597, 2011.
NASA Astrophysics Data System (ADS)
Li, Chenguang; Yang, Xianjun
2016-10-01
The Magnetized Plasma Fusion Reactor concept is proposed as a magneto-inertial fusion approach based on the target plasma created through the collision merging of two oppositely translating field reversed configuration plasmas, which is then compressed by the imploding liner driven by the pulsed-power driver. The target creation process is described by a two-dimensional magnetohydrodynamics model, resulting in the typical target parameters. The implosion process and the fusion reaction are modeled by a simple zero-dimensional model, taking into account the alpha particle heating and the bremsstrahlung radiation loss. The compression on the target can be 2D cylindrical or 2.4D with the additive axial contraction taken into account. The dynamics of the liner compression and fusion burning are simulated and the optimum fusion gain and the associated target parameters are predicted. The scientific breakeven could be achieved at the optimized conditions.
Multimodal Neuroimaging: Basic Concepts and Classification of Neuropsychiatric Diseases.
Tulay, Emine Elif; Metin, Barış; Tarhan, Nevzat; Arıkan, Mehmet Kemal
2018-06-01
Neuroimaging techniques are widely used in neuroscience to visualize neural activity, to improve our understanding of brain mechanisms, and to identify biomarkers-especially for psychiatric diseases; however, each neuroimaging technique has several limitations. These limitations led to the development of multimodal neuroimaging (MN), which combines data obtained from multiple neuroimaging techniques, such as electroencephalography, functional magnetic resonance imaging, and yields more detailed information about brain dynamics. There are several types of MN, including visual inspection, data integration, and data fusion. This literature review aimed to provide a brief summary and basic information about MN techniques (data fusion approaches in particular) and classification approaches. Data fusion approaches are generally categorized as asymmetric and symmetric. The present review focused exclusively on studies based on symmetric data fusion methods (data-driven methods), such as independent component analysis and principal component analysis. Machine learning techniques have recently been introduced for use in identifying diseases and biomarkers of disease. The machine learning technique most widely used by neuroscientists is classification-especially support vector machine classification. Several studies differentiated patients with psychiatric diseases and healthy controls with using combined datasets. The common conclusion among these studies is that the prediction of diseases increases when combining data via MN techniques; however, there remain a few challenges associated with MN, such as sample size. Perhaps in the future N-way fusion can be used to combine multiple neuroimaging techniques or nonimaging predictors (eg, cognitive ability) to overcome the limitations of MN.
A CCA+ICA based model for multi-task brain imaging data fusion and its application to schizophrenia.
Sui, Jing; Adali, Tülay; Pearlson, Godfrey; Yang, Honghui; Sponheim, Scott R; White, Tonya; Calhoun, Vince D
2010-05-15
Collection of multiple-task brain imaging data from the same subject has now become common practice in medical imaging studies. In this paper, we propose a simple yet effective model, "CCA+ICA", as a powerful tool for multi-task data fusion. This joint blind source separation (BSS) model takes advantage of two multivariate methods: canonical correlation analysis and independent component analysis, to achieve both high estimation accuracy and to provide the correct connection between two datasets in which sources can have either common or distinct between-dataset correlation. In both simulated and real fMRI applications, we compare the proposed scheme with other joint BSS models and examine the different modeling assumptions. The contrast images of two tasks: sensorimotor (SM) and Sternberg working memory (SB), derived from a general linear model (GLM), were chosen to contribute real multi-task fMRI data, both of which were collected from 50 schizophrenia patients and 50 healthy controls. When examining the relationship with duration of illness, CCA+ICA revealed a significant negative correlation with temporal lobe activation. Furthermore, CCA+ICA located sensorimotor cortex as the group-discriminative regions for both tasks and identified the superior temporal gyrus in SM and prefrontal cortex in SB as task-specific group-discriminative brain networks. In summary, we compared the new approach to some competitive methods with different assumptions, and found consistent results regarding each of their hypotheses on connecting the two tasks. Such an approach fills a gap in existing multivariate methods for identifying biomarkers from brain imaging data.
Lappala, Anna; Nishima, Wataru; Miner, Jacob; Fenimore, Paul; Fischer, Will; Hraber, Peter; Zhang, Ming; McMahon, Benjamin; Tung, Chang-Shung
2018-05-10
Membrane fusion proteins are responsible for viral entry into host cells—a crucial first step in viral infection. These proteins undergo large conformational changes from pre-fusion to fusion-initiation structures, and, despite differences in viral genomes and disease etiology, many fusion proteins are arranged as trimers. Structural information for both pre-fusion and fusion-initiation states is critical for understanding virus neutralization by the host immune system. In the case of Ebola virus glycoprotein (EBOV GP) and Zika virus envelope protein (ZIKV E), pre-fusion state structures have been identified experimentally, but only partial structures of fusion-initiation states have been described. While the fusion-initiation structure is in an energetically unfavorable state that is difficult to solve experimentally, the existing structural information combined with computational approaches enabled the modeling of fusion-initiation state structures of both proteins. These structural models provide an improved understanding of four different neutralizing antibodies in the prevention of viral host entry.
Towards a Unified Approach to Information Integration - A review paper on data/information fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Whitney, Paul D.; Posse, Christian; Lei, Xingye C.
2005-10-14
Information or data fusion of data from different sources are ubiquitous in many applications, from epidemiology, medical, biological, political, and intelligence to military applications. Data fusion involves integration of spectral, imaging, text, and many other sensor data. For example, in epidemiology, information is often obtained based on many studies conducted by different researchers at different regions with different protocols. In the medical field, the diagnosis of a disease is often based on imaging (MRI, X-Ray, CT), clinical examination, and lab results. In the biological field, information is obtained based on studies conducted on many different species. In military field, informationmore » is obtained based on data from radar sensors, text messages, chemical biological sensor, acoustic sensor, optical warning and many other sources. Many methodologies are used in the data integration process, from classical, Bayesian, to evidence based expert systems. The implementation of the data integration ranges from pure software design to a mixture of software and hardware. In this review we summarize the methodologies and implementations of data fusion process, and illustrate in more detail the methodologies involved in three examples. We propose a unified multi-stage and multi-path mapping approach to the data fusion process, and point out future prospects and challenges.« less
Spatial resolution enhancement of satellite image data using fusion approach
NASA Astrophysics Data System (ADS)
Lestiana, H.; Sukristiyanti
2018-02-01
Object identification using remote sensing data has a problem when the spatial resolution is not in accordance with the object. The fusion approach is one of methods to solve the problem, to improve the object recognition and to increase the objects information by combining data from multiple sensors. The application of fusion image can be used to estimate the environmental component that is needed to monitor in multiple views, such as evapotranspiration estimation, 3D ground-based characterisation, smart city application, urban environments, terrestrial mapping, and water vegetation. Based on fusion application method, the visible object in land area has been easily recognized using the method. The variety of object information in land area has increased the variation of environmental component estimation. The difficulties in recognizing the invisible object like Submarine Groundwater Discharge (SGD), especially in tropical area, might be decreased by the fusion method. The less variation of the object in the sea surface temperature is a challenge to be solved.
NASA Astrophysics Data System (ADS)
McMullen, Sonya A. H.; Henderson, Troy; Ison, David
2017-05-01
The miniaturization of unmanned systems and spacecraft, as well as computing and sensor technologies, has opened new opportunities in the areas of remote sensing and multi-sensor data fusion for a variety of applications. Remote sensing and data fusion historically have been the purview of large government organizations, such as the Department of Defense (DoD), National Aeronautics and Space Administration (NASA), and National Geospatial-Intelligence Agency (NGA) due to the high cost and complexity of developing, fielding, and operating such systems. However, miniaturized computers with high capacity processing capabilities, small and affordable sensors, and emerging, commercially available platforms such as UAS and CubeSats to carry such sensors, have allowed for a vast range of novel applications. In order to leverage these developments, Embry-Riddle Aeronautical University (ERAU) has developed an advanced sensor and data fusion laboratory to research component capabilities and their employment on a wide-range of autonomous, robotic, and transportation systems. This lab is unique in several ways, for example, it provides a traditional campus laboratory for students and faculty to model and test sensors in a range of scenarios, process multi-sensor data sets (both simulated and experimental), and analyze results. Moreover, such allows for "virtual" modeling, testing, and teaching capability reaching beyond the physical confines of the facility for use among ERAU Worldwide students and faculty located around the globe. Although other institutions such as Georgia Institute of Technology, Lockheed Martin, University of Dayton, and University of Central Florida have optical sensor laboratories, the ERAU virtual concept is the first such lab to expand to multispectral sensors and data fusion, while focusing on the data collection and data products and not on the manufacturing aspect. Further, the initiative is a unique effort among Embry-Riddle faculty to develop multi-disciplinary, cross-campus research to facilitate faculty- and student-driven research. Specifically, the ERAU Worldwide Campus, with locations across the globe and delivering curricula online, will be leveraged to provide novel approaches to remote sensor experimentation and simulation. The purpose of this paper and presentation is to present this new laboratory, research, education, and collaboration process.
Joint Data Management for MOVINT Data-to-Decision Making
2011-07-01
flux tensor , aligned motion history images, and related approaches have been shown to be versatile approaches [12, 16, 17, 18]. Scaling these...methods include voting , neural networks, fuzzy logic, neuro-dynamic programming, support vector machines, Bayesian and Dempster-Shafer methods. One way...Information Fusion, 2010. [16] F. Bunyak, K. Palaniappan, S. K. Nath, G. Seetharaman, “Flux tensor constrained geodesic active contours with sensor fusion
NASA Astrophysics Data System (ADS)
Li, Yun; Zhang, Ji; Li, Tao; Liu, Honggao; Li, Jieqing; Wang, Yuanzhong
2017-04-01
In this work, the data fusion strategy of Fourier transform mid infrared (FT-MIR) spectroscopy and inductively coupled plasma-atomic emission spectrometry (ICP-AES) was used in combination with Support Vector Machine (SVM) to determine the geographic origin of Boletus edulis collected from nine regions of Yunnan Province in China. Firstly, competitive adaptive reweighted sampling (CARS) was used for selecting an optimal combination of key wavenumbers of second derivative FT-MIR spectra, and thirteen elements were sorted with variable importance in projection (VIP) scores. Secondly, thirteen subsets of multi-elements with the best VIP score were generated and each subset was used to fuse with FT-MIR. Finally, the classification models were established by SVM, and the combination of parameter C and γ (gamma) of SVM models was calculated by the approaches of grid search (GS) and genetic algorithm (GA). The results showed that both GS-SVM and GA-SVM models achieved good performances based on the #9 subset and the prediction accuracy in calibration and validation sets of the two models were 81.40% and 90.91%, correspondingly. In conclusion, it indicated that the data fusion strategy of FT-MIR and ICP-AES coupled with the algorithm of SVM can be used as a reliable tool for accurate identification of B. edulis, and it can provide a useful way of thinking for the quality control of edible mushrooms.
Li, Yun; Zhang, Ji; Li, Tao; Liu, Honggao; Li, Jieqing; Wang, Yuanzhong
2017-04-15
In this work, the data fusion strategy of Fourier transform mid infrared (FT-MIR) spectroscopy and inductively coupled plasma-atomic emission spectrometry (ICP-AES) was used in combination with Support Vector Machine (SVM) to determine the geographic origin of Boletus edulis collected from nine regions of Yunnan Province in China. Firstly, competitive adaptive reweighted sampling (CARS) was used for selecting an optimal combination of key wavenumbers of second derivative FT-MIR spectra, and thirteen elements were sorted with variable importance in projection (VIP) scores. Secondly, thirteen subsets of multi-elements with the best VIP score were generated and each subset was used to fuse with FT-MIR. Finally, the classification models were established by SVM, and the combination of parameter C and γ (gamma) of SVM models was calculated by the approaches of grid search (GS) and genetic algorithm (GA). The results showed that both GS-SVM and GA-SVM models achieved good performances based on the #9 subset and the prediction accuracy in calibration and validation sets of the two models were 81.40% and 90.91%, correspondingly. In conclusion, it indicated that the data fusion strategy of FT-MIR and ICP-AES coupled with the algorithm of SVM can be used as a reliable tool for accurate identification of B. edulis, and it can provide a useful way of thinking for the quality control of edible mushrooms. Copyright © 2017. Published by Elsevier B.V.
Comparing fusion techniques for the ImageCLEF 2013 medical case retrieval task.
G Seco de Herrera, Alba; Schaer, Roger; Markonis, Dimitrios; Müller, Henning
2015-01-01
Retrieval systems can supply similar cases with a proven diagnosis to a new example case under observation to help clinicians during their work. The ImageCLEFmed evaluation campaign proposes a framework where research groups can compare case-based retrieval approaches. This paper focuses on the case-based task and adds results of the compound figure separation and modality classification tasks. Several fusion approaches are compared to identify the approaches best adapted to the heterogeneous data of the task. Fusion of visual and textual features is analyzed, demonstrating that the selection of the fusion strategy can improve the best performance on the case-based retrieval task. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Kou, Leilei; Wang, Zhuihui; Xu, Fen
2018-03-01
The spaceborne precipitation radar onboard the Tropical Rainfall Measuring Mission satellite (TRMM PR) can provide good measurement of the vertical structure of reflectivity, while ground radar (GR) has a relatively high horizontal resolution and greater sensitivity. Fusion of TRMM PR and GR reflectivity data may maximize the advantages from both instruments. In this paper, TRMM PR and GR reflectivity data are fused using a neural network (NN)-based approach. The main steps included are: quality control of TRMM PR and GR reflectivity data; spatiotemporal matchup; GR calibration bias correction; conversion of TRMM PR data from Ku to S band; fusion of TRMM PR and GR reflectivity data with an NN method; interpolation of reflectivity data that are below PR's sensitivity; blind areas compensation with a distance weighting-based merging approach; combination of three types of data: data with the NN method, data below PR's sensitivity and data within compensated blind areas. During the NN fusion step, the TRMM PR data are taken as targets of the training NNs, and gridded GR data after horizontal downsampling at different heights are used as the input. The trained NNs are then used to obtain 3D high-resolution reflectivity from the original GR gridded data. After 3D fusion of the TRMM PR and GR reflectivity data, a more complete and finer-scale 3D radar reflectivity dataset incorporating characteristics from both the TRMM PR and GR observations can be obtained. The fused reflectivity data are evaluated based on a convective precipitation event through comparison with the high resolution TRMM PR and GR data with an interpolation algorithm.
Classification Accuracy Increase Using Multisensor Data Fusion
NASA Astrophysics Data System (ADS)
Makarau, A.; Palubinskas, G.; Reinartz, P.
2011-09-01
The practical use of very high resolution visible and near-infrared (VNIR) data is still growing (IKONOS, Quickbird, GeoEye-1, etc.) but for classification purposes the number of bands is limited in comparison to full spectral imaging. These limitations may lead to the confusion of materials such as different roofs, pavements, roads, etc. and therefore may provide wrong interpretation and use of classification products. Employment of hyperspectral data is another solution, but their low spatial resolution (comparing to multispectral data) restrict their usage for many applications. Another improvement can be achieved by fusion approaches of multisensory data since this may increase the quality of scene classification. Integration of Synthetic Aperture Radar (SAR) and optical data is widely performed for automatic classification, interpretation, and change detection. In this paper we present an approach for very high resolution SAR and multispectral data fusion for automatic classification in urban areas. Single polarization TerraSAR-X (SpotLight mode) and multispectral data are integrated using the INFOFUSE framework, consisting of feature extraction (information fission), unsupervised clustering (data representation on a finite domain and dimensionality reduction), and data aggregation (Bayesian or neural network). This framework allows a relevant way of multisource data combination following consensus theory. The classification is not influenced by the limitations of dimensionality, and the calculation complexity primarily depends on the step of dimensionality reduction. Fusion of single polarization TerraSAR-X, WorldView-2 (VNIR or full set), and Digital Surface Model (DSM) data allow for different types of urban objects to be classified into predefined classes of interest with increased accuracy. The comparison to classification results of WorldView-2 multispectral data (8 spectral bands) is provided and the numerical evaluation of the method in comparison to other established methods illustrates the advantage in the classification accuracy for many classes such as buildings, low vegetation, sport objects, forest, roads, rail roads, etc.
Remote Sensing Data Visualization, Fusion and Analysis via Giovanni
NASA Technical Reports Server (NTRS)
Leptoukh, G.; Zubko, V.; Gopalan, A.; Khayat, M.
2007-01-01
We describe Giovanni, the NASA Goddard developed online visualization and analysis tool that allows users explore various phenomena without learning remote sensing data formats and downloading voluminous data. Using MODIS aerosol data as an example, we formulate an approach to the data fusion for Giovanni to further enrich online multi-sensor remote sensing data comparison and analysis.
2013-09-01
model and the BRDF in the SRP model are not consistent with each other, then the resulting estimated albedo-areas and mass are inaccurate and biased...This work studies the use of physically consistent BRDF -SRP models for mass estimation. Simulation studies are used to provide an indication of the...benefits of using these new models . An unscented Kalman filter approach that includes BRDF and mass parameters in the state vector is used. The
Collaborative classification of hyperspectral and visible images with convolutional neural network
NASA Astrophysics Data System (ADS)
Zhang, Mengmeng; Li, Wei; Du, Qian
2017-10-01
Recent advances in remote sensing technology have made multisensor data available for the same area, and it is well-known that remote sensing data processing and analysis often benefit from multisource data fusion. Specifically, low spatial resolution of hyperspectral imagery (HSI) degrades the quality of the subsequent classification task while using visible (VIS) images with high spatial resolution enables high-fidelity spatial analysis. A collaborative classification framework is proposed to fuse HSI and VIS images for finer classification. First, the convolutional neural network model is employed to extract deep spectral features for HSI classification. Second, effective binarized statistical image features are learned as contextual basis vectors for the high-resolution VIS image, followed by a classifier. The proposed approach employs diversified data in a decision fusion, leading to an integration of the rich spectral information, spatial information, and statistical representation information. In particular, the proposed approach eliminates the potential problems of the curse of dimensionality and excessive computation time. The experiments evaluated on two standard data sets demonstrate better classification performance offered by this framework.
An Improved Evidential-IOWA Sensor Data Fusion Approach in Fault Diagnosis
Zhou, Deyun; Zhuang, Miaoyan; Fang, Xueyi; Xie, Chunhe
2017-01-01
As an important tool of information fusion, Dempster–Shafer evidence theory is widely applied in handling the uncertain information in fault diagnosis. However, an incorrect result may be obtained if the combined evidence is highly conflicting, which may leads to failure in locating the fault. To deal with the problem, an improved evidential-Induced Ordered Weighted Averaging (IOWA) sensor data fusion approach is proposed in the frame of Dempster–Shafer evidence theory. In the new method, the IOWA operator is used to determine the weight of different sensor data source, while determining the parameter of the IOWA, both the distance of evidence and the belief entropy are taken into consideration. First, based on the global distance of evidence and the global belief entropy, the α value of IOWA is obtained. Simultaneously, a weight vector is given based on the maximum entropy method model. Then, according to IOWA operator, the evidence are modified before applying the Dempster’s combination rule. The proposed method has a better performance in conflict management and fault diagnosis due to the fact that the information volume of each evidence is taken into consideration. A numerical example and a case study in fault diagnosis are presented to show the rationality and efficiency of the proposed method. PMID:28927017
Base of the Measles Virus Fusion Trimer Head Receives the Signal That Triggers Membrane Fusion*
Apte-Sengupta, Swapna; Negi, Surendra; Leonard, Vincent H. J.; Oezguen, Numan; Navaratnarajah, Chanakha K.; Braun, Werner; Cattaneo, Roberto
2012-01-01
The measles virus (MV) fusion (F) protein trimer executes membrane fusion after receiving a signal elicited by receptor binding to the hemagglutinin (H) tetramer. Where and how this signal is received is understood neither for MV nor for other paramyxoviruses. Because only the prefusion structure of the parainfluenza virus 5 (PIV5) F-trimer is available, to study signal receipt by the MV F-trimer, we generated and energy-refined a homology model. We used two approaches to predict surface residues of the model interacting with other proteins. Both approaches measured interface propensity values for patches of residues. The second approach identified, in addition, individual residues based on the conservation of physical chemical properties among F-proteins. Altogether, about 50 candidate interactive residues were identified. Through iterative cycles of mutagenesis and functional analysis, we characterized six residues that are required specifically for signal transmission; their mutation interferes with fusion, although still allowing efficient F-protein processing and cell surface transport. One residue is located adjacent to the fusion peptide, four line a cavity in the base of the F-trimer head, while the sixth residue is located near this cavity. Hydrophobic interactions in the cavity sustain the fusion process and contacts with H. The cavity is flanked by two different subunits of the F-trimer. Tetrameric H-stalks may be lodged in apposed cavities of two F-trimers. Because these insights are based on a PIV5 homology model, the signal receipt mechanism may be conserved among paramyxoviruses. PMID:22859308
NASA Astrophysics Data System (ADS)
Peigney, B. E.; Larroche, O.; Tikhonchuk, V.
2014-12-01
In this article, we study the hydrodynamics and burn of the thermonuclear fuel in inertial confinement fusion pellets at the ion kinetic level. The analysis is based on a two-velocity-scale Vlasov-Fokker-Planck kinetic model that is specially tailored to treat fusion products (suprathermal α-particles) in a self-consistent manner with the thermal bulk. The model assumes spherical symmetry in configuration space and axial symmetry in velocity space around the mean flow velocity. A typical hot-spot ignition design is considered. Compared with fluid simulations where a multi-group diffusion scheme is applied to model α transport, the full ion-kinetic approach reveals significant non-local effects on the transport of energetic α-particles. This has a direct impact on hydrodynamic spatial profiles during combustion: the hot spot reactivity is reduced, while the inner dense fuel layers are pre-heated by the escaping α-suprathermal particles, which are transported farther out of the hot spot. We show how the kinetic transport enhancement of fusion products leads to a significant reduction of the fusion yield.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peigney, B. E.; Larroche, O.; Tikhonchuk, V.
2014-12-15
In this article, we study the hydrodynamics and burn of the thermonuclear fuel in inertial confinement fusion pellets at the ion kinetic level. The analysis is based on a two-velocity-scale Vlasov-Fokker-Planck kinetic model that is specially tailored to treat fusion products (suprathermal α-particles) in a self-consistent manner with the thermal bulk. The model assumes spherical symmetry in configuration space and axial symmetry in velocity space around the mean flow velocity. A typical hot-spot ignition design is considered. Compared with fluid simulations where a multi-group diffusion scheme is applied to model α transport, the full ion-kinetic approach reveals significant non-local effectsmore » on the transport of energetic α-particles. This has a direct impact on hydrodynamic spatial profiles during combustion: the hot spot reactivity is reduced, while the inner dense fuel layers are pre-heated by the escaping α-suprathermal particles, which are transported farther out of the hot spot. We show how the kinetic transport enhancement of fusion products leads to a significant reduction of the fusion yield.« less
Automotive Radar and Lidar Systems for Next Generation Driver Assistance Functions
NASA Astrophysics Data System (ADS)
Rasshofer, R. H.; Gresser, K.
2005-05-01
Automotive radar and lidar sensors represent key components for next generation driver assistance functions (Jones, 2001). Today, their use is limited to comfort applications in premium segment vehicles although an evolution process towards more safety-oriented functions is taking place. Radar sensors available on the market today suffer from low angular resolution and poor target detection in medium ranges (30 to 60m) over azimuth angles larger than ±30°. In contrast, Lidar sensors show large sensitivity towards environmental influences (e.g. snow, fog, dirt). Both sensor technologies today have a rather high cost level, forbidding their wide-spread usage on mass markets. A common approach to overcome individual sensor drawbacks is the employment of data fusion techniques (Bar-Shalom, 2001). Raw data fusion requires a common, standardized data interface to easily integrate a variety of asynchronous sensor data into a fusion network. Moreover, next generation sensors should be able to dynamically adopt to new situations and should have the ability to work in cooperative sensor environments. As vehicular function development today is being shifted more and more towards virtual prototyping, mathematical sensor models should be available. These models should take into account the sensor's functional principle as well as all typical measurement errors generated by the sensor.
Predicting individual fusional range from optometric data
NASA Astrophysics Data System (ADS)
Endrikhovski, Serguei; Jin, Elaine; Miller, Michael E.; Ford, Robert W.
2005-03-01
A model was developed to predict the range of disparities that can be fused by an individual user from optometric measurements. This model uses parameters, such as dissociated phoria and fusional reserves, to calculate an individual user"s fusional range (i.e., the disparities that can be fused on stereoscopic displays) when the user views a stereoscopic stimulus from various distances. This model is validated by comparing its output with data from a study in which the individual fusional range of a group of users was quantified while they viewed a stereoscopic display from distances of 0.5, 1.0, and 2.0 meters. Overall, the model provides good data predictions for the majority of the subjects and can be generalized for other viewing conditions. The model may, therefore, be used within a customized stereoscopic system, which would render stereoscopic information in a way that accounts for the individual differences in fusional range. Because the comfort of an individual user also depends on the user"s ability to fuse stereo images, such a system may, consequently, improve the comfort level and viewing experience for people with different stereoscopic fusional capabilities.
NASA Technical Reports Server (NTRS)
Foyle, David C.
1993-01-01
Based on existing integration models in the psychological literature, an evaluation framework is developed to assess sensor fusion displays as might be implemented in an enhanced/synthetic vision system. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The pilot's performance with the sensor fusion image is compared to models' predictions based on the pilot's performance when viewing the original component sensor images prior to fusion. This allows for the determination as to when a sensor fusion system leads to: poorer performance than one of the original sensor displays, clearly an undesirable system in which the fused sensor system causes some distortion or interference; better performance than with either single sensor system alone, but at a sub-optimal level compared to model predictions; optimal performance compared to model predictions; or, super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays.
A phase field approach for multicellular aggregate fusion in biofabrication.
Yang, Xiaofeng; Sun, Yi; Wang, Qi
2013-07-01
We present a modeling and computational approach to study fusion of multicellular aggregates during tissue and organ fabrication, which forms the foundation for the scaffold-less biofabrication of tissues and organs known as bioprinting. It is known as the phase field method, where multicellular aggregates are modeled as mixtures of multiphase complex fluids whose phase mixing or separation is governed by interphase force interactions, mimicking the cell-cell interaction in the multicellular aggregates, and intermediate range interaction mediated by the surrounding hydrogel. The material transport in the mixture is dictated by hydrodynamics as well as forces due to the interphase interactions. In a multicellular aggregate system with fixed number of cells and fixed amount of the hydrogel medium, the effect of cell differentiation, proliferation, and death are neglected in the current model, which can be readily included in the model, and the interaction between different components is dictated by the interaction energy between cell and cell as well as between cell and medium particles, respectively. The modeling approach is applicable to transient simulations of fusion of cellular aggregate systems at the time and length scale appropriate to biofabrication. Numerical experiments are presented to demonstrate fusion and cell sorting during tissue and organ maturation processes in biofabrication.
Screening effects on 12C+12C fusion reaction
NASA Astrophysics Data System (ADS)
Koyuncu, F.; Soylu, A.
2018-05-01
One of the important reactions for nucleosynthesis in the carbon burning phase in high-mass stars is the 12C+12C fusion reaction. In this study, we investigate the influences of the nuclear potentials and screening effect on astrophysically interesting 12C+12C fusion reaction observables at sub-barrier energies by using the microscopic α–α double folding cluster (DFC) potential and the proximity potential. In order to model the screening effects on the experimental data, a more general exponential cosine screened Coulomb (MGECSC) potential including Debye and quantum plasma cases has been considered in the calculations for the 12C+12C fusion reaction. In the calculations of the reaction observables, the semi-classical Wentzel-Kramers-Brillouin (WKB) approach and coupled channel (CC) formalism have been used. Moreover, in order to investigate how the potentials between 12C nuclei produce molecular cluster states of 24Mg, the normalized resonant energy states of 24Mg cluster bands have been calculated for the DFC potential. By analyzing the results produced from the fusion of 12C+12C, it is found that taking into account the screening effects in terms of MGECSC is important for explaining the 12C+12C fusion data, and the microscopic DFC potential is better than the proximity potential in explaining the experimental data, also considering that clustering is dominant for the structure of the 24Mg nucleus. Supported by the Turkish Science and Research Council (TÜBİTAK) with (117R015)
Lu, Mengxiao; Gantz, Donald L.; Herscovitz, Haya; Gursky, Olga
2012-01-01
Fusion of modified LDL in the arterial wall promotes atherogenesis. Earlier we showed that thermal denaturation mimics LDL remodeling and fusion, and revealed kinetic origin of LDL stability. Here we report the first quantitative analysis of LDL thermal stability. Turbidity data show sigmoidal kinetics of LDL heat denaturation, which is unique among lipoproteins, suggesting that fusion is preceded by other structural changes. High activation energy of denaturation, Ea = 100 ± 8 kcal/mol, indicates disruption of extensive packing interactions in LDL. Size-exclusion chromatography, nondenaturing gel electrophoresis, and negative-stain electron microscopy suggest that LDL dimerization is an early step in thermally induced fusion. Monoclonal antibody binding suggests possible involvement of apoB N-terminal domain in early stages of LDL fusion. LDL fusion accelerates at pH < 7, which may contribute to LDL retention in acidic atherosclerotic lesions. Fusion also accelerates upon increasing LDL concentration in near-physiologic range, which likely contributes to atherogenesis. Thermal stability of LDL decreases with increasing particle size, indicating that the pro-atherogenic properties of small dense LDL do not result from their enhanced fusion. Our work provides the first kinetic approach to measuring LDL stability and suggests that lipid-lowering therapies that reduce LDL concentration but increase the particle size may have opposite effects on LDL fusion. PMID:22855737
Lu, Mengxiao; Gantz, Donald L; Herscovitz, Haya; Gursky, Olga
2012-10-01
Fusion of modified LDL in the arterial wall promotes atherogenesis. Earlier we showed that thermal denaturation mimics LDL remodeling and fusion, and revealed kinetic origin of LDL stability. Here we report the first quantitative analysis of LDL thermal stability. Turbidity data show sigmoidal kinetics of LDL heat denaturation, which is unique among lipoproteins, suggesting that fusion is preceded by other structural changes. High activation energy of denaturation, E(a) = 100 ± 8 kcal/mol, indicates disruption of extensive packing interactions in LDL. Size-exclusion chromatography, nondenaturing gel electrophoresis, and negative-stain electron microscopy suggest that LDL dimerization is an early step in thermally induced fusion. Monoclonal antibody binding suggests possible involvement of apoB N-terminal domain in early stages of LDL fusion. LDL fusion accelerates at pH < 7, which may contribute to LDL retention in acidic atherosclerotic lesions. Fusion also accelerates upon increasing LDL concentration in near-physiologic range, which likely contributes to atherogenesis. Thermal stability of LDL decreases with increasing particle size, indicating that the pro-atherogenic properties of small dense LDL do not result from their enhanced fusion. Our work provides the first kinetic approach to measuring LDL stability and suggests that lipid-lowering therapies that reduce LDL concentration but increase the particle size may have opposite effects on LDL fusion.
A Hybrid Sensing Approach for Pure and Adulterated Honey Classification
Subari, Norazian; Saleh, Junita Mohamad; Shakaff, Ali Yeon Md; Zakaria, Ammar
2012-01-01
This paper presents a comparison between data from single modality and fusion methods to classify Tualang honey as pure or adulterated using Linear Discriminant Analysis (LDA) and Principal Component Analysis (PCA) statistical classification approaches. Ten different brands of certified pure Tualang honey were obtained throughout peninsular Malaysia and Sumatera, Indonesia. Various concentrations of two types of sugar solution (beet and cane sugar) were used in this investigation to create honey samples of 20%, 40%, 60% and 80% adulteration concentrations. Honey data extracted from an electronic nose (e-nose) and Fourier Transform Infrared Spectroscopy (FTIR) were gathered, analyzed and compared based on fusion methods. Visual observation of classification plots revealed that the PCA approach able to distinct pure and adulterated honey samples better than the LDA technique. Overall, the validated classification results based on FTIR data (88.0%) gave higher classification accuracy than e-nose data (76.5%) using the LDA technique. Honey classification based on normalized low-level and intermediate-level FTIR and e-nose fusion data scored classification accuracies of 92.2% and 88.7%, respectively using the Stepwise LDA method. The results suggested that pure and adulterated honey samples were better classified using FTIR and e-nose fusion data than single modality data. PMID:23202033
Data Fusion Analysis for Range Test Validation System
2010-07-14
simulants were released during the RTVS ’08 test series: triethyl phosphate (TEP), methyl salicylate (MeS), and acetic acid (AA). A total of 29 release...the combination of a grid of point sensors at ground level and a standoff FTIR system monitoring above ground areas proved effective in detecting the...presence of simulants over the test grid. A Dempster-Shafer approach for data fusion was selected as the most effective strategy for RTVS data fusion
Causal inference and the data-fusion problem
Bareinboim, Elias; Pearl, Judea
2016-01-01
We review concepts, principles, and tools that unify current approaches to causal analysis and attend to new challenges presented by big data. In particular, we address the problem of data fusion—piecing together multiple datasets collected under heterogeneous conditions (i.e., different populations, regimes, and sampling methods) to obtain valid answers to queries of interest. The availability of multiple heterogeneous datasets presents new opportunities to big data analysts, because the knowledge that can be acquired from combined data would not be possible from any individual source alone. However, the biases that emerge in heterogeneous environments require new analytical tools. Some of these biases, including confounding, sampling selection, and cross-population biases, have been addressed in isolation, largely in restricted parametric models. We here present a general, nonparametric framework for handling these biases and, ultimately, a theoretical solution to the problem of data fusion in causal inference tasks. PMID:27382148
Data fusion in cyber security: first order entity extraction from common cyber data
NASA Astrophysics Data System (ADS)
Giacobe, Nicklaus A.
2012-06-01
The Joint Directors of Labs Data Fusion Process Model (JDL Model) provides a framework for how to handle sensor data to develop higher levels of inference in a complex environment. Beginning from a call to leverage data fusion techniques in intrusion detection, there have been a number of advances in the use of data fusion algorithms in this subdomain of cyber security. While it is tempting to jump directly to situation-level or threat-level refinement (levels 2 and 3) for more exciting inferences, a proper fusion process starts with lower levels of fusion in order to provide a basis for the higher fusion levels. The process begins with first order entity extraction, or the identification of important entities represented in the sensor data stream. Current cyber security operational tools and their associated data are explored for potential exploitation, identifying the first order entities that exist in the data and the properties of these entities that are described by the data. Cyber events that are represented in the data stream are added to the first order entities as their properties. This work explores typical cyber security data and the inferences that can be made at the lower fusion levels (0 and 1) with simple metrics. Depending on the types of events that are expected by the analyst, these relatively simple metrics can provide insight on their own, or could be used in fusion algorithms as a basis for higher levels of inference.
Near-real-time data fusion, phase 2
DOT National Transportation Integrated Search
1999-10-01
Report developed under SBIR contract for topic N93-084. In Phase I of this project, we explored several different approaches to Near-Real-Time Data Fusion (NRTDF), and in Phase II we developed the most promising architecture into a prototype NRTDF sy...
Spatial Statistical Data Fusion (SSDF)
NASA Technical Reports Server (NTRS)
Braverman, Amy J.; Nguyen, Hai M.; Cressie, Noel
2013-01-01
As remote sensing for scientific purposes has transitioned from an experimental technology to an operational one, the selection of instruments has become more coordinated, so that the scientific community can exploit complementary measurements. However, tech nological and scientific heterogeneity across devices means that the statistical characteristics of the data they collect are different. The challenge addressed here is how to combine heterogeneous remote sensing data sets in a way that yields optimal statistical estimates of the underlying geophysical field, and provides rigorous uncertainty measures for those estimates. Different remote sensing data sets may have different spatial resolutions, different measurement error biases and variances, and other disparate characteristics. A state-of-the-art spatial statistical model was used to relate the true, but not directly observed, geophysical field to noisy, spatial aggregates observed by remote sensing instruments. The spatial covariances of the true field and the covariances of the true field with the observations were modeled. The observations are spatial averages of the true field values, over pixels, with different measurement noise superimposed. A kriging framework is used to infer optimal (minimum mean squared error and unbiased) estimates of the true field at point locations from pixel-level, noisy observations. A key feature of the spatial statistical model is the spatial mixed effects model that underlies it. The approach models the spatial covariance function of the underlying field using linear combinations of basis functions of fixed size. Approaches based on kriging require the inversion of very large spatial covariance matrices, and this is usually done by making simplifying assumptions about spatial covariance structure that simply do not hold for geophysical variables. In contrast, this method does not require these assumptions, and is also computationally much faster. This method is fundamentally different than other approaches to data fusion for remote sensing data because it is inferential rather than merely descriptive. All approaches combine data in a way that minimizes some specified loss function. Most of these are more or less ad hoc criteria based on what looks good to the eye, or some criteria that relate only to the data at hand.
The role of data fusion in predictive maintenance using digital twin
NASA Astrophysics Data System (ADS)
Liu, Zheng; Meyendorf, Norbert; Mrad, Nezih
2018-04-01
Modern aerospace industry is migrating from reactive to proactive and predictive maintenance to increase platform operational availability and efficiency, extend its useful life cycle and reduce its life cycle cost. Multiphysics modeling together with data-driven analytics generate a new paradigm called "Digital Twin." The digital twin is actually a living model of the physical asset or system, which continually adapts to operational changes based on the collected online data and information, and can forecast the future of the corresponding physical counterpart. This paper reviews the overall framework to develop a digital twin coupled with the industrial Internet of Things technology to advance aerospace platforms autonomy. Data fusion techniques particularly play a significant role in the digital twin framework. The flow of information from raw data to high-level decision making is propelled by sensor-to-sensor, sensor-to-model, and model-to-model fusion. This paper further discusses and identifies the role of data fusion in the digital twin framework for aircraft predictive maintenance.
Unsupervised Metric Fusion Over Multiview Data by Graph Random Walk-Based Cross-View Diffusion.
Wang, Yang; Zhang, Wenjie; Wu, Lin; Lin, Xuemin; Zhao, Xiang
2017-01-01
Learning an ideal metric is crucial to many tasks in computer vision. Diverse feature representations may combat this problem from different aspects; as visual data objects described by multiple features can be decomposed into multiple views, thus often provide complementary information. In this paper, we propose a cross-view fusion algorithm that leads to a similarity metric for multiview data by systematically fusing multiple similarity measures. Unlike existing paradigms, we focus on learning distance measure by exploiting a graph structure of data samples, where an input similarity matrix can be improved through a propagation of graph random walk. In particular, we construct multiple graphs with each one corresponding to an individual view, and a cross-view fusion approach based on graph random walk is presented to derive an optimal distance measure by fusing multiple metrics. Our method is scalable to a large amount of data by enforcing sparsity through an anchor graph representation. To adaptively control the effects of different views, we dynamically learn view-specific coefficients, which are leveraged into graph random walk to balance multiviews. However, such a strategy may lead to an over-smooth similarity metric where affinities between dissimilar samples may be enlarged by excessively conducting cross-view fusion. Thus, we figure out a heuristic approach to controlling the iteration number in the fusion process in order to avoid over smoothness. Extensive experiments conducted on real-world data sets validate the effectiveness and efficiency of our approach.
NASA Astrophysics Data System (ADS)
Zein-Sabatto, Saleh; Mikhail, Maged; Bodruzzaman, Mohammad; DeSimio, Martin; Derriso, Mark; Behbahani, Alireza
2012-06-01
It has been widely accepted that data fusion and information fusion methods can improve the accuracy and robustness of decision-making in structural health monitoring systems. It is arguably true nonetheless, that decision-level is equally beneficial when applied to integrated health monitoring systems. Several decisions at low-levels of abstraction may be produced by different decision-makers; however, decision-level fusion is required at the final stage of the process to provide accurate assessment about the health of the monitored system as a whole. An example of such integrated systems with complex decision-making scenarios is the integrated health monitoring of aircraft. Thorough understanding of the characteristics of the decision-fusion methodologies is a crucial step for successful implementation of such decision-fusion systems. In this paper, we have presented the major information fusion methodologies reported in the literature, i.e., probabilistic, evidential, and artificial intelligent based methods. The theoretical basis and characteristics of these methodologies are explained and their performances are analyzed. Second, candidate methods from the above fusion methodologies, i.e., Bayesian, Dempster-Shafer, and fuzzy logic algorithms are selected and their applications are extended to decisions fusion. Finally, fusion algorithms are developed based on the selected fusion methods and their performance are tested on decisions generated from synthetic data and from experimental data. Also in this paper, a modeling methodology, i.e. cloud model, for generating synthetic decisions is presented and used. Using the cloud model, both types of uncertainties; randomness and fuzziness, involved in real decision-making are modeled. Synthetic decisions are generated with an unbiased process and varying interaction complexities among decisions to provide for fair performance comparison of the selected decision-fusion algorithms. For verification purposes, implementation results of the developed fusion algorithms on structural health monitoring data collected from experimental tests are reported in this paper.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Noterdaeme, J. M.; Max-Planck-Institut für Plasmaphysik, Garching D-85748
2014-11-15
Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing tomore » physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well.« less
A data fusion approach for track monitoring from multiple in-service trains
NASA Astrophysics Data System (ADS)
Lederman, George; Chen, Siheng; Garrett, James H.; Kovačević, Jelena; Noh, Hae Young; Bielak, Jacobo
2017-10-01
We present a data fusion approach for enabling data-driven rail-infrastructure monitoring from multiple in-service trains. A number of researchers have proposed using vibration data collected from in-service trains as a low-cost method to monitor track geometry. The majority of this work has focused on developing novel features to extract information about the tracks from data produced by individual sensors on individual trains. We extend this work by presenting a technique to combine extracted features from multiple passes over the tracks from multiple sensors aboard multiple vehicles. There are a number of challenges in combining multiple data sources, like different relative position coordinates depending on the location of the sensor within the train. Furthermore, as the number of sensors increases, the likelihood that some will malfunction also increases. We use a two-step approach that first minimizes position offset errors through data alignment, then fuses the data with a novel adaptive Kalman filter that weights data according to its estimated reliability. We show the efficacy of this approach both through simulations and on a data-set collected from two instrumented trains operating over a one-year period. Combining data from numerous in-service trains allows for more continuous and more reliable data-driven monitoring than analyzing data from any one train alone; as the number of instrumented trains increases, the proposed fusion approach could facilitate track monitoring of entire rail-networks.
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2009-04-01
The rapidly advancing hardware technology, smart sensors and sensor networks are advancing environment sensing. One major potential of this technology is Large-Scale Surveillance Systems (LS3) especially for, homeland security, battlefield intelligence, facility guarding and other civilian applications. The efficient and effective deployment of LS3 requires addressing number of aspects impacting the scalability of such systems. The scalability factors are related to: computation and memory utilization efficiency, communication bandwidth utilization, network topology (e.g., centralized, ad-hoc, hierarchical or hybrid), network communication protocol and data routing schemes; and local and global data/information fusion scheme for situational awareness. Although, many models have been proposed to address one aspect or another of these issues but, few have addressed the need for a multi-modality multi-agent data/information fusion that has characteristics satisfying the requirements of current and future intelligent sensors and sensor networks. In this paper, we have presented a novel scalable fusion engine for multi-modality multi-agent information fusion for LS3. The new fusion engine is based on a concept we call: Energy Logic. Experimental results of this work as compared to a Fuzzy logic model strongly supported the validity of the new model and inspired future directions for different levels of fusion and different applications.
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Gu, Chengfan
2018-01-01
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation. PMID:29415509
Multi-Sensor Optimal Data Fusion Based on the Adaptive Fading Unscented Kalman Filter.
Gao, Bingbing; Hu, Gaoge; Gao, Shesheng; Zhong, Yongmin; Gu, Chengfan
2018-02-06
This paper presents a new optimal data fusion methodology based on the adaptive fading unscented Kalman filter for multi-sensor nonlinear stochastic systems. This methodology has a two-level fusion structure: at the bottom level, an adaptive fading unscented Kalman filter based on the Mahalanobis distance is developed and serves as local filters to improve the adaptability and robustness of local state estimations against process-modeling error; at the top level, an unscented transformation-based multi-sensor optimal data fusion for the case of N local filters is established according to the principle of linear minimum variance to calculate globally optimal state estimation by fusion of local estimations. The proposed methodology effectively refrains from the influence of process-modeling error on the fusion solution, leading to improved adaptability and robustness of data fusion for multi-sensor nonlinear stochastic systems. It also achieves globally optimal fusion results based on the principle of linear minimum variance. Simulation and experimental results demonstrate the efficacy of the proposed methodology for INS/GNSS/CNS (inertial navigation system/global navigation satellite system/celestial navigation system) integrated navigation.
Z-Pinch Pulsed Plasma Propulsion Technology Development
NASA Technical Reports Server (NTRS)
Polsgrove, Tara; Adams, Robert B.; Fabisinski, Leo; Fincher, Sharon; Maples, C. Dauphne; Miernik, Janie; Percy, Tom; Statham, Geoff; Turner, Matt; Cassibry, Jason;
2010-01-01
Fusion-based propulsion can enable fast interplanetary transportation. Magneto-inertial fusion (MIF) is an approach which has been shown to potentially lead to a low cost, small reactor for fusion break even. The Z-Pinch/dense plasma focus method is an MIF concept in which a column of gas is compressed to thermonuclear conditions by an axial current (I approximates 100 MA). Recent advancements in experiments and the theoretical understanding of this concept suggest favorable scaling of fusion power output yield as I(sup 4). This document presents a conceptual design of a Z-Pinch fusion propulsion system and a vehicle for human exploration. The purpose of this study is to apply Z-Pinch fusion principles to the design of a propulsion system for an interplanetary spacecraft. This study took four steps in service of that objective; these steps are identified below. 1. Z-Pinch Modeling and Analysis: There is a wealth of literature characterizing Z-Pinch physics and existing Z-Pinch physics models. In order to be useful in engineering analysis, simplified Z-Pinch fusion thermodynamic models are required to give propulsion engineers the quantity of plasma, plasma temperature, rate of expansion, etc. The study team developed these models in this study. 2. Propulsion Modeling and Analysis: While the Z-Pinch models characterize the fusion process itself, propulsion models calculate the parameters that characterize the propulsion system (thrust, specific impulse, etc.) The study team developed a Z-Pinch propulsion model and used it to determine the best values for pulse rate, amount of propellant per pulse, and mixture ratio of the D-T and liner materials as well as the resulting thrust and specific impulse of the system. 3. Mission Analysis: Several potential missions were studied. Trajectory analysis using data from the propulsion model was used to determine the duration of the propulsion burns, the amount of propellant expended to complete each mission considered. 4. Vehicle Design: To understand the applicability of Z-Pinch propulsion to interplanetary travel, it is necessary to design a concept vehicle that uses it -- the propulsion system significantly impacts the design of the electrical, thermal control, avionics and structural subsystems of a vehicle. The study team developed a conceptual design of an interplanetary vehicle that transports crew and cargo to Mars and back and can be reused for other missions. Several aspects of this vehicle are based on a previous crewed fusion vehicle study -- the Human Outer Planet Exploration (HOPE) Magnetized Target Fusion (MTF) vehicle. Portions of the vehicle design were used outright and others were modified from the MTF design in order to maintain comparability.
Efficient sensor network vehicle classification using peak harmonics of acoustic emissions
NASA Astrophysics Data System (ADS)
William, Peter E.; Hoffman, Michael W.
2008-04-01
An application is proposed for detection and classification of battlefield ground vehicles using the emitted acoustic signal captured at individual sensor nodes of an ad hoc Wireless Sensor Network (WSN). We make use of the harmonic characteristics of the acoustic emissions of battlefield vehicles, in reducing both the computations carried on the sensor node and the transmitted data to the fusion center for reliable and effcient classification of targets. Previous approaches focus on the lower frequency band of the acoustic emissions up to 500Hz; however, we show in the proposed application how effcient discrimination between battlefield vehicles is performed using features extracted from higher frequency bands (50 - 1500Hz). The application shows that selective time domain acoustic features surpass equivalent spectral features. Collaborative signal processing is utilized, such that estimation of certain signal model parameters is carried by the sensor node, in order to reduce the communication between the sensor node and the fusion center, while the remaining model parameters are estimated at the fusion center. The transmitted data from the sensor node to the fusion center ranges from 1 ~ 5% of the sampled acoustic signal at the node. A variety of classification schemes were examined, such as maximum likelihood, vector quantization and artificial neural networks. Evaluation of the proposed application, through processing of an acoustic data set with comparison to previous results, shows that the improvement is not only in the number of computations but also in the detection and false alarm rate as well.
Olson, Mark A; Lee, Michael S; Yeh, In-Chul
2017-06-15
This work presents replica-exchange molecular dynamics simulations of inserting a 16-residue Ebola virus fusion peptide into a membrane bilayer. A computational approach is applied for modeling the peptide at the explicit all-atom level and the membrane-aqueous bilayer by a generalized Born continuum model with a smoothed switching function (GBSW). We provide an assessment of the model calculations in terms of three metrics: (1) the ability to reproduce the NMR structure of the peptide determined in the presence of SDS micelles and comparable structural data on other fusion peptides; (2) determination of the effects of the mutation Trp-8 to Ala and sequence discrimination of the homologous Marburg virus; and (3) calculation of potentials of mean force for estimating the partitioning free energy and their comparison to predictions from the Wimley-White interfacial hydrophobicity scale. We found the GBSW implicit membrane model to produce results of limited accuracy in conformational properties of the peptide when compared to the NMR structure, yet the model resolution is sufficient to determine the effect of sequence differentiation on peptide-membrane integration. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Jiang, Quansheng; Shen, Yehu; Li, Hua; Xu, Fengyu
2018-01-24
Feature recognition and fault diagnosis plays an important role in equipment safety and stable operation of rotating machinery. In order to cope with the complexity problem of the vibration signal of rotating machinery, a feature fusion model based on information entropy and probabilistic neural network is proposed in this paper. The new method first uses information entropy theory to extract three kinds of characteristics entropy in vibration signals, namely, singular spectrum entropy, power spectrum entropy, and approximate entropy. Then the feature fusion model is constructed to classify and diagnose the fault signals. The proposed approach can combine comprehensive information from different aspects and is more sensitive to the fault features. The experimental results on simulated fault signals verified better performances of our proposed approach. In real two-span rotor data, the fault detection accuracy of the new method is more than 10% higher compared with the methods using three kinds of information entropy separately. The new approach is proved to be an effective fault recognition method for rotating machinery.
Gunawardena, Harsha P; O'Brien, Jonathon; Wrobel, John A; Xie, Ling; Davies, Sherri R; Li, Shunqiang; Ellis, Matthew J; Qaqish, Bahjat F; Chen, Xian
2016-02-01
Single quantitative platforms such as label-based or label-free quantitation (LFQ) present compromises in accuracy, precision, protein sequence coverage, and speed of quantifiable proteomic measurements. To maximize the quantitative precision and the number of quantifiable proteins or the quantifiable coverage of tissue proteomes, we have developed a unified approach, termed QuantFusion, that combines the quantitative ratios of all peptides measured by both LFQ and label-based methodologies. Here, we demonstrate the use of QuantFusion in determining the proteins differentially expressed in a pair of patient-derived tumor xenografts (PDXs) representing two major breast cancer (BC) subtypes, basal and luminal. Label-based in-spectra quantitative peptides derived from amino acid-coded tagging (AACT, also known as SILAC) of a non-malignant mammary cell line were uniformly added to each xenograft with a constant predefined ratio, from which Ratio-of-Ratio estimates were obtained for the label-free peptides paired with AACT peptides in each PDX tumor. A mixed model statistical analysis was used to determine global differential protein expression by combining complementary quantifiable peptide ratios measured by LFQ and Ratio-of-Ratios, respectively. With minimum number of replicates required for obtaining the statistically significant ratios, QuantFusion uses the distinct mechanisms to "rescue" the missing data inherent to both LFQ and label-based quantitation. Combined quantifiable peptide data from both quantitative schemes increased the overall number of peptide level measurements and protein level estimates. In our analysis of the PDX tumor proteomes, QuantFusion increased the number of distinct peptide ratios by 65%, representing differentially expressed proteins between the BC subtypes. This quantifiable coverage improvement, in turn, not only increased the number of measurable protein fold-changes by 8% but also increased the average precision of quantitative estimates by 181% so that some BC subtypically expressed proteins were rescued by QuantFusion. Thus, incorporating data from multiple quantitative approaches while accounting for measurement variability at both the peptide and global protein levels make QuantFusion unique for obtaining increased coverage and quantitative precision for tissue proteomes. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.
2013-01-01
Background Valuable clone collections encoding the complete ORFeomes for some model organisms have been constructed following the completion of their genome sequencing projects. These libraries are based on Gateway cloning technology, which facilitates the study of protein function by simplifying the subcloning of open reading frames (ORF) into any suitable destination vector. The expression of proteins of interest as fusions with functional modules is a frequent approach in their initial functional characterization. A limited number of Gateway destination expression vectors allow the construction of fusion proteins from ORFeome-derived sequences, but they are restricted to the possibilities offered by their inbuilt functional modules and their pre-defined model organism-specificity. Thus, the availability of cloning systems that overcome these limitations would be highly advantageous. Results We present a versatile cloning toolkit for constructing fully-customizable three-part fusion proteins based on the MultiSite Gateway cloning system. The fusion protein components are encoded in the three plasmids integral to the kit. These can recombine with any purposely-engineered destination vector that uses a heterologous promoter external to the Gateway cassette, leading to the in-frame cloning of an ORF of interest flanked by two functional modules. In contrast to previous systems, a third part becomes available for peptide-encoding as it no longer needs to contain a promoter, resulting in an increased number of possible fusion combinations. We have constructed the kit’s component plasmids and demonstrate its functionality by providing proof-of-principle data on the expression of prototype fluorescent fusions in transiently-transfected cells. Conclusions We have developed a toolkit for creating fusion proteins with customized N- and C-term modules from Gateway entry clones encoding ORFs of interest. Importantly, our method allows entry clones obtained from ORFeome collections to be used without prior modifications. Using this technology, any existing Gateway destination expression vector with its model-specific properties could be easily adapted for expressing fusion proteins. PMID:23957834
Buj, Raquel; Iglesias, Noa; Planas, Anna M; Santalucía, Tomàs
2013-08-20
Valuable clone collections encoding the complete ORFeomes for some model organisms have been constructed following the completion of their genome sequencing projects. These libraries are based on Gateway cloning technology, which facilitates the study of protein function by simplifying the subcloning of open reading frames (ORF) into any suitable destination vector. The expression of proteins of interest as fusions with functional modules is a frequent approach in their initial functional characterization. A limited number of Gateway destination expression vectors allow the construction of fusion proteins from ORFeome-derived sequences, but they are restricted to the possibilities offered by their inbuilt functional modules and their pre-defined model organism-specificity. Thus, the availability of cloning systems that overcome these limitations would be highly advantageous. We present a versatile cloning toolkit for constructing fully-customizable three-part fusion proteins based on the MultiSite Gateway cloning system. The fusion protein components are encoded in the three plasmids integral to the kit. These can recombine with any purposely-engineered destination vector that uses a heterologous promoter external to the Gateway cassette, leading to the in-frame cloning of an ORF of interest flanked by two functional modules. In contrast to previous systems, a third part becomes available for peptide-encoding as it no longer needs to contain a promoter, resulting in an increased number of possible fusion combinations. We have constructed the kit's component plasmids and demonstrate its functionality by providing proof-of-principle data on the expression of prototype fluorescent fusions in transiently-transfected cells. We have developed a toolkit for creating fusion proteins with customized N- and C-term modules from Gateway entry clones encoding ORFs of interest. Importantly, our method allows entry clones obtained from ORFeome collections to be used without prior modifications. Using this technology, any existing Gateway destination expression vector with its model-specific properties could be easily adapted for expressing fusion proteins.
NASA Astrophysics Data System (ADS)
Naderi, D.; Pahlavani, M. R.; Alavi, S. A.
2013-05-01
Using the Langevin dynamical approach, the neutron multiplicity and the anisotropy of angular distribution of fission fragments in heavy ion fusion-fission reactions were calculated. We applied one- and two-dimensional Langevin equations to study the decay of a hot excited compound nucleus. The influence of the level-density parameter on neutron multiplicity and anisotropy of angular distribution of fission fragments was investigated. We used the level-density parameter based on the liquid drop model with two different values of the Bartel approach and Pomorska approach. Our calculations show that the anisotropy and neutron multiplicity are affected by level-density parameter and neck thickness. The calculations were performed on the 16O+208Pb and 20Ne+209Bi reactions. Obtained results in the case of the two-dimensional Langevin with a level-density parameter based on Bartel and co-workers approach are in better agreement with experimental data.
NASA Astrophysics Data System (ADS)
Maack, Joachim; Lingenfelder, Marcus; Weinacker, Holger; Koch, Barbara
2016-07-01
Remote sensing-based timber volume estimation is key for modelling the regional potential, accessibility and price of lignocellulosic raw material for an emerging bioeconomy. We used a unique wall-to-wall airborne LiDAR dataset and Landsat 7 satellite images in combination with terrestrial inventory data derived from the National Forest Inventory (NFI), and applied generalized additive models (GAM) to estimate spatially explicit timber distribution and volume in forested areas. Since the NFI data showed an underlying structure regarding size and ownership, we additionally constructed a socio-economic predictor to enhance the accuracy of the analysis. Furthermore, we balanced the training dataset with a bootstrap method to achieve unbiased regression weights for interpolating timber volume. Finally, we compared and discussed the model performance of the original approach (r2 = 0.56, NRMSE = 9.65%), the approach with balanced training data (r2 = 0.69, NRMSE = 12.43%) and the final approach with balanced training data and the additional socio-economic predictor (r2 = 0.72, NRMSE = 12.17%). The results demonstrate the usefulness of remote sensing techniques for mapping timber volume for a future lignocellulose-based bioeconomy.
Operational data fusion framework for building frequent Landsat-like imagery in a cloudy region
USDA-ARS?s Scientific Manuscript database
An operational data fusion framework is built to generate dense time-series Landsat-like images for a cloudy region by fusing Moderate Resolution Imaging Spectroradiometer (MODIS) data products and Landsat imagery. The Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) is integrated in ...
Viscoelastic modeling of the fusion of multicellular tumor spheroids in growth phase.
Dechristé, Guillaume; Fehrenbach, Jérôme; Griseti, Elena; Lobjois, Valérie; Poignard, Clair
2018-06-08
Since several decades, the experiments have highlighted the analogy of fusing cell aggregates with liquid droplets. The physical macroscopic models have been derived under incompressible assumptions. The aim of this paper is to provide a 3D model of growing spheroids, which is more relevant regarding embryo cell aggregates or tumor cell spheroids. We extend the past approach to a compressible 3D framework in order to account for the tumor spheroid growth. We exhibit the crucial importance of the effective surface tension, and of the inner pressure of the spheroid to describe precisely the fusion. The experimental data were obtained on spheroids of colon carcinoma human cells (HCT116 cell line). After 3 or 6 days of culture, two identical spheroids were transferred in one well and their fusion was monitored by live videomicroscopy acquisition each 2 h during 72 h. From these images the neck radius and the diameter of the assembly of the fusing spheroids are extracted. The numerical model is fitted with the experiments. It is worth noting that the time evolution of both neck radius and spheroid diameter are quantitatively obtained. The interesting feature lies in the fact that such measurements characterise the macroscopic rheological properties of the tumor spheroids. The experimental determination of the kinetics of neck radius and overall diameter during spheroids fusion characterises the rheological properties of the spheroids. The consistency of the model is shown by fitting the model with two different experiments, enhancing the importance of both surface tension and cell proliferation. The paper sheds new light on the macroscopic rheological properties of tumor spheroids. It emphasizes the role of the surface tension and the inner pressure in the fusion of growing spheroid. Under geometrical assumptions, the model reduces to a 2-parameter differential equation fit with experimental measurements. The 3-D partial differential system makes it possible to study the fusion of spheroids in non-symmetrical or more general frameworks. Copyright © 2018 Elsevier Ltd. All rights reserved.
Data fusion approach to threat assessment for radar resources management
NASA Astrophysics Data System (ADS)
Komorniczak, Wojciech; Pietrasinski, Jerzy; Solaiman, Basel
2002-03-01
The paper deals with the problem of the multifunction radar resources management. The problem consists of target/tasks ranking and tasks scheduling. The paper is focused on the target ranking, with the data fusion approach. The data from the radar (object's velocity, range, altitude, direction etc.), IFF system (Identification Friend or Foe) and ESM system (Electronic Support Measures - information concerning threat's electro - magnetic activities) is used to decide of the importance assignment for each detected target. The main problem consists of the multiplicity of various types of the input information. The information from the radar is of the probabilistic or ambiguous imperfection type and the IFF information is of evidential type. To take the advantage of these information sources the advanced data fusion system is necessary. The system should deal with the following situations: fusion of the evidential and fuzzy information, fusion of the evidential information and a'priori information. The paper describes the system which fuses the fuzzy and the evidential information without previous change to the same type of information. It is also described the proposal of using of the dynamic fuzzy qualifiers. The paper shows the results of the preliminary system's tests.
Optimizing the Four-Index Integral Transform Using Data Movement Lower Bounds Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rajbhandari, Samyam; Rastello, Fabrice; Kowalski, Karol
The four-index integral transform is a fundamental and computationally demanding calculation used in many computational chemistry suites such as NWChem. It transforms a four-dimensional tensor from an atomic basis to a molecular basis. This transformation is most efficiently implemented as a sequence of four tensor contractions that each contract a four-dimensional tensor with a two-dimensional transformation matrix. Differing degrees of permutation symmetry in the intermediate and final tensors in the sequence of contractions cause intermediate tensors to be much larger than the final tensor and limit the number of electronic states in the modeled systems. Loop fusion, in conjunction withmore » tiling, can be very effective in reducing the total space requirement, as well as data movement. However, the large number of possible choices for loop fusion and tiling, and data/computation distribution across a parallel system, make it challenging to develop an optimized parallel implementation for the four-index integral transform. We develop a novel approach to address this problem, using lower bounds modeling of data movement complexity. We establish relationships between available aggregate physical memory in a parallel computer system and ineffective fusion configurations, enabling their pruning and consequent identification of effective choices and a characterization of optimality criteria. This work has resulted in the development of a significantly improved implementation of the four-index transform that enables higher performance and the ability to model larger electronic systems than the current implementation in the NWChem quantum chemistry software suite.« less
Real-time sensor validation and fusion for distributed autonomous sensors
NASA Astrophysics Data System (ADS)
Yuan, Xiaojing; Li, Xiangshang; Buckles, Bill P.
2004-04-01
Multi-sensor data fusion has found widespread applications in industrial and research sectors. The purpose of real time multi-sensor data fusion is to dynamically estimate an improved system model from a set of different data sources, i.e., sensors. This paper presented a systematic and unified real time sensor validation and fusion framework (RTSVFF) based on distributed autonomous sensors. The RTSVFF is an open architecture which consists of four layers - the transaction layer, the process fusion layer, the control layer, and the planning layer. This paradigm facilitates distribution of intelligence to the sensor level and sharing of information among sensors, controllers, and other devices in the system. The openness of the architecture also provides a platform to test different sensor validation and fusion algorithms and thus facilitates the selection of near optimal algorithms for specific sensor fusion application. In the version of the model presented in this paper, confidence weighted averaging is employed to address the dynamic system state issue noted above. The state is computed using an adaptive estimator and dynamic validation curve for numeric data fusion and a robust diagnostic map for decision level qualitative fusion. The framework is then applied to automatic monitoring of a gas-turbine engine, including a performance comparison of the proposed real-time sensor fusion algorithms and a traditional numerical weighted average.
A comparison of synthesis and integrative approaches for meaning making and information fusion
NASA Astrophysics Data System (ADS)
Eggleston, Robert G.; Fenstermacher, Laurie
2017-05-01
Traditionally, information fusion approaches to meaning making have been integrative or aggregative in nature, creating meaning "containers" in which to put content (e.g., attributes) about object classes. In a large part, this was due to the limits in technology/tools for supporting information fusion (e.g., computers). A different synthesis based approach for meaning making is described which takes advantage of computing advances. The approach is not focused on the events/behaviors being observed/sensed; instead, it is human work centric. The former director of the Defense Intelligence Agency once wrote, "Context is king. Achieving an understanding of what is happening - or will happen - comes from a truly integrated picture of an area, the situation and the various personalities in it…a layered approach over time that builds depth of understanding."1 The synthesis based meaning making framework enables this understanding. It is holistic (both the sum and the parts, the proverbial forest and the trees), multi-perspective and emulative (as opposed to representational). The two approaches are complementary, with the synthesis based meaning making framework as a wrapper. The integrative approach would be dominant at level 0,1 fusion: data fusion, track formation and the synthesis based meaning making becomes dominant at higher fusion levels (levels 2 and 3), although both may be in play. A synthesis based approach to information fusion is thus well suited for "gray zone" challenges in which there is aggression and ambiguity and which are inherently perspective dependent (e.g., recent events in Ukraine).
A Survey of Methods for Computing Best Estimates of Endoatmospheric and Exoatmospheric Trajectories
NASA Technical Reports Server (NTRS)
Bernard, William P.
2018-01-01
Beginning with the mathematical prediction of planetary orbits in the early seventeenth century up through the most recent developments in sensor fusion methods, many techniques have emerged that can be employed on the problem of endo and exoatmospheric trajectory estimation. Although early methods were ad hoc, the twentieth century saw the emergence of many systematic approaches to estimation theory that produced a wealth of useful techniques. The broad genesis of estimation theory has resulted in an equally broad array of mathematical principles, methods and vocabulary. Among the fundamental ideas and methods that are briefly touched on are batch and sequential processing, smoothing, estimation, and prediction, sensor fusion, sensor fusion architectures, data association, Bayesian and non Bayesian filtering, the family of Kalman filters, models of the dynamics of the phases of a rocket's flight, and asynchronous, delayed, and asequent data. Along the way, a few trajectory estimation issues are addressed and much of the vocabulary is defined.
Information Fusion - Methods and Aggregation Operators
NASA Astrophysics Data System (ADS)
Torra, Vicenç
Information fusion techniques are commonly applied in Data Mining and Knowledge Discovery. In this chapter, we will give an overview of such applications considering their three main uses. This is, we consider fusion methods for data preprocessing, model building and information extraction. Some aggregation operators (i.e. particular fusion methods) and their properties are briefly described as well.
Hensman, James; Lawrence, Neil D; Rattray, Magnus
2013-08-20
Time course data from microarrays and high-throughput sequencing experiments require simple, computationally efficient and powerful statistical models to extract meaningful biological signal, and for tasks such as data fusion and clustering. Existing methodologies fail to capture either the temporal or replicated nature of the experiments, and often impose constraints on the data collection process, such as regularly spaced samples, or similar sampling schema across replications. We propose hierarchical Gaussian processes as a general model of gene expression time-series, with application to a variety of problems. In particular, we illustrate the method's capacity for missing data imputation, data fusion and clustering.The method can impute data which is missing both systematically and at random: in a hold-out test on real data, performance is significantly better than commonly used imputation methods. The method's ability to model inter- and intra-cluster variance leads to more biologically meaningful clusters. The approach removes the necessity for evenly spaced samples, an advantage illustrated on a developmental Drosophila dataset with irregular replications. The hierarchical Gaussian process model provides an excellent statistical basis for several gene-expression time-series tasks. It has only a few additional parameters over a regular GP, has negligible additional complexity, is easily implemented and can be integrated into several existing algorithms. Our experiments were implemented in python, and are available from the authors' website: http://staffwww.dcs.shef.ac.uk/people/J.Hensman/.
Textual and visual content-based anti-phishing: a Bayesian approach.
Zhang, Haijun; Liu, Gang; Chow, Tommy W S; Liu, Wenyin
2011-10-01
A novel framework using a Bayesian approach for content-based phishing web page detection is presented. Our model takes into account textual and visual contents to measure the similarity between the protected web page and suspicious web pages. A text classifier, an image classifier, and an algorithm fusing the results from classifiers are introduced. An outstanding feature of this paper is the exploration of a Bayesian model to estimate the matching threshold. This is required in the classifier for determining the class of the web page and identifying whether the web page is phishing or not. In the text classifier, the naive Bayes rule is used to calculate the probability that a web page is phishing. In the image classifier, the earth mover's distance is employed to measure the visual similarity, and our Bayesian model is designed to determine the threshold. In the data fusion algorithm, the Bayes theory is used to synthesize the classification results from textual and visual content. The effectiveness of our proposed approach was examined in a large-scale dataset collected from real phishing cases. Experimental results demonstrated that the text classifier and the image classifier we designed deliver promising results, the fusion algorithm outperforms either of the individual classifiers, and our model can be adapted to different phishing cases. © 2011 IEEE
Deep learning decision fusion for the classification of urban remote sensing data
NASA Astrophysics Data System (ADS)
Abdi, Ghasem; Samadzadegan, Farhad; Reinartz, Peter
2018-01-01
Multisensor data fusion is one of the most common and popular remote sensing data classification topics by considering a robust and complete description about the objects of interest. Furthermore, deep feature extraction has recently attracted significant interest and has become a hot research topic in the geoscience and remote sensing research community. A deep learning decision fusion approach is presented to perform multisensor urban remote sensing data classification. After deep features are extracted by utilizing joint spectral-spatial information, a soft-decision made classifier is applied to train high-level feature representations and to fine-tune the deep learning framework. Next, a decision-level fusion classifies objects of interest by the joint use of sensors. Finally, a context-aware object-based postprocessing is used to enhance the classification results. A series of comparative experiments are conducted on the widely used dataset of 2014 IEEE GRSS data fusion contest. The obtained results illustrate the considerable advantages of the proposed deep learning decision fusion over the traditional classifiers.
Data fusion for target tracking and classification with wireless sensor network
NASA Astrophysics Data System (ADS)
Pannetier, Benjamin; Doumerc, Robin; Moras, Julien; Dezert, Jean; Canevet, Loic
2016-10-01
In this paper, we address the problem of multiple ground target tracking and classification with information obtained from a unattended wireless sensor network. A multiple target tracking (MTT) algorithm, taking into account road and vegetation information, is proposed based on a centralized architecture. One of the key issue is how to adapt classical MTT approach to satisfy embedded processing. Based on track statistics, the classification algorithm uses estimated location, velocity and acceleration to help to classify targets. The algorithms enables tracking human and vehicles driving both on and off road. We integrate road or trail width and vegetation cover, as constraints in target motion models to improve performance of tracking under constraint with classification fusion. Our algorithm also presents different dynamic models, to palliate the maneuvers of targets. The tracking and classification algorithms are integrated into an operational platform (the fusion node). In order to handle realistic ground target tracking scenarios, we use an autonomous smart computer deposited in the surveillance area. After the calibration step of the heterogeneous sensor network, our system is able to handle real data from a wireless ground sensor network. The performance of system is evaluated in a real exercise for intelligence operation ("hunter hunt" scenario).
An Approach for Reducing the Error Rate in Automated Lung Segmentation
Gill, Gurman; Beichel, Reinhard R.
2016-01-01
Robust lung segmentation is challenging, especially when tens of thousands of lung CT scans need to be processed, as required by large multi-center studies. The goal of this work was to develop and assess a method for the fusion of segmentation results from two different methods to generate lung segmentations that have a lower failure rate than individual input segmentations. As basis for the fusion approach, lung segmentations generated with a region growing and model-based approach were utilized. The fusion result was generated by comparing input segmentations and selectively combining them using a trained classification system. The method was evaluated on a diverse set of 204 CT scans of normal and diseased lungs. The fusion approach resulted in a Dice coefficient of 0.9855 ± 0.0106 and showed a statistically significant improvement compared to both input segmentation methods. In addition, the failure rate at different segmentation accuracy levels was assessed. For example, when requiring that lung segmentations must have a Dice coefficient of better than 0.97, the fusion approach had a failure rate of 6.13%. In contrast, the failure rate for region growing and model-based methods was 18.14% and 15.69%, respectively. Therefore, the proposed method improves the quality of the lung segmentations, which is important for subsequent quantitative analysis of lungs. Also, to enable a comparison with other methods, results on the LOLA11 challenge test set are reported. PMID:27447897
Sensor fusion for synthetic vision
NASA Technical Reports Server (NTRS)
Pavel, M.; Larimer, J.; Ahumada, A.
1991-01-01
Display methodologies are explored for fusing images gathered by millimeter wave sensors with images rendered from an on-board terrain data base to facilitate visually guided flight and ground operations in low visibility conditions. An approach to fusion based on multiresolution image representation and processing is described which facilitates fusion of images differing in resolution within and between images. To investigate possible fusion methods, a workstation-based simulation environment is being developed.
Generating High-Temporal and Spatial Resolution TIR Image Data
NASA Astrophysics Data System (ADS)
Herrero-Huerta, M.; Lagüela, S.; Alfieri, S. M.; Menenti, M.
2017-09-01
Remote sensing imagery to monitor global biophysical dynamics requires the availability of thermal infrared data at high temporal and spatial resolution because of the rapid development of crops during the growing season and the fragmentation of most agricultural landscapes. Conversely, no single sensor meets these combined requirements. Data fusion approaches offer an alternative to exploit observations from multiple sensors, providing data sets with better properties. A novel spatio-temporal data fusion model based on constrained algorithms denoted as multisensor multiresolution technique (MMT) was developed and applied to generate TIR synthetic image data at both temporal and spatial high resolution. Firstly, an adaptive radiance model is applied based on spectral unmixing analysis of . TIR radiance data at TOA (top of atmosphere) collected by MODIS daily 1-km and Landsat - TIRS 16-day sampled at 30-m resolution are used to generate synthetic daily radiance images at TOA at 30-m spatial resolution. The next step consists of unmixing the 30 m (now lower resolution) images using the information about their pixel land-cover composition from co-registered images at higher spatial resolution. In our case study, TIR synthesized data were unmixed to the Sentinel 2 MSI with 10 m resolution. The constrained unmixing preserves all the available radiometric information of the 30 m images and involves the optimization of the number of land-cover classes and the size of the moving window for spatial unmixing. Results are still being evaluated, with particular attention for the quality of the data streams required to apply our approach.
Castrignanò, Annamaria; Quarto, Ruggiero; Vitti, Carolina; Langella, Giuliano; Terribile, Fabio
2017-01-01
To assess spatial variability at the very fine scale required by Precision Agriculture, different proximal and remote sensors have been used. They provide large amounts and different types of data which need to be combined. An integrated approach, using multivariate geostatistical data-fusion techniques and multi-source geophysical sensor data to determine simple summary scale-dependent indices, is described here. These indices can be used to delineate management zones to be submitted to differential management. Such a data fusion approach with geophysical sensors was applied in a soil of an agronomic field cropped with tomato. The synthetic regionalized factors determined, contributed to split the 3D edaphic environment into two main horizontal structures with different hydraulic properties and to disclose two main horizons in the 0–1.0-m depth with a discontinuity probably occurring between 0.40 m and 0.70 m. Comparing this partition with the soil properties measured with a shallow sampling, it was possible to verify the coherence in the topsoil between the dielectric properties and other properties more directly related to agronomic management. These results confirm the advantages of using proximal sensing as a preliminary step in the application of site-specific management. Combining disparate spatial data (data fusion) is not at all a naive problem and novel and powerful methods need to be developed. PMID:29207510
Castrignanò, Annamaria; Buttafuoco, Gabriele; Quarto, Ruggiero; Vitti, Carolina; Langella, Giuliano; Terribile, Fabio; Venezia, Accursio
2017-12-03
To assess spatial variability at the very fine scale required by Precision Agriculture, different proximal and remote sensors have been used. They provide large amounts and different types of data which need to be combined. An integrated approach, using multivariate geostatistical data-fusion techniques and multi-source geophysical sensor data to determine simple summary scale-dependent indices, is described here. These indices can be used to delineate management zones to be submitted to differential management. Such a data fusion approach with geophysical sensors was applied in a soil of an agronomic field cropped with tomato. The synthetic regionalized factors determined, contributed to split the 3D edaphic environment into two main horizontal structures with different hydraulic properties and to disclose two main horizons in the 0-1.0-m depth with a discontinuity probably occurring between 0.40 m and 0.70 m. Comparing this partition with the soil properties measured with a shallow sampling, it was possible to verify the coherence in the topsoil between the dielectric properties and other properties more directly related to agronomic management. These results confirm the advantages of using proximal sensing as a preliminary step in the application of site-specific management. Combining disparate spatial data (data fusion) is not at all a naive problem and novel and powerful methods need to be developed.
Stalk model of membrane fusion: solution of energy crisis.
Kozlovsky, Yonathan; Kozlov, Michael M
2002-01-01
Membrane fusion proceeds via formation of intermediate nonbilayer structures. The stalk model of fusion intermediate is commonly recognized to account for the major phenomenology of the fusion process. However, in its current form, the stalk model poses a challenge. On one hand, it is able to describe qualitatively the modulation of the fusion reaction by the lipid composition of the membranes. On the other, it predicts very large values of the stalk energy, so that the related energy barrier for fusion cannot be overcome by membranes within a biologically reasonable span of time. We suggest a new structure for the fusion stalk, which resolves the energy crisis of the model. Our approach is based on a combined deformation of the stalk membrane including bending of the membrane surface and tilt of the hydrocarbon chains of lipid molecules. We demonstrate that the energy of the fusion stalk is a few times smaller than those predicted previously and the stalks are feasible in real systems. We account quantitatively for the experimental results on dependence of the fusion reaction on the lipid composition of different membrane monolayers. We analyze the dependence of the stalk energy on the distance between the fusing membranes and provide the experimentally testable predictions for the structural features of the stalk intermediates. PMID:11806930
Spatial heterogeneity of tungsten transmutation in a fusion device
NASA Astrophysics Data System (ADS)
Gilbert, M. R.; Sublet, J.-Ch.; Dudarev, S. L.
2017-04-01
Accurately quantifying the transmutation rate of tungsten (W) under neutron irradiation is a necessary requirement in the assessment of its performance as an armour material in a fusion power plant. The usual approach of calculating average responses, assuming large, homogenised material volumes, is insufficient to capture the full complexity of the transmutation picture in the context of a realistic fusion power plant design, particularly for rhenium (Re) production from W. Combined neutron transport and inventory simulations for representative spatially heterogeneous high-resolution models of a fusion power plant show that the production rate of Re is strongly influenced by the surrounding local spatial environment. Localised variation in neutron moderation (slowing down) due to structural steel and coolant, particularly water, can dramatically increase Re production because of the huge cross sections of giant resolved resonances in the neutron-capture reaction of 186W at low neutron energies. Calculations using cross section data corrected for temperature (Doppler) effects suggest that temperature may have a relatively lesser influence on transmutation rates.
Value Driven Information Processing and Fusion
2016-03-01
consensus approach allows a decentralized approach to achieve the optimal error exponent of the centralized counterpart, a conclusion that is signifi...SECURITY CLASSIFICATION OF: The objective of the project is to develop a general framework for value driven decentralized information processing...including: optimal data reduction in a network setting for decentralized inference with quantization constraint; interactive fusion that allows queries and
Seismic data fusion anomaly detection
NASA Astrophysics Data System (ADS)
Harrity, Kyle; Blasch, Erik; Alford, Mark; Ezekiel, Soundararajan; Ferris, David
2014-06-01
Detecting anomalies in non-stationary signals has valuable applications in many fields including medicine and meteorology. These include uses such as identifying possible heart conditions from an Electrocardiography (ECG) signals or predicting earthquakes via seismographic data. Over the many choices of anomaly detection algorithms, it is important to compare possible methods. In this paper, we examine and compare two approaches to anomaly detection and see how data fusion methods may improve performance. The first approach involves using an artificial neural network (ANN) to detect anomalies in a wavelet de-noised signal. The other method uses a perspective neural network (PNN) to analyze an arbitrary number of "perspectives" or transformations of the observed signal for anomalies. Possible perspectives may include wavelet de-noising, Fourier transform, peak-filtering, etc.. In order to evaluate these techniques via signal fusion metrics, we must apply signal preprocessing techniques such as de-noising methods to the original signal and then use a neural network to find anomalies in the generated signal. From this secondary result it is possible to use data fusion techniques that can be evaluated via existing data fusion metrics for single and multiple perspectives. The result will show which anomaly detection method, according to the metrics, is better suited overall for anomaly detection applications. The method used in this study could be applied to compare other signal processing algorithms.
Data Field Modeling and Spectral-Spatial Feature Fusion for Hyperspectral Data Classification.
Liu, Da; Li, Jianxun
2016-12-16
Classification is a significant subject in hyperspectral remote sensing image processing. This study proposes a spectral-spatial feature fusion algorithm for the classification of hyperspectral images (HSI). Unlike existing spectral-spatial classification methods, the influences and interactions of the surroundings on each measured pixel were taken into consideration in this paper. Data field theory was employed as the mathematical realization of the field theory concept in physics, and both the spectral and spatial domains of HSI were considered as data fields. Therefore, the inherent dependency of interacting pixels was modeled. Using data field modeling, spatial and spectral features were transformed into a unified radiation form and further fused into a new feature by using a linear model. In contrast to the current spectral-spatial classification methods, which usually simply stack spectral and spatial features together, the proposed method builds the inner connection between the spectral and spatial features, and explores the hidden information that contributed to classification. Therefore, new information is included for classification. The final classification result was obtained using a random forest (RF) classifier. The proposed method was tested with the University of Pavia and Indian Pines, two well-known standard hyperspectral datasets. The experimental results demonstrate that the proposed method has higher classification accuracies than those obtained by the traditional approaches.
Modeling the uncertainty of estimating forest carbon stocks in China
NASA Astrophysics Data System (ADS)
Yue, T. X.; Wang, Y. F.; Du, Z. P.; Zhao, M. W.; Zhang, L. L.; Zhao, N.; Lu, M.; Larocque, G. R.; Wilson, J. P.
2015-12-01
Earth surface systems are controlled by a combination of global and local factors, which cannot be understood without accounting for both the local and global components. The system dynamics cannot be recovered from the global or local controls alone. Ground forest inventory is able to accurately estimate forest carbon stocks at sample plots, but these sample plots are too sparse to support the spatial simulation of carbon stocks with required accuracy. Satellite observation is an important source of global information for the simulation of carbon stocks. Satellite remote-sensing can supply spatially continuous information about the surface of forest carbon stocks, which is impossible from ground-based investigations, but their description has considerable uncertainty. In this paper, we validated the Lund-Potsdam-Jena dynamic global vegetation model (LPJ), the Kriging method for spatial interpolation of ground sample plots and a satellite-observation-based approach as well as an approach for fusing the ground sample plots with satellite observations and an assimilation method for incorporating the ground sample plots into LPJ. The validation results indicated that both the data fusion and data assimilation approaches reduced the uncertainty of estimating carbon stocks. The data fusion had the lowest uncertainty by using an existing method for high accuracy surface modeling to fuse the ground sample plots with the satellite observations (HASM-SOA). The estimates produced with HASM-SOA were 26.1 and 28.4 % more accurate than the satellite-based approach and spatial interpolation of the sample plots, respectively. Forest carbon stocks of 7.08 Pg were estimated for China during the period from 2004 to 2008, an increase of 2.24 Pg from 1984 to 2008, using the preferred HASM-SOA method.
Modeling and Classifying Six-Dimensional Trajectories for Teleoperation Under a Time Delay
NASA Technical Reports Server (NTRS)
SunSpiral, Vytas; Wheeler, Kevin R.; Allan, Mark B.; Martin, Rodney
2006-01-01
Within the context of teleoperating the JSC Robonaut humanoid robot under 2-10 second time delays, this paper explores the technical problem of modeling and classifying human motions represented as six-dimensional (position and orientation) trajectories. A dual path research agenda is reviewed which explored both deterministic approaches and stochastic approaches using Hidden Markov Models. Finally, recent results are shown from a new model which represents the fusion of these two research paths. Questions are also raised about the possibility of automatically generating autonomous actions by reusing the same predictive models of human behavior to be the source of autonomous control. This approach changes the role of teleoperation from being a stand-in for autonomy into the first data collection step for developing generative models capable of autonomous control of the robot.
Frequency domain surface EMG sensor fusion for estimating finger forces.
Potluri, Chandrasekhar; Kumar, Parmod; Anugolu, Madhavi; Urfer, Alex; Chiu, Steve; Naidu, D; Schoen, Marco P
2010-01-01
Extracting or estimating skeletal hand/finger forces using surface electro myographic (sEMG) signals poses many challenges due to cross-talk, noise, and a temporal and spatially modulated signal characteristics. Normal sEMG measurements are based on single sensor data. In this paper, array sensors are used along with a proposed sensor fusion scheme that result in a simple Multi-Input-Single-Output (MISO) transfer function. Experimental data is used along with system identification to find this MISO system. A Genetic Algorithm (GA) approach is employed to optimize the characteristics of the MISO system. The proposed fusion-based approach is tested experimentally and indicates improvement in finger/hand force estimation.
Distributed multimodal data fusion for large scale wireless sensor networks
NASA Astrophysics Data System (ADS)
Ertin, Emre
2006-05-01
Sensor network technology has enabled new surveillance systems where sensor nodes equipped with processing and communication capabilities can collaboratively detect, classify and track targets of interest over a large surveillance area. In this paper we study distributed fusion of multimodal sensor data for extracting target information from a large scale sensor network. Optimal tracking, classification, and reporting of threat events require joint consideration of multiple sensor modalities. Multiple sensor modalities improve tracking by reducing the uncertainty in the track estimates as well as resolving track-sensor data association problems. Our approach to solving the fusion problem with large number of multimodal sensors is construction of likelihood maps. The likelihood maps provide a summary data for the solution of the detection, tracking and classification problem. The likelihood map presents the sensory information in an easy format for the decision makers to interpret and is suitable with fusion of spatial prior information such as maps, imaging data from stand-off imaging sensors. We follow a statistical approach to combine sensor data at different levels of uncertainty and resolution. The likelihood map transforms each sensor data stream to a spatio-temporal likelihood map ideally suitable for fusion with imaging sensor outputs and prior geographic information about the scene. We also discuss distributed computation of the likelihood map using a gossip based algorithm and present simulation results.
Competitive-Cooperative Automated Reasoning from Distributed and Multiple Source of Data
NASA Astrophysics Data System (ADS)
Fard, Amin Milani
Knowledge extraction from distributed database systems, have been investigated during past decade in order to analyze billions of information records. In this work a competitive deduction approach in a heterogeneous data grid environment is proposed using classic data mining and statistical methods. By applying a game theory concept in a multi-agent model, we tried to design a policy for hierarchical knowledge discovery and inference fusion. To show the system run, a sample multi-expert system has also been developed.
Bergamini, Elena; Ligorio, Gabriele; Summa, Aurora; Vannozzi, Giuseppe; Cappozzo, Aurelio; Sabatini, Angelo Maria
2014-10-09
Magnetic and inertial measurement units are an emerging technology to obtain 3D orientation of body segments in human movement analysis. In this respect, sensor fusion is used to limit the drift errors resulting from the gyroscope data integration by exploiting accelerometer and magnetic aiding sensors. The present study aims at investigating the effectiveness of sensor fusion methods under different experimental conditions. Manual and locomotion tasks, differing in time duration, measurement volume, presence/absence of static phases, and out-of-plane movements, were performed by six subjects, and recorded by one unit located on the forearm or the lower trunk, respectively. Two sensor fusion methods, representative of the stochastic (Extended Kalman Filter) and complementary (Non-linear observer) filtering, were selected, and their accuracy was assessed in terms of attitude (pitch and roll angles) and heading (yaw angle) errors using stereophotogrammetric data as a reference. The sensor fusion approaches provided significantly more accurate results than gyroscope data integration. Accuracy improved mostly for heading and when the movement exhibited stationary phases, evenly distributed 3D rotations, it occurred in a small volume, and its duration was greater than approximately 20 s. These results were independent from the specific sensor fusion method used. Practice guidelines for improving the outcome accuracy are provided.
One decade of the Data Fusion Information Group (DFIG) model
NASA Astrophysics Data System (ADS)
Blasch, Erik
2015-05-01
The revision of the Joint Directors of the Laboratories (JDL) Information Fusion model in 2004 discussed information processing, incorporated the analyst, and was coined the Data Fusion Information Group (DFIG) model. Since that time, developments in information technology (e.g., cloud computing, applications, and multimedia) have altered the role of the analyst. Data production has outpaced the analyst; however the analyst still has the role of data refinement and information reporting. In this paper, we highlight three examples being addressed by the DFIG model. One example is the role of the analyst to provide semantic queries (through an ontology) so that vast amount of data available can be indexed, accessed, retrieved, and processed. The second idea is reporting which requires the analyst to collect the data into a condensed and meaningful form through information management. The last example is the interpretation of the resolved information from data that must include contextual information not inherent in the data itself. Through a literature review, the DFIG developments in the last decade demonstrate the usability of the DFIG model to bring together the user (analyst or operator) and the machine (information fusion or manager) in a systems design.
Global Atmosphere Watch Workshop on Measurement-Model ...
The World Meteorological Organization’s (WMO) Global Atmosphere Watch (GAW) Programme coordinates high-quality observations of atmospheric composition from global to local scales with the aim to drive high-quality and high-impact science while co-producing a new generation of products and services. In line with this vision, GAW’s Scientific Advisory Group for Total Atmospheric Deposition (SAG-TAD) has a mandate to produce global maps of wet, dry and total atmospheric deposition for important atmospheric chemicals to enable research into biogeochemical cycles and assessments of ecosystem and human health effects. The most suitable scientific approach for this activity is the emerging technique of measurement-model fusion for total atmospheric deposition. This technique requires global-scale measurements of atmospheric trace gases, particles, precipitation composition and precipitation depth, as well as predictions of the same from global/regional chemical transport models. The fusion of measurement and model results requires data assimilation and mapping techniques. The objective of the GAW Workshop on Measurement-Model Fusion for Global Total Atmospheric Deposition (MMF-GTAD), an initiative of the SAG-TAD, was to review the state-of-the-science and explore the feasibility and methodology of producing, on a routine retrospective basis, global maps of atmospheric gas and aerosol concentrations as well as wet, dry and total deposition via measurement-model
Long-range dismount activity classification: LODAC
NASA Astrophysics Data System (ADS)
Garagic, Denis; Peskoe, Jacob; Liu, Fang; Cuevas, Manuel; Freeman, Andrew M.; Rhodes, Bradley J.
2014-06-01
Continuous classification of dismount types (including gender, age, ethnicity) and their activities (such as walking, running) evolving over space and time is challenging. Limited sensor resolution (often exacerbated as a function of platform standoff distance) and clutter from shadows in dense target environments, unfavorable environmental conditions, and the normal properties of real data all contribute to the challenge. The unique and innovative aspect of our approach is a synthesis of multimodal signal processing with incremental non-parametric, hierarchical Bayesian machine learning methods to create a new kind of target classification architecture. This architecture is designed from the ground up to optimally exploit correlations among the multiple sensing modalities (multimodal data fusion) and rapidly and continuously learns (online self-tuning) patterns of distinct classes of dismounts given little a priori information. This increases classification performance in the presence of challenges posed by anti-access/area denial (A2/AD) sensing. To fuse multimodal features, Long-range Dismount Activity Classification (LODAC) develops a novel statistical information theoretic approach for multimodal data fusion that jointly models multimodal data (i.e., a probabilistic model for cross-modal signal generation) and discovers the critical cross-modal correlations by identifying components (features) with maximal mutual information (MI) which is efficiently estimated using non-parametric entropy models. LODAC develops a generic probabilistic pattern learning and classification framework based on a new class of hierarchical Bayesian learning algorithms for efficiently discovering recurring patterns (classes of dismounts) in multiple simultaneous time series (sensor modalities) at multiple levels of feature granularity.
NASA Astrophysics Data System (ADS)
Park, Joong Yong; Tuell, Grady
2010-04-01
The Data Processing System (DPS) of the Coastal Zone Mapping and Imaging Lidar (CZMIL) has been designed to automatically produce a number of novel environmental products through the fusion of Lidar, spectrometer, and camera data in a single software package. These new products significantly transcend use of the system as a bathymeter, and support use of CZMIL as a complete coastal and benthic mapping tool. The DPS provides a spinning globe capability for accessing data files; automated generation of combined topographic and bathymetric point clouds; a fully-integrated manual editor and data analysis tool; automated generation of orthophoto mosaics; automated generation of reflectance data cubes from the imaging spectrometer; a coupled air-ocean spectral optimization model producing images of chlorophyll and CDOM concentrations; and a fusion based capability to produce images and classifications of the shallow water seafloor. Adopting a multitasking approach, we expect to achieve computation of the point clouds, DEMs, and reflectance images at a 1:1 processing to acquisition ratio.
Sensor fusion approaches for EMI and GPR-based subsurface threat identification
NASA Astrophysics Data System (ADS)
Torrione, Peter; Morton, Kenneth, Jr.; Besaw, Lance E.
2011-06-01
Despite advances in both electromagnetic induction (EMI) and ground penetrating radar (GPR) sensing and related signal processing, neither sensor alone provides a perfect tool for detecting the myriad of possible buried objects that threaten the lives of Soldiers and civilians. However, while neither GPR nor EMI sensing alone can provide optimal detection across all target types, the two approaches are highly complementary. As a result, many landmine systems seek to make use of both sensing modalities simultaneously and fuse the results from both sensors to improve detection performance for targets with widely varying metal content and GPR responses. Despite this, little work has focused on large-scale comparisons of different approaches to sensor fusion and machine learning for combining data from these highly orthogonal phenomenologies. In this work we explore a wide array of pattern recognition techniques for algorithm development and sensor fusion. Results with the ARA Nemesis landmine detection system suggest that nonlinear and non-parametric classification algorithms provide significant performance benefits for single-sensor algorithm development, and that fusion of multiple algorithms can be performed satisfactorily using basic parametric approaches, such as logistic discriminant classification, for the targets under consideration in our data sets.
A hybrid model for computing nonthermal ion distributions in a long mean-free-path plasma
NASA Astrophysics Data System (ADS)
Tang, Xianzhu; McDevitt, Chris; Guo, Zehua; Berk, Herb
2014-10-01
Non-thermal ions, especially the suprathermal ones, are known to make a dominant contribution to a number of important physics such as the fusion reactivity in controlled fusion, the ion heat flux, and in the case of a tokamak, the ion bootstrap current. Evaluating the deviation from a local Maxwellian distribution of these non-thermal ions can be a challenging task in the context of a global plasma fluid model that evolves the plasma density, flow, and temperature. Here we describe a hybrid model for coupling such constrained kinetic calculation to global plasma fluid models. The key ingredient is a non-perturbative treatment of the tail ions where the ion Knudsen number approaches or surpasses order unity. This can be sharply constrasted with the standard Chapman-Enskog approach which relies on a perturbative treatment that is frequently invalidated. The accuracy of our coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space. Although our specific application examples will be drawn from laboratory controlled fusion experiments, the general approach is applicable to space and astrophysical plasmas as well. Work supported by DOE.
Toward an optimisation technique for dynamically monitored environment
NASA Astrophysics Data System (ADS)
Shurrab, Orabi M.
2016-10-01
The data fusion community has introduced multiple procedures of situational assessments; this is to facilitate timely responses to emerging situations. More directly, the process refinement of the Joint Directors of Laboratories (JDL) is a meta-process to assess and improve the data fusion task during real-time operation. In other wording, it is an optimisation technique to verify the overall data fusion performance, and enhance it toward the top goals of the decision-making resources. This paper discusses the theoretical concept of prioritisation. Where the analysts team is required to keep an up to date with the dynamically changing environment, concerning different domains such as air, sea, land, space and cyberspace. Furthermore, it demonstrates an illustration example of how various tracking activities are ranked, simultaneously into a predetermined order. Specifically, it presents a modelling scheme for a case study based scenario, where the real-time system is reporting different classes of prioritised events. Followed by a performance metrics for evaluating the prioritisation process of situational awareness (SWA) domain. The proposed performance metrics has been designed and evaluated using an analytical approach. The modelling scheme represents the situational awareness system outputs mathematically, in the form of a list of activities. Such methods allowed the evaluation process to conduct a rigorous analysis of the prioritisation process, despite any constrained related to a domain-specific configuration. After conducted three levels of assessments over three separates scenario, The Prioritisation Capability Score (PCS) has provided an appropriate scoring scheme for different ranking instances, Indeed, from the data fusion perspectives, the proposed metric has assessed real-time system performance adequately, and it is capable of conducting a verification process, to direct the operator's attention to any issue, concerning the prioritisation capability of situational awareness domain.
Modeling Cyber Situational Awareness Through Data Fusion
2013-03-01
following table: Table 3.10: Example Vulnerable Hosts for Criticality Assessment Experiment Example Id OS Applications/Services Version 1 Mac OS X VLC ...linux.org/. [4] Blasch, E., I. Kadar, J. Salerno, M. Kokar, S. Das, G. Powell, D. Corkill, and E. Ruspini. “Issues and challenges of knowledge representation...Holsopple. “Issues and challenges in higher level fusion: Threat/impact assessment and intent modeling (a panel summary)”. Information Fusion (FUSION
NASA Astrophysics Data System (ADS)
Jenkins, Thomas G.; Held, Eric D.
2015-09-01
Neoclassical tearing modes are macroscopic (L ∼ 1 m) instabilities in magnetic fusion experiments; if unchecked, these modes degrade plasma performance and may catastrophically destroy plasma confinement by inducing a disruption. Fortunately, the use of properly tuned and directed radiofrequency waves (λ ∼ 1 mm) can eliminate these modes. Numerical modeling of this difficult multiscale problem requires the integration of separate mathematical models for each length and time scale (Jenkins and Kruger, 2012 [21]); the extended MHD model captures macroscopic plasma evolution while the RF model tracks the flow and deposition of injected RF power through the evolving plasma profiles. The scale separation enables use of the eikonal (ray-tracing) approximation to model the RF wave propagation. In this work we demonstrate a technique, based on methods of computational geometry, for mapping the ensuing RF data (associated with discrete ray trajectories) onto the finite-element/pseudospectral grid that is used to model the extended MHD physics. In the new representation, the RF data can then be used to construct source terms in the equations of the extended MHD model, enabling quantitative modeling of RF-induced tearing mode stabilization. Though our specific implementation uses the NIMROD extended MHD (Sovinec et al., 2004 [22]) and GENRAY RF (Smirnov et al., 1994 [23]) codes, the approach presented can be applied more generally to any code coupling requiring the mapping of ray tracing data onto Eulerian grids.
Inverse bootstrapping conformal field theories
NASA Astrophysics Data System (ADS)
Li, Wenliang
2018-01-01
We propose a novel approach to study conformal field theories (CFTs) in general dimensions. In the conformal bootstrap program, one usually searches for consistent CFT data that satisfy crossing symmetry. In the new method, we reverse the logic and interpret manifestly crossing-symmetric functions as generating functions of conformal data. Physical CFTs can be obtained by scanning the space of crossing-symmetric functions. By truncating the fusion rules, we are able to concentrate on the low-lying operators and derive some approximate relations for their conformal data. It turns out that the free scalar theory, the 2d minimal model CFTs, the ϕ 4 Wilson-Fisher CFT, the Lee-Yang CFTs and the Ising CFTs are consistent with the universal relations from the minimal fusion rule ϕ 1 × ϕ 1 = I + ϕ 2 + T , where ϕ 1 , ϕ 2 are scalar operators, I is the identity operator and T is the stress tensor.
Feature-based fusion of medical imaging data.
Calhoun, Vince D; Adali, Tülay
2009-09-01
The acquisition of multiple brain imaging types for a given study is a very common practice. There have been a number of approaches proposed for combining or fusing multitask or multimodal information. These can be roughly divided into those that attempt to study convergence of multimodal imaging, for example, how function and structure are related in the same region of the brain, and those that attempt to study the complementary nature of modalities, for example, utilizing temporal EEG information and spatial functional magnetic resonance imaging information. Within each of these categories, one can attempt data integration (the use of one imaging modality to improve the results of another) or true data fusion (in which multiple modalities are utilized to inform one another). We review both approaches and present a recent computational approach that first preprocesses the data to compute features of interest. The features are then analyzed in a multivariate manner using independent component analysis. We describe the approach in detail and provide examples of how it has been used for different fusion tasks. We also propose a method for selecting which combination of modalities provides the greatest value in discriminating groups. Finally, we summarize and describe future research topics.
Minimally invasive surgery: lateral approach interbody fusion: results and review.
Youssef, Jim A; McAfee, Paul C; Patty, Catherine A; Raley, Erin; DeBauche, Spencer; Shucosky, Erin; Chotikul, Liana
2010-12-15
A retrospective review of patients treated at 2 institutions with anterior lumbar interbody fusion using a minimally invasive lateral retroperitoneal approach, and review of literature. To analyze the outcomes from historical literature and from a retrospectively compiled database of patients having undergone anterior interbody fusions performed through a lateral approach. A paucity of published literature exists describing outcomes following lateral approach fusion surgery. Patients treated with extreme lateral interbody fusion (XLIF) were identified through retrospective chart review. Treatment variables included operating room (OR) time, estimated blood loss (EBL), length of hospital stay (LOS), complications, and fusion rate. A literature review, using the National Center for Biotechnology Information databases PubMed/MEDLINE and Google Scholar, yielded 14 peer-reviewed articles reporting outcomes scoring, complications, fusion status, long-term follow-up, and radiographic assessments related to XLIF. Published XLIF results were summarized and evaluated with current study data. A total of 84 XLIF patients were included in the current cohort analysis. OR time, EBL, and length of hospital stay averaged 199 minutes, 155 mL, and 2.6 days, respectively, and perioperative and postoperative complication rates were 2.4% and 6.1%. Mean follow-up was 15.7 months. Sixty-eight patients showed evidence of solid arthrodesis and no subsidence on computed tomography and flexion/extension radiographs. Results were within the ranges of those in the literature. Literature review identified reports of significant improvements in clinical outcomes scores, radiographic measures, and cost effectiveness. Current data corroborates and contributes to the existing body of literature describing XLIF outcomes. Procedures are generally performed with short OR times, minimal EBL, and few complications. Patients recover quickly, requiring minimal hospital stay, although transient hip/thigh pain and/or weakness is common. Long-term outcomes are generally favorable, with maintained improvements in patient-reported pain and function scores as well as radiographic parameters, including high rates of fusion.
Yao, Sen; Li, Tao; Liu, HongGao; Li, JieQing; Wang, YuanZhong
2018-04-01
Boletaceae mushrooms are wild-grown edible mushrooms that have high nutrition, delicious flavor and large economic value distributing in Yunnan Province, China. Traceability is important for the authentication and quality assessment of Boletaceae mushrooms. In this study, UV-visible and Fourier transform infrared (FTIR) spectroscopies were applied for traceability of 247 Boletaceae mushroom samples in combination with chemometrics. Compared with a single spectroscopy technique, data fusion strategy can obviously improve the classification performance in partial least square discriminant analysis (PLS-DA) and grid-search support vector machine (GS-SVM) models, for both species and geographical origin traceability. In addition, PLS-DA and GS-SVM models can provide 100.00% accuracy for species traceability and have reliable evaluation parameters. For geographical origin traceability, the accuracy of prediction in the PLS-DA model by data fusion was just 64.63%, but the GS-SVM model based on data fusion was 100.00%. The results demonstrated that the data fusion strategy of UV-visible and FTIR combined with GS-SVM could provide a higher synergic effect for traceability of Boletaceae mushrooms and have a good generalization ability for the comprehensive quality control and evaluation of similar foods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.
Quality dependent fusion of intramodal and multimodal biometric experts
NASA Astrophysics Data System (ADS)
Kittler, J.; Poh, N.; Fatukasi, O.; Messer, K.; Kryszczuk, K.; Richiardi, J.; Drygajlo, A.
2007-04-01
We address the problem of score level fusion of intramodal and multimodal experts in the context of biometric identity verification. We investigate the merits of confidence based weighting of component experts. In contrast to the conventional approach where confidence values are derived from scores, we use instead raw measures of biometric data quality to control the influence of each expert on the final fused score. We show that quality based fusion gives better performance than quality free fusion. The use of quality weighted scores as features in the definition of the fusion functions leads to further improvements. We demonstrate that the achievable performance gain is also affected by the choice of fusion architecture. The evaluation of the proposed methodology involves 6 face and one speech verification experts. It is carried out on the XM2VTS data base.
Enhancing vector shoreline data using a data fusion approach
NASA Astrophysics Data System (ADS)
Carlotto, Mark; Nebrich, Mark; DeMichele, David
2017-05-01
Vector shoreline (VSL) data is potentially useful in ATR systems that distinguish between objects on land or water. Unfortunately available data such as the NOAA 1:250,000 World Vector Shoreline and NGA Prototype Global Shoreline data cannot be used by themselves to make a land/water determination because of the manner in which the data are compiled. We describe a data fusion approach for creating labeled VSL data using test points from Global 30 Arc-Second Elevation (GTOPO30) data to determine the direction of vector segments; i.e., whether they are in clockwise or counterclockwise order. We show consistently labeled VSL data be used to easily determine whether a point is on land or water using a vector cross product test.
Comparison of interbody fusion approaches for disabling low back pain.
Hacker, R J
1997-03-15
This is a study comparing two groups of patients surgically treated for disabling low back pain. One group was treated with lumbar anteroposterior fusion (360 degrees fusion), the other with posterior lumbar interbody fusion and an interbody fixation device. To determine which approach provided the best and most cost-effective outcome using similar patient selection criteria. Others have shown that certain patients with disabling low back pain benefit from lumbar fusion. Although rarely reported, the costs of different surgical treatments appear to vary significantly, whereas the patient outcome may vary little. Since 1991, 75 patients have been treated Starting in 1993, posterior lumbar interbody fusion BAK was offered to patients as an alternative to 360 degrees fusion. The treating surgeon reviewed the cases. The interbody fixation device used (BAK; Spine-Tech, Inc., Minneapolis, MN) was part of a Food and Drug Administration study. Patient selection criteria included examination, response to conservative therapy, imaging, psychological profile, and discography. North American Spine Society outcome questionnaires, BAK investigation data radiographs, chart entries, billing records and patient interviews were the basis for assessment. Age, sex compensable injury history and history of previous surgery were similar. Operative time; blood loss, hospitalization time, and total costs were significantly different. There was a quicker return to work and closure of workers compensation claims for the posterior lumbar interbody fusion-BAK group. Patient satisfaction was comparable at last follow-up. Posterior lumbar interbody fusion-BAK achieves equal patient satisfaction but fiscally surpasses the 360 degrees fusion approach. Today's environment of regulated medical practice requires the surgeon to consider cost effectiveness when performing fusion for low back pain.
Influence of incomplete fusion on complete fusion at energies above the Coulomb barrier
NASA Astrophysics Data System (ADS)
Shuaib, Mohd; Sharma, Vijay R.; Yadav, Abhishek; Sharma, Manoj Kumar; Singh, Pushpendra P.; Singh, Devendra P.; Kumar, R.; Singh, R. P.; Muralithar, S.; Singh, B. P.; Prasad, R.
2017-10-01
In the present work, excitation functions of several reaction residues in the system 19F+169Tm, populated via the complete and incomplete fusion processes, have been measured using off-line γ-ray spectroscopy. The analysis of excitation functions has been done within the framework of statistical model code pace4. The excitation functions of residues populated via xn and pxn channels are found to be in good agreement with those estimated by the theoretical model code, which confirms the production of these residues solely via complete fusion process. However, a significant enhancement has been observed in the cross-sections of residues involving α-emitting channels as compared to the theoretical predictions. The observed enhancement in the cross-sections has been attributed to the incomplete fusion processes. In order to have a better insight into the onset and strength of incomplete fusion, the incomplete fusion strength function has been deduced. At present, there is no theoretical model available which can satisfactorily explain the incomplete fusion reaction data at energies ≈4-6 MeV/nucleon. In the present work, the influence of incomplete fusion on complete fusion in the 19F+169Tm system has also been studied. The measured cross-section data may be important for the development of reactor technology as well. It has been found that the incomplete fusion strength function strongly depends on the α-Q value of the projectile, which is found to be in good agreement with the existing literature data. The analysis strongly supports the projectile-dependent mass-asymmetry systematics. In order to study the influence of Coulomb effect ({Z}{{P}}{Z}{{T}}) on incomplete fusion, the deduced strength function for the present work is compared with the nearby projectile-target combinations. The incomplete fusion strength function is found to increase linearly with {Z}{{P}}{Z}{{T}}, indicating a strong influence of Coulomb effect in the incomplete fusion reactions.
Arkenbout, Ewout A.; de Winter, Joost C. F.; Breedveld, Paul
2015-01-01
Vision based interfaces for human computer interaction have gained increasing attention over the past decade. This study presents a data fusion approach of the Nimble VR vision based system, using the Kinect camera, with the contact based 5DT Data Glove. Data fusion was achieved through a Kalman filter. The Nimble VR and filter output were compared using measurements performed on (1) a wooden hand model placed in various static postures and orientations; and (2) three differently sized human hands during active finger flexions. Precision and accuracy of joint angle estimates as a function of hand posture and orientation were determined. Moreover, in light of possible self-occlusions of the fingers in the Kinect camera images, data completeness was assessed. Results showed that the integration of the Data Glove through the Kalman filter provided for the proximal interphalangeal (PIP) joints of the fingers a substantial improvement of 79% in precision, from 2.2 deg to 0.9 deg. Moreover, a moderate improvement of 31% in accuracy (being the mean angular deviation from the true joint angle) was established, from 24 deg to 17 deg. The metacarpophalangeal (MCP) joint was relatively unaffected by the Kalman filter. Moreover, the Data Glove increased data completeness, thus providing a substantial advantage over the sole use of the Nimble VR system. PMID:26694395
Arkenbout, Ewout A; de Winter, Joost C F; Breedveld, Paul
2015-12-15
Vision based interfaces for human computer interaction have gained increasing attention over the past decade. This study presents a data fusion approach of the Nimble VR vision based system, using the Kinect camera, with the contact based 5DT Data Glove. Data fusion was achieved through a Kalman filter. The Nimble VR and filter output were compared using measurements performed on (1) a wooden hand model placed in various static postures and orientations; and (2) three differently sized human hands during active finger flexions. Precision and accuracy of joint angle estimates as a function of hand posture and orientation were determined. Moreover, in light of possible self-occlusions of the fingers in the Kinect camera images, data completeness was assessed. Results showed that the integration of the Data Glove through the Kalman filter provided for the proximal interphalangeal (PIP) joints of the fingers a substantial improvement of 79% in precision, from 2.2 deg to 0.9 deg. Moreover, a moderate improvement of 31% in accuracy (being the mean angular deviation from the true joint angle) was established, from 24 deg to 17 deg. The metacarpophalangeal (MCP) joint was relatively unaffected by the Kalman filter. Moreover, the Data Glove increased data completeness, thus providing a substantial advantage over the sole use of the Nimble VR system.
Warner, Guy C; Blum, Jesse M; Jones, Simon B; Lambert, Paul S; Turner, Kenneth J; Tan, Larry; Dawson, Alison S F; Bell, David N F
2010-08-28
The last two decades have seen substantially increased potential for quantitative social science research. This has been made possible by the significant expansion of publicly available social science datasets, the development of new analytical methodologies, such as microsimulation, and increases in computing power. These rich resources do, however, bring with them substantial challenges associated with organizing and using data. These processes are often referred to as 'data management'. The Data Management through e-Social Science (DAMES) project is working to support activities of data management for social science research. This paper describes the DAMES infrastructure, focusing on the data-fusion process that is central to the project approach. It covers: the background and requirements for provision of resources by DAMES; the use of grid technologies to provide easy-to-use tools and user front-ends for several common social science data-management tasks such as data fusion; the approach taken to solve problems related to data resources and metadata relevant to social science applications; and the implementation of the architecture that has been designed to achieve this infrastructure.
Facility Monitoring: A Qualitative Theory for Sensor Fusion
NASA Technical Reports Server (NTRS)
Figueroa, Fernando
2001-01-01
Data fusion and sensor management approaches have largely been implemented with centralized and hierarchical architectures. Numerical and statistical methods are the most common data fusion methods found in these systems. Given the proliferation and low cost of processing power, there is now an emphasis on designing distributed and decentralized systems. These systems use analytical/quantitative techniques or qualitative reasoning methods for date fusion.Based on other work by the author, a sensor may be treated as a highly autonomous (decentralized) unit. Each highly autonomous sensor (HAS) is capable of extracting qualitative behaviours from its data. For example, it detects spikes, disturbances, noise levels, off-limit excursions, step changes, drift, and other typical measured trends. In this context, this paper describes a distributed sensor fusion paradigm and theory where each sensor in the system is a HAS. Hence, given the reach qualitative information from each HAS, a paradigm and formal definitions are given so that sensors and processes can reason and make decisions at the qualitative level. This approach to sensor fusion makes it possible the implementation of intuitive (effective) methods to monitor, diagnose, and compensate processes/systems and their sensors. This paradigm facilitates a balanced distribution of intelligence (code and/or hardware) to the sensor level, the process/system level, and a higher controller level. The primary application of interest is in intelligent health management of rocket engine test stands.
Driver fatigue detection through multiple entropy fusion analysis in an EEG-based system.
Min, Jianliang; Wang, Ping; Hu, Jianfeng
2017-01-01
Driver fatigue is an important contributor to road accidents, and fatigue detection has major implications for transportation safety. The aim of this research is to analyze the multiple entropy fusion method and evaluate several channel regions to effectively detect a driver's fatigue state based on electroencephalogram (EEG) records. First, we fused multiple entropies, i.e., spectral entropy, approximate entropy, sample entropy and fuzzy entropy, as features compared with autoregressive (AR) modeling by four classifiers. Second, we captured four significant channel regions according to weight-based electrodes via a simplified channel selection method. Finally, the evaluation model for detecting driver fatigue was established with four classifiers based on the EEG data from four channel regions. Twelve healthy subjects performed continuous simulated driving for 1-2 hours with EEG monitoring on a static simulator. The leave-one-out cross-validation approach obtained an accuracy of 98.3%, a sensitivity of 98.3% and a specificity of 98.2%. The experimental results verified the effectiveness of the proposed method, indicating that the multiple entropy fusion features are significant factors for inferring the fatigue state of a driver.
Data fusion algorithm for rapid multi-mode dust concentration measurement system based on MEMS
NASA Astrophysics Data System (ADS)
Liao, Maohao; Lou, Wenzhong; Wang, Jinkui; Zhang, Yan
2018-03-01
As single measurement method cannot fully meet the technical requirements of dust concentration measurement, the multi-mode detection method is put forward, as well as the new requirements for data processing. This paper presents a new dust concentration measurement system which contains MEMS ultrasonic sensor and MEMS capacitance sensor, and presents a new data fusion algorithm for this multi-mode dust concentration measurement system. After analyzing the relation between the data of the composite measurement method, the data fusion algorithm based on Kalman filtering is established, which effectively improve the measurement accuracy, and ultimately forms a rapid data fusion model of dust concentration measurement. Test results show that the data fusion algorithm is able to realize the rapid and exact concentration detection.
Yong, Mostyn R N O; Saifzadeh, Siamak; Askin, Geoffrey N; Labrom, Robert D; Hutmacher, Dietmar W; Adam, Clayton J
2014-01-01
A large animal model is required for the assessment of minimally invasive, tissue-engineering-based approaches to thoracic spine fusion, with relevance to deformity correction surgery for human adolescent idiopathic scoliosis. Here, we develop a novel open mini-thoracotomy approach in an ovine model of thoracic interbody fusion that allows the assessment of various fusion constructs, with a focus on novel, tissue-engineering-based interventions. The open mini-thoracotomy surgical approach was developed through a series of mock surgeries, and then applied in a live sheep study. Customized scaffolds were manufactured to conform with intervertebral disc space clearances that were required of the study. Six male Merino sheep aged 4-6 years and weighing 35-45 kg underwent the procedure mentioned earlier and were alloted a survival timeline of 6 months. Each sheep underwent a three-level discectomy (T6/7, T8/9, and T10/11) with a randomly allocated implantation of a different graft substitute at each of the following three levels: (1) polycaprolactone (PCL)-based scaffold plus 0.54 μg recombinant human bone morphogenetic protein-2 (rhBMP-2); (2) PCL-based scaffold alone; or (3) autograft. The sheep were closely monitored postoperatively for signs of pain (i.e., gait abnormalities/teeth gnawing/social isolation). Fusion assessments were conducted postsacrifice using computed tomography and hard-tissue histology. All scientific work was undertaken in accordance with the study protocol that was approved by the Institute's committee on animal research. All six sheep were successfully operated on and reached the allotted survival timeline, thereby demonstrating the feasibility of the surgical procedure and postoperative care. There were no significant complications and during the postoperative period, the animals did not exhibit marked signs of distress according to the previously described assessment criteria. Computed tomographic scanning demonstrated higher fusion grades in the rhBMP-2 plus PCL-based scaffold group in comparison to either PCL-based scaffold alone or autograft. These results were supported by a histological evaluation of the respective groups. This novel open mini-thoracotomy surgical approach to the ovine thoracic spine represents a safe surgical method that can reproducibly form the platform for research into various spine-tissue-engineered constructs and their fusion-promoting properties.
A label field fusion bayesian model and its penalized maximum rand estimator for image segmentation.
Mignotte, Max
2010-06-01
This paper presents a novel segmentation approach based on a Markov random field (MRF) fusion model which aims at combining several segmentation results associated with simpler clustering models in order to achieve a more reliable and accurate segmentation result. The proposed fusion model is derived from the recently introduced probabilistic Rand measure for comparing one segmentation result to one or more manual segmentations of the same image. This non-parametric measure allows us to easily derive an appealing fusion model of label fields, easily expressed as a Gibbs distribution, or as a nonstationary MRF model defined on a complete graph. Concretely, this Gibbs energy model encodes the set of binary constraints, in terms of pairs of pixel labels, provided by each segmentation results to be fused. Combined with a prior distribution, this energy-based Gibbs model also allows for definition of an interesting penalized maximum probabilistic rand estimator with which the fusion of simple, quickly estimated, segmentation results appears as an interesting alternative to complex segmentation models existing in the literature. This fusion framework has been successfully applied on the Berkeley image database. The experiments reported in this paper demonstrate that the proposed method is efficient in terms of visual evaluation and quantitative performance measures and performs well compared to the best existing state-of-the-art segmentation methods recently proposed in the literature.
Metadata Creation Tool Content Template For Data Stewards
A space-time Bayesian fusion model (McMillan, Holland, Morara, and Feng, 2009) is used to provide daily, gridded predictive PM2.5 (daily average) and O3 (daily 8-hr maximum) surfaces for 2001-2005. The fusion model uses both air quality monitoring data from ...
Statistical algorithms improve accuracy of gene fusion detection
Hsieh, Gillian; Bierman, Rob; Szabo, Linda; Lee, Alex Gia; Freeman, Donald E.; Watson, Nathaniel; Sweet-Cordero, E. Alejandro
2017-01-01
Abstract Gene fusions are known to play critical roles in tumor pathogenesis. Yet, sensitive and specific algorithms to detect gene fusions in cancer do not currently exist. In this paper, we present a new statistical algorithm, MACHETE (Mismatched Alignment CHimEra Tracking Engine), which achieves highly sensitive and specific detection of gene fusions from RNA-Seq data, including the highest Positive Predictive Value (PPV) compared to the current state-of-the-art, as assessed in simulated data. We show that the best performing published algorithms either find large numbers of fusions in negative control data or suffer from low sensitivity detecting known driving fusions in gold standard settings, such as EWSR1-FLI1. As proof of principle that MACHETE discovers novel gene fusions with high accuracy in vivo, we mined public data to discover and subsequently PCR validate novel gene fusions missed by other algorithms in the ovarian cancer cell line OVCAR3. These results highlight the gains in accuracy achieved by introducing statistical models into fusion detection, and pave the way for unbiased discovery of potentially driving and druggable gene fusions in primary tumors. PMID:28541529
Shifting from Stewardship to Analytics of Massive Science Data
NASA Astrophysics Data System (ADS)
Crichton, D. J.; Doyle, R.; Law, E.; Hughes, S.; Huang, T.; Mahabal, A.
2015-12-01
Currently, the analysis of large data collections is executed through traditional computational and data analysis approaches, which require users to bring data to their desktops and perform local data analysis. Data collection, archiving and analysis from future remote sensing missions, be it from earth science satellites, planetary robotic missions, or massive radio observatories may not scale as more capable instruments stress existing architectural approaches and systems due to more continuous data streams, data from multiple observational platforms, and measurements and models from different agencies. A new paradigm is needed in order to increase the productivity and effectiveness of scientific data analysis. This paradigm must recognize that architectural choices, data processing, management, analysis, etc are interrelated, and must be carefully coordinated in any system that aims to allow efficient, interactive scientific exploration and discovery to exploit massive data collections. Future observational systems, including satellite and airborne experiments, and research in climate modeling will significantly increase the size of the data requiring new methodological approaches towards data analytics where users can more effectively interact with the data and apply automated mechanisms for data reduction, reduction and fusion across these massive data repositories. This presentation will discuss architecture, use cases, and approaches for developing a big data analytics strategy across multiple science disciplines.
Klose, Diana; Saunders, Ute; Barth, Stefan; Fischer, Rainer; Jacobi, Annett Marita; Nachreiner, Thomas
2016-02-17
In an earlier study we developed a unique strategy allowing us to specifically eliminate antigen-specific murine B cells via their distinct B cell receptors using a new class of fusion proteins. In the present work we elaborated our idea to demonstrate the feasibility of specifically addressing and eliminating human memory B cells. The present study reveals efficient adaptation of the general approach to selectively target and eradicate human memory B cells. In order to demonstrate the feasibility we engineered a fusion protein following the principle of recombinant immunotoxins by combining a model antigen (tetanus toxoid fragment C, TTC) for B cell receptor targeting and a truncated version of Pseudomonas aeruginosa exotoxin A (ETA') to induce apoptosis after cellular uptake. The TTC-ETA' fusion protein not only selectively bound to a TTC-reactive murine B cell hybridoma cell line in vitro but also to freshly isolated human memory B cells from immunized donors ex vivo. Specific toxicity was confirmed on an antigen-specific population of human CD27(+) memory B cells. This protein engineering strategy can be used as a generalized platform approach for the construction of therapeutic fusion proteins with disease-relevant antigens as B cell receptor-binding domains, offering a promising approach for the specific depletion of autoreactive B-lymphocytes in B cell-driven autoimmune diseases.
Fusion Simulation Project Workshop Report
NASA Astrophysics Data System (ADS)
Kritz, Arnold; Keyes, David
2009-03-01
The mission of the Fusion Simulation Project is to develop a predictive capability for the integrated modeling of magnetically confined plasmas. This FSP report adds to the previous activities that defined an approach to integrated modeling in magnetic fusion. These previous activities included a Fusion Energy Sciences Advisory Committee panel that was charged to study integrated simulation in 2002. The report of that panel [Journal of Fusion Energy 20, 135 (2001)] recommended the prompt initiation of a Fusion Simulation Project. In 2003, the Office of Fusion Energy Sciences formed a steering committee that developed a project vision, roadmap, and governance concepts [Journal of Fusion Energy 23, 1 (2004)]. The current FSP planning effort involved 46 physicists, applied mathematicians and computer scientists, from 21 institutions, formed into four panels and a coordinating committee. These panels were constituted to consider: Status of Physics Components, Required Computational and Applied Mathematics Tools, Integration and Management of Code Components, and Project Structure and Management. The ideas, reported here, are the products of these panels, working together over several months and culminating in a 3-day workshop in May 2007.
Semiotic foundation for multisensor-multilook fusion
NASA Astrophysics Data System (ADS)
Myler, Harley R.
1998-07-01
This paper explores the concept of an application of semiotic principles to the design of a multisensor-multilook fusion system. Semiotics is an approach to analysis that attempts to process media in a united way using qualitative methods as opposed to quantitative. The term semiotic refers to signs, or signatory data that encapsulates information. Semiotic analysis involves the extraction of signs from information sources and the subsequent processing of the signs into meaningful interpretations of the information content of the source. The multisensor fusion problem predicated on a semiotic system structure and incorporating semiotic analysis techniques is explored and the design for a multisensor system as an information fusion system is explored. Semiotic analysis opens the possibility of using non-traditional sensor sources and modalities in the fusion process, such as verbal and textual intelligence derived from human observers. Examples of how multisensor/multimodality data might be analyzed semiotically is shown and discussion on how a semiotic system for multisensor fusion could be realized is outlined. The architecture of a semiotic multisensor fusion processor that can accept situational awareness data is described, although an implementation has not as yet been constructed.
Supervised classification of aerial imagery and multi-source data fusion for flood assessment
NASA Astrophysics Data System (ADS)
Sava, E.; Harding, L.; Cervone, G.
2015-12-01
Floods are among the most devastating natural hazards and the ability to produce an accurate and timely flood assessment before, during, and after an event is critical for their mitigation and response. Remote sensing technologies have become the de-facto approach for observing the Earth and its environment. However, satellite remote sensing data are not always available. For these reasons, it is crucial to develop new techniques in order to produce flood assessments during and after an event. Recent advancements in data fusion techniques of remote sensing with near real time heterogeneous datasets have allowed emergency responders to more efficiently extract increasingly precise and relevant knowledge from the available information. This research presents a fusion technique using satellite remote sensing imagery coupled with non-authoritative data such as Civil Air Patrol (CAP) and tweets. A new computational methodology is proposed based on machine learning algorithms to automatically identify water pixels in CAP imagery. Specifically, wavelet transformations are paired with multiple classifiers, run in parallel, to build models discriminating water and non-water regions. The learned classification models are first tested against a set of control cases, and then used to automatically classify each image separately. A measure of uncertainty is computed for each pixel in an image proportional to the number of models classifying the pixel as water. Geo-tagged tweets are continuously harvested and stored on a MongoDB and queried in real time. They are fused with CAP classified data, and with satellite remote sensing derived flood extent results to produce comprehensive flood assessment maps. The final maps are then compared with FEMA generated flood extents to assess their accuracy. The proposed methodology is applied on two test cases, relative to the 2013 floods in Boulder CO, and the 2015 floods in Texas.
Object-oriented structures supporting remote sensing databases
NASA Technical Reports Server (NTRS)
Wichmann, Keith; Cromp, Robert F.
1995-01-01
Object-oriented databases show promise for modeling the complex interrelationships pervasive in scientific domains. To examine the utility of this approach, we have developed an Intelligent Information Fusion System based on this technology, and applied it to the problem of managing an active repository of remotely-sensed satellite scenes. The design and implementation of the system is compared and contrasted with conventional relational database techniques, followed by a presentation of the underlying object-oriented data structures used to enable fast indexing into the data holdings.
NASA Astrophysics Data System (ADS)
Benaskeur, Abder R.; Roy, Jean
2001-08-01
Sensor Management (SM) has to do with how to best manage, coordinate and organize the use of sensing resources in a manner that synergistically improves the process of data fusion. Based on the contextual information, SM develops options for collecting further information, allocates and directs the sensors towards the achievement of the mission goals and/or tunes the parameters for the realtime improvement of the effectiveness of the sensing process. Conscious of the important role that SM has to play in modern data fusion systems, we are currently studying advanced SM Concepts that would help increase the survivability of the current Halifax and Iroquois Class ships, as well as their possible future upgrades. For this purpose, a hierarchical scheme has been proposed for data fusion and resource management adaptation, based on the control theory and within the process refinement paradigm of the JDL data fusion model, and taking into account the multi-agent model put forward by the SASS Group for the situation analysis process. The novelty of this work lies in the unified framework that has been defined for tackling the adaptation of both the fusion process and the sensor/weapon management.
Multisource data fusion for documenting archaeological sites
NASA Astrophysics Data System (ADS)
Knyaz, Vladimir; Chibunichev, Alexander; Zhuravlev, Denis
2017-10-01
The quality of archaeological sites documenting is of great importance for cultural heritage preserving and investigating. The progress in developing new techniques and systems for data acquisition and processing creates an excellent basis for achieving a new quality of archaeological sites documenting and visualization. archaeological data has some specific features which have to be taken into account when acquiring, processing and managing. First of all, it is a needed to gather as full as possible information about findings providing no loss of information and no damage to artifacts. Remote sensing technologies are the most adequate and powerful means which satisfy this requirement. An approach to archaeological data acquiring and fusion based on remote sensing is proposed. It combines a set of photogrammetric techniques for obtaining geometrical and visual information at different scales and detailing and a pipeline for archaeological data documenting, structuring, fusion, and analysis. The proposed approach is applied for documenting of Bosporus archaeological expedition of Russian State Historical Museum.
Differences in 3D vs. 2D analysis in lumbar spinal fusion simulations.
Hsu, Hung-Wei; Bashkuev, Maxim; Pumberger, Matthias; Schmidt, Hendrik
2018-04-27
Lumbar interbody fusion is currently the gold standard in treating patients with disc degeneration or segmental instability. Despite it having been used for several decades, the non-union rate remains high. A failed fusion is frequently attributed to an inadequate mechanical environment after instrumentation. Finite element (FE) models can provide insights into the mechanics of the fusion process. Previous fusion simulations using FE models showed that the geometries and material of the cage can greatly influence the fusion outcome. However, these studies used axisymmetric models which lacked realistic spinal geometries. Therefore, different modeling approaches were evaluated to understand the bone-formation process. Three FE models of the lumbar motion segment (L4-L5) were developed: 2D, Sym-3D and Nonsym-3D. The fusion process based on existing mechano-regulation algorithms using the FE simulations to evaluate the mechanical environment was then integrated into these models. In addition, the influence of different lordotic angles (5, 10 and 15°) was investigated. The volume of newly formed bone, the axial stiffness of the whole segment and bone distribution inside and surrounding the cage were evaluated. In contrast to the Nonsym-3D, the 2D and Sym-3D models predicted excessive bone formation prior to bridging (peak values with 36 and 9% higher than in equilibrium, respectively). The 3D models predicted a more uniform bone distribution compared to the 2D model. The current results demonstrate the crucial role of the realistic 3D geometry of the lumbar motion segment in predicting bone formation after lumbar spinal fusion. Copyright © 2018 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Blasch, Erik; Salerno, John; Kadar, Ivan; Yang, Shanchieh J.; Fenstermacher, Laurie; Endsley, Mica; Grewe, Lynne
2013-05-01
During the SPIE 2012 conference, panelists convened to discuss "Real world issues and challenges in Human Social/Cultural/Behavioral modeling with Applications to Information Fusion." Each panelist presented their current trends and issues. The panel had agreement on advanced situation modeling, working with users for situation awareness and sense-making, and HSCB context modeling in focusing research activities. Each panelist added different perspectives based on the domain of interest such as physical, cyber, and social attacks from which estimates and projections can be forecasted. Also, additional techniques were addressed such as interest graphs, network modeling, and variable length Markov Models. This paper summarizes the panelists discussions to highlight the common themes and the related contrasting approaches to the domains in which HSCB applies to information fusion applications.
NASA Astrophysics Data System (ADS)
Maimaitijiang, Maitiniyazi; Ghulam, Abduwasit; Sidike, Paheding; Hartling, Sean; Maimaitiyiming, Matthew; Peterson, Kyle; Shavers, Ethan; Fishman, Jack; Peterson, Jim; Kadam, Suhas; Burken, Joel; Fritschi, Felix
2017-12-01
Estimating crop biophysical and biochemical parameters with high accuracy at low-cost is imperative for high-throughput phenotyping in precision agriculture. Although fusion of data from multiple sensors is a common application in remote sensing, less is known on the contribution of low-cost RGB, multispectral and thermal sensors to rapid crop phenotyping. This is due to the fact that (1) simultaneous collection of multi-sensor data using satellites are rare and (2) multi-sensor data collected during a single flight have not been accessible until recent developments in Unmanned Aerial Systems (UASs) and UAS-friendly sensors that allow efficient information fusion. The objective of this study was to evaluate the power of high spatial resolution RGB, multispectral and thermal data fusion to estimate soybean (Glycine max) biochemical parameters including chlorophyll content and nitrogen concentration, and biophysical parameters including Leaf Area Index (LAI), above ground fresh and dry biomass. Multiple low-cost sensors integrated on UASs were used to collect RGB, multispectral, and thermal images throughout the growing season at a site established near Columbia, Missouri, USA. From these images, vegetation indices were extracted, a Crop Surface Model (CSM) was advanced, and a model to extract the vegetation fraction was developed. Then, spectral indices/features were combined to model and predict crop biophysical and biochemical parameters using Partial Least Squares Regression (PLSR), Support Vector Regression (SVR), and Extreme Learning Machine based Regression (ELR) techniques. Results showed that: (1) For biochemical variable estimation, multispectral and thermal data fusion provided the best estimate for nitrogen concentration and chlorophyll (Chl) a content (RMSE of 9.9% and 17.1%, respectively) and RGB color information based indices and multispectral data fusion exhibited the largest RMSE 22.6%; the highest accuracy for Chl a + b content estimation was obtained by fusion of information from all three sensors with an RMSE of 11.6%. (2) Among the plant biophysical variables, LAI was best predicted by RGB and thermal data fusion while multispectral and thermal data fusion was found to be best for biomass estimation. (3) For estimation of the above mentioned plant traits of soybean from multi-sensor data fusion, ELR yields promising results compared to PLSR and SVR in this study. This research indicates that fusion of low-cost multiple sensor data within a machine learning framework can provide relatively accurate estimation of plant traits and provide valuable insight for high spatial precision in agriculture and plant stress assessment.
Embedded security system for multi-modal surveillance in a railway carriage
NASA Astrophysics Data System (ADS)
Zouaoui, Rhalem; Audigier, Romaric; Ambellouis, Sébastien; Capman, François; Benhadda, Hamid; Joudrier, Stéphanie; Sodoyer, David; Lamarque, Thierry
2015-10-01
Public transport security is one of the main priorities of the public authorities when fighting against crime and terrorism. In this context, there is a great demand for autonomous systems able to detect abnormal events such as violent acts aboard passenger cars and intrusions when the train is parked at the depot. To this end, we present an innovative approach which aims at providing efficient automatic event detection by fusing video and audio analytics and reducing the false alarm rate compared to classical stand-alone video detection. The multi-modal system is composed of two microphones and one camera and integrates onboard video and audio analytics and fusion capabilities. On the one hand, for detecting intrusion, the system relies on the fusion of "unusual" audio events detection with intrusion detections from video processing. The audio analysis consists in modeling the normal ambience and detecting deviation from the trained models during testing. This unsupervised approach is based on clustering of automatically extracted segments of acoustic features and statistical Gaussian Mixture Model (GMM) modeling of each cluster. The intrusion detection is based on the three-dimensional (3D) detection and tracking of individuals in the videos. On the other hand, for violent events detection, the system fuses unsupervised and supervised audio algorithms with video event detection. The supervised audio technique detects specific events such as shouts. A GMM is used to catch the formant structure of a shout signal. Video analytics use an original approach for detecting aggressive motion by focusing on erratic motion patterns specific to violent events. As data with violent events is not easily available, a normality model with structured motions from non-violent videos is learned for one-class classification. A fusion algorithm based on Dempster-Shafer's theory analyses the asynchronous detection outputs and computes the degree of belief of each probable event.
Neyman-Pearson biometric score fusion as an extension of the sum rule
NASA Astrophysics Data System (ADS)
Hube, Jens Peter
2007-04-01
We define the biometric performance invariance under strictly monotonic functions on match scores as normalization symmetry. We use this symmetry to clarify the essential difference between the standard score-level fusion approaches of sum rule and Neyman-Pearson. We then express Neyman-Pearson fusion assuming match scores defined using false acceptance rates on a logarithmic scale. We show that by stating Neyman-Pearson in this form, it reduces to sum rule fusion for ROC curves with logarithmic slope. We also introduce a one parameter model of biometric performance and use it to express Neyman-Pearson fusion as a weighted sum rule.
Proposed evaluation framework for assessing operator performance with multisensor displays
NASA Technical Reports Server (NTRS)
Foyle, David C.
1992-01-01
Despite aggressive work on the development of sensor fusion algorithms and techniques, no formal evaluation procedures have been proposed. Based on existing integration models in the literature, an evaluation framework is developed to assess an operator's ability to use multisensor, or sensor fusion, displays. The proposed evaluation framework for evaluating the operator's ability to use such systems is a normative approach: The operator's performance with the sensor fusion display can be compared to the models' predictions based on the operator's performance when viewing the original sensor displays prior to fusion. This allows for the determination as to when a sensor fusion system leads to: 1) poorer performance than one of the original sensor displays (clearly an undesirable system in which the fused sensor system causes some distortion or interference); 2) better performance than with either single sensor system alone, but at a sub-optimal (compared to the model predictions) level; 3) optimal performance (compared to model predictions); or, 4) super-optimal performance, which may occur if the operator were able to use some highly diagnostic 'emergent features' in the sensor fusion display, which were unavailable in the original sensor displays. An experiment demonstrating the usefulness of the proposed evaluation framework is discussed.
Discriminative confidence estimation for probabilistic multi-atlas label fusion.
Benkarim, Oualid M; Piella, Gemma; González Ballester, Miguel Angel; Sanroma, Gerard
2017-12-01
Quantitative neuroimaging analyses often rely on the accurate segmentation of anatomical brain structures. In contrast to manual segmentation, automatic methods offer reproducible outputs and provide scalability to study large databases. Among existing approaches, multi-atlas segmentation has recently shown to yield state-of-the-art performance in automatic segmentation of brain images. It consists in propagating the labelmaps from a set of atlases to the anatomy of a target image using image registration, and then fusing these multiple warped labelmaps into a consensus segmentation on the target image. Accurately estimating the contribution of each atlas labelmap to the final segmentation is a critical step for the success of multi-atlas segmentation. Common approaches to label fusion either rely on local patch similarity, probabilistic statistical frameworks or a combination of both. In this work, we propose a probabilistic label fusion framework based on atlas label confidences computed at each voxel of the structure of interest. Maximum likelihood atlas confidences are estimated using a supervised approach, explicitly modeling the relationship between local image appearances and segmentation errors produced by each of the atlases. We evaluate different spatial pooling strategies for modeling local segmentation errors. We also present a novel type of label-dependent appearance features based on atlas labelmaps that are used during confidence estimation to increase the accuracy of our label fusion. Our approach is evaluated on the segmentation of seven subcortical brain structures from the MICCAI 2013 SATA Challenge dataset and the hippocampi from the ADNI dataset. Overall, our results indicate that the proposed label fusion framework achieves superior performance to state-of-the-art approaches in the majority of the evaluated brain structures and shows more robustness to registration errors. Copyright © 2017 Elsevier B.V. All rights reserved.
Subcellular localization of transiently expressed fluorescent fusion proteins.
Collings, David A
2013-01-01
The recent and massive expansion in plant genomics data has generated a large number of gene sequences for which two seemingly simple questions need to be answered: where do the proteins encoded by these genes localize in cells, and what do they do? One widespread approach to answering the localization question has been to use particle bombardment to transiently express unknown proteins tagged with green fluorescent protein (GFP) or its numerous derivatives. Confocal fluorescence microscopy is then used to monitor the localization of the fluorescent protein as it hitches a ride through the cell. The subcellular localization of the fusion protein, if not immediately apparent, can then be determined by comparison to localizations generated by fluorescent protein fusions to known signalling sequences and proteins, or by direct comparison with fluorescent dyes. This review aims to be a tour guide for researchers wanting to travel this hitch-hiker's path, and for reviewers and readers who wish to understand their travel reports. It will describe some of the technology available for visualizing protein localizations, and some of the experimental approaches for optimizing and confirming localizations generated by particle bombardment in onion epidermal cells, the most commonly used experimental system. As the non-conservation of signal sequences in heterologous expression systems such as onion, and consequent mis-targeting of fusion proteins, is always a potential problem, the epidermal cells of the Argenteum mutant of pea are proposed as a model system.
Zhang, Xinzheng; Rad, Ahmad B; Wong, Yiu-Kwong
2012-01-01
This paper presents a sensor fusion strategy applied for Simultaneous Localization and Mapping (SLAM) in dynamic environments. The designed approach consists of two features: (i) the first one is a fusion module which synthesizes line segments obtained from laser rangefinder and line features extracted from monocular camera. This policy eliminates any pseudo segments that appear from any momentary pause of dynamic objects in laser data. (ii) The second characteristic is a modified multi-sensor point estimation fusion SLAM (MPEF-SLAM) that incorporates two individual Extended Kalman Filter (EKF) based SLAM algorithms: monocular and laser SLAM. The error of the localization in fused SLAM is reduced compared with those of individual SLAM. Additionally, a new data association technique based on the homography transformation matrix is developed for monocular SLAM. This data association method relaxes the pleonastic computation. The experimental results validate the performance of the proposed sensor fusion and data association method.
Empirical Data Fusion for Convective Weather Hazard Nowcasting
NASA Astrophysics Data System (ADS)
Williams, J.; Ahijevych, D.; Steiner, M.; Dettling, S.
2009-09-01
This paper describes a statistical analysis approach to developing an automated convective weather hazard nowcast system suitable for use by aviation users in strategic route planning and air traffic management. The analysis makes use of numerical weather prediction model fields and radar, satellite, and lightning observations and derived features along with observed thunderstorm evolution data, which are aligned using radar-derived motion vectors. Using a dataset collected during the summers of 2007 and 2008 over the eastern U.S., the predictive contributions of the various potential predictor fields are analyzed for various spatial scales, lead-times and scenarios using a technique called random forests (RFs). A minimal, skillful set of predictors is selected for each scenario requiring distinct forecast logic, and RFs are used to construct an empirical probabilistic model for each. The resulting data fusion system, which ran in real-time at the National Center for Atmospheric Research during the summer of 2009, produces probabilistic and deterministic nowcasts of the convective weather hazard and assessments of the prediction uncertainty. The nowcasts' performance and results for several case studies are presented to demonstrate the value of this approach. This research has been funded by the U.S. Federal Aviation Administration to support the development of the Consolidated Storm Prediction for Aviation (CoSPA) system, which is intended to provide convective hazard nowcasts and forecasts for the U.S. Next Generation Air Transportation System (NextGen).
Fusion of MultiSpectral and Panchromatic Images Based on Morphological Operators.
Restaino, Rocco; Vivone, Gemine; Dalla Mura, Mauro; Chanussot, Jocelyn
2016-04-20
Nonlinear decomposition schemes constitute an alternative to classical approaches for facing the problem of data fusion. In this paper we discuss the application of this methodology to a popular remote sensing application called pansharpening, which consists in the fusion of a low resolution multispectral image and a high resolution panchromatic image. We design a complete pansharpening scheme based on the use of morphological half gradients operators and demonstrate the suitability of this algorithm through the comparison with state of the art approaches. Four datasets acquired by the Pleiades, Worldview-2, Ikonos and Geoeye-1 satellites are employed for the performance assessment, testifying the effectiveness of the proposed approach in producing top-class images with a setting independent of the specific sensor.
Protein fold recognition using geometric kernel data fusion.
Zakeri, Pooya; Jeuris, Ben; Vandebril, Raf; Moreau, Yves
2014-07-01
Various approaches based on features extracted from protein sequences and often machine learning methods have been used in the prediction of protein folds. Finding an efficient technique for integrating these different protein features has received increasing attention. In particular, kernel methods are an interesting class of techniques for integrating heterogeneous data. Various methods have been proposed to fuse multiple kernels. Most techniques for multiple kernel learning focus on learning a convex linear combination of base kernels. In addition to the limitation of linear combinations, working with such approaches could cause a loss of potentially useful information. We design several techniques to combine kernel matrices by taking more involved, geometry inspired means of these matrices instead of convex linear combinations. We consider various sequence-based protein features including information extracted directly from position-specific scoring matrices and local sequence alignment. We evaluate our methods for classification on the SCOP PDB-40D benchmark dataset for protein fold recognition. The best overall accuracy on the protein fold recognition test set obtained by our methods is ∼ 86.7%. This is an improvement over the results of the best existing approach. Moreover, our computational model has been developed by incorporating the functional domain composition of proteins through a hybridization model. It is observed that by using our proposed hybridization model, the protein fold recognition accuracy is further improved to 89.30%. Furthermore, we investigate the performance of our approach on the protein remote homology detection problem by fusing multiple string kernels. The MATLAB code used for our proposed geometric kernel fusion frameworks are publicly available at http://people.cs.kuleuven.be/∼raf.vandebril/homepage/software/geomean.php?menu=5/. © The Author 2014. Published by Oxford University Press.
Wu, Lingfei; Wu, Kesheng; Sim, Alex; ...
2016-06-01
A novel algorithm and implementation of real-time identification and tracking of blob-filaments in fusion reactor data is presented. Similar spatio-temporal features are important in many other applications, for example, ignition kernels in combustion and tumor cells in a medical image. This work presents an approach for extracting these features by dividing the overall task into three steps: local identification of feature cells, grouping feature cells into extended feature, and tracking movement of feature through overlapping in space. Through our extensive work in parallelization, we demonstrate that this approach can effectively make use of a large number of compute nodes tomore » detect and track blob-filaments in real time in fusion plasma. Here, on a set of 30GB fusion simulation data, we observed linear speedup on 1024 processes and completed blob detection in less than three milliseconds using Edison, a Cray XC30 system at NERSC.« less
Raman/LIBS Data Fusion via Two-Way Variational Autoencoders
NASA Astrophysics Data System (ADS)
Parente, M.; Gemp, I.
2018-04-01
We propose an original solution to extracting mineral abundances from Raman spectra by combining Raman data with LIBS using a novel deep learning model based on variational autoencoders and data fusion, which outperforms the current state of the art.
Functional human antibody CDR fusions as long-acting therapeutic endocrine agonists.
Liu, Tao; Zhang, Yong; Liu, Yan; Wang, Ying; Jia, Haiqun; Kang, Mingchao; Luo, Xiaozhou; Caballero, Dawna; Gonzalez, Jose; Sherwood, Lance; Nunez, Vanessa; Wang, Danling; Woods, Ashley; Schultz, Peter G; Wang, Feng
2015-02-03
On the basis of the 3D structure of a bovine antibody with a well-folded, ultralong complementarity-determining region (CDR), we have developed a versatile approach for generating human or humanized antibody agonists with excellent pharmacological properties. Using human growth hormone (hGH) and human leptin (hLeptin) as model proteins, we have demonstrated that functional human antibody CDR fusions can be efficiently engineered by grafting the native hormones into different CDRs of the humanized antibody Herceptin. The resulting Herceptin CDR fusion proteins were expressed in good yields in mammalian cells and retain comparable in vitro biological activity to the native hormones. Pharmacological studies in rodents indicated a 20- to 100-fold increase in plasma circulating half-life for these antibody agonists and significantly extended in vivo activities in the GH-deficient rat model and leptin-deficient obese mouse model for the hGH and hLeptin antibody fusions, respectively. These results illustrate the utility of antibody CDR fusions as a general and versatile strategy for generating long-acting protein therapeutics.
Chun, Danielle S; Cook, Ralph W; Weiner, Joseph A; Schallmo, Michael S; Barth, Kathryn A; Singh, Sameer K; Freshman, Ryan D; Patel, Alpesh A; Hsu, Wellington K
2018-03-01
Retrospective cohort. Determine whether surgeon demographic factors influence postoperative complication rates after elective spine fusion procedures. Surgeon demographic factors have been shown to impact decision making in the management of degenerative disease of the lumbar spine. Complication rates are frequently reported outcome measurements used to evaluate surgical treatments, quality-of-care, and determine health care reimbursements. However, there are few studies investigating the association between surgeon demographic factors and complication outcomes after elective spine fusions. A database of US spine surgeons with corresponding postoperative complications data after elective spine fusions was compiled utilizing public data provided by the Centers for Medicare and Medicaid Services (2011-2013) and ProPublica Surgeon Scorecard (2009-2013). Demographic data for each surgeon was collected and consisted of: surgical specialty (orthopedic vs. neurosurgery), years in practice, practice setting (private vs. academic), type of medical degree (MD vs. DO), medical school location (United States vs. foreign), sex, and geographic region of practice. General linear mixed models using a Beta distribution with a logit link and pairwise comparison with post hoc Tukey-Kramer were used to assess the relationship between surgeon demographics and complication rates. 2110 US-practicing spine surgeons who performed spine fusions on 125,787 Medicare patients from 2011 to 2013 met inclusion criteria for this study. None of the surgeon demographic factors analyzed were found to significantly affect overall complication rates in lumbar (posterior approach) or cervical spine fusion. Publicly available complication rates for individual spine surgeons are being utilized by hospital systems and patients to assess aptitude and gauge expectations. The increasing demand for transparency will likely lead to emphasis of these statistics to improve outcomes. We conclude that none of the surgeon demographic factors analyzed in this study are associated with differences in overall complications rates in patients undergoing elective spine fusion as published by the ProPublica Surgeon Scorecard. Level 3.
Model-based diagnostics of gas turbine engine lubrication systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Byington, C.S.
1998-09-01
The objective of the current research was to develop improved methodology for diagnosing anomalies and maintaining oil lubrication systems for gas turbine engines. The effort focused on the development of reasoning modules that utilize the existing, inexpensive sensors and are applicable to on-line monitoring within the full-authority digital engine controller (FADEC) of the engine. The target application is the Enhanced TF-40B gas turbine engine that powers the Landing Craft Air Cushion (LCAC) platform. To accomplish the development of the requisite data fusion algorithms and automated reasoning for the diagnostic modules, Penn State ARL produced a generic Turbine Engine Lubrication Systemmore » Simulator (TELSS) and Data Fusion Workbench (DFW). TELSS is a portable simulator code that calculates lubrication system parameters based upon one-dimensional fluid flow resistance network equations. Validation of the TF- 40B modules was performed using engineering and limited test data. The simulation model was used to analyze operational data from the LCAC fleet. The TELSS, as an integral portion of the DFW, provides the capability to experiment with combinations of variables and feature vectors that characterize normal and abnormal operation of the engine lubrication system. The model-based diagnostics approach is applicable to all gas turbine engines and mechanical transmissions with similar pressure-fed lubrication systems.« less
Wu, Mingquan; Yang, Chenghai; Song, Xiaoyu; Hoffmann, Wesley Clint; Huang, Wenjiang; Niu, Zheng; Wang, Changyao; Li, Wang; Yu, Bo
2018-01-31
To better understand the progression of cotton root rot within the season, time series monitoring is required. In this study, an improved spatial and temporal data fusion approach (ISTDFA) was employed to combine 250-m Moderate Resolution Imaging Spectroradiometer (MODIS) Normalized Different Vegetation Index (NDVI) and 10-m Sentinetl-2 NDVI data to generate a synthetic Sentinel-2 NDVI time series for monitoring this disease. Then, the phenology of healthy cotton and infected cotton was modeled using a logistic model. Finally, several phenology parameters, including the onset day of greenness minimum (OGM), growing season length (GLS), onset of greenness increase (OGI), max NDVI value, and integral area of the phenology curve, were calculated. The results showed that ISTDFA could be used to combine time series MODIS and Sentinel-2 NDVI data with a correlation coefficient of 0.893. The logistic model could describe the phenology curves with R-squared values from 0.791 to 0.969. Moreover, the phenology curve of infected cotton showed a significant difference from that of healthy cotton. The max NDVI value, OGM, GSL and the integral area of the phenology curve for infected cotton were reduced by 0.045, 30 days, 22 days, and 18.54%, respectively, compared with those for healthy cotton.
Khodabandeloo, Babak; Melvin, Dyan; Jo, Hongki
2017-01-01
Direct measurements of external forces acting on a structure are infeasible in many cases. The Augmented Kalman Filter (AKF) has several attractive features that can be utilized to solve the inverse problem of identifying applied forces, as it requires the dynamic model and the measured responses of structure at only a few locations. But, the AKF intrinsically suffers from numerical instabilities when accelerations, which are the most common response measurements in structural dynamics, are the only measured responses. Although displacement measurements can be used to overcome the instability issue, the absolute displacement measurements are challenging and expensive for full-scale dynamic structures. In this paper, a reliable model-based data fusion approach to reconstruct dynamic forces applied to structures using heterogeneous structural measurements (i.e., strains and accelerations) in combination with AKF is investigated. The way of incorporating multi-sensor measurements in the AKF is formulated. Then the formulation is implemented and validated through numerical examples considering possible uncertainties in numerical modeling and sensor measurement. A planar truss example was chosen to clearly explain the formulation, while the method and formulation are applicable to other structures as well. PMID:29149088
An approach to 3D model fusion in GIS systems and its application in a future ECDIS
NASA Astrophysics Data System (ADS)
Liu, Tao; Zhao, Depeng; Pan, Mingyang
2016-04-01
Three-dimensional (3D) computer graphics technology is widely used in various areas and causes profound changes. As an information carrier, 3D models are becoming increasingly important. The use of 3D models greatly helps to improve the cartographic expression and design. 3D models are more visually efficient, quicker and easier to understand and they can express more detailed geographical information. However, it is hard to efficiently and precisely fuse 3D models in local systems. The purpose of this study is to propose an automatic and precise approach to fuse 3D models in geographic information systems (GIS). It is the basic premise for subsequent uses of 3D models in local systems, such as attribute searching, spatial analysis, and so on. The basic steps of our research are: (1) pose adjustment by principal component analysis (PCA); (2) silhouette extraction by simple mesh silhouette extraction and silhouette merger; (3) size adjustment; (4) position matching. Finally, we implement the above methods in our system Automotive Intelligent Chart (AIC) 3D Electronic Chart Display and Information Systems (ECDIS). The fusion approach we propose is a common method and each calculation step is carefully designed. This approach solves the problem of cross-platform model fusion. 3D models can be from any source. They may be stored in the local cache or retrieved from Internet, or may be manually created by different tools or automatically generated by different programs. The system can be any kind of 3D GIS system.
Assessing tropical rainforest growth traits: Data - Model fusion in the Congo basin and beyond
NASA Astrophysics Data System (ADS)
Pietsch, Stephan
2017-04-01
Virgin forest ecosystems resemble the key reference level for natural tree growth dynamics. The mosaic cycle concept describes such dynamics as local disequilibria driven by patch level succession cycles of breakdown, regeneration, juvenescence and old growth. These cycles, however, may involve different traits of light demanding and shade tolerant species assemblies. In this work a data model fusion concept will be introduced to assess the differences in growth dynamics of the mosaic cycle of the Western Congolian Lowland Rainforest ecosystem. Field data from 34 forest patches located in an ice age forest refuge, recently pinpointed to the ground and still devoid of direct human impact up to today - resemble the data base. A 3D error assessment procedure versus BGC model simulations for the 34 patches revealed two different growth dynamics, consistent with observed growth traits of pioneer and late succession species assemblies of the Western Congolian Lowland rainforest. An application of the same procedure to Central American Pacific rainforests confirms the strength of the 3D error field data model fusion concept to Central American Pacific rainforests confirms the strength of the 3D error field data model fusion concept to assess different growth traits of the mosaic cycle of natural forest dynamics.
Yoo, Jejoong; Jackson, Meyer B.; Cui, Qiang
2013-01-01
To establish the validity of continuum mechanics models quantitatively for the analysis of membrane remodeling processes, we compare the shape and energies of the membrane fusion pore predicted by coarse-grained (MARTINI) and continuum mechanics models. The results at these distinct levels of resolution give surprisingly consistent descriptions for the shape of the fusion pore, and the deviation between the continuum and coarse-grained models becomes notable only when the radius of curvature approaches the thickness of a monolayer. Although slow relaxation beyond microseconds is observed in different perturbative simulations, the key structural features (e.g., dimension and shape of the fusion pore near the pore center) are consistent among independent simulations. These observations provide solid support for the use of coarse-grained and continuum models in the analysis of membrane remodeling. The combined coarse-grained and continuum analysis confirms the recent prediction of continuum models that the fusion pore is a metastable structure and that its optimal shape is neither toroidal nor catenoidal. Moreover, our results help reveal a new, to our knowledge, bowing feature in which the bilayers close to the pore axis separate more from one another than those at greater distances from the pore axis; bowing helps reduce the curvature and therefore stabilizes the fusion pore structure. The spread of the bilayer deformations over distances of hundreds of nanometers and the substantial reduction in energy of fusion pore formation provided by this spread indicate that membrane fusion can be enhanced by allowing a larger area of membrane to participate and be deformed. PMID:23442963
Xia, Youshen; Kamel, Mohamed S
2007-06-01
Identification of a general nonlinear noisy system viewed as an estimation of a predictor function is studied in this article. A measurement fusion method for the predictor function estimate is proposed. In the proposed scheme, observed data are first fused by using an optimal fusion technique, and then the optimal fused data are incorporated in a nonlinear function estimator based on a robust least squares support vector machine (LS-SVM). A cooperative learning algorithm is proposed to implement the proposed measurement fusion method. Compared with related identification methods, the proposed method can minimize both the approximation error and the noise error. The performance analysis shows that the proposed optimal measurement fusion function estimate has a smaller mean square error than the LS-SVM function estimate. Moreover, the proposed cooperative learning algorithm can converge globally to the optimal measurement fusion function estimate. Finally, the proposed measurement fusion method is applied to ARMA signal and spatial temporal signal modeling. Experimental results show that the proposed measurement fusion method can provide a more accurate model.
Multimodal biometric system using rank-level fusion approach.
Monwar, Md Maruf; Gavrilova, Marina L
2009-08-01
In many real-world applications, unimodal biometric systems often face significant limitations due to sensitivity to noise, intraclass variability, data quality, nonuniversality, and other factors. Attempting to improve the performance of individual matchers in such situations may not prove to be highly effective. Multibiometric systems seek to alleviate some of these problems by providing multiple pieces of evidence of the same identity. These systems help achieve an increase in performance that may not be possible using a single-biometric indicator. This paper presents an effective fusion scheme that combines information presented by multiple domain experts based on the rank-level fusion integration method. The developed multimodal biometric system possesses a number of unique qualities, starting from utilizing principal component analysis and Fisher's linear discriminant methods for individual matchers (face, ear, and signature) identity authentication and utilizing the novel rank-level fusion method in order to consolidate the results obtained from different biometric matchers. The ranks of individual matchers are combined using the highest rank, Borda count, and logistic regression approaches. The results indicate that fusion of individual modalities can improve the overall performance of the biometric system, even in the presence of low quality data. Insights on multibiometric design using rank-level fusion and its performance on a variety of biometric databases are discussed in the concluding section.
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-01-01
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot’s wheels, and 24 fuzzy rules for the robot’s movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes. PMID:26712766
Sensor Fusion Based Model for Collision Free Mobile Robot Navigation.
Almasri, Marwah; Elleithy, Khaled; Alajlan, Abrar
2015-12-26
Autonomous mobile robots have become a very popular and interesting topic in the last decade. Each of them are equipped with various types of sensors such as GPS, camera, infrared and ultrasonic sensors. These sensors are used to observe the surrounding environment. However, these sensors sometimes fail and have inaccurate readings. Therefore, the integration of sensor fusion will help to solve this dilemma and enhance the overall performance. This paper presents a collision free mobile robot navigation based on the fuzzy logic fusion model. Eight distance sensors and a range finder camera are used for the collision avoidance approach where three ground sensors are used for the line or path following approach. The fuzzy system is composed of nine inputs which are the eight distance sensors and the camera, two outputs which are the left and right velocities of the mobile robot's wheels, and 24 fuzzy rules for the robot's movement. Webots Pro simulator is used for modeling the environment and the robot. The proposed methodology, which includes the collision avoidance based on fuzzy logic fusion model and line following robot, has been implemented and tested through simulation and real time experiments. Various scenarios have been presented with static and dynamic obstacles using one robot and two robots while avoiding obstacles in different shapes and sizes.
Goz, Vadim; Weinreb, Jeffrey H; Schwab, Frank; Lafage, Virginie; Errico, Thomas J
2014-09-01
Lumbar interbody fusion (LIF) techniques have been used for years to treat a number of pathologies of the lower back. These procedures may use an anterior, posterior, or combined surgical approach. Each approach is associated with a unique set of complications, but the exact prevalence of complications associated with each approach remains unclear. To investigate the rates of perioperative complications of anterior lumbar interbody fusion (ALIF), posterior/transforaminal lumbar interbody fusion (P/TLIF), and LIF with a combined anterior-posterior interbody fusion (APF). Retrospective review of national data from a large administrative database. Patients undergoing ALIF, P/TLIF, or APF. Perioperative complications, length of stay (LOS), total costs, and mortality. The Nationwide Inpatient Sample database was queried for patients undergoing ALIF, P/TLIF, or APF between 2001 and 2010 as identified via International Classification of Diseases, ninth revision codes. Univariate analyses were carried out comparing the three cohorts in terms of the outcomes of interest. Multivariate analysis for primary outcomes was carried out adjusting for overall comorbidity burden, race, gender, age, and length of fusion. National estimates of annual total number of procedures were calculated based on the provided discharge weights. Geographic distribution of the three cohorts was also investigated. An estimated total of 923,038 LIFs were performed between 2001 and 2010 in the United States. Posterior/transforaminal lumbar interbody fusions accounted for 79% to 86% of total LIFs between 2001 and 2010, ALIFs for 10% to 15%, and APF decreased from 10% in 2002 to less than 1% in 2010. On average, P/TLIF patients were oldest (54.55 years), followed by combined approach (47.23 years) and ALIF (46.94 years) patients (p<.0001). Anterior lumbar interbody fusion, P/TLIF, and combined surgical costs were $75,872, $65,894, and $92,249, respectively (p<.0001). Patients in the P/TLIF cohort had the greatest number of comorbidities, having the highest prevalence for 10 of 17 comorbidities investigated. Anterior-posterior interbody fusion group was associated with the greatest number of complications, having the highest incidence of 12 of the 16 complications investigated. These data help to define the perioperative risks for several LIF approaches. Comparison of outcomes showed that a combined approach is more expensive and associated with greater LOS, whereas ALIF is associated with the highest postoperative mortality. These trends should be taken into consideration during surgical planning to improve clinical outcomes. Copyright © 2014 Elsevier Inc. All rights reserved.
Statistical rice yield modeling using blended MODIS-Landsat based crop phenology metrics in Taiwan
NASA Astrophysics Data System (ADS)
Chen, C. R.; Chen, C. F.; Nguyen, S. T.; Lau, K. V.
2015-12-01
Taiwan is a populated island with a majority of residents settled in the western plains where soils are suitable for rice cultivation. Rice is not only the most important commodity, but also plays a critical role for agricultural and food marketing. Information of rice production is thus important for policymakers to devise timely plans for ensuring sustainably socioeconomic development. Because rice fields in Taiwan are generally small and yet crop monitoring requires information of crop phenology associating with the spatiotemporal resolution of satellite data, this study used Landsat-MODIS fusion data for rice yield modeling in Taiwan. We processed the data for the first crop (Feb-Mar to Jun-Jul) and the second (Aug-Sep to Nov-Dec) in 2014 through five main steps: (1) data pre-processing to account for geometric and radiometric errors of Landsat data, (2) Landsat-MODIS data fusion using using the spatial-temporal adaptive reflectance fusion model, (3) construction of the smooth time-series enhanced vegetation index 2 (EVI2), (4) rice yield modeling using EVI2-based crop phenology metrics, and (5) error verification. The fusion results by a comparison bewteen EVI2 derived from the fusion image and that from the reference Landsat image indicated close agreement between the two datasets (R2 > 0.8). We analysed smooth EVI2 curves to extract phenology metrics or phenological variables for establishment of rice yield models. The results indicated that the established yield models significantly explained more than 70% variability in the data (p-value < 0.001). The comparison results between the estimated yields and the government's yield statistics for the first and second crops indicated a close significant relationship between the two datasets (R2 > 0.8), in both cases. The root mean square error (RMSE) and mean absolute error (MAE) used to measure the model accuracy revealed the consistency between the estimated yields and the government's yield statistics. This study demonstrates advantages of using EVI2-based phenology metrics (derived from Landsat-MODIS fusion data) for rice yield estimation in Taiwan prior to the harvest period.
NASA Astrophysics Data System (ADS)
Pournamdari, M.; Hashim, M.
2014-02-01
Chromite ore deposit occurrence is related to ophiolite complexes as a part of the oceanic crust and provides a good opportunity for lithological mapping using remote sensing data. The main contribution of this paper is a novel approaches to discriminate different rock units associated with ophiolite complex using the Feature Level Fusion technique on ASTER and Landsat TM satellite data at regional scale. In addition this study has applied spectral transform approaches, consisting of Spectral Angle Mapper (SAM) to distinguish the concentration of high-potential areas of chromite and also for determining the boundary between different rock units. Results indicated both approaches show superior outputs compared to other methods and can produce a geological map for ophiolite complex rock units in the arid and the semi-arid region. The novel technique including feature level fusion and Spectral Angle Mapper (SAM) discriminated ophiolitic rock units and produced detailed geological maps of the study area. As a case study, Sikhoran ophiolite complex located in SE, Iran has been selected for image processing techniques. In conclusion, a suitable approach for lithological mapping of ophiolite complexes is demonstrated, this technique contributes meaningfully towards economic geology in terms of identifying new prospects.
Trust Model of Wireless Sensor Networks and Its Application in Data Fusion
Chen, Zhenguo; Tian, Liqin; Lin, Chuang
2017-01-01
In order to ensure the reliability and credibility of the data in wireless sensor networks (WSNs), this paper proposes a trust evaluation model and data fusion mechanism based on trust. First of all, it gives the model structure. Then, the calculation rules of trust are given. In the trust evaluation model, comprehensive trust consists of three parts: behavior trust, data trust, and historical trust. Data trust can be calculated by processing the sensor data. Based on the behavior of nodes in sensing and forwarding, the behavior trust is obtained. The initial value of historical trust is set to the maximum and updated with comprehensive trust. Comprehensive trust can be obtained by weighted calculation, and then the model is used to construct the trust list and guide the process of data fusion. Using the trust model, simulation results indicate that energy consumption can be reduced by an average of 15%. The detection rate of abnormal nodes is at least 10% higher than that of the lightweight and dependable trust system (LDTS) model. Therefore, this model has good performance in ensuring the reliability and credibility of the data. Moreover, the energy consumption of transmitting was greatly reduced. PMID:28350347
The optimal algorithm for Multi-source RS image fusion.
Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan
2016-01-01
In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.
2015-06-09
anomaly detection , which is generally considered part of high level information fusion (HLIF) involving temporal-geospatial data as well as meta-data... Anomaly detection in the Maritime defence and security domain typically focusses on trying to identify vessels that are behaving in an unusual...manner compared with lawful vessels operating in the area – an applied case of target detection among distractors. Anomaly detection is a complex problem
Orhan, U.; Erdogmus, D.; Roark, B.; Oken, B.; Purwar, S.; Hild, K. E.; Fowler, A.; Fried-Oken, M.
2013-01-01
RSVP Keyboard™ is an electroencephalography (EEG) based brain computer interface (BCI) typing system, designed as an assistive technology for the communication needs of people with locked-in syndrome (LIS). It relies on rapid serial visual presentation (RSVP) and does not require precise eye gaze control. Existing BCI typing systems which uses event related potentials (ERP) in EEG suffer from low accuracy due to low signal-to-noise ratio. Henceforth, RSVP Keyboard™ utilizes a context based decision making via incorporating a language model, to improve the accuracy of letter decisions. To further improve the contributions of the language model, we propose recursive Bayesian estimation, which relies on non-committing string decisions, and conduct an offline analysis, which compares it with the existing naïve Bayesian fusion approach. The results indicate the superiority of the recursive Bayesian fusion and in the next generation of RSVP Keyboard™ we plan to incorporate this new approach. PMID:23366432
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zylstra, A. B.; Frenje, J. A.; Gatu Johnson, M.
Few-body nuclear physics often relies upon phenomenological models, with new efforts at the ab initio theory reported recently; both need high-quality benchmark data, particularly at low center-of-mass energies. We use high-energy-density plasmas to measure the proton spectra from 3He + T and 3He + 3He fusion. The data disagree with R -matrix predictions constrained by neutron spectra from T + T fusion. Here, we present a new analysis of the 3He + 3He proton spectrum; these benchmarked spectral shapes should be used for interpreting low-resolution data, such as solar fusion cross-section measurements.
NASA Astrophysics Data System (ADS)
Zylstra, A. B.; Frenje, J. A.; Gatu Johnson, M.; Hale, G. M.; Brune, C. R.; Bacher, A.; Casey, D. T.; Li, C. K.; McNabb, D.; Paris, M.; Petrasso, R. D.; Sangster, T. C.; Sayre, D. B.; Séguin, F. H.
2017-12-01
Few-body nuclear physics often relies upon phenomenological models, with new efforts at the ab initio theory reported recently; both need high-quality benchmark data, particularly at low center-of-mass energies. We use high-energy-density plasmas to measure the proton spectra from 3He +T and 3He + 3He fusion. The data disagree with R -matrix predictions constrained by neutron spectra from T +T fusion. We present a new analysis of the 3He + 3He 3 proton spectrum; these benchmarked spectral shapes should be used for interpreting low-resolution data, such as solar fusion cross-section measurements.
Zylstra, A. B.; Frenje, J. A.; Gatu Johnson, M.; ...
2017-11-29
Few-body nuclear physics often relies upon phenomenological models, with new efforts at the ab initio theory reported recently; both need high-quality benchmark data, particularly at low center-of-mass energies. We use high-energy-density plasmas to measure the proton spectra from 3He + T and 3He + 3He fusion. The data disagree with R -matrix predictions constrained by neutron spectra from T + T fusion. Here, we present a new analysis of the 3He + 3He proton spectrum; these benchmarked spectral shapes should be used for interpreting low-resolution data, such as solar fusion cross-section measurements.
NASA Astrophysics Data System (ADS)
Renschler, Chris S.; Wang, Zhihao
2017-10-01
In light of climate and land use change, stakeholders around the world are interested in assessing historic and likely future flood dynamics and flood extents for decision-making in watersheds with dams as well as limited availability of stream gages and costly technical resources. This research evaluates an assessment and communication approach of combining GIS, hydraulic modeling based on latest remote sensing and topographic imagery by comparing the results to an actual flood event and available stream gages. On August 28th 2011, floods caused by Hurricane Irene swept through a large rural area in New York State, leaving thousands of people homeless, devastating towns and cities. Damage was widespread though the estimated and actual floods inundation and associated return period were still unclear since the flooding was artificially increased by flood water release due to fear of a dam break. This research uses the stream section right below the dam between two stream gages North Blenheim and Breakabeen along Schoharie Creek as a case study site to validate the approach. The data fusion approach uses a GIS, commonly available data sources, the hydraulic model HEC-RAS as well as airborne LiDAR data that were collected two days after the flood event (Aug 30, 2011). The aerial imagery of the airborne survey depicts a low flow event as well as the evidence of the record flood such as debris and other signs of damage to validate the hydrologic simulation results with the available stream gauges. Model results were also compared to the official Federal Emergency Management Agency (FEMA) flood scenarios to determine the actual flood return period of the event. The dynamic of the flood levels was then used to visualize the flood and the actual loss of the Old Blenheim Bridge using Google Sketchup. Integration of multi-source data, cross-validation and visualization provides new ways to utilize pre- and post-event remote sensing imagery and hydrologic models to better understand and communicate the complex spatial-temporal dynamics, return periods and potential/actual consequences to decision-makers and the local population.
Analysis of membrane fusion as a two-state sequential process: evaluation of the stalk model.
Weinreb, Gabriel; Lentz, Barry R
2007-06-01
We propose a model that accounts for the time courses of PEG-induced fusion of membrane vesicles of varying lipid compositions and sizes. The model assumes that fusion proceeds from an initial, aggregated vesicle state ((A) membrane contact) through two sequential intermediate states (I(1) and I(2)) and then on to a fusion pore state (FP). Using this model, we interpreted data on the fusion of seven different vesicle systems. We found that the initial aggregated state involved no lipid or content mixing but did produce leakage. The final state (FP) was not leaky. Lipid mixing normally dominated the first intermediate state (I(1)), but content mixing signal was also observed in this state for most systems. The second intermediate state (I(2)) exhibited both lipid and content mixing signals and leakage, and was sometimes the only leaky state. In some systems, the first and second intermediates were indistinguishable and converted directly to the FP state. Having also tested a parallel, two-intermediate model subject to different assumptions about the nature of the intermediates, we conclude that a sequential, two-intermediate model is the simplest model sufficient to describe PEG-mediated fusion in all vesicle systems studied. We conclude as well that a fusion intermediate "state" should not be thought of as a fixed structure (e.g., "stalk" or "transmembrane contact") of uniform properties. Rather, a fusion "state" describes an ensemble of similar structures that can have different mechanical properties. Thus, a "state" can have varying probabilities of having a given functional property such as content mixing, lipid mixing, or leakage. Our data show that the content mixing signal may occur through two processes, one correlated and one not correlated with leakage. Finally, we consider the implications of our results in terms of the "modified stalk" hypothesis for the mechanism of lipid pore formation. We conclude that our results not only support this hypothesis but also provide a means of analyzing fusion time courses so as to test it and gauge the mechanism of action of fusion proteins in the context of the lipidic hypothesis of fusion.
Three-Dimensional Road Network by Fusion of Polarimetric and Interferometric SAR Data
NASA Technical Reports Server (NTRS)
Gamba, P.; Houshmand, B.
1998-01-01
In this paper a fuzzy classification procedure is applied to polarimetric radar measurements, and street pixels are detected. These data are successively grouped into consistent roads by means of a dynamic programming approach based on the fuzzy membership function values. Further fusion of the 2D road network extracted and 3D TOPSAR measurements provides a powerful way to analyze urban infrastructures.
Fusion method of SAR and optical images for urban object extraction
NASA Astrophysics Data System (ADS)
Jia, Yonghong; Blum, Rick S.; Li, Fangfang
2007-11-01
A new image fusion method of SAR, Panchromatic (Pan) and multispectral (MS) data is proposed. First of all, SAR texture is extracted by ratioing the despeckled SAR image to its low pass approximation, and is used to modulate high pass details extracted from the available Pan image by means of the á trous wavelet decomposition. Then, high pass details modulated with the texture is applied to obtain the fusion product by HPFM (High pass Filter-based Modulation) fusion method. A set of image data including co-registered Landsat TM, ENVISAT SAR and SPOT Pan is used for the experiment. The results demonstrate accurate spectral preservation on vegetated regions, bare soil, and also on textured areas (buildings and road network) where SAR texture information enhances the fusion product, and the proposed approach is effective for image interpret and classification.
NASA Astrophysics Data System (ADS)
Sun, Liang; Anderson, Martha C.; Gao, Feng; Hain, Christopher; Alfieri, Joseph G.; Sharifi, Amirreza; McCarty, Gregory W.; Yang, Yun; Yang, Yang; Kustas, William P.; McKee, Lynn
2017-07-01
The health of the Chesapeake Bay ecosystem has been declining for several decades due to high levels of nutrients and sediments largely tied to agricultural production systems. Therefore, monitoring of agricultural water use and hydrologic connections between crop lands and Bay tributaries has received increasing attention. Remote sensing retrievals of actual evapotranspiration (ET) can provide valuable information in support of these hydrologic modeling efforts, spatially and temporally describing consumptive water use by crops and natural vegetation and quantifying response to expansion of irrigated area occurring with Bay watershed. In this study, a multisensor satellite data fusion methodology, combined with a multiscale ET retrieval algorithm, was applied over the Choptank River watershed located within the Lower Chesapeake Bay region on the Eastern Shore of Maryland, USA to produce daily 30 m resolution ET maps. ET estimates directly retrieved on Landsat satellite overpass dates have high accuracy with relative error (RE) of 9%, as evaluated using flux tower measurements. The fused daily ET time series have reasonable errors of 18% at the daily time step - an improvement from 27% errors using standard Landsat-only interpolation techniques. Annual water consumption by different land cover types was assessed, showing reasonable distributions of water use with cover class. Seasonal patterns in modeled crop transpiration and soil evaporation for dominant crop types were analyzed, and agree well with crop phenology at field scale. Additionally, effects of irrigation occurring during a period of rainfall shortage were captured by the fusion program. These results suggest that the ET fusion system will have utility for water management at field and regional scales over the Eastern Shore. Further efforts are underway to integrate these detailed water use data sets into watershed-scale hydrologic models to improve assessments of water quality and inform best management practices to reduce nutrient and sediment loads to the Chesapeake Bay.
NASA Astrophysics Data System (ADS)
Tang, Xian-Zhu; McDevitt, C. J.; Guo, Zehua; Berk, H. L.
2014-03-01
Inertial confinement fusion requires an imploded target in which a central hot spot is surrounded by a cold and dense pusher. The hot spot/pusher interface can take complicated shape in three dimensions due to hydrodynamic mix. It is also a transition region where the Knudsen and inverse Knudsen layer effect can significantly modify the fusion reactivity in comparison with the commonly used value evaluated with background Maxwellians. Here, we describe a hybrid model that couples the kinetic correction of fusion reactivity to global hydrodynamic implosion simulations. The key ingredient is a non-perturbative treatment of the tail ions in the interface region where the Gamow ion Knudsen number approaches or surpasses order unity. The accuracy of the coupling scheme is controlled by the precise criteria for matching the non-perturbative kinetic model to perturbative solutions in both configuration space and velocity space.
Open data models for smart health interconnected applications: the example of openEHR.
Demski, Hans; Garde, Sebastian; Hildebrand, Claudia
2016-10-22
Smart Health is known as a concept that enhances networking, intelligent data processing and combining patient data with other parameters. Open data models can play an important role in creating a framework for providing interoperable data services that support the development of innovative Smart Health applications profiting from data fusion and sharing. This article describes a model-driven engineering approach based on standardized clinical information models and explores its application for the development of interoperable electronic health record systems. The following possible model-driven procedures were considered: provision of data schemes for data exchange, automated generation of artefacts for application development and native platforms that directly execute the models. The applicability of the approach in practice was examined using the openEHR framework as an example. A comprehensive infrastructure for model-driven engineering of electronic health records is presented using the example of the openEHR framework. It is shown that data schema definitions to be used in common practice software development processes can be derived from domain models. The capabilities for automatic creation of implementation artefacts (e.g., data entry forms) are demonstrated. Complementary programming libraries and frameworks that foster the use of open data models are introduced. Several compatible health data platforms are listed. They provide standard based interfaces for interconnecting with further applications. Open data models help build a framework for interoperable data services that support the development of innovative Smart Health applications. Related tools for model-driven application development foster semantic interoperability and interconnected innovative applications.
Label fusion based brain MR image segmentation via a latent selective model
NASA Astrophysics Data System (ADS)
Liu, Gang; Guo, Xiantang; Zhu, Kai; Liao, Hengxu
2018-04-01
Multi-atlas segmentation is an effective approach and increasingly popular for automatically labeling objects of interest in medical images. Recently, segmentation methods based on generative models and patch-based techniques have become the two principal branches of label fusion. However, these generative models and patch-based techniques are only loosely related, and the requirement for higher accuracy, faster segmentation, and robustness is always a great challenge. In this paper, we propose novel algorithm that combines the two branches using global weighted fusion strategy based on a patch latent selective model to perform segmentation of specific anatomical structures for human brain magnetic resonance (MR) images. In establishing this probabilistic model of label fusion between the target patch and patch dictionary, we explored the Kronecker delta function in the label prior, which is more suitable than other models, and designed a latent selective model as a membership prior to determine from which training patch the intensity and label of the target patch are generated at each spatial location. Because the image background is an equally important factor for segmentation, it is analyzed in label fusion procedure and we regard it as an isolated label to keep the same privilege between the background and the regions of interest. During label fusion with the global weighted fusion scheme, we use Bayesian inference and expectation maximization algorithm to estimate the labels of the target scan to produce the segmentation map. Experimental results indicate that the proposed algorithm is more accurate and robust than the other segmentation methods.
Zipper model for the melting of thin films
NASA Astrophysics Data System (ADS)
Abdullah, Mikrajuddin; Khairunnisa, Shafira; Akbar, Fathan
2016-01-01
We propose an alternative model to Lindemann’s criterion for melting that explains the melting of thin films on the basis of a molecular zipper-like mechanism. Using this model, a unique criterion for melting is obtained. We compared the results of the proposed model with experimental data of melting points and heat of fusion for many materials and obtained interesting results. The interesting thing reported here is how complex physics problems can sometimes be modeled with simple objects around us that seemed to have no correlation. This kind of approach is sometimes very important in physics education and should always be taught to undergraduate or graduate students.
Multi-atlas and label fusion approach for patient-specific MRI based skull estimation.
Torrado-Carvajal, Angel; Herraiz, Joaquin L; Hernandez-Tamames, Juan A; San Jose-Estepar, Raul; Eryaman, Yigitcan; Rozenholc, Yves; Adalsteinsson, Elfar; Wald, Lawrence L; Malpica, Norberto
2016-04-01
MRI-based skull segmentation is a useful procedure for many imaging applications. This study describes a methodology for automatic segmentation of the complete skull from a single T1-weighted volume. The skull is estimated using a multi-atlas segmentation approach. Using a whole head computed tomography (CT) scan database, the skull in a new MRI volume is detected by nonrigid image registration of the volume to every CT, and combination of the individual segmentations by label-fusion. We have compared Majority Voting, Simultaneous Truth and Performance Level Estimation (STAPLE), Shape Based Averaging (SBA), and the Selective and Iterative Method for Performance Level Estimation (SIMPLE) algorithms. The pipeline has been evaluated quantitatively using images from the Retrospective Image Registration Evaluation database (reaching an overlap of 72.46 ± 6.99%), a clinical CT-MR dataset (maximum overlap of 78.31 ± 6.97%), and a whole head CT-MRI pair (maximum overlap 78.68%). A qualitative evaluation has also been performed on MRI acquisition of volunteers. It is possible to automatically segment the complete skull from MRI data using a multi-atlas and label fusion approach. This will allow the creation of complete MRI-based tissue models that can be used in electromagnetic dosimetry applications and attenuation correction in PET/MR. © 2015 Wiley Periodicals, Inc.
Multi-scale pixel-based image fusion using multivariate empirical mode decomposition.
Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P; McDonald-Maier, Klaus D
2015-05-08
A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences.
Multi-Scale Pixel-Based Image Fusion Using Multivariate Empirical Mode Decomposition
Rehman, Naveed ur; Ehsan, Shoaib; Abdullah, Syed Muhammad Umer; Akhtar, Muhammad Jehanzaib; Mandic, Danilo P.; McDonald-Maier, Klaus D.
2015-01-01
A novel scheme to perform the fusion of multiple images using the multivariate empirical mode decomposition (MEMD) algorithm is proposed. Standard multi-scale fusion techniques make a priori assumptions regarding input data, whereas standard univariate empirical mode decomposition (EMD)-based fusion techniques suffer from inherent mode mixing and mode misalignment issues, characterized respectively by either a single intrinsic mode function (IMF) containing multiple scales or the same indexed IMFs corresponding to multiple input images carrying different frequency information. We show that MEMD overcomes these problems by being fully data adaptive and by aligning common frequency scales from multiple channels, thus enabling their comparison at a pixel level and subsequent fusion at multiple data scales. We then demonstrate the potential of the proposed scheme on a large dataset of real-world multi-exposure and multi-focus images and compare the results against those obtained from standard fusion algorithms, including the principal component analysis (PCA), discrete wavelet transform (DWT) and non-subsampled contourlet transform (NCT). A variety of image fusion quality measures are employed for the objective evaluation of the proposed method. We also report the results of a hypothesis testing approach on our large image dataset to identify statistically-significant performance differences. PMID:26007714
A modification of the fusion model for log polar coordinates
NASA Technical Reports Server (NTRS)
Griswold, N. C.; Weiman, Carl F. R.
1990-01-01
The fusion mechanism for application in stereo analysis of range restricted the depth of field and therefore required a shift variant mechanism in the peripheral area to find disparity. Misregistration was prevented by restricting the disparity detection range to a neighborhood spanned by the directional edge detection filters. This transformation was essentially accomplished by a nonuniform resampling of the original image in a horizontal direction. While this is easily implemented for digital processing, the approach does not (in the peripheral vision area) model the log-conformal mapping which is known to occur in the human mechanism. This paper therefore modifies the original fusion concept in the peripheral area to include the polar exponential grid-to-log conformal tesselation. Examples of the fusion process resulting in accurate disparity values are given.
NASA Technical Reports Server (NTRS)
Schenker, Paul S. (Editor)
1991-01-01
The volume on data fusion from multiple sources discusses fusing multiple views, temporal analysis and 3D motion interpretation, sensor fusion and eye-to-hand coordination, and integration in human shape perception. Attention is given to surface reconstruction, statistical methods in sensor fusion, fusing sensor data with environmental knowledge, computational models for sensor fusion, and evaluation and selection of sensor fusion techniques. Topics addressed include the structure of a scene from two and three projections, optical flow techniques for moving target detection, tactical sensor-based exploration in a robotic environment, and the fusion of human and machine skills for remote robotic operations. Also discussed are K-nearest-neighbor concepts for sensor fusion, surface reconstruction with discontinuities, a sensor-knowledge-command fusion paradigm for man-machine systems, coordinating sensing and local navigation, and terrain map matching using multisensing techniques for applications to autonomous vehicle navigation.
Integrating multiple data sources in species distribution modeling: A framework for data fusion
Pacifici, Krishna; Reich, Brian J.; Miller, David A.W.; Gardner, Beth; Stauffer, Glenn E.; Singh, Susheela; McKerrow, Alexa; Collazo, Jaime A.
2017-01-01
The last decade has seen a dramatic increase in the use of species distribution models (SDMs) to characterize patterns of species’ occurrence and abundance. Efforts to parameterize SDMs often create a tension between the quality and quantity of data available to fit models. Estimation methods that integrate both standardized and non-standardized data types offer a potential solution to the tradeoff between data quality and quantity. Recently several authors have developed approaches for jointly modeling two sources of data (one of high quality and one of lesser quality). We extend their work by allowing for explicit spatial autocorrelation in occurrence and detection error using a Multivariate Conditional Autoregressive (MVCAR) model and develop three models that share information in a less direct manner resulting in more robust performance when the auxiliary data is of lesser quality. We describe these three new approaches (“Shared,” “Correlation,” “Covariates”) for combining data sources and show their use in a case study of the Brown-headed Nuthatch in the Southeastern U.S. and through simulations. All three of the approaches which used the second data source improved out-of-sample predictions relative to a single data source (“Single”). When information in the second data source is of high quality, the Shared model performs the best, but the Correlation and Covariates model also perform well. When the information quality in the second data source is of lesser quality, the Correlation and Covariates model performed better suggesting they are robust alternatives when little is known about auxiliary data collected opportunistically or through citizen scientists. Methods that allow for both data types to be used will maximize the useful information available for estimating species distributions.
Skyrme forces and the fusion-fission dynamics of the 132Sn+64Ni→196Pt* reaction
NASA Astrophysics Data System (ADS)
Jain, Deepika; Kumar, Raj; Sharma, Manoj K.; Gupta, Raj K.
2012-02-01
The dependence of the fusion-fission process on Skyrme forces is studied by using the dynamical cluster-decay model (DCM) and the ℓ-summed extended-Wong model in the 132Sn+64Ni→196Pt* reaction, where the nuclear proximity potential is obtained by using the semiclassical extended Thomas-Fermi (ETF) approach in the Skyrme energy density formalism (SEDF) under the frozen density approximation. The DCM gives an excellent fit to the measured fusion evaporation residue (ER) and the fission cross sections below and above barrier energies, with ER data needing “barrier lowering” at below-barrier energies for each Skyrme force (an in-built property of the DCM). The fission cross sections show a contribution of quasifission (qf) at the above-barrier two or three highest energies, depending on the Skyrme force. Calculations are illustrated for three Skyrme forces, GSkI, SSk, and SIII. Another interesting result is that there is a change of fission mass distribution from a predominantly asymmetric one to a symmetric one with a decrease in the N/Z ratio of the compound nucleus, independent of the choice of nuclear interaction potential, which gives an opportunity to address the isospin effects in the Pt* nucleus. Within the ℓ-summed extended-Wong model we find that the GSkI and SSk forces fit the total fusion cross-section data exactly, whereas the SIII force needs “barrier modification” in order to fit the data at below-barrier energies. This happens because the isospin and neutron-proton asymmetry nature of GSkI and SSk forces is different from that of the SIII force, and because the center-of-mass energy Ec.m. dependence of the barrier height for the SIII force and that of Blocki [Ann. Phys. (NY)10.1016/0003-4916(77)90249-4 105, 427 (1977)] differs strongly (by a constant amount of ˜7 MeV) from those for GSKI and SSk forces. Note that, because of the associated preformation factor with each fragment, the DCM has the advantage of treating various decay processes separately, whereas the Wong model describes only the total fusion cross section, a sum of cross sections due to all contributing processes.
US EPA 2012 Air Quality Fused Surface for the Conterminous U.S. Map Service
This web service contains a polygon layer that depicts fused air quality predictions for 2012 for census tracts in the conterminous United States. Fused air quality predictions (for ozone and PM2.5) are modeled using a Bayesian space-time downscaling fusion model approach described in a series of three published journal papers: 1) (Berrocal, V., Gelfand, A. E. and Holland, D. M. (2012). Space-time fusion under error in computer model output: an application to modeling air quality. Biometrics 68, 837-848; 2) Berrocal, V., Gelfand, A. E. and Holland, D. M. (2010). A bivariate space-time downscaler under space and time misalignment. The Annals of Applied Statistics 4, 1942-1975; and 3) Berrocal, V., Gelfand, A. E., and Holland, D. M. (2010). A spatio-temporal downscaler for output from numerical models. J. of Agricultural, Biological,and Environmental Statistics 15, 176-197) is used to provide daily, predictive PM2.5 (daily average) and O3 (daily 8-hr maximum) surfaces for 2012. Summer (O3) and annual (PM2.5) means calculated and published. The downscaling fusion model uses both air quality monitoring data from the National Air Monitoring Stations/State and Local Air Monitoring Stations (NAMS/SLAMS) and numerical output from the Models-3/Community Multiscale Air Quality (CMAQ). Currently, predictions at the US census tract centroid locations within the 12 km CMAQ domain are archived. Predictions at the CMAQ grid cell centroids, or any desired set of locations co
Joint independent component analysis for simultaneous EEG-fMRI: principle and simulation.
Moosmann, Matthias; Eichele, Tom; Nordby, Helge; Hugdahl, Kenneth; Calhoun, Vince D
2008-03-01
An optimized scheme for the fusion of electroencephalography and event related potentials with functional magnetic resonance imaging (BOLD-fMRI) data should simultaneously assess all available electrophysiologic and hemodynamic information in a common data space. In doing so, it should be possible to identify features of latent neural sources whose trial-to-trial dynamics are jointly reflected in both modalities. We present a joint independent component analysis (jICA) model for analysis of simultaneous single trial EEG-fMRI measurements from multiple subjects. We outline the general idea underlying the jICA approach and present results from simulated data under realistic noise conditions. Our results indicate that this approach is a feasible and physiologically plausible data-driven way to achieve spatiotemporal mapping of event related responses in the human brain.
NASA Astrophysics Data System (ADS)
Anwer, Rao Muhammad; Khan, Fahad Shahbaz; van de Weijer, Joost; Molinier, Matthieu; Laaksonen, Jorma
2018-04-01
Designing discriminative powerful texture features robust to realistic imaging conditions is a challenging computer vision problem with many applications, including material recognition and analysis of satellite or aerial imagery. In the past, most texture description approaches were based on dense orderless statistical distribution of local features. However, most recent approaches to texture recognition and remote sensing scene classification are based on Convolutional Neural Networks (CNNs). The de facto practice when learning these CNN models is to use RGB patches as input with training performed on large amounts of labeled data (ImageNet). In this paper, we show that Local Binary Patterns (LBP) encoded CNN models, codenamed TEX-Nets, trained using mapped coded images with explicit LBP based texture information provide complementary information to the standard RGB deep models. Additionally, two deep architectures, namely early and late fusion, are investigated to combine the texture and color information. To the best of our knowledge, we are the first to investigate Binary Patterns encoded CNNs and different deep network fusion architectures for texture recognition and remote sensing scene classification. We perform comprehensive experiments on four texture recognition datasets and four remote sensing scene classification benchmarks: UC-Merced with 21 scene categories, WHU-RS19 with 19 scene classes, RSSCN7 with 7 categories and the recently introduced large scale aerial image dataset (AID) with 30 aerial scene types. We demonstrate that TEX-Nets provide complementary information to standard RGB deep model of the same network architecture. Our late fusion TEX-Net architecture always improves the overall performance compared to the standard RGB network on both recognition problems. Furthermore, our final combination leads to consistent improvement over the state-of-the-art for remote sensing scene classification.
[Time consumption and quality of an automated fusion tool for SPECT and MRI images of the brain].
Fiedler, E; Platsch, G; Schwarz, A; Schmiedehausen, K; Tomandl, B; Huk, W; Rupprecht, Th; Rahn, N; Kuwert, T
2003-10-01
Although the fusion of images from different modalities may improve diagnostic accuracy, it is rarely used in clinical routine work due to logistic problems. Therefore we evaluated performance and time needed for fusing MRI and SPECT images using a semiautomated dedicated software. PATIENTS, MATERIAL AND METHOD: In 32 patients regional cerebral blood flow was measured using (99m)Tc ethylcystein dimer (ECD) and the three-headed SPECT camera MultiSPECT 3. MRI scans of the brain were performed using either a 0,2 T Open or a 1,5 T Sonata. Twelve of the MRI data sets were acquired using a 3D-T1w MPRAGE sequence, 20 with a 2D acquisition technique and different echo sequences. Image fusion was performed on a Syngo workstation using an entropy minimizing algorithm by an experienced user of the software. The fusion results were classified. We measured the time needed for the automated fusion procedure and in case of need that for manual realignment after automated, but insufficient fusion. The mean time of the automated fusion procedure was 123 s. It was for the 2D significantly shorter than for the 3D MRI datasets. For four of the 2D data sets and two of the 3D data sets an optimal fit was reached using the automated approach. The remaining 26 data sets required manual correction. The sum of the time required for automated fusion and that needed for manual correction averaged 320 s (50-886 s). The fusion of 3D MRI data sets lasted significantly longer than that of the 2D MRI data. The automated fusion tool delivered in 20% an optimal fit, in 80% manual correction was necessary. Nevertheless, each of the 32 SPECT data sets could be merged in less than 15 min with the corresponding MRI data, which seems acceptable for clinical routine use.
A Multi-Objective Decision Making Approach for Solving the Image Segmentation Fusion Problem.
Khelifi, Lazhar; Mignotte, Max
2017-08-01
Image segmentation fusion is defined as the set of methods which aim at merging several image segmentations, in a manner that takes full advantage of the complementarity of each one. Previous relevant researches in this field have been impeded by the difficulty in identifying an appropriate single segmentation fusion criterion, providing the best possible, i.e., the more informative, result of fusion. In this paper, we propose a new model of image segmentation fusion based on multi-objective optimization which can mitigate this problem, to obtain a final improved result of segmentation. Our fusion framework incorporates the dominance concept in order to efficiently combine and optimize two complementary segmentation criteria, namely, the global consistency error and the F-measure (precision-recall) criterion. To this end, we present a hierarchical and efficient way to optimize the multi-objective consensus energy function related to this fusion model, which exploits a simple and deterministic iterative relaxation strategy combining the different image segments. This step is followed by a decision making task based on the so-called "technique for order performance by similarity to ideal solution". Results obtained on two publicly available databases with manual ground truth segmentations clearly show that our multi-objective energy-based model gives better results than the classical mono-objective one.
Multisensor Fusion for Change Detection
NASA Astrophysics Data System (ADS)
Schenk, T.; Csatho, B.
2005-12-01
Combining sensors that record different properties of a 3-D scene leads to complementary and redundant information. If fused properly, a more robust and complete scene description becomes available. Moreover, fusion facilitates automatic procedures for object reconstruction and modeling. For example, aerial imaging sensors, hyperspectral scanning systems, and airborne laser scanning systems generate complementary data. We describe how data from these sensors can be fused for such diverse applications as mapping surface erosion and landslides, reconstructing urban scenes, monitoring urban land use and urban sprawl, and deriving velocities and surface changes of glaciers and ice sheets. An absolute prerequisite for successful fusion is a rigorous co-registration of the sensors involved. We establish a common 3-D reference frame by using sensor invariant features. Such features are caused by the same object space phenomena and are extracted in multiple steps from the individual sensors. After extracting, segmenting and grouping the features into more abstract entities, we discuss ways on how to automatically establish correspondences. This is followed by a brief description of rigorous mathematical models suitable to deal with linear and area features. In contrast to traditional, point-based registration methods, lineal and areal features lend themselves to a more robust and more accurate registration. More important, the chances to automate the registration process increases significantly. The result of the co-registration of the sensors is a unique transformation between the individual sensors and the object space. This makes spatial reasoning of extracted information more versatile; reasoning can be performed in sensor space or in 3-D space where domain knowledge about features and objects constrains reasoning processes, reduces the search space, and helps to make the problem well-posed. We demonstrate the feasibility of the proposed multisensor fusion approach with detecting surface elevation changes on the Byrd Glacier, Antarctica, with aerial imagery from 1980s and ICESat laser altimetry data from 2003-05. Change detection from such disparate data sets is an intricate fusion problem, beginning with sensor alignment, and on to reasoning with spatial information as to where changes occurred and to what extent.
Role of the supersymmetric semiclassical approach in barrier penetration and heavy-ion fusion
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sil, T.; Dutt, R.; Varshni, Y.P.
1994-11-01
The problem of heavy-ion fusion reactions in the one-dimensional barrier penetration model (BPM) has been reexamined in light of supersymmetry-inspired WKB (SWKB) method. Motivated by our recent work [Phys. Lett. A 184, 209 (1994)] describing the SWKB method for the computation of the transmission coefficient [ital T]([ital E]), we have performed similar calculations for a potential barrier that mimics the proximity potential obtained by fitting experimentally measured fusion cross section [sigma][sub [ital F
Fusion of spectral models for dynamic modeling of sEMG and skeletal muscle force.
Potluri, Chandrasekhar; Anugolu, Madhavi; Chiu, Steve; Urfer, Alex; Schoen, Marco P; Naidu, D Subbaram
2012-01-01
In this paper, we present a method of combining spectral models using a Kullback Information Criterion (KIC) data fusion algorithm. Surface Electromyographic (sEMG) signals and their corresponding skeletal muscle force signals are acquired from three sensors and pre-processed using a Half-Gaussian filter and a Chebyshev Type- II filter, respectively. Spectral models - Spectral Analysis (SPA), Empirical Transfer Function Estimate (ETFE), Spectral Analysis with Frequency Dependent Resolution (SPFRD) - are extracted from sEMG signals as input and skeletal muscle force as output signal. These signals are then employed in a System Identification (SI) routine to establish the dynamic models relating the input and output. After the individual models are extracted, the models are fused by a probability based KIC fusion algorithm. The results show that the SPFRD spectral models perform better than SPA and ETFE models in modeling the frequency content of the sEMG/skeletal muscle force data.
V S, Unni; Mishra, Deepak; Subrahmanyam, G R K S
2016-12-01
The need for image fusion in current image processing systems is increasing mainly due to the increased number and variety of image acquisition techniques. Image fusion is the process of combining substantial information from several sensors using mathematical techniques in order to create a single composite image that will be more comprehensive and thus more useful for a human operator or other computer vision tasks. This paper presents a new approach to multifocus image fusion based on sparse signal representation. Block-based compressive sensing integrated with a projection-driven compressive sensing (CS) recovery that encourages sparsity in the wavelet domain is used as a method to get the focused image from a set of out-of-focus images. Compression is achieved during the image acquisition process using a block compressive sensing method. An adaptive thresholding technique within the smoothed projected Landweber recovery process reconstructs high-resolution focused images from low-dimensional CS measurements of out-of-focus images. Discrete wavelet transform and dual-tree complex wavelet transform are used as the sparsifying basis for the proposed fusion. The main finding lies in the fact that sparsification enables a better selection of the fusion coefficients and hence better fusion. A Laplacian mixture model fit is done in the wavelet domain and estimation of the probability density function (pdf) parameters by expectation maximization leads us to the proper selection of the coefficients of the fused image. Using the proposed method compared with the fusion scheme without employing the projected Landweber (PL) scheme and the other existing CS-based fusion approaches, it is observed that with fewer samples itself, the proposed method outperforms other approaches.
2000-08-01
lecturer of LATIN 2006 , (Latin America Theoretical Informat- ics, 2006 ), Valdivia , Chile, March 2006 . 67. Sergio Verdu gave a Keynote Talk at the New...NUMBER OF PAGES 20. LIMITATION OF ABSTRACT UL - 31-Jan- 2006 Data Fusion in Large Arrays of Microsensors (SensorWeb): A Comprehensive Approach to...Transactions on Wireless Communications, February 2006 . 21. A.P. George, W.B. Powell, S.R. Kulkarni. The Statistics of Hierarchical Aggregation for
Project Icarus: Nuclear Fusion Propulsion Concept Comparison
NASA Astrophysics Data System (ADS)
Stanic, M.
Project Icarus will use nuclear fusion as the primary propulsion, since achieving breakeven is imminent within the next decade. Therefore, fusion technology provides confidence in further development and fairly high technological maturity by the time the Icarus mission would be plausible. Currently there are numerous (over 2 dozen) different fusion approaches that are simultaneously being developed around the World and it is difficult to predict which of the concepts is going to be the most successful one. This study tried to estimate current technological maturity and possible technological extrapolation of fusion approaches for which appropriate data could be found. Figures of merit that were assessed include: current technological state, mass and volume estimates, possible gain values, main advantages and disadvantages of the concept and an attempt to extrapolate current technological state for the next decade or two. Analysis suggests that Magnetic Confinement Fusion (MCF) concepts are not likely to deliver sufficient performance due to size, mass, gain and large technological barriers of the concept. However, ICF and PJMIF did show potential for delivering necessary performance, assuming appropriate techno- logical advances. This paper is a submission of the Project Icarus Study Group.
National Fusion Collaboratory: Grid Computing for Simulations and Experiments
NASA Astrophysics Data System (ADS)
Greenwald, Martin
2004-05-01
The National Fusion Collaboratory Project is creating a computational grid designed to advance scientific understanding and innovation in magnetic fusion research by facilitating collaborations, enabling more effective integration of experiments, theory and modeling and allowing more efficient use of experimental facilities. The philosophy of FusionGrid is that data, codes, analysis routines, visualization tools, and communication tools should be thought of as network available services, easily used by the fusion scientist. In such an environment, access to services is stressed rather than portability. By building on a foundation of established computer science toolkits, deployment time can be minimized. These services all share the same basic infrastructure that allows for secure authentication and resource authorization which allows stakeholders to control their own resources such as computers, data and experiments. Code developers can control intellectual property, and fair use of shared resources can be demonstrated and controlled. A key goal is to shield scientific users from the implementation details such that transparency and ease-of-use are maximized. The first FusionGrid service deployed was the TRANSP code, a widely used tool for transport analysis. Tools for run preparation, submission, monitoring and management have been developed and shared among a wide user base. This approach saves user sites from the laborious effort of maintaining such a large and complex code while at the same time reducing the burden on the development team by avoiding the need to support a large number of heterogeneous installations. Shared visualization and A/V tools are being developed and deployed to enhance long-distance collaborations. These include desktop versions of the Access Grid, a highly capable multi-point remote conferencing tool and capabilities for sharing displays and analysis tools over local and wide-area networks.
Driver fatigue detection through multiple entropy fusion analysis in an EEG-based system
Min, Jianliang; Wang, Ping
2017-01-01
Driver fatigue is an important contributor to road accidents, and fatigue detection has major implications for transportation safety. The aim of this research is to analyze the multiple entropy fusion method and evaluate several channel regions to effectively detect a driver's fatigue state based on electroencephalogram (EEG) records. First, we fused multiple entropies, i.e., spectral entropy, approximate entropy, sample entropy and fuzzy entropy, as features compared with autoregressive (AR) modeling by four classifiers. Second, we captured four significant channel regions according to weight-based electrodes via a simplified channel selection method. Finally, the evaluation model for detecting driver fatigue was established with four classifiers based on the EEG data from four channel regions. Twelve healthy subjects performed continuous simulated driving for 1–2 hours with EEG monitoring on a static simulator. The leave-one-out cross-validation approach obtained an accuracy of 98.3%, a sensitivity of 98.3% and a specificity of 98.2%. The experimental results verified the effectiveness of the proposed method, indicating that the multiple entropy fusion features are significant factors for inferring the fatigue state of a driver. PMID:29220351
2008-03-01
amount of arriving data, extract actionable information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center...information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center climate and it becomes clear that analysts, police... fusion centers, including specifics about how these problems manifest at the Illinois State Police (ISP) Statewide Terrorism and Intelligence Center
Nonlinear information fusion algorithms for data-efficient multi-fidelity modelling.
Perdikaris, P; Raissi, M; Damianou, A; Lawrence, N D; Karniadakis, G E
2017-02-01
Multi-fidelity modelling enables accurate inference of quantities of interest by synergistically combining realizations of low-cost/low-fidelity models with a small set of high-fidelity observations. This is particularly effective when the low- and high-fidelity models exhibit strong correlations, and can lead to significant computational gains over approaches that solely rely on high-fidelity models. However, in many cases of practical interest, low-fidelity models can only be well correlated to their high-fidelity counterparts for a specific range of input parameters, and potentially return wrong trends and erroneous predictions if probed outside of their validity regime. Here we put forth a probabilistic framework based on Gaussian process regression and nonlinear autoregressive schemes that is capable of learning complex nonlinear and space-dependent cross-correlations between models of variable fidelity, and can effectively safeguard against low-fidelity models that provide wrong trends. This introduces a new class of multi-fidelity information fusion algorithms that provide a fundamental extension to the existing linear autoregressive methodologies, while still maintaining the same algorithmic complexity and overall computational cost. The performance of the proposed methods is tested in several benchmark problems involving both synthetic and real multi-fidelity datasets from computational fluid dynamics simulations.
Zhang, Wenyu; Zhang, Zhenjiang
2015-01-01
Decision fusion in sensor networks enables sensors to improve classification accuracy while reducing the energy consumption and bandwidth demand for data transmission. In this paper, we focus on the decentralized multi-class classification fusion problem in wireless sensor networks (WSNs) and a new simple but effective decision fusion rule based on belief function theory is proposed. Unlike existing belief function based decision fusion schemes, the proposed approach is compatible with any type of classifier because the basic belief assignments (BBAs) of each sensor are constructed on the basis of the classifier’s training output confusion matrix and real-time observations. We also derive explicit global BBA in the fusion center under Dempster’s combinational rule, making the decision making operation in the fusion center greatly simplified. Also, sending the whole BBA structure to the fusion center is avoided. Experimental results demonstrate that the proposed fusion rule has better performance in fusion accuracy compared with the naïve Bayes rule and weighted majority voting rule. PMID:26295399
Goode, Adam P; Richardson, William J; Schectman, Robin M; Carey, Timothy S
2014-09-01
Nationwide estimates examining bone morphogenetic protein (BMP) use with cervical spine fusions have been limited to perioperative outcomes. To determine the 1-year risk of complications, cervical revision fusions, hospital readmissions, and health care services utilization. A retrospective cohort study from 2002 to 2009 using a nationwide claims database. There were 61,937 primary cervical spine fusions of which 1,677 received BMP. Complications, revision fusions, 30-day hospital readmission, and health care utilization. Data for these analyses come from the Thomson Reuters MarketScan Commercial Claims and Encounters Database 2010. Patients were aged 18 to 64 years, receiving and not receiving BMP with a primary (C2-C7) cervical spine fusion. All outcomes were defined by International Classification of Diseases, 9th edition Clinical Modification and Current Procedural and Terminology, 4th edition codes. Complications were analyzed as any complication and stratified by nervous system, wound, and dysphagia or hoarseness. Cervical revision fusions were determined in the 1-year follow-up. Hospital readmission discharge records defined 30-day hospital readmission and reason for the readmission. The utilization of at least one health care service of cervical spine imaging, epidural usage or rehabilitation service was examined. Poisson regression models were used to estimate the relative risk and 95% confidence interval (CI). Linear regression was used to determine the time to hospital readmission. Results were stratified by anterior or posterior and circumferential approaches. Patients receiving BMP were 29% more likely to have a complication (adjusted relative risk [aRR]=1.29 [95% CI, 1.14-1.46]) and a nervous system complication (aRR=1.42 [95% CI, 1.10-1.83]). Cervical revision fusions were more likely among patients receiving BMP (aRR=1.69 [95% CI, 1.35-2.13]). The risk of 30-day readmission was greater with BMP use (aRR=1.37 [95% CI, 1.07-1.73]) and readmission occurred 27.4% sooner on an average. Patients receiving BMP were more likely to receive computed tomography scans (aRR=1.34 [95% CI, 1.06-1.70]) and epidurals with anterior surgical approaches (aRR=1.29 [95% CI, 1.00-1.65]). These findings question both the safety and effectiveness of off-label BMP use in primary cervical spine fusions. Copyright © 2014 Elsevier Inc. All rights reserved.
Data fusion of multi-scale representations for structural damage detection
NASA Astrophysics Data System (ADS)
Guo, Tian; Xu, Zili
2018-01-01
Despite extensive researches into structural health monitoring (SHM) in the past decades, there are few methods that can detect multiple slight damage in noisy environments. Here, we introduce a new hybrid method that utilizes multi-scale space theory and data fusion approach for multiple damage detection in beams and plates. A cascade filtering approach provides multi-scale space for noisy mode shapes and filters the fluctuations caused by measurement noise. In multi-scale space, a series of amplification and data fusion algorithms are utilized to search the damage features across all possible scales. We verify the effectiveness of the method by numerical simulation using damaged beams and plates with various types of boundary conditions. Monte Carlo simulations are conducted to illustrate the effectiveness and noise immunity of the proposed method. The applicability is further validated via laboratory cases studies focusing on different damage scenarios. Both results demonstrate that the proposed method has a superior noise tolerant ability, as well as damage sensitivity, without knowing material properties or boundary conditions.
Hybrid Arrays for Chemical Sensing
NASA Astrophysics Data System (ADS)
Kramer, Kirsten E.; Rose-Pehrsson, Susan L.; Johnson, Kevin J.; Minor, Christian P.
In recent years, multisensory approaches to environment monitoring for chemical detection as well as other forms of situational awareness have become increasingly popular. A hybrid sensor is a multimodal system that incorporates several sensing elements and thus produces data that are multivariate in nature and may be significantly increased in complexity compared to data provided by single-sensor systems. Though a hybrid sensor is itself an array, hybrid sensors are often organized into more complex sensing systems through an assortment of network topologies. Part of the reason for the shift to hybrid sensors is due to advancements in sensor technology and computational power available for processing larger amounts of data. There is also ample evidence to support the claim that a multivariate analytical approach is generally superior to univariate measurements because it provides additional redundant and complementary information (Hall, D. L.; Linas, J., Eds., Handbook of Multisensor Data Fusion, CRC, Boca Raton, FL, 2001). However, the benefits of a multisensory approach are not automatically achieved. Interpretation of data from hybrid arrays of sensors requires the analyst to develop an application-specific methodology to optimally fuse the disparate sources of data generated by the hybrid array into useful information characterizing the sample or environment being observed. Consequently, multivariate data analysis techniques such as those employed in the field of chemometrics have become more important in analyzing sensor array data. Depending on the nature of the acquired data, a number of chemometric algorithms may prove useful in the analysis and interpretation of data from hybrid sensor arrays. It is important to note, however, that the challenges posed by the analysis of hybrid sensor array data are not unique to the field of chemical sensing. Applications in electrical and process engineering, remote sensing, medicine, and of course, artificial intelligence and robotics, all share the same essential data fusion challenges. The design of a hybrid sensor array should draw on this extended body of knowledge. In this chapter, various techniques for data preprocessing, feature extraction, feature selection, and modeling of sensor data will be introduced and illustrated with data fusion approaches that have been implemented in applications involving data from hybrid arrays. The example systems discussed in this chapter involve the development of prototype sensor networks for damage control event detection aboard US Navy vessels and the development of analysis algorithms to combine multiple sensing techniques for enhanced remote detection of unexploded ordnance (UXO) in both ground surveys and wide area assessments.
NASA Astrophysics Data System (ADS)
Krysta, Monika; Kushida, Noriyuki; Kotselko, Yuriy; Carter, Jerry
2016-04-01
Possibilities of associating information from four pillars constituting CTBT monitoring and verification regime, namely seismic, infrasound, hydracoustic and radionuclide networks, have been explored by the International Data Centre (IDC) of the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) for a long time. Based on a concept of overlying waveform events with the geographical regions constituting possible sources of the detected radionuclides, interactive and non-interactive tools were built in the past. Based on the same concept, a design of a prototype of a Fused Event Bulletin was proposed recently. One of the key design elements of the proposed approach is the ability to access fusion results from either the radionuclide or from the waveform technologies products, which are available on different time scales and through various different automatic and interactive products. To accommodate various time scales a dynamic product evolving while the results of the different technologies are being processed and compiled is envisioned. The product would be available through the Secure Web Portal (SWP). In this presentation we describe implementation of the data fusion functionality in the test framework of the SWP. In addition, we address possible refinements to the already implemented concepts.
NASA Astrophysics Data System (ADS)
Sliva, Amy L.; Gorman, Joe; Voshell, Martin; Tittle, James; Bowman, Christopher
2016-05-01
The Dual Node Decision Wheels (DNDW) architecture concept was previously described as a novel approach toward integrating analytic and decision-making processes in joint human/automation systems in highly complex sociotechnical settings. In this paper, we extend the DNDW construct with a description of components in this framework, combining structures of the Dual Node Network (DNN) for Information Fusion and Resource Management with extensions on Rasmussen's Decision Ladder (DL) to provide guidance on constructing information systems that better serve decision-making support requirements. The DNN takes a component-centered approach to system design, decomposing each asset in terms of data inputs and outputs according to their roles and interactions in a fusion network. However, to ensure relevancy to and organizational fitment within command and control (C2) processes, principles from cognitive systems engineering emphasize that system design must take a human-centered systems view, integrating information needs and decision making requirements to drive the architecture design and capabilities of network assets. In the current work, we present an approach for structuring and assessing DNDW systems that uses a unique hybrid DNN top-down system design with a human-centered process design, combining DNN node decomposition with artifacts from cognitive analysis (i.e., system abstraction decomposition models, decision ladders) to provide work domain and task-level insights at different levels in an example intelligence, surveillance, and reconnaissance (ISR) system setting. This DNDW structure will ensure not only that the information fusion technologies and processes are structured effectively, but that the resulting information products will align with the requirements of human decision makers and be adaptable to different work settings .
Takahashi, Shinji; Buser, Zorica; Cohen, Jeremiah R; Roe, Allison; Myhre, Sue L; Meisel, Hans-Joerg; Brodke, Darrel S; Yoon, S Tim; Park, Jong-Beom; Wang, Jeffrey C; Youssef, Jim A
2017-11-01
A retrospective cohort study. To compare the complications between posterior cervical fusions with and without recombinant human bone morphogenetic protein 2 (rhBMP2). Use of rhBMP2 in anterior cervical spinal fusion procedures can lead to potential complications such as neck edema, resulting in airway complications or neurological compression. However, there are no data on the complications associated with the "off-label" use of rhBMP2 in upper and lower posterior cervical fusion approaches. Patients from the PearlDiver database who had a posterior cervical fusion between 2005 and 2011 were identified. We evaluated complications within 90 days after fusion and data was divided in 2 groups: (1) posterior cervical fusion including upper cervical spine O-C2 (upper group) and (2) posterior cervical fusion including lower cervical spine C3-C7 (lower group). Complications were divided into: any complication, neck-related complications, wound-related complications, and other complications. Of the 352 patients in the upper group, 73 patients (20.7%) received rhBMP2, and 279 patients (79.3%) did not. Likewise, in the lower group of 2372 patients, 378 patients (15.9%) had surgery with rhBMP2 and 1994 patients (84.1%) without. In the upper group, complications were observed in 7 patients (9.6%) with and 34 patients (12%) without rhBMP2. In the lower group, complications were observed in 42 patients (11%) with and 276 patients (14%) without rhBMP2. Furthermore, in the lower group the wound-related complications were significantly higher in the rhBMP2 group (23 patients, 6.1%) compared with the non-rhBMP2 group (75 patients, 3.8%). Our data showed that the use of rhBMP2 does not increase the risk of complications in upper cervical spine fusion procedures. However, in the lower cervical spine, rhBMP2 may elevate the risk of wound-related complications. Overall, there were no major complications associated with the use of rhBMP2 for posterior cervical fusion approaches. Level III.
Lesion classification using clinical and visual data fusion by multiple kernel learning
NASA Astrophysics Data System (ADS)
Kisilev, Pavel; Hashoul, Sharbell; Walach, Eugene; Tzadok, Asaf
2014-03-01
To overcome operator dependency and to increase diagnosis accuracy in breast ultrasound (US), a lot of effort has been devoted to developing computer-aided diagnosis (CAD) systems for breast cancer detection and classification. Unfortunately, the efficacy of such CAD systems is limited since they rely on correct automatic lesions detection and localization, and on robustness of features computed based on the detected areas. In this paper we propose a new approach to boost the performance of a Machine Learning based CAD system, by combining visual and clinical data from patient files. We compute a set of visual features from breast ultrasound images, and construct the textual descriptor of patients by extracting relevant keywords from patients' clinical data files. We then use the Multiple Kernel Learning (MKL) framework to train SVM based classifier to discriminate between benign and malignant cases. We investigate different types of data fusion methods, namely, early, late, and intermediate (MKL-based) fusion. Our database consists of 408 patient cases, each containing US images, textual description of complaints and symptoms filled by physicians, and confirmed diagnoses. We show experimentally that the proposed MKL-based approach is superior to other classification methods. Even though the clinical data is very sparse and noisy, its MKL-based fusion with visual features yields significant improvement of the classification accuracy, as compared to the image features only based classifier.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chandrasekhar Potluri,; Madhavi Anugolu; Marco P. Schoen
2013-08-01
In this work, an array of three surface Electrography (sEMG) sensors are used to acquired muscle extension and contraction signals for 18 healthy test subjects. The skeletal muscle force is estimated using the acquired sEMG signals and a Non-linear Wiener Hammerstein model, relating the two signals in a dynamic fashion. The model is obtained from using System Identification (SI) algorithm. The obtained force models for each sensor are fused using a proposed fuzzy logic concept with the intent to improve the force estimation accuracy and resilience to sensor failure or misalignment. For the fuzzy logic inference system, the sEMG entropy,more » the relative error, and the correlation of the force signals are considered for defining the membership functions. The proposed fusion algorithm yields an average of 92.49% correlation between the actual force and the overall estimated force output. In addition, the proposed fusionbased approach is implemented on a test platform. Experiments indicate an improvement in finger/hand force estimation.« less
DUCTILE-PHASE TOUGHENED TUNGSTEN FOR PLASMA-FACING MATERIALS IN FUSION REACTORS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henager, Charles H.; Setyawan, Wahyu; Roosendaal, Timothy J.
2017-05-01
Tungsten (W) and W-alloys are the leading candidates for plasma-facing components in nuclear fusion reactor designs because of their high melting point, strength retention at high temperatures, high thermal conductivity, and low sputtering yield. However, tungsten is brittle and does not exhibit the required fracture toughness for licensing in nuclear applications. A promising approach to increasing fracture toughness of W-alloys is by ductile-phase toughening (DPT). In this method, a ductile phase is included in a brittle matrix to prevent on inhibit crack propagation by crack blunting, crack bridging, crack deflection, and crack branching. Model examples of DPT tungsten are exploredmore » in this study, including W-Cu and W-Ni-Fe powder product composites. Three-point and four-point notched and/or pre-cracked bend samples were tested at several strain rates and temperatures to help understand deformation, cracking, and toughening in these materials. Data from these tests are used for developing and calibrating crack-bridging models. Finite element damage mechanics models are introduced as a modeling method that appears to capture the complexity of crack growth in these materials.« less
Wang, Qian; Song, Enmin; Jin, Renchao; Han, Ping; Wang, Xiaotong; Zhou, Yanying; Zeng, Jianchao
2009-06-01
The aim of this study was to develop a novel algorithm for segmenting lung nodules on three-dimensional (3D) computed tomographic images to improve the performance of computer-aided diagnosis (CAD) systems. The database used in this study consists of two data sets obtained from the Lung Imaging Database Consortium. The first data set, containing 23 nodules (22% irregular nodules, 13% nonsolid nodules, 17% nodules attached to other structures), was used for training. The second data set, containing 64 nodules (37% irregular nodules, 40% nonsolid nodules, 62% nodules attached to other structures), was used for testing. Two key techniques were developed in the segmentation algorithm: (1) a 3D extended dynamic programming model, with a newly defined internal cost function based on the information between adjacent slices, allowing parameters to be adapted to each slice, and (2) a multidirection fusion technique, which makes use of the complementary relationships among different directions to improve the final segmentation accuracy. The performance of this approach was evaluated by the overlap criterion, complemented by the true-positive fraction and the false-positive fraction criteria. The mean values of the overlap, true-positive fraction, and false-positive fraction for the first data set achieved using the segmentation scheme were 66%, 75%, and 15%, respectively, and the corresponding values for the second data set were 58%, 71%, and 22%, respectively. The experimental results indicate that this segmentation scheme can achieve better performance for nodule segmentation than two existing algorithms reported in the literature. The proposed 3D extended dynamic programming model is an effective way to segment sequential images of lung nodules. The proposed multidirection fusion technique is capable of reducing segmentation errors especially for no-nodule and near-end slices, thus resulting in better overall performance.
Beyond RGB: Very high resolution urban remote sensing with multimodal deep networks
NASA Astrophysics Data System (ADS)
Audebert, Nicolas; Le Saux, Bertrand; Lefèvre, Sébastien
2018-06-01
In this work, we investigate various methods to deal with semantic labeling of very high resolution multi-modal remote sensing data. Especially, we study how deep fully convolutional networks can be adapted to deal with multi-modal and multi-scale remote sensing data for semantic labeling. Our contributions are threefold: (a) we present an efficient multi-scale approach to leverage both a large spatial context and the high resolution data, (b) we investigate early and late fusion of Lidar and multispectral data, (c) we validate our methods on two public datasets with state-of-the-art results. Our results indicate that late fusion make it possible to recover errors steaming from ambiguous data, while early fusion allows for better joint-feature learning but at the cost of higher sensitivity to missing data.
NASA Astrophysics Data System (ADS)
Prasad, S.; Bruce, L. M.
2007-04-01
There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.
A Review of Multivariate Methods for Multimodal Fusion of Brain Imaging Data
Adali, Tülay; Yu, Qingbao; Calhoun, Vince D.
2011-01-01
The development of various neuroimaging techniques is rapidly improving the measurements of brain function/structure. However, despite improvements in individual modalities, it is becoming increasingly clear that the most effective research approaches will utilize multi-modal fusion, which takes advantage of the fact that each modality provides a limited view of the brain. The goal of multimodal fusion is to capitalize on the strength of each modality in a joint analysis, rather than a separate analysis of each. This is a more complicated endeavor that must be approached more carefully and efficient methods should be developed to draw generalized and valid conclusions from high dimensional data with a limited number of subjects. Numerous research efforts have been reported in the field based on various statistical approaches, e.g. independent component analysis (ICA), canonical correlation analysis (CCA) and partial least squares (PLS). In this review paper, we survey a number of multivariate methods appearing in previous reports, which are performed with or without prior information and may have utility for identifying potential brain illness biomarkers. We also discuss the possible strengths and limitations of each method, and review their applications to brain imaging data. PMID:22108139
Probing the mechanism of fusion in a two-dimensional computer simulation.
Chanturiya, Alexandr; Scaria, Puthurapamil; Kuksenok, Oleksandr; Woodle, Martin C
2002-01-01
A two-dimensional (2D) model of lipid bilayers was developed and used to investigate a possible role of membrane lateral tension in membrane fusion. We found that an increase of lateral tension in contacting monolayers of 2D analogs of liposomes and planar membranes could cause not only hemifusion, but also complete fusion when internal pressure is introduced in the model. With a certain set of model parameters it was possible to induce hemifusion-like structural changes by a tension increase in only one of the two contacting bilayers. The effect of lysolipids was modeled as an insertion of a small number of extra molecules into the cis or trans side of the interacting bilayers at different stages of simulation. It was found that cis insertion arrests fusion and trans insertion has no inhibitory effect on fusion. The possibility of protein participation in tension-driven fusion was tested in simulation, with one of two model liposomes containing a number of structures capable of reducing the area occupied by them in the outer monolayer. It was found that condensation of these structures was sufficient to produce membrane reorganization similar to that observed in simulations with "protein-free" bilayers. These data support the hypothesis that changes in membrane lateral tension may be responsible for fusion in both model phospholipid membranes and in biological protein-mediated fusion. PMID:12023230
Ontology-aided Data Fusion (Invited)
NASA Astrophysics Data System (ADS)
Raskin, R.
2009-12-01
An ontology provides semantic descriptions that are analogous to those in a dictionary, but are readable by both computers and humans. A data or service is semantically annotated when it is formally associated with elements of an ontology. The ESIP Federation Semantic Web Cluster has developed a set of ontologies to describe datatypes and data services that can be used to support automated data fusion. The service ontology includes descriptors of the service function, its inputs/outputs, and its invocation method. The datatype descriptors resemble typical metadata fields (data format, data model, data structure, originator, etc.) augmented with descriptions of the meaning of the data. These ontologies, in combination with the SWEET science ontology, enable a registered data fusion service to be chained together and implemented that is scientifically meaningful based on machine understanding of the associated data and services. This presentation describes initial results and experiences in automated data fusion.
Overview of FAR-TECH's magnetic fusion energy research
NASA Astrophysics Data System (ADS)
Kim, Jin-Soo; Bogatu, I. N.; Galkin, S. A.; Spencer, J. Andrew; Svidzinski, V. A.; Zhao, L.
2017-10-01
FAR-TECH, Inc. has been working on magnetic fusion energy research over two-decades. During the years, we have developed unique approaches to help understanding the physics, and resolving issues in magnetic fusion energy. The specific areas of work have been in modeling RF waves in plasmas, MHD modeling and mode-identification, and nano-particle plasma jet and its application to disruption mitigation. Our research highlights in recent years will be presented with examples, specifically, developments of FullWave (Full Wave RF code), PMARS (Parallelized MARS code), and HEM (Hybrid ElectroMagnetic code). In addition, nano-particle plasma-jet (NPPJ) and its application for disruption mitigation will be presented. Work is supported by the U.S. DOE SBIR program.
Immunological Approach to the Identification and Development of Vaccines to Various Toxins
1991-03-30
discussed. 4 II. RESULTS A. SAXITOXIN Within the last year, fusions of spleen cells from mice immu- nized with SXT conjugated to keyhole limpet hemocyanin...as described in previous reports (also see reference 1). A total of approximately 1200 hybrids have been screened from two fusions of spleen cells ...from mice immunized with SXT-formaldeh1yd- KLH and three fusions of spleen cells from mice immunized with SXT-SPDP-KLH (data not shown). Up to date
Facchinello, Yann; Brailovski, Vladimir; Petit, Yvan; Mac-Thiong, Jean-Marc
2014-11-01
The concept of a monolithic Ti-Ni spinal rod with variable flexural stiffness is proposed to reduce the risks associated with spinal fusion. The variable stiffness is conferred to the rod using the Joule-heating local annealing technique. The annealing temperature and the mechanical properties' distributions resulted from this thermal treatment are numerically modeled and experimentally measured. To illustrate the possible applications of such a modeling approach, two case studies are presented: (a) optimization of the Joule-heating strategy to reduce annealing time, and (b) modulation of the rod's overall flexural stiffness using partial annealing. A numerical model of a human spine coupled with the model of the variable flexural stiffness spinal rod developed in this work can ultimately be used to maximize the stabilization capability of spinal instrumentation, while simultaneously decreasing the risks associated with spinal fusion. Copyright © 2014 IPEM. Published by Elsevier Ltd. All rights reserved.
Human-centric predictive model of task difficulty for human-in-the-loop control tasks
Majewicz Fey, Ann
2018-01-01
Quantitatively measuring the difficulty of a manipulation task in human-in-the-loop control systems is ill-defined. Currently, systems are typically evaluated through task-specific performance measures and post-experiment user surveys; however, these methods do not capture the real-time experience of human users. In this study, we propose to analyze and predict the difficulty of a bivariate pointing task, with a haptic device interface, using human-centric measurement data in terms of cognition, physical effort, and motion kinematics. Noninvasive sensors were used to record the multimodal response of human user for 14 subjects performing the task. A data-driven approach for predicting task difficulty was implemented based on several task-independent metrics. We compare four possible models for predicting task difficulty to evaluated the roles of the various types of metrics, including: (I) a movement time model, (II) a fusion model using both physiological and kinematic metrics, (III) a model only with kinematic metrics, and (IV) a model only with physiological metrics. The results show significant correlation between task difficulty and the user sensorimotor response. The fusion model, integrating user physiology and motion kinematics, provided the best estimate of task difficulty (R2 = 0.927), followed by a model using only kinematic metrics (R2 = 0.921). Both models were better predictors of task difficulty than the movement time model (R2 = 0.847), derived from Fitt’s law, a well studied difficulty model for human psychomotor control. PMID:29621301
NASA Astrophysics Data System (ADS)
Cantelli, A.; D'Orta, F.; Cattini, A.; Sebastianelli, F.; Cedola, L.
2015-08-01
A computational model is developed for retrieving the positions and the emission rates of unknown pollution sources, under steady state conditions, starting from the measurements of the concentration of the pollutants. The approach is based on the minimization of a fitness function employing a genetic algorithm paradigm. The model is tested considering both pollutant concentrations generated through a Gaussian model in 25 points in a 3-D test case domain (1000m × 1000m × 50 m) and experimental data such as the Prairie Grass field experiments data in which about 600 receptors were located along five concentric semicircle arcs and the Fusion Field Trials 2007. The results show that the computational model is capable to efficiently retrieve up to three different unknown sources.
Context-Aware Fusion of RGB and Thermal Imagery for Traffic Monitoring
Alldieck, Thiemo; Bahnsen, Chris H.; Moeslund, Thomas B.
2016-01-01
In order to enable a robust 24-h monitoring of traffic under changing environmental conditions, it is beneficial to observe the traffic scene using several sensors, preferably from different modalities. To fully benefit from multi-modal sensor output, however, one must fuse the data. This paper introduces a new approach for fusing color RGB and thermal video streams by using not only the information from the videos themselves, but also the available contextual information of a scene. The contextual information is used to judge the quality of a particular modality and guides the fusion of two parallel segmentation pipelines of the RGB and thermal video streams. The potential of the proposed context-aware fusion is demonstrated by extensive tests of quantitative and qualitative characteristics on existing and novel video datasets and benchmarked against competing approaches to multi-modal fusion. PMID:27869730
Adaptive Modeling of the International Space Station Electrical Power System
NASA Technical Reports Server (NTRS)
Thomas, Justin Ray
2007-01-01
Software simulations provide NASA engineers the ability to experiment with spacecraft systems in a computer-imitated environment. Engineers currently develop software models that encapsulate spacecraft system behavior. These models can be inaccurate due to invalid assumptions, erroneous operation, or system evolution. Increasing accuracy requires manual calibration and domain-specific knowledge. This thesis presents a method for automatically learning system models without any assumptions regarding system behavior. Data stream mining techniques are applied to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). We also explore a knowledge fusion approach that uses traditional engineered EPS models to supplement the learned models. We observed that these engineered EPS models provide useful background knowledge to reduce predictive error spikes when confronted with making predictions in situations that are quite different from the training scenarios used when learning the model. Evaluations using ISS sensor data and existing EPS models demonstrate the success of the adaptive approach. Our experimental results show that adaptive modeling provides reductions in model error anywhere from 80% to 96% over these existing models. Final discussions include impending use of adaptive modeling technology for ISS mission operations and the need for adaptive modeling in future NASA lunar and Martian exploration.
Anterior surgical management of single-level cervical disc disease: a cost-effectiveness analysis.
Lewis, Daniel J; Attiah, Mark A; Malhotra, Neil R; Burnett, Mark G; Stein, Sherman C
2014-12-01
Cost-effectiveness analysis with decision analysis and meta-analysis. To determine the relative cost-effectiveness of anterior cervical discectomy with fusion (with autograft, allograft, or spacers), anterior cervical discectomy without fusion (ACD), and cervical disc replacement (CDR) for the treatment of 1-level cervical disc disease. There is debate as to the optimal anterior surgical strategy to treat single-level cervical disc disease. Surgical strategies include 3 techniques of anterior cervical discectomy with fusion (autograft, allograft, or spacer-assisted fusion), ACD, and CDR. Several controlled trials have compared these treatments but have yielded mixed results. Decision analysis provides a structure for making a quantitative comparison of the costs and outcomes of each treatment. A literature search was performed and yielded 156 case series that fulfilled our search criteria describing nearly 17,000 cases. Data were abstracted from these publications and pooled meta-analytically to estimate the incidence of various outcomes, including index-level and adjacent-level reoperation. A decision analytic model calculated the expected costs in US dollars and outcomes in quality-adjusted life years for a typical adult patient with 1-level cervical radiculopathy subjected to each of the 5 approaches. At 5 years postoperatively, patients who had undergone ACD alone had significantly (P < 0.001) more quality-adjusted life years (4.885 ± 0.041) than those receiving other treatments. Patients with ACD also exhibited highly significant (P < 0.001) differences in costs, incurring the lowest societal costs ($16,558 ± $539). Follow-up data were inadequate for comparison beyond 5 years. The results of our decision analytic model indicate advantages for ACD, both in effectiveness and costs, over other strategies. Thus, ACD is a cost-effective alternative to anterior cervical discectomy with fusion and CDR in patients with single-level cervical disc disease. Definitive conclusions about degenerative changes after ACD and adjacent-level disease after CDR await longer follow-up. 4.
NASA Astrophysics Data System (ADS)
Jourde, K.; Gibert, D.; Marteau, J.
2015-04-01
This paper examines how the resolution of small-scale geological density models is improved through the fusion of information provided by gravity measurements and density muon radiographies. Muon radiography aims at determining the density of geological bodies by measuring their screening effect on the natural flux of cosmic muons. Muon radiography essentially works like medical X-ray scan and integrates density information along elongated narrow conical volumes. Gravity measurements are linked to density by a 3-D integration encompassing the whole studied domain. We establish the mathematical expressions of these integration formulas - called acquisition kernels - and derive the resolving kernels that are spatial filters relating the true unknown density structure to the density distribution actually recovered from the available data. The resolving kernels approach allows to quantitatively describe the improvement of the resolution of the density models achieved by merging gravity data and muon radiographies. The method developed in this paper may be used to optimally design the geometry of the field measurements to perform in order to obtain a given spatial resolution pattern of the density model to construct. The resolving kernels derived in the joined muon/gravimetry case indicate that gravity data are almost useless to constrain the density structure in regions sampled by more than two muon tomography acquisitions. Interestingly the resolution in deeper regions not sampled by muon tomography is significantly improved by joining the two techniques. The method is illustrated with examples for La Soufrière of Guadeloupe volcano.
Physics-based and human-derived information fusion for analysts
NASA Astrophysics Data System (ADS)
Blasch, Erik; Nagy, James; Scott, Steve; Okoth, Joshua; Hinman, Michael
2017-05-01
Recent trends in physics-based and human-derived information fusion (PHIF) have amplified the capabilities of analysts; however with the big data opportunities there is a need for open architecture designs, methods of distributed team collaboration, and visualizations. In this paper, we explore recent trends in the information fusion to support user interaction and machine analytics. Challenging scenarios requiring PHIF include combing physics-based video data with human-derived text data for enhanced simultaneous tracking and identification. A driving effort would be to provide analysts with applications, tools, and interfaces that afford effective and affordable solutions for timely decision making. Fusion at scale should be developed to allow analysts to access data, call analytics routines, enter solutions, update models, and store results for distributed decision making.
The High Field Path to Practical Fusion Energy
NASA Astrophysics Data System (ADS)
Mumgaard, Robert; Whyte, D.; Greenwald, M.; Hartwig, Z.; Brunner, D.; Sorbom, B.; Marmar, E.; Minervini, J.; Bonoli, P.; Irby, J.; Labombard, B.; Terry, J.; Vieira, R.; Wukitch, S.
2017-10-01
We propose a faster, lower cost development path for fusion energy enabled by high temperature superconductors, devices at high magnetic field, innovative technologies and modern approaches to technology development. Timeliness, scale, and economic-viability are the drivers for fusion energy to combat climate change and aid economic development. The opportunities provided by high-temperature superconductors, innovative engineering and physics, and new organizational structures identified over the last few years open new possibilities for realizing practical fusion energy that could meet mid-century de-carbonization needs. We discuss re-factoring the fusion energy development path with an emphasis on concrete risk retirement strategies utilizing a modular approach based on the high-field tokamak that leverages the broader tokamak physics understanding of confinement, stability, and operational limits. Elements of this plan include development of high-temperature superconductor magnets, simplified immersion blankets, advanced long-leg divertors, a compact divertor test tokamak, efficient current drive, modular construction, and demountable magnet joints. An R&D plan culminating in the construction of an integrated pilot plant and test facility modeled on the ARC concept is presented.
Performance evaluation of an asynchronous multisensor track fusion filter
NASA Astrophysics Data System (ADS)
Alouani, Ali T.; Gray, John E.; McCabe, D. H.
2003-08-01
Recently the authors developed a new filter that uses data generated by asynchronous sensors to produce a state estimate that is optimal in the minimum mean square sense. The solution accounts for communications delay between sensors platform and fusion center. It also deals with out of sequence data as well as latent data by processing the information in a batch-like manner. This paper compares, using simulated targets and Monte Carlo simulations, the performance of the filter to the optimal sequential processing approach. It was found that the new asynchronous Multisensor track fusion filter (AMSTFF) performance is identical to that of the extended sequential Kalman filter (SEKF), while the new filter updates its track at a much lower rate than the SEKF.
NASA Astrophysics Data System (ADS)
Shahini Shamsabadi, Salar
A web-based PAVEment MONitoring system, PAVEMON, is a GIS oriented platform for accommodating, representing, and leveraging data from a multi-modal mobile sensor system. Stated sensor system consists of acoustic, optical, electromagnetic, and GPS sensors and is capable of producing as much as 1 Terabyte of data per day. Multi-channel raw sensor data (microphone, accelerometer, tire pressure sensor, video) and processed results (road profile, crack density, international roughness index, micro texture depth, etc.) are outputs of this sensor system. By correlating the sensor measurements and positioning data collected in tight time synchronization, PAVEMON attaches a spatial component to all the datasets. These spatially indexed outputs are placed into an Oracle database which integrates seamlessly with PAVEMON's web-based system. The web-based system of PAVEMON consists of two major modules: 1) a GIS module for visualizing and spatial analysis of pavement condition information layers, and 2) a decision-support module for managing maintenance and repair (Mℝ) activities and predicting future budget needs. PAVEMON weaves together sensor data with third-party climate and traffic information from the National Oceanic and Atmospheric Administration (NOAA) and Long Term Pavement Performance (LTPP) databases for an organized data driven approach to conduct pavement management activities. PAVEMON deals with heterogeneous and redundant observations by fusing them for jointly-derived higher-confidence results. A prominent example of the fusion algorithms developed within PAVEMON is a data fusion algorithm used for estimating the overall pavement conditions in terms of ASTM's Pavement Condition Index (PCI). PAVEMON predicts PCI by undertaking a statistical fusion approach and selecting a subset of all the sensor measurements. Other fusion algorithms include noise-removal algorithms to remove false negatives in the sensor data in addition to fusion algorithms developed for identifying features on the road. PAVEMON offers an ideal research and monitoring platform for rapid, intelligent and comprehensive evaluation of tomorrow's transportation infrastructure based on up-to-date data from heterogeneous sensor systems.
Bradley, W Daniel; Hisey, Michael S; Verma-Kurvari, Sunita; Ohnmeiss, Donna D
2012-01-01
Lumbar interbody fusion has long been used for the treatment of painful degenerative spinal conditions. The anterior approach is not feasible in some patients, and the posterior approach is associated with a risk of neural complications and possibly muscle injury. A trans-sacral technique was developed that allows access to the L5-S1 disc space. The purposes of this study were to investigate the clinical outcome of trans-sacral interbody fusion in a consecutive series of patients from 1 center and to perform a comprehensive review of the literature on this procedure. A literature search using PubMed was performed to identify articles published on trans-sacral axial lumbar interbody fusion (AxiaLIF). Articles reviewed included biomechanical testing, feasibility of the technique, and clinical results. The data from our center were collected retrospectively from charts for the consecutive series, beginning with the first case, of all patients undergoing fusion using the AxiaLIF technique. In most cases, posterior instrumentation was also used. A total of 41 patients with at least 6 months' follow-up were included (mean follow-up, 22.2 months). The primary clinical outcome measures were visual analog scales separately assessing back and leg pain and the Oswestry Disability Index. Radiographic assessment of fusion was also performed. In the group of 28 patients undergoing single-level AxiaLIF combined with posterior fusion, the visual analog scale scores assessing back and leg pain and mean Oswestry Disability Index scores improved significantly (P < .01). In the remaining 13 patients, back pain improved significantly with a trend for improvement in leg pain. Reoperation occurred in 19.5% of patients; in half of these, reoperation was not related to the anterior procedure. A review of the literature found that the AxiaLIF technique was similar to other fusion techniques with respect to biomechanical properties and produced acceptable clinical outcomes, although results varied among studies. The AxiaLIF approach allows access to the L5-S1 interspace without violating the annulus or longitudinal ligaments and with minimal risk to dorsal neural elements. It may be a viable alternative to other approaches to interbody fusion at the L5-S1 level. It is important that the patients be selected carefully and surgeons are familiar with the presacral anatomy and the surgical approach.
Wang, Yen-Ling
2014-01-01
Checkpoint kinase 2 (Chk2) has a great effect on DNA-damage and plays an important role in response to DNA double-strand breaks and related lesions. In this study, we will concentrate on Chk2 and the purpose is to find the potential inhibitors by the pharmacophore hypotheses (PhModels), combinatorial fusion, and virtual screening techniques. Applying combinatorial fusion into PhModels and virtual screening techniques is a novel design strategy for drug design. We used combinatorial fusion to analyze the prediction results and then obtained the best correlation coefficient of the testing set (r test) with the value 0.816 by combining the BesttrainBesttest and FasttrainFasttest prediction results. The potential inhibitors were selected from NCI database by screening according to BesttrainBesttest + FasttrainFasttest prediction results and molecular docking with CDOCKER docking program. Finally, the selected compounds have high interaction energy between a ligand and a receptor. Through these approaches, 23 potential inhibitors for Chk2 are retrieved for further study. PMID:24864236
Mohamed, Yehia S; Dunnion, Debbie; Teobald, Iryna; Walewska, Renata; Browning, Michael J
2012-10-12
Fusions of dendritic cells (DCs) and tumour cells have been shown to induce protective immunity to tumour challenge in animal models, and to represent a promising approach to cancer immunotherapy. The broader clinical application of this approach, however, is potentially constrained by the lack of replicative capacity and limited standardisation of fusion cell preparations. We show here that fusion of ex vivo tumour cells isolated from patients with a range of haematological malignancies with the human B-lymphoblastoid cell line (LCL), HMy2, followed by chemical selection of the hybridomas, generated stable, self-replicating human hybrid cell lines that grew continuously in tissue culture, and survived freeze/thawing cycles. The hybrid cell lines expressed HLA class I and class II molecules, and the major T-cell costimulatory molecules, CD80 and CD86. All but two of 14 hybrid cell lines generated expressed tumour-associated antigens that were not expressed by HMy2 cells, and were therefore derived from the parent tumour cells. The hybrid cell lines stimulated allogeneic T-cell proliferative responses and interferon-gamma release in vitro to a considerably greater degree than their respective parent tumour cells. The enhanced T-cell stimulation was inhibited by CTLA4-Ig fusion protein, and by blocking antibodies to MHC class I and class II molecules. Finally, all of five LCL/tumour hybrid cell lines tested induced tumour antigen-specific cytotoxic T-cell responses in vitro in PBL from healthy, HLA-A2+ individuals, as detected by HLA-A2-peptide pentamer staining and cellular cytotoxicity. These data show that stable hybrid cell lines, with enhanced immunostimulatory properties and potential for therapeutic vaccination, can be generated by in vitro fusion and chemical selection of B-LCL and ex vivo haematological tumour cells. Copyright © 2012 Elsevier Ltd. All rights reserved.
Potluri, Chandrasekhar; Anugolu, Madhavi; Schoen, Marco P; Subbaram Naidu, D; Urfer, Alex; Chiu, Steve
2013-11-01
Estimating skeletal muscle (finger) forces using surface Electromyography (sEMG) signals poses many challenges. In general, the sEMG measurements are based on single sensor data. In this paper, two novel hybrid fusion techniques for estimating the skeletal muscle force from the sEMG array sensors are proposed. The sEMG signals are pre-processed using five different filters: Butterworth, Chebychev Type II, Exponential, Half-Gaussian and Wavelet transforms. Dynamic models are extracted from the acquired data using Nonlinear Wiener Hammerstein (NLWH) models and Spectral Analysis Frequency Dependent Resolution (SPAFDR) models based system identification techniques. A detailed comparison is provided for the proposed filters and models using 18 healthy subjects. Wavelet transforms give higher mean correlation of 72.6 ± 1.7 (mean ± SD) and 70.4 ± 1.5 (mean ± SD) for NLWH and SPAFDR models, respectively, when compared to the other filters used in this work. Experimental verification of the fusion based hybrid models with wavelet transform shows a 96% mean correlation and 3.9% mean relative error with a standard deviation of ± 1.3 and ± 0.9 respectively between the overall hybrid fusion algorithm estimated and the actual force for 18 test subjects' k-fold cross validation data. © 2013 Elsevier Ltd. All rights reserved.
HEDP and new directions for fusion energy
NASA Astrophysics Data System (ADS)
Kirkpatrick, Ronald C.
2010-06-01
Magnetic-confinement fusion energy and inertia-confinement fusion energy (IFE) represent two extreme approaches to the quest for the application of thermonuclear fusion to electrical energy generation. Blind pursuit of these extreme approaches has long delayed the achievement of their common goal. We point out the possibility of an intermediate approach that promises cheaper, and consequently more rapid development of fusion energy. For example, magneto-inertial fusion appears to be possible over a broad range of parameter space. It is further argued that imposition of artificial constraints impedes the discovery of physics solutions for the fusion energy problem.
Interactive Plasma Physics Education Using Data from Fusion Experiments
NASA Astrophysics Data System (ADS)
Calderon, Brisa; Davis, Bill; Zwicker, Andrew
2010-11-01
The Internet Plasma Physics Education Experience (IPPEX) website was created in 1996 to give users access to data from plasma and fusion experiments. Interactive material on electricity, magnetism, matter, and energy was presented to generate interest and prepare users to understand data from a fusion experiment. Initially, users were allowed to analyze real-time and archival data from the Tokamak Fusion Test Reactor (TFTR) experiment. IPPEX won numerous awards for its novel approach of allowing users to participate in ongoing research. However, the latest revisions of IPPEX were in 2001 and the interactive material is no longer functional on modern browsers. Also, access to real-time data was lost when TFTR was shut down. The interactive material on IPPEX is being rewritten in ActionScript3.0, and real-time and archival data from the National Spherical Tokamak Experiment (NSTX) will be made available to users. New tools like EFIT animations, fast cameras, and plots of important plasma parameters will be included along with an existing Java-based ``virtual tokamak.'' Screenshots from the upgraded website and future directions will be presented.
Automatic tissue segmentation of head and neck MR images for hyperthermia treatment planning
NASA Astrophysics Data System (ADS)
Fortunati, Valerio; Verhaart, René F.; Niessen, Wiro J.; Veenland, Jifke F.; Paulides, Margarethus M.; van Walsum, Theo
2015-08-01
A hyperthermia treatment requires accurate, patient-specific treatment planning. This planning is based on 3D anatomical models which are generally derived from computed tomography. Because of its superior soft tissue contrast, magnetic resonance imaging (MRI) information can be introduced to improve the quality of these 3D patient models and therefore the treatment planning itself. Thus, we present here an automatic atlas-based segmentation algorithm for MR images of the head and neck. Our method combines multiatlas local weighting fusion with intensity modelling. The accuracy of the method was evaluated using a leave-one-out cross validation experiment over a set of 11 patients for which manual delineation were available. The accuracy of the proposed method was high both in terms of the Dice similarity coefficient (DSC) and the 95th percentile Hausdorff surface distance (HSD) with median DSC higher than 0.8 for all tissues except sclera. For all tissues, except the spine tissues, the accuracy was approaching the interobserver agreement/variability both in terms of DSC and HSD. The positive effect of adding the intensity modelling to the multiatlas fusion decreased when a more accurate atlas fusion method was used. Using the proposed approach we improved the performance of the approach previously presented for H&N hyperthermia treatment planning, making the method suitable for clinical application.
Formation and decay analysis of
NASA Astrophysics Data System (ADS)
Gautam, Manjeet Singh; Kaur, Amandeep; Sharma, Manoj K.
2015-11-01
We have analyzed the fusion dynamics of Ca40
Predicting breast cancer using an expression values weighted clinical classifier.
Thomas, Minta; De Brabanter, Kris; Suykens, Johan A K; De Moor, Bart
2014-12-31
Clinical data, such as patient history, laboratory analysis, ultrasound parameters-which are the basis of day-to-day clinical decision support-are often used to guide the clinical management of cancer in the presence of microarray data. Several data fusion techniques are available to integrate genomics or proteomics data, but only a few studies have created a single prediction model using both gene expression and clinical data. These studies often remain inconclusive regarding an obtained improvement in prediction performance. To improve clinical management, these data should be fully exploited. This requires efficient algorithms to integrate these data sets and design a final classifier. LS-SVM classifiers and generalized eigenvalue/singular value decompositions are successfully used in many bioinformatics applications for prediction tasks. While bringing up the benefits of these two techniques, we propose a machine learning approach, a weighted LS-SVM classifier to integrate two data sources: microarray and clinical parameters. We compared and evaluated the proposed methods on five breast cancer case studies. Compared to LS-SVM classifier on individual data sets, generalized eigenvalue decomposition (GEVD) and kernel GEVD, the proposed weighted LS-SVM classifier offers good prediction performance, in terms of test area under ROC Curve (AUC), on all breast cancer case studies. Thus a clinical classifier weighted with microarray data set results in significantly improved diagnosis, prognosis and prediction responses to therapy. The proposed model has been shown as a promising mathematical framework in both data fusion and non-linear classification problems.
Mixed H2/H∞-Based Fusion Estimation for Energy-Limited Multi-Sensors in Wearable Body Networks
Li, Chao; Zhang, Zhenjiang; Chao, Han-Chieh
2017-01-01
In wireless sensor networks, sensor nodes collect plenty of data for each time period. If all of data are transmitted to a Fusion Center (FC), the power of sensor node would run out rapidly. On the other hand, the data also needs a filter to remove the noise. Therefore, an efficient fusion estimation model, which can save the energy of the sensor nodes while maintaining higher accuracy, is needed. This paper proposes a novel mixed H2/H∞-based energy-efficient fusion estimation model (MHEEFE) for energy-limited Wearable Body Networks. In the proposed model, the communication cost is firstly reduced efficiently while keeping the estimation accuracy. Then, the parameters in quantization method are discussed, and we confirm them by an optimization method with some prior knowledge. Besides, some calculation methods of important parameters are researched which make the final estimates more stable. Finally, an iteration-based weight calculation algorithm is presented, which can improve the fault tolerance of the final estimate. In the simulation, the impacts of some pivotal parameters are discussed. Meanwhile, compared with the other related models, the MHEEFE shows a better performance in accuracy, energy-efficiency and fault tolerance. PMID:29280950
Ouyang, Qin; Zhao, Jiewen; Chen, Quansheng
2014-09-02
Instrumental test of food quality using perception sensors instead of human panel test is attracting massive attention recently. A novel cross-perception multi-sensors data fusion imitating multiple mammal perception was proposed for the instrumental test in this work. First, three mimic sensors of electronic eye, electronic nose and electronic tongue were used in sequence for data acquisition of rice wine samples. Then all data from the three different sensors were preprocessed and merged. Next, three cross-perception variables i.e., color, aroma and taste, were constructed using principal components analysis (PCA) and multiple linear regression (MLR) which were used as the input of models. MLR, back-propagation artificial neural network (BPANN) and support vector machine (SVM) were comparatively used for modeling, and the instrumental test was achieved for the comprehensive quality of samples. Results showed the proposed cross-perception multi-sensors data fusion presented obvious superiority to the traditional data fusion methodologies, also achieved a high correlation coefficient (>90%) with the human panel test results. This work demonstrated that the instrumental test based on the cross-perception multi-sensors data fusion can actually mimic the human test behavior, therefore is of great significance to ensure the quality of products and decrease the loss of the manufacturers. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.
2015-01-01
The earlier study of excitation functions of *105Ag, formed in the 12C+93Nb reaction, based on the dynamical cluster-decay model (DCM), using the pocket formula for nuclear proximity potential is extended to the use of other nuclear interaction potentials derived from the Skyrme energy density functional (SEDF) based on the semiclassical extended Thomas Fermi (ETF) approach and to the use of the extended-Wong model of Gupta and collaborators. The Skyrme forces used are the old SIII and SIV and the new SSk, GSkI, and KDE0(v1) given for both normal and isospin-rich nuclei, with densities added in the frozen-density approximation. Taking advantage of the fact that different Skyrme forces provide different barrier characteristics, we look for the "barrier modification" effects in terms of choosing an appropriate force and hence for the existence or nonexistence of noncompound nucleus (nCN) effects in this reaction. Interestingly, independent of the choice of Skyrme or proximity force, the extended-Wong model fits the experimental data nicely, without any barrier modification and hence no nCN component in the measured fusion cross section, which consists of light-particle evaporation residue (ER) and intermediate-mass fragments (IMFs) up to mass 13, i.e., σfusionExpt .=σER+σIMFs . However, the predicted fusion cross section due to the extended-Wong model is much larger, possibly because of the so-far missing fusion-fission (ff) component in the data. On the other hand, in agreement with the earlier work using the pocket proximity potential, the DCM fits only some data (mainly IMFs) for only some Skyrme forces, and hence it presents the chosen reaction as a case of a large nCN component, whose empirically estimated content is fitted for use of the DCM with a fragment preformation factor taken equal to one, i.e., using DCM (P0=1 ), by introducing "barrier modification" through changing the neck-length parameter Δ R for a best fit to the empirical nCN data in each (ER and IMF) decay channel. Also, the ff component of the DCM is predicted to lie around the symmetric mass A /2 ±16 . All calculations are made for deformed and oriented coplanar nuclei.
NASA Technical Reports Server (NTRS)
Haldemann, Albert F. C.; Johnson, Jerome B.; Elphic, Richard C.; Boynton, William V.; Wetzel, John
2006-01-01
CRUX is a modular suite of geophysical and borehole instruments combined with display and decision support system (MapperDSS) tools to characterize regolith resources, surface conditions, and geotechnical properties. CRUX is a NASA-funded Technology Maturation Program effort to provide enabling technology for Lunar and Planetary Surface Operations (LPSO). The MapperDSS uses data fusion methods with CRUX instruments, and other available data and models, to provide regolith properties information needed for LPSO that cannot be determined otherwise. We demonstrate the data fusion method by showing how it might be applied to characterize the distribution and form of hydrogen using a selection of CRUX instruments: Borehole Neutron Probe and Thermal Evolved Gas Analyzer data as a function of depth help interpret Surface Neutron Probe data to generate 3D information. Secondary information from other instruments along with physical models improves the hydrogen distribution characterization, enabling information products for operational decision-making.
Funding for the 2ND IAEA technical meeting on fusion data processing, validation and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Greenwald, Martin
The International Atomic Energy Agency (IAEA) will organize the second Technical Meeting on Fusion Da Processing, Validation and Analysis from 30 May to 02 June, 2017, in Cambridge, MA USA. The meeting w be hosted by the MIT Plasma Science and Fusion Center (PSFC). The objective of the meeting is to provide a platform where a set of topics relevant to fusion data processing, validation and analysis are discussed with the view of extrapolation needs to next step fusion devices such as ITER. The validation and analysis of experimental data obtained from diagnostics used to characterize fusion plasmas are crucialmore » for a knowledge based understanding of the physical processes governing the dynamics of these plasmas. The meeting will aim at fostering, in particular, discussions of research and development results that set out or underline trends observed in the current major fusion confinement devices. General information on the IAEA, including its mission and organization, can be found at the IAEA websit Uncertainty quantification (UQ) Model selection, validation, and verification (V&V) Probability theory and statistical analysis Inverse problems & equilibrium reconstru ction Integrated data analysis Real time data analysis Machine learning Signal/image proc essing & pattern recognition Experimental design and synthetic diagnostics Data management« less
CONFERENCE REPORT: Summary of the 8th IAEA Technical Meeting on Fusion Power Plant Safety
NASA Astrophysics Data System (ADS)
Girard, J. Ph.; Gulden, W.; Kolbasov, B.; Louzeiro-Malaquias, A.-J.; Petti, D.; Rodriguez-Rodrigo, L.
2008-01-01
Reports were presented covering a selection of topics on the safety of fusion power plants. These included a review on licensing studies developed for ITER site preparation surveying common and non-common issues (i.e. site dependent) as lessons to a broader approach for fusion power plant safety. Several fusion power plant models, spanning from accessible technology to more advanced-materials based concepts, were discussed. On the topic related to fusion-specific technology, safety studies were reported on different concepts of breeding blanket modules, tritium handling and auxiliary systems under normal and accident scenarios' operation. The testing of power plant relevant technology in ITER was also assessed in terms of normal operation and accident scenarios, and occupational doses and radioactive releases under these testings have been determined. Other specific safety issues for fusion have also been discussed such as availability and reliability of fusion power plants, dust and tritium inventories and component failure databases. This study reveals that the environmental impact of fusion power plants can be minimized through a proper selection of low activation materials and using recycling technology helping to reduce waste volume and potentially open the route for its reutilization for the nuclear sector or even its clearance into the commercial circuit. Computational codes for fusion safety have been presented in support of the many studies reported. The on-going work on establishing validation approaches aiming at improving the prediction capability of fusion codes has been supported by experimental results and new directions for development have been identified. Fusion standards are not available and fission experience is mostly used as the framework basis for licensing and target design for safe operation and occupational and environmental constraints. It has been argued that fusion can benefit if a specific fusion approach is implemented, in particular for materials selection which will have a large impact on waste disposal and recycling and in the real limits of radiation releases if indexed to the real impact on individuals and the environment given the differences in the types of radiation emitted by tritium when compared with the fission products. Round table sessions resulted in some common recommendations. The discussions also created the awareness of the need for a larger involvement of the IAEA in support of fusion safety standards development.
Bayesian Approaches for Model and Multi-mission Satellites Data Fusion
NASA Astrophysics Data System (ADS)
Khaki, M., , Dr; Forootan, E.; Awange, J.; Kuhn, M.
2017-12-01
Traditionally, data assimilation is formulated as a Bayesian approach that allows one to update model simulations using new incoming observations. This integration is necessary due to the uncertainty in model outputs, which mainly is the result of several drawbacks, e.g., limitations in accounting for the complexity of real-world processes, uncertainties of (unknown) empirical model parameters, and the absence of high resolution (both spatially and temporally) data. Data assimilation, however, requires knowledge of the physical process of a model, which may be either poorly described or entirely unavailable. Therefore, an alternative method is required to avoid this dependency. In this study we present a novel approach which can be used in hydrological applications. A non-parametric framework based on Kalman filtering technique is proposed to improve hydrological model estimates without using a model dynamics. Particularly, we assesse Kalman-Taken formulations that take advantage of the delay coordinate method to reconstruct nonlinear dynamics in the absence of the physical process. This empirical relationship is then used instead of model equations to integrate satellite products with model outputs. We use water storage variables from World-Wide Water Resources Assessment (W3RA) simulations and update them using data known as the Gravity Recovery And Climate Experiment (GRACE) terrestrial water storage (TWS) and also surface soil moisture data from the Advanced Microwave Scanning Radiometer for the Earth Observing System (AMSR-E) over Australia for the period of 2003 to 2011. The performance of the proposed integration method is compared with data obtained from the more traditional assimilation scheme using the Ensemble Square-Root Filter (EnSRF) filtering technique (Khaki et al., 2017), as well as by evaluating them against ground-based soil moisture and groundwater observations within the Murray-Darling Basin.
A data fusion-based methodology for optimal redesign of groundwater monitoring networks
NASA Astrophysics Data System (ADS)
Hosseini, Marjan; Kerachian, Reza
2017-09-01
In this paper, a new data fusion-based methodology is presented for spatio-temporal (S-T) redesigning of Groundwater Level Monitoring Networks (GLMNs). The kriged maps of three different criteria (i.e. marginal entropy of water table levels, estimation error variances of mean values of water table levels, and estimation values of long-term changes in water level) are combined for determining monitoring sub-areas of high and low priorities in order to consider different spatial patterns for each sub-area. The best spatial sampling scheme is selected by applying a new method, in which a regular hexagonal gridding pattern and the Thiessen polygon approach are respectively utilized in sub-areas of high and low monitoring priorities. An Artificial Neural Network (ANN) and a S-T kriging models are used to simulate water level fluctuations. To improve the accuracy of the predictions, results of the ANN and S-T kriging models are combined using a data fusion technique. The concept of Value of Information (VOI) is utilized to determine two stations with maximum information values in both sub-areas with high and low monitoring priorities. The observed groundwater level data of these two stations are considered for the power of trend detection, estimating periodic fluctuations and mean values of the stationary components, which are used for determining non-uniform sampling frequencies for sub-areas. The proposed methodology is applied to the Dehgolan plain in northwestern Iran. The results show that a new sampling configuration with 35 and 7 monitoring stations and sampling intervals of 20 and 32 days, respectively in sub-areas with high and low monitoring priorities, leads to a more efficient monitoring network than the existing one containing 52 monitoring stations and monthly temporal sampling.
A DNA-based semantic fusion model for remote sensing data.
Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H
2013-01-01
Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology.
A DNA-Based Semantic Fusion Model for Remote Sensing Data
Sun, Heng; Weng, Jian; Yu, Guangchuang; Massawe, Richard H.
2013-01-01
Semantic technology plays a key role in various domains, from conversation understanding to algorithm analysis. As the most efficient semantic tool, ontology can represent, process and manage the widespread knowledge. Nowadays, many researchers use ontology to collect and organize data's semantic information in order to maximize research productivity. In this paper, we firstly describe our work on the development of a remote sensing data ontology, with a primary focus on semantic fusion-driven research for big data. Our ontology is made up of 1,264 concepts and 2,030 semantic relationships. However, the growth of big data is straining the capacities of current semantic fusion and reasoning practices. Considering the massive parallelism of DNA strands, we propose a novel DNA-based semantic fusion model. In this model, a parallel strategy is developed to encode the semantic information in DNA for a large volume of remote sensing data. The semantic information is read in a parallel and bit-wise manner and an individual bit is converted to a base. By doing so, a considerable amount of conversion time can be saved, i.e., the cluster-based multi-processes program can reduce the conversion time from 81,536 seconds to 4,937 seconds for 4.34 GB source data files. Moreover, the size of result file recording DNA sequences is 54.51 GB for parallel C program compared with 57.89 GB for sequential Perl. This shows that our parallel method can also reduce the DNA synthesis cost. In addition, data types are encoded in our model, which is a basis for building type system in our future DNA computer. Finally, we describe theoretically an algorithm for DNA-based semantic fusion. This algorithm enables the process of integration of the knowledge from disparate remote sensing data sources into a consistent, accurate, and complete representation. This process depends solely on ligation reaction and screening operations instead of the ontology. PMID:24116207
Comparative Safety of Simultaneous and Staged Anterior and Posterior Spinal Surgery
Passias, Peter G.; Ma, Yan; Chiu, Ya Lin; Mazumdar, Madhu; Girardi, Federico P.; Memtsoudis, Stavros G.
2011-01-01
Study Design Analysis of population based national hospital discharge data collected for the Nationwide Inpatient Sample. Objective To study perioperative outcomes of circumferential spine surgery performed on either the same or different days of the same hospitalization. Summary of Background Data Circumferential spine fusion surgery has been linked to an increased adjusted risk in perioperative morbidity and mortality compared to procedures involving only one site. In order to minimize these risks some surgeons elect to perform the two components of this procedure in separate sessions during the same hospitalization. The value of this approach is uncertain. Methods Data collected between 1998 and 2006 for the Nationwide Inpatient Sample were analyzed. Hospitalizations during which a circumferential non-cervical spine fusion was performed were identified. Patients were divided into those who had their anterior and posterior portion performed on the same and those performed on different days of the same hospitalization. The prevalence of patient and health care system related demographics were evaluated. Frequencies of procedure-related complications and mortality were determined. Multivariate regression models were created to identify if timing of procedures was associated with an independent increase in risk for adverse events. Results We identified a total of 11,265 entries for circumferential spine fusion. Of those, 71.2% (8022) were operated in one session. Complications were more frequent among staged versus same day surgery patients (28.4% vs. 21.7% P<0.0001). The incidence of venous thrombosis, and ARDS was also increased among staged candidates while the trend toward higher mortality (0.5 vs. 0.4%) did not reach significance. In the regression model staged circumferential spine fusions were associated with a 29% increase in the odds morbidity and mortality compared to same day procedures. Conclusion Staging circumferential spine surgery procedures during the same hospitalization offers no mortality benefit, and may even expose patients to increased morbidity. PMID:21301391
Sensor data fusion for textured reconstruction and virtual representation of alpine scenes
NASA Astrophysics Data System (ADS)
Häufel, Gisela; Bulatov, Dimitri; Solbrig, Peter
2017-10-01
The concept of remote sensing is to provide information about a wide-range area without making physical contact with this area. If, additionally to satellite imagery, images and videos taken by drones provide a more up-to-date data at a higher resolution, or accurate vector data is downloadable from the Internet, one speaks of sensor data fusion. The concept of sensor data fusion is relevant for many applications, such as virtual tourism, automatic navigation, hazard assessment, etc. In this work, we describe sensor data fusion aiming to create a semantic 3D model of an extremely interesting yet challenging dataset: An alpine region in Southern Germany. A particular challenge of this work is that rock faces including overhangs are present in the input airborne laser point cloud. The proposed procedure for identification and reconstruction of overhangs from point clouds comprises four steps: Point cloud preparation, filtering out vegetation, mesh generation and texturing. Further object types are extracted in several interesting subsections of the dataset: Building models with textures from UAV (Unmanned Aerial Vehicle) videos, hills reconstructed as generic surfaces and textured by the orthophoto, individual trees detected by the watershed algorithm, as well as the vector data for roads retrieved from openly available shapefiles and GPS-device tracks. We pursue geo-specific reconstruction by assigning texture and width to roads of several pre-determined types and modeling isolated trees and rocks using commercial software. For visualization and simulation of the area, we have chosen the simulation system Virtual Battlespace 3 (VBS3). It becomes clear that the proposed concept of sensor data fusion allows a coarse reconstruction of a large scene and, at the same time, an accurate and up-to-date representation of its relevant subsections, in which simulation can take place.
A Smartphone-Based Driver Safety Monitoring System Using Data Fusion
Lee, Boon-Giin; Chung, Wan-Young
2012-01-01
This paper proposes a method for monitoring driver safety levels using a data fusion approach based on several discrete data types: eye features, bio-signal variation, in-vehicle temperature, and vehicle speed. The driver safety monitoring system was developed in practice in the form of an application for an Android-based smartphone device, where measuring safety-related data requires no extra monetary expenditure or equipment. Moreover, the system provides high resolution and flexibility. The safety monitoring process involves the fusion of attributes gathered from different sensors, including video, electrocardiography, photoplethysmography, temperature, and a three-axis accelerometer, that are assigned as input variables to an inference analysis framework. A Fuzzy Bayesian framework is designed to indicate the driver’s capability level and is updated continuously in real-time. The sensory data are transmitted via Bluetooth communication to the smartphone device. A fake incoming call warning service alerts the driver if his or her safety level is suspiciously compromised. Realistic testing of the system demonstrates the practical benefits of multiple features and their fusion in providing a more authentic and effective driver safety monitoring. PMID:23247416
Product development using process monitoring and NDE data fusion
NASA Astrophysics Data System (ADS)
Peterson, Todd; Bossi, Richard H.
1998-03-01
Composite process/product development relies on both process monitoring information and nondestructive evaluation measurements for determining application suitability. In the past these activities have been performed and analyzed independently. Our present approach is to present the process monitoring and NDE data together in a data fusion workstation. This methodology leads to final product acceptance based on a combined process monitoring and NDE criteria. The data fusion work station combines process parameter and NDE data in a single workspace enabling all the data to be used in the acceptance/rejection decision process. An example application is the induction welding process, a unique joining method for assembling primary composite structure, that offers significant cost and weight advantages over traditional fasted structure. The determination of the required time, temperature and pressure conditions used in the process to achieve a complete weld is being aided by the use of ultrasonic inspection techniques. Full waveform ultrasonic inspection data is employed to evaluate the quality of spar cap to skin fit, an essential element of the welding process, and is processed to find a parameter that can be used for weld acceptance. Certification of the completed weld incorporates the data fusion methodology.
Generalized information fusion and visualization using spatial voting and data modeling
NASA Astrophysics Data System (ADS)
Jaenisch, Holger M.; Handley, James W.
2013-05-01
We present a novel and innovative information fusion and visualization framework for multi-source intelligence (multiINT) data using Spatial Voting (SV) and Data Modeling. We describe how different sources of information can be converted into numerical form for further processing downstream, followed by a short description of how this information can be fused using the SV grid. As an illustrative example, we show the modeling of cyberspace as cyber layers for the purpose of tracking cyber personas. Finally we describe a path ahead for creating interactive agile networks through defender customized Cyber-cubes for network configuration and attack visualization.
A Software Tool for Integrated Optical Design Analysis
NASA Technical Reports Server (NTRS)
Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)
2001-01-01
Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.
A Decision Fusion Framework for Treatment Recommendation Systems.
Mei, Jing; Liu, Haifeng; Li, Xiang; Xie, Guotong; Yu, Yiqin
2015-01-01
Treatment recommendation is a nontrivial task--it requires not only domain knowledge from evidence-based medicine, but also data insights from descriptive, predictive and prescriptive analysis. A single treatment recommendation system is usually trained or modeled with a limited (size or quality) source. This paper proposes a decision fusion framework, combining both knowledge-driven and data-driven decision engines for treatment recommendation. End users (e.g. using the clinician workstation or mobile apps) could have a comprehensive view of various engines' opinions, as well as the final decision after fusion. For implementation, we leverage several well-known fusion algorithms, such as decision templates and meta classifiers (of logistic and SVM, etc.). Using an outcome-driven evaluation metric, we compare the fusion engine with base engines, and our experimental results show that decision fusion is a promising way towards a more valuable treatment recommendation.
A Technical Analysis Information Fusion Approach for Stock Price Analysis and Modeling
NASA Astrophysics Data System (ADS)
Lahmiri, Salim
In this paper, we address the problem of technical analysis information fusion in improving stock market index-level prediction. We present an approach for analyzing stock market price behavior based on different categories of technical analysis metrics and a multiple predictive system. Each category of technical analysis measures is used to characterize stock market price movements. The presented predictive system is based on an ensemble of neural networks (NN) coupled with particle swarm intelligence for parameter optimization where each single neural network is trained with a specific category of technical analysis measures. The experimental evaluation on three international stock market indices and three individual stocks show that the presented ensemble-based technical indicators fusion system significantly improves forecasting accuracy in comparison with single NN. Also, it outperforms the classical neural network trained with index-level lagged values and NN trained with stationary wavelet transform details and approximation coefficients. As a result, technical information fusion in NN ensemble architecture helps improving prediction accuracy.
Segmentation by fusion of histogram-based k-means clusters in different color spaces.
Mignotte, Max
2008-05-01
This paper presents a new, simple, and efficient segmentation approach, based on a fusion procedure which aims at combining several segmentation maps associated to simpler partition models in order to finally get a more reliable and accurate segmentation result. The different label fields to be fused in our application are given by the same and simple (K-means based) clustering technique on an input image expressed in different color spaces. Our fusion strategy aims at combining these segmentation maps with a final clustering procedure using as input features, the local histogram of the class labels, previously estimated and associated to each site and for all these initial partitions. This fusion framework remains simple to implement, fast, general enough to be applied to various computer vision applications (e.g., motion detection and segmentation), and has been successfully applied on the Berkeley image database. The experiments herein reported in this paper illustrate the potential of this approach compared to the state-of-the-art segmentation methods recently proposed in the literature.
High-Energy Space Propulsion Based on Magnetized Target Fusion
NASA Technical Reports Server (NTRS)
Thio, Y. C. F.; Landrum, D. B.; Freeze, B.; Kirkpatrick, R. C.; Gerrish, H.; Schmidt, G. R.
1999-01-01
Magnetized target fusion is an approach in which a magnetized target plasma is compressed inertially by an imploding material wall. A high energy plasma liner may be used to produce the required implosion. The plasma liner is formed by the merging of a number of high momentum plasma jets converging towards the center of a sphere where two compact toroids have been introduced. Preliminary 3-D hydrodynamics modeling results using the SPHINX code of Los Alamos National Laboratory have been very encouraging and confirm earlier theoretical expectations. The concept appears ready for experimental exploration and plans for doing so are being pursued. In this talk, we explore conceptually how this innovative fusion approach could be packaged for space propulsion for interplanetary travel. We discuss the generally generic components of a baseline propulsion concept including the fusion engine, high velocity plasma accelerators, generators of compact toroids using conical theta pinches, magnetic nozzle, neutron absorption blanket, tritium reprocessing system, shock absorber, magnetohydrodynamic generator, capacitor pulsed power system, thermal management system, and micrometeorite shields.
Minimum energy information fusion in sensor networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chapline, G
1999-05-11
In this paper we consider how to organize the sharing of information in a distributed network of sensors and data processors so as to provide explanations for sensor readings with minimal expenditure of energy. We point out that the Minimum Description Length principle provides an approach to information fusion that is more naturally suited to energy minimization than traditional Bayesian approaches. In addition we show that for networks consisting of a large number of identical sensors Kohonen self-organization provides an exact solution to the problem of combing the sensor outputs into minimal description length explanations.
Adaptive multisensor fusion for planetary exploration rovers
NASA Technical Reports Server (NTRS)
Collin, Marie-France; Kumar, Krishen; Pampagnin, Luc-Henri
1992-01-01
The purpose of the adaptive multisensor fusion system currently being designed at NASA/Johnson Space Center is to provide a robotic rover with assured vision and safe navigation capabilities during robotic missions on planetary surfaces. Our approach consists of using multispectral sensing devices ranging from visible to microwave wavelengths to fulfill the needs of perception for space robotics. Based on the illumination conditions and the sensors capabilities knowledge, the designed perception system should automatically select the best subset of sensors and their sensing modalities that will allow the perception and interpretation of the environment. Then, based on reflectance and emittance theoretical models, the sensor data are fused to extract the physical and geometrical surface properties of the environment surface slope, dielectric constant, temperature and roughness. The theoretical concepts, the design and first results of the multisensor perception system are presented.
NASA Astrophysics Data System (ADS)
Pal, S. K.; Majumdar, T. J.; Bhattacharya, Amit K.
Fusion of optical and synthetic aperture radar data has been attempted in the present study for mapping of various lithologic units over a part of the Singhbhum Shear Zone (SSZ) and its surroundings. ERS-2 SAR data over the study area has been enhanced using Fast Fourier Transformation (FFT) based filtering approach, and also using Frost filtering technique. Both the enhanced SAR imagery have been then separately fused with histogram equalized IRS-1C LISS III image using Principal Component Analysis (PCA) technique. Later, Feature-oriented Principal Components Selection (FPCS) technique has been applied to generate False Color Composite (FCC) images, from which corresponding geological maps have been prepared. Finally, GIS techniques have been successfully used for change detection analysis in the lithological interpretation between the published geological map and the fusion based geological maps. In general, there is good agreement between these maps over a large portion of the study area. Based on the change detection studies, few areas could be identified which need attention for further detailed ground-based geological studies.
Towards a Near Real-Time Satellite-Based Flux Monitoring System for the MENA Region
NASA Astrophysics Data System (ADS)
Ershadi, A.; Houborg, R.; McCabe, M. F.; Anderson, M. C.; Hain, C.
2013-12-01
Satellite remote sensing has the potential to offer spatially and temporally distributed information on land surface characteristics, which may be used as inputs and constraints for estimating land surface fluxes of carbon, water and energy. Enhanced satellite-based monitoring systems for aiding local water resource assessments and agricultural management activities are particularly needed for the Middle East and North Africa (MENA) region. The MENA region is an area characterized by limited fresh water resources, an often inefficient use of these, and relatively poor in-situ monitoring as a result of sparse meteorological observations. To address these issues, an integrated modeling approach for near real-time monitoring of land surface states and fluxes at fine spatio-temporal scales over the MENA region is presented. This approach is based on synergistic application of multiple sensors and wavebands in the visible to shortwave infrared and thermal infrared (TIR) domain. The multi-scale flux mapping and monitoring system uses the Atmosphere-Land Exchange Inverse (ALEXI) model and associated flux disaggregation scheme (DisALEXI), and the Spatial and Temporal Adaptive Reflectance Fusion Model (STARFM) in conjunction with model reanalysis data and multi-sensor remotely sensed data from polar orbiting (e.g. Landsat and MODerate resolution Imaging Spectroradiometer (MODIS)) and geostationary (MSG; Meteosat Second Generation) satellite platforms to facilitate time-continuous (i.e. daily) estimates of field-scale water, energy and carbon fluxes. Within this modeling system, TIR satellite data provide information about the sub-surface moisture status and plant stress, obviating the need for precipitation input and a detailed soil surface characterization (i.e. for prognostic modeling of soil transport processes). The STARFM fusion methodology blends aspects of high frequency (spatially coarse) and spatially fine resolution sensors and is applied directly to flux output fields to facilitate daily mapping of fluxes at sub-field scales. A complete processing infrastructure to automatically ingest and pre-process all required input data and to execute the integrated modeling system for near real-time agricultural monitoring purposes over targeted MENA sites is being developed, and initial results from this concerted effort will be discussed.
Research on precise modeling of buildings based on multi-source data fusion of air to ground
NASA Astrophysics Data System (ADS)
Li, Yongqiang; Niu, Lubiao; Yang, Shasha; Li, Lixue; Zhang, Xitong
2016-03-01
Aims at the accuracy problem of precise modeling of buildings, a test research was conducted based on multi-source data for buildings of the same test area , including top data of air-borne LiDAR, aerial orthophotos, and façade data of vehicle-borne LiDAR. After accurately extracted the top and bottom outlines of building clusters, a series of qualitative and quantitative analysis was carried out for the 2D interval between outlines. Research results provide a reliable accuracy support for precise modeling of buildings of air ground multi-source data fusion, on the same time, discussed some solution for key technical problems.
Propagation of nuclear data uncertainties for fusion power measurements
NASA Astrophysics Data System (ADS)
Sjöstrand, Henrik; Conroy, Sean; Helgesson, Petter; Hernandez, Solis Augusto; Koning, Arjan; Pomp, Stephan; Rochman, Dimitri
2017-09-01
Neutron measurements using neutron activation systems are an essential part of the diagnostic system at large fusion machines such as JET and ITER. Nuclear data is used to infer the neutron yield. Consequently, high-quality nuclear data is essential for the proper determination of the neutron yield and fusion power. However, uncertainties due to nuclear data are not fully taken into account in uncertainty analysis for neutron yield calibrations using activation foils. This paper investigates the neutron yield uncertainty due to nuclear data using the so-called Total Monte Carlo Method. The work is performed using a detailed MCNP model of the JET fusion machine; the uncertainties due to the cross-sections and angular distributions in JET structural materials, as well as the activation cross-sections in the activation foils, are analysed. It is found that a significant contribution to the neutron yield uncertainty can come from uncertainties in the nuclear data.
Parkinson, Bonny; Goodall, Stephen; Thavaneswaran, Prema
2013-09-01
Lower back pain is a common and costly condition in Australia. This paper aims to conduct an economic evaluation of lumbar artificial intervertebral disc replacement (AIDR) compared with lumbar fusion for the treatment of patients suffering from significant axial back pain and/or radicular (nerve root) pain, secondary to disc degeneration or prolapse, who have failed conservative treatment. A cost-effectiveness approach was used to compare costs and benefits of AIDR to five fusion approaches. Resource use was based on Medicare Benefits Schedule claims data and expert opinion. Effectiveness and re-operation rates were based on published randomized controlled trials. The key clinical outcomes considered were narcotic medication discontinuation, achievement of overall clinical success, achievement of Oswestry Disability Index success and quality-adjusted life-years gained. AIDR was estimated to be cost-saving compared with fusion overall ($1600/patient); however, anterior lumbar interbody fusion and posterolateral fusion were less costly by $2155 and $807, respectively. The incremental cost-effectiveness depends on the outcome considered and the comparator. AIDR is potentially a cost-saving treatment for lumbar disc degeneration, although longer-term follow-up data are required to substantiate this claim. The incremental cost-effectiveness depends on the outcome considered and the comparator, and further research is required before any firm conclusions can be drawn. © 2012 The Authors. ANZ Journal of Surgery © 2012 Royal Australasian College of Surgeons.
TANDI: threat assessment of network data and information
NASA Astrophysics Data System (ADS)
Holsopple, Jared; Yang, Shanchieh Jay; Sudit, Moises
2006-04-01
Current practice for combating cyber attacks typically use Intrusion Detection Sensors (IDSs) to passively detect and block multi-stage attacks. This work leverages Level-2 fusion that correlates IDS alerts belonging to the same attacker, and proposes a threat assessment algorithm to predict potential future attacker actions. The algorithm, TANDI, reduces the problem complexity by separating the models of the attacker's capability and opportunity, and fuse the two to determine the attacker's intent. Unlike traditional Bayesian-based approaches, which require assigning a large number of edge probabilities, the proposed Level-3 fusion procedure uses only 4 parameters. TANDI has been implemented and tested with randomly created attack sequences. The results demonstrate that TANDI predicts future attack actions accurately as long as the attack is not part of a coordinated attack and contains no insider threats. In the presence of abnormal attack events, TANDI will alarm the network analyst for further analysis. The attempt to evaluate a threat assessment algorithm via simulation is the first in the literature, and shall open up a new avenue in the area of high level fusion.
NASA Astrophysics Data System (ADS)
Sah, Shagan
An increasingly important application of remote sensing is to provide decision support during emergency response and disaster management efforts. Land cover maps constitute one such useful application product during disaster events; if generated rapidly after any disaster, such map products can contribute to the efficacy of the response effort. In light of recent nuclear incidents, e.g., after the earthquake/tsunami in Japan (2011), our research focuses on constructing rapid and accurate land cover maps of the impacted area in case of an accidental nuclear release. The methodology involves integration of results from two different approaches, namely coarse spatial resolution multi-temporal and fine spatial resolution imagery, to increase classification accuracy. Although advanced methods have been developed for classification using high spatial or temporal resolution imagery, only a limited amount of work has been done on fusion of these two remote sensing approaches. The presented methodology thus involves integration of classification results from two different remote sensing modalities in order to improve classification accuracy. The data used included RapidEye and MODIS scenes over the Nine Mile Point Nuclear Power Station in Oswego (New York, USA). The first step in the process was the construction of land cover maps from freely available, high temporal resolution, low spatial resolution MODIS imagery using a time-series approach. We used the variability in the temporal signatures among different land cover classes for classification. The time series-specific features were defined by various physical properties of a pixel, such as variation in vegetation cover and water content over time. The pixels were classified into four land cover classes - forest, urban, water, and vegetation - using Euclidean and Mahalanobis distance metrics. On the other hand, a high spatial resolution commercial satellite, such as RapidEye, can be tasked to capture images over the affected area in the case of a nuclear event. This imagery served as a second source of data to augment results from the time series approach. The classifications from the two approaches were integrated using an a posteriori probability-based fusion approach. This was done by establishing a relationship between the classes, obtained after classification of the two data sources. Despite the coarse spatial resolution of MODIS pixels, acceptable accuracies were obtained using time series features. The overall accuracies using the fusion-based approach were in the neighborhood of 80%, when compared with GIS data sets from New York State. This fusion thus contributed to classification accuracy refinement, with a few additional advantages, such as correction for cloud cover and providing for an approach that is robust against point-in-time seasonal anomalies, due to the inclusion of multi-temporal data. We concluded that this approach is capable of generating land cover maps of acceptable accuracy and rapid turnaround, which in turn can yield reliable estimates of crop acreage of a region. The final algorithm is part of an automated software tool, which can be used by emergency response personnel to generate a nuclear ingestion pathway information product within a few hours of data collection.
Data Fusion of Gridded Snow Products Enhanced with Terrain Covariates and a Simple Snow Model
NASA Astrophysics Data System (ADS)
Snauffer, A. M.; Hsieh, W. W.; Cannon, A. J.
2017-12-01
Hydrologic planning requires accurate estimates of regional snow water equivalent (SWE), particularly areas with hydrologic regimes dominated by spring melt. While numerous gridded data products provide such estimates, accurate representations are particularly challenging under conditions of mountainous terrain, heavy forest cover and large snow accumulations, contexts which in many ways define the province of British Columbia (BC), Canada. One promising avenue of improving SWE estimates is a data fusion approach which combines field observations with gridded SWE products and relevant covariates. A base artificial neural network (ANN) was constructed using three of the best performing gridded SWE products over BC (ERA-Interim/Land, MERRA and GLDAS-2) and simple location and time covariates. This base ANN was then enhanced to include terrain covariates (slope, aspect and Terrain Roughness Index, TRI) as well as a simple 1-layer energy balance snow model driven by gridded bias-corrected ANUSPLIN temperature and precipitation values. The ANN enhanced with all aforementioned covariates performed better than the base ANN, but most of the skill improvement was attributable to the snow model with very little contribution from the terrain covariates. The enhanced ANN improved station mean absolute error (MAE) by an average of 53% relative to the composing gridded products over the province. Interannual peak SWE correlation coefficient was found to be 0.78, an improvement of 0.05 to 0.18 over the composing products. This nonlinear approach outperformed a comparable multiple linear regression (MLR) model by 22% in MAE and 0.04 in interannual correlation. The enhanced ANN has also been shown to estimate better than the Variable Infiltration Capacity (VIC) hydrologic model calibrated and run for four BC watersheds, improving MAE by 22% and correlation by 0.05. The performance improvements of the enhanced ANN are statistically significant at the 5% level across the province and in four out of five physiographic regions.
Characterizing the astrophysical S factor for 12C+12C fusion with wave-packet dynamics
NASA Astrophysics Data System (ADS)
Diaz-Torres, Alexis; Wiescher, Michael
2018-05-01
A quantitative study of the astrophysically important subbarrier fusion of 12C+12C is presented. Low-energy collisions are described in the body-fixed reference frame using wave-packet dynamics within a nuclear molecular picture. A collective Hamiltonian drives the time propagation of the wave packet through the collective potential-energy landscape. The fusion imaginary potential for specific dinuclear configurations is crucial for understanding the appearance of resonances in the fusion cross section. The theoretical subbarrier fusion cross sections explain some observed resonant structures in the astrophysical S factor. These cross sections monotonically decline towards stellar energies. The structures in the data that are not explained are possibly due to cluster effects in the nuclear molecule, which need to be included in the present approach.
Bougrini, Madiha; Tahri, Khalid; Haddi, Zouhair; El Bari, Nezha; Llobet, Eduard; Jaffrezic-Renault, Nicole; Bouchikhi, Benachir
2014-12-01
A combined approach based on a multisensor system to get additional chemical information from liquid samples through the analysis of the solution and its headspace is illustrated and commented. In the present work, innovative analytical techniques, such as a hybrid e-nose and a voltammetric e-tongue were elaborated to differentiate between different pasteurized milk brands and for the exact recognition of their storage days through the data fusion technique of the combined system. The Principal Component Analysis (PCA) has shown an acceptable discrimination of the pasteurized milk brands on the first day of storage, when the two instruments were used independently. Contrariwise, PCA indicated that no clear storage day's discrimination can be drawn when the two instruments are applied separately. Mid-level of abstraction data fusion approach has demonstrated that results obtained by the data fusion approach outperformed the classification results of the e-nose and e-tongue taken individually. Furthermore, the Support Vector Machine (SVM) supervised method was applied to the new subset and confirmed that all storage days were correctly identified. This study can be generalized to several beverage and food products where their quality is based on the perception of odor and flavor. Copyright © 2014 Elsevier B.V. All rights reserved.
Tang, Yongchuan; Zhou, Deyun; Chan, Felix T S
2018-06-11
Quantification of uncertain degree in the Dempster-Shafer evidence theory (DST) framework with belief entropy is still an open issue, even a blank field for the open world assumption. Currently, the existed uncertainty measures in the DST framework are limited to the closed world where the frame of discernment (FOD) is assumed to be complete. To address this issue, this paper focuses on extending a belief entropy to the open world by considering the uncertain information represented as the FOD and the nonzero mass function of the empty set simultaneously. An extension to Deng’s entropy in the open world assumption (EDEOW) is proposed as a generalization of the Deng’s entropy and it can be degenerated to the Deng entropy in the closed world wherever necessary. In order to test the reasonability and effectiveness of the extended belief entropy, an EDEOW-based information fusion approach is proposed and applied to sensor data fusion under uncertainty circumstance. The experimental results verify the usefulness and applicability of the extended measure as well as the modified sensor data fusion method. In addition, a few open issues still exist in the current work: the necessary properties for a belief entropy in the open world assumption, whether there exists a belief entropy that satisfies all the existed properties, and what is the most proper fusion frame for sensor data fusion under uncertainty.
Ran, Changyan; Cheng, Xianghong
2016-01-01
This paper presents a direct and non-singular approach based on an unscented Kalman filter (UKF) for the integration of strapdown inertial navigation systems (SINSs) with the aid of velocity. The state vector includes velocity and Euler angles, and the system model contains Euler angle kinematics equations. The measured velocity in the body frame is used as the filter measurement. The quaternion nonlinear equality constraint is eliminated, and the cross-noise problem is overcome. The filter model is simple and easy to apply without linearization. Data fusion is performed by an UKF, which directly estimates and outputs the navigation information. There is no need to process navigation computation and error correction separately because the navigation computation is completed synchronously during the filter time updating. In addition, the singularities are avoided with the help of the dual-Euler method. The performance of the proposed approach is verified by road test data from a land vehicle equipped with an odometer aided SINS, and a singularity turntable test is conducted using three-axis turntable test data. The results show that the proposed approach can achieve higher navigation accuracy than the commonly-used indirect approach, and the singularities can be efficiently removed as the result of dual-Euler method. PMID:27598169
Debus, Bruno; Orio, Maylis; Rehault, Julien; Burdzinski, Gotard; Ruckebusch, Cyril; Sliwa, Michel
2017-08-03
Ultrafast photoisomerization reactions generally start at a higher excited state with excess of internal vibrational energy and occur via conical intersections. This leads to ultrafast dynamics which are difficult to investigate with a single transient absorption spectroscopy technique, be it in the ultraviolet-visible (UV-vis) or infrared (IR) domain. On one hand, the information available in the UV-vis domain is limited as only slight spectral changes are observed for different isomers. On the other hand, the interpretation of vibrational spectra is strongly hindered by intramolecular relaxation and vibrational cooling. These limitations can be circumvented by fusing UV-vis and IR transient absorption spectroscopy data in a multiset multivariate curve resolution analysis. We apply this approach to describe the spectrodynamics of the ultrafast cis-trans photoisomerization around the C-N double bond observed for aromatic Schiff bases. Twisted intermediate states could be elucidated, and isomerization was shown to occur through a continuous complete rotation. More broadly, data fusion can be used to rationalize a vast range of ultrafast photoisomerization processes of interest in photochemistry.
Joint modality fusion and temporal context exploitation for semantic video analysis
NASA Astrophysics Data System (ADS)
Papadopoulos, Georgios Th; Mezaris, Vasileios; Kompatsiaris, Ioannis; Strintzis, Michael G.
2011-12-01
In this paper, a multi-modal context-aware approach to semantic video analysis is presented. Overall, the examined video sequence is initially segmented into shots and for every resulting shot appropriate color, motion and audio features are extracted. Then, Hidden Markov Models (HMMs) are employed for performing an initial association of each shot with the semantic classes that are of interest separately for each modality. Subsequently, a graphical modeling-based approach is proposed for jointly performing modality fusion and temporal context exploitation. Novelties of this work include the combined use of contextual information and multi-modal fusion, and the development of a new representation for providing motion distribution information to HMMs. Specifically, an integrated Bayesian Network is introduced for simultaneously performing information fusion of the individual modality analysis results and exploitation of temporal context, contrary to the usual practice of performing each task separately. Contextual information is in the form of temporal relations among the supported classes. Additionally, a new computationally efficient method for providing motion energy distribution-related information to HMMs, which supports the incorporation of motion characteristics from previous frames to the currently examined one, is presented. The final outcome of this overall video analysis framework is the association of a semantic class with every shot. Experimental results as well as comparative evaluation from the application of the proposed approach to four datasets belonging to the domains of tennis, news and volleyball broadcast video are presented.
NASA Astrophysics Data System (ADS)
Benhalouche, Fatima Zohra; Karoui, Moussa Sofiane; Deville, Yannick; Ouamri, Abdelaziz
2017-04-01
This paper proposes three multisharpening approaches to enhance the spatial resolution of urban hyperspectral remote sensing images. These approaches, related to linear-quadratic spectral unmixing techniques, use a linear-quadratic nonnegative matrix factorization (NMF) multiplicative algorithm. These methods begin by unmixing the observable high-spectral/low-spatial resolution hyperspectral and high-spatial/low-spectral resolution multispectral images. The obtained high-spectral/high-spatial resolution features are then recombined, according to the linear-quadratic mixing model, to obtain an unobservable multisharpened high-spectral/high-spatial resolution hyperspectral image. In the first designed approach, hyperspectral and multispectral variables are independently optimized, once they have been coherently initialized. These variables are alternately updated in the second designed approach. In the third approach, the considered hyperspectral and multispectral variables are jointly updated. Experiments, using synthetic and real data, are conducted to assess the efficiency, in spatial and spectral domains, of the designed approaches and of linear NMF-based approaches from the literature. Experimental results show that the designed methods globally yield very satisfactory spectral and spatial fidelities for the multisharpened hyperspectral data. They also prove that these methods significantly outperform the used literature approaches.
Nunes, Karen M; Andrade, Marcus Vinícius O; Santos Filho, Antônio M P; Lasmar, Marcelo C; Sena, Marcelo M
2016-08-15
Concerns about meat authenticity are increasing recently, due to great fraud scandals. This paper analysed real samples (43 adulterated and 12 controls) originated from criminal networks dismantled by the Brazilian Police. This fraud consisted of injecting solutions of non-meat ingredients (NaCl, phosphates, carrageenan, maltodextrin) in bovine meat, aiming to increase its water holding capacity. Five physico-chemical variables were determined, protein, ash, chloride, sodium, phosphate. Additionally, infrared spectra were recorded. Supervised classification PLS-DA models were built with each data set individually, but the best model was obtained with data fusion, correctly detecting 91% of the adulterated samples. From this model, a variable selection based on the highest VIPscores was performed and a new data fusion model was built with only one chemical variable, providing slightly lower predictions, but a good cost/performance ratio. Finally, some of the selected infrared bands were specifically associated to the presence of adulterants NaCl, tripolyphosphate and carrageenan. Copyright © 2016 Elsevier Ltd. All rights reserved.
Zwolak, Pawel; Farei-Campagna, Jan; Jentzsch, Thorsten; von Rechenberg, Brigitte; Werner, Clément M
2018-01-01
Posterolateral spinal fusion is a common orthopaedic surgery performed to treat degenerative and traumatic deformities of the spinal column. In posteriolateral spinal fusion, different osteoinductive demineralized bone matrix products have been previously investigated. We evaluated the effect of locally applied zoledronic acid in combination with commercially available demineralized bone matrix putty on new bone formation in posterolateral spinal fusion in a murine in vivo model. A posterolateral sacral spine fusion in murine model was used to evaluate the new bone formation. We used the sacral spine fusion model to model the clinical situation in which a bone graft or demineralized bone matrix is applied after dorsal instrumentation of the spine. In our study, group 1 received decortications only (n = 10), group 2 received decortication, and absorbable collagen sponge carrier, group 3 received decortication and absorbable collagen sponge carrier with zoledronic acid in dose 10 µg, group 4 received demineralized bone matrix putty (DBM putty) plus decortication (n = 10), and group 5 received DBM putty, decortication and locally applied zoledronic acid in dose 10 µg. Imaging was performed using MicroCT for new bone formation assessment. Also, murine spines were harvested for histopathological analysis 10 weeks after surgery. The surgery performed through midline posterior approach was reproducible. In group with decortication alone there was no new bone formation. Application of demineralized bone matrix putty alone produced new bone formation which bridged the S1-S4 laminae. Local application of zoledronic acid to demineralized bone matrix putty resulted in significant increase of new bone formation as compared to demineralized bone matrix putty group alone. A single local application of zoledronic acid with DBM putty during posterolateral fusion in sacral murine spine model increased significantly new bone formation in situ in our model. Therefore, our results justify further investigations to potentially use local application of zoledronic acid in future clinical studies.
NASA Astrophysics Data System (ADS)
Yao, Sen; Li, Tao; Li, JieQing; Liu, HongGao; Wang, YuanZhong
2018-06-01
Boletus griseus and Boletus edulis are two well-known wild-grown edible mushrooms which have high nutrition, delicious flavor and high economic value distributing in Yunnan Province. In this study, a rapid method using Fourier transform infrared (FT-IR) and ultraviolet (UV) spectroscopies coupled with data fusion was established for the discrimination of Boletus mushrooms from seven different geographical origins with pattern recognition method. Initially, the spectra of 332 mushroom samples obtained from the two spectroscopic techniques were analyzed individually and then the classification performance based on data fusion strategy was investigated. Meanwhile, the latent variables (LVs) of FT-IR and UV spectra were extracted by partial least square discriminant analysis (PLS-DA) and two datasets were concatenated into a new matrix for data fusion. Then, the fusion matrix was further analyzed by support vector machine (SVM). Compared with single spectroscopic technique, data fusion strategy can improve the classification performance effectively. In particular, the accuracy of correct classification of SVM model in training and test sets were 99.10% and 100.00%, respectively. The results demonstrated that data fusion of FT-IR and UV spectra can provide higher synergic effect for the discrimination of different geographical origins of Boletus mushrooms, which may be benefit for further authentication and quality assessment of edible mushrooms.
Yao, Sen; Li, Tao; Li, JieQing; Liu, HongGao; Wang, YuanZhong
2018-06-05
Boletus griseus and Boletus edulis are two well-known wild-grown edible mushrooms which have high nutrition, delicious flavor and high economic value distributing in Yunnan Province. In this study, a rapid method using Fourier transform infrared (FT-IR) and ultraviolet (UV) spectroscopies coupled with data fusion was established for the discrimination of Boletus mushrooms from seven different geographical origins with pattern recognition method. Initially, the spectra of 332 mushroom samples obtained from the two spectroscopic techniques were analyzed individually and then the classification performance based on data fusion strategy was investigated. Meanwhile, the latent variables (LVs) of FT-IR and UV spectra were extracted by partial least square discriminant analysis (PLS-DA) and two datasets were concatenated into a new matrix for data fusion. Then, the fusion matrix was further analyzed by support vector machine (SVM). Compared with single spectroscopic technique, data fusion strategy can improve the classification performance effectively. In particular, the accuracy of correct classification of SVM model in training and test sets were 99.10% and 100.00%, respectively. The results demonstrated that data fusion of FT-IR and UV spectra can provide higher synergic effect for the discrimination of different geographical origins of Boletus mushrooms, which may be benefit for further authentication and quality assessment of edible mushrooms. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Panagiotopoulou, Antigoni; Bratsolis, Emmanuel; Charou, Eleni; Perantonis, Stavros
2017-10-01
The detailed three-dimensional modeling of buildings utilizing elevation data, such as those provided by light detection and ranging (LiDAR) airborne scanners, is increasingly demanded today. There are certain application requirements and available datasets to which any research effort has to be adapted. Our dataset includes aerial orthophotos, with a spatial resolution 20 cm, and a digital surface model generated from LiDAR, with a spatial resolution 1 m and an elevation resolution 20 cm, from an area of Athens, Greece. The aerial images are fused with LiDAR, and we classify these data with a multilayer feedforward neural network for building block extraction. The innovation of our approach lies in the preprocessing step in which the original LiDAR data are super-resolution (SR) reconstructed by means of a stochastic regularized technique before their fusion with the aerial images takes place. The Lorentzian estimator combined with the bilateral total variation regularization performs the SR reconstruction. We evaluate the performance of our approach against that of fusing unprocessed LiDAR data with aerial images. We present the classified images and the statistical measures confusion matrix, kappa coefficient, and overall accuracy. The results demonstrate that our approach predominates over that of fusing unprocessed LiDAR data with aerial images.
Dichromatic State Sum Models for Four-Manifolds from Pivotal Functors
NASA Astrophysics Data System (ADS)
Bärenz, Manuel; Barrett, John
2017-11-01
A family of invariants of smooth, oriented four-dimensional manifolds is defined via handle decompositions and the Kirby calculus of framed link diagrams. The invariants are parametrised by a pivotal functor from a spherical fusion category into a ribbon fusion category. A state sum formula for the invariant is constructed via the chain-mail procedure, so a large class of topological state sum models can be expressed as link invariants. Most prominently, the Crane-Yetter state sum over an arbitrary ribbon fusion category is recovered, including the nonmodular case. It is shown that the Crane-Yetter invariant for nonmodular categories is stronger than signature and Euler invariant. A special case is the four-dimensional untwisted Dijkgraaf-Witten model. Derivations of state space dimensions of TQFTs arising from the state sum model agree with recent calculations of ground state degeneracies in Walker-Wang models. Relations to different approaches to quantum gravity such as Cartan geometry and teleparallel gravity are also discussed.
Dichromatic State Sum Models for Four-Manifolds from Pivotal Functors
NASA Astrophysics Data System (ADS)
Bärenz, Manuel; Barrett, John
2018-06-01
A family of invariants of smooth, oriented four-dimensional manifolds is defined via handle decompositions and the Kirby calculus of framed link diagrams. The invariants are parametrised by a pivotal functor from a spherical fusion category into a ribbon fusion category. A state sum formula for the invariant is constructed via the chain-mail procedure, so a large class of topological state sum models can be expressed as link invariants. Most prominently, the Crane-Yetter state sum over an arbitrary ribbon fusion category is recovered, including the nonmodular case. It is shown that the Crane-Yetter invariant for nonmodular categories is stronger than signature and Euler invariant. A special case is the four-dimensional untwisted Dijkgraaf-Witten model. Derivations of state space dimensions of TQFTs arising from the state sum model agree with recent calculations of ground state degeneracies in Walker-Wang models. Relations to different approaches to quantum gravity such as Cartan geometry and teleparallel gravity are also discussed.
Experimental plasma research project summaries
NASA Astrophysics Data System (ADS)
1992-06-01
This is the latest in a series of Project Summary books that date back to 1976. It is the first after a hiatus of several years. They are published to provide a short description of each project supported by the Experimental Plasma Research Branch of the Division of Applied Plasma Physics in the Office of Fusion Energy. The Experimental Plasma Research Branch seeks to provide a broad range of experimental data, physics understanding, and new experimental techniques that contribute to operation, interpretation, and improvement of high temperature plasma as a source of fusion energy. In pursuit of these objectives, the branch supports research at universities, DOE laboratories, other federal laboratories, and industry. About 70 percent of the funds expended are spent at universities and a significant function of this program is the training of students in fusion physics. The branch supports small- and medium-scale experimental studies directly related to specific critical plasma issues of the magnetic fusion program. Plasma physics experiments are conducted on transport of particles and energy within plasma. Additionally, innovative approaches for operating, controlling, and heating plasma are evaluated for application to the larger confinement devices of the magnetic fusion program. New diagnostic approaches to measuring the properties of high temperature plasmas are developed to the point where they can be applied with confidence on the large-scale confinement experiments. Atomic data necessary for impurity control, interpretation of diagnostic data, development of heating devices, and analysis of cooling by impurity ion radiation are obtained. The project summaries are grouped into the three categories of plasma physics, diagnostic development, and atomic physics.
NASA Astrophysics Data System (ADS)
Bonne, François; Alamir, Mazen; Bonnay, Patrick
2014-01-01
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection, to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bonne, François; Bonnay, Patrick; Alamir, Mazen
2014-01-29
In this paper, a physical method to obtain control-oriented dynamical models of large scale cryogenic refrigerators is proposed, in order to synthesize model-based advanced control schemes. These schemes aim to replace classical user experience designed approaches usually based on many independent PI controllers. This is particularly useful in the case where cryoplants are submitted to large pulsed thermal loads, expected to take place in the cryogenic cooling systems of future fusion reactors such as the International Thermonuclear Experimental Reactor (ITER) or the Japan Torus-60 Super Advanced Fusion Experiment (JT-60SA). Advanced control schemes lead to a better perturbation immunity and rejection,more » to offer a safer utilization of cryoplants. The paper gives details on how basic components used in the field of large scale helium refrigeration (especially those present on the 400W @1.8K helium test facility at CEA-Grenoble) are modeled and assembled to obtain the complete dynamic description of controllable subsystems of the refrigerator (controllable subsystems are namely the Joule-Thompson Cycle, the Brayton Cycle, the Liquid Nitrogen Precooling Unit and the Warm Compression Station). The complete 400W @1.8K (in the 400W @4.4K configuration) helium test facility model is then validated against experimental data and the optimal control of both the Joule-Thompson valve and the turbine valve is proposed, to stabilize the plant under highly variable thermals loads. This work is partially supported through the European Fusion Development Agreement (EFDA) Goal Oriented Training Program, task agreement WP10-GOT-GIRO.« less
A Self-Synthesis Approach to Perceptual Learning for Multisensory Fusion in Robotics
Axenie, Cristian; Richter, Christoph; Conradt, Jörg
2016-01-01
Biological and technical systems operate in a rich multimodal environment. Due to the diversity of incoming sensory streams a system perceives and the variety of motor capabilities a system exhibits there is no single representation and no singular unambiguous interpretation of such a complex scene. In this work we propose a novel sensory processing architecture, inspired by the distributed macro-architecture of the mammalian cortex. The underlying computation is performed by a network of computational maps, each representing a different sensory quantity. All the different sensory streams enter the system through multiple parallel channels. The system autonomously associates and combines them into a coherent representation, given incoming observations. These processes are adaptive and involve learning. The proposed framework introduces mechanisms for self-creation and learning of the functional relations between the computational maps, encoding sensorimotor streams, directly from the data. Its intrinsic scalability, parallelisation, and automatic adaptation to unforeseen sensory perturbations make our approach a promising candidate for robust multisensory fusion in robotic systems. We demonstrate this by applying our model to a 3D motion estimation on a quadrotor. PMID:27775621
NASA Astrophysics Data System (ADS)
Jourde, K.; Gibert, D.; Marteau, J.
2015-08-01
This paper examines how the resolution of small-scale geological density models is improved through the fusion of information provided by gravity measurements and density muon radiographies. Muon radiography aims at determining the density of geological bodies by measuring their screening effect on the natural flux of cosmic muons. Muon radiography essentially works like a medical X-ray scan and integrates density information along elongated narrow conical volumes. Gravity measurements are linked to density by a 3-D integration encompassing the whole studied domain. We establish the mathematical expressions of these integration formulas - called acquisition kernels - and derive the resolving kernels that are spatial filters relating the true unknown density structure to the density distribution actually recovered from the available data. The resolving kernel approach allows one to quantitatively describe the improvement of the resolution of the density models achieved by merging gravity data and muon radiographies. The method developed in this paper may be used to optimally design the geometry of the field measurements to be performed in order to obtain a given spatial resolution pattern of the density model to be constructed. The resolving kernels derived in the joined muon-gravimetry case indicate that gravity data are almost useless for constraining the density structure in regions sampled by more than two muon tomography acquisitions. Interestingly, the resolution in deeper regions not sampled by muon tomography is significantly improved by joining the two techniques. The method is illustrated with examples for the La Soufrière volcano of Guadeloupe.
NASA Technical Reports Server (NTRS)
Montesano, P. M.; Cook, B. D.; Sun, G.; Simard, M.; Zhang, Z.; Nelson, R. F.; Ranson, K. J.; Lutchke, S.; Blair, J. B.
2012-01-01
The synergistic use of active and passive remote sensing (i.e., data fusion) demonstrates the ability of spaceborne light detection and ranging (LiDAR), synthetic aperture radar (SAR) and multispectral imagery for achieving the accuracy requirements of a global forest biomass mapping mission. This data fusion approach also provides a means to extend 3D information from discrete spaceborne LiDAR measurements of forest structure across scales much larger than that of the LiDAR footprint. For estimating biomass, these measurements mix a number of errors including those associated with LiDAR footprint sampling over regional - global extents. A general framework for mapping above ground live forest biomass (AGB) with a data fusion approach is presented and verified using data from NASA field campaigns near Howland, ME, USA, to assess AGB and LiDAR sampling errors across a regionally representative landscape. We combined SAR and Landsat-derived optical (passive optical) image data to identify forest patches, and used image and simulated spaceborne LiDAR data to compute AGB and estimate LiDAR sampling error for forest patches and 100m, 250m, 500m, and 1km grid cells. Forest patches were delineated with Landsat-derived data and airborne SAR imagery, and simulated spaceborne LiDAR (SSL) data were derived from orbit and cloud cover simulations and airborne data from NASA's Laser Vegetation Imaging Sensor (L VIS). At both the patch and grid scales, we evaluated differences in AGB estimation and sampling error from the combined use of LiDAR with both SAR and passive optical and with either SAR or passive optical alone. This data fusion approach demonstrates that incorporating forest patches into the AGB mapping framework can provide sub-grid forest information for coarser grid-level AGB reporting, and that combining simulated spaceborne LiDAR with SAR and passive optical data are most useful for estimating AGB when measurements from LiDAR are limited because they minimized forest AGB sampling errors by 15 - 38%. Furthermore, spaceborne global scale accuracy requirements were achieved. At least 80% of the grid cells at 100m, 250m, 500m, and 1km grid levels met AGB density accuracy requirements using a combination of passive optical and SAR along with machine learning methods to predict vegetation structure metrics for forested areas without LiDAR samples. Finally, using either passive optical or SAR, accuracy requirements were met at the 500m and 250m grid level, respectively.
A Coalition Approach to Higher-Level Fusion
2009-07-01
environment and as a simulation which operates without any human interaction . In a typical wargame, WISE might portray a complex warfighting...the environment , while higher-level fusion is about establishing the story behind the data. Thus, achieving situational awareness for the users of...It is important to manage the relationship between the users and the virtual adviser so that context can be conveyed without disrupting the users
Forest height Mapping using the fusion of Lidar and MULTI-ANGLE spectral data
NASA Astrophysics Data System (ADS)
Pang, Y.; Li, Z.
2016-12-01
Characterizing the complexity of forest ecosystem over large area is highly complex. Light detection and Ranging (LIDAR) approaches have demonstrated a high capacity to accurately estimate forest structural parameters. A number of satellite mission concepts have been proposed to fuse LiDAR with other optical imagery allowing Multi-angle spectral observations to be captured using the Bidirectional Reflectance Distribution Function (BRDF) characteristics of forests. China is developing the concept of Chinese Terrestrial Carbon Mapping Satellite. A multi-beam waveform Lidar is the main sensor. A multi-angle imagery system is considered as the spatial mapping sensor. In this study, we explore the fusion potential of Lidar and multi-angle spectral data to estimate forest height across different scales. We flew intensive airborne Lidar and Multi-angle hyperspectral data in Genhe Forest Ecological Research Station, Northeast China. Then extended the spatial scale with some long transect flights to cover more forest structures. Forest height data derived from airborne lidar data was used as reference data and the multi-angle hyperspectral data was used as model inputs. Our results demonstrate that the multi-angle spectral data can be used to estimate forest height with the RMSE of 1.1 m with an R2 approximately 0.8.
Lin, Chun-Yuan; Wang, Yen-Ling
2014-01-01
Checkpoint kinase 2 (Chk2) has a great effect on DNA-damage and plays an important role in response to DNA double-strand breaks and related lesions. In this study, we will concentrate on Chk2 and the purpose is to find the potential inhibitors by the pharmacophore hypotheses (PhModels), combinatorial fusion, and virtual screening techniques. Applying combinatorial fusion into PhModels and virtual screening techniques is a novel design strategy for drug design. We used combinatorial fusion to analyze the prediction results and then obtained the best correlation coefficient of the testing set (r test) with the value 0.816 by combining the Best(train)Best(test) and Fast(train)Fast(test) prediction results. The potential inhibitors were selected from NCI database by screening according to Best(train)Best(test) + Fast(train)Fast(test) prediction results and molecular docking with CDOCKER docking program. Finally, the selected compounds have high interaction energy between a ligand and a receptor. Through these approaches, 23 potential inhibitors for Chk2 are retrieved for further study.
Fusion of magnetometer and gradiometer sensors of MEG in the presence of multiplicative error.
Mohseni, Hamid R; Woolrich, Mark W; Kringelbach, Morten L; Luckhoo, Henry; Smith, Penny Probert; Aziz, Tipu Z
2012-07-01
Novel neuroimaging techniques have provided unprecedented information on the structure and function of the living human brain. Multimodal fusion of data from different sensors promises to radically improve this understanding, yet optimal methods have not been developed. Here, we demonstrate a novel method for combining multichannel signals. We show how this method can be used to fuse signals from the magnetometer and gradiometer sensors used in magnetoencephalography (MEG), and through extensive experiments using simulation, head phantom and real MEG data, show that it is both robust and accurate. This new approach works by assuming that the lead fields have multiplicative error. The criterion to estimate the error is given within a spatial filter framework such that the estimated power is minimized in the worst case scenario. The method is compared to, and found better than, existing approaches. The closed-form solution and the conditions under which the multiplicative error can be optimally estimated are provided. This novel approach can also be employed for multimodal fusion of other multichannel signals such as MEG and EEG. Although the multiplicative error is estimated based on beamforming, other methods for source analysis can equally be used after the lead-field modification.
NASA Astrophysics Data System (ADS)
Saadeddin, Kamal; Abdel-Hafez, Mamoun F.; Jaradat, Mohammad A.; Jarrah, Mohammad Amin
2013-12-01
In this paper, a low-cost navigation system that fuses the measurements of the inertial navigation system (INS) and the global positioning system (GPS) receiver is developed. First, the system's dynamics are obtained based on a vehicle's kinematic model. Second, the INS and GPS measurements are fused using an extended Kalman filter (EKF) approach. Subsequently, an artificial intelligence based approach for the fusion of INS/GPS measurements is developed based on an Input-Delayed Adaptive Neuro-Fuzzy Inference System (IDANFIS). Experimental tests are conducted to demonstrate the performance of the two sensor fusion approaches. It is found that the use of the proposed IDANFIS approach achieves a reduction in the integration development time and an improvement in the estimation accuracy of the vehicle's position and velocity compared to the EKF based approach.
Chowdhury, Amor; Sarjaš, Andrej
2016-01-01
The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation. PMID:27649197
Chowdhury, Amor; Sarjaš, Andrej
2016-09-15
The presented paper describes accurate distance measurement for a field-sensed magnetic suspension system. The proximity measurement is based on a Hall effect sensor. The proximity sensor is installed directly on the lower surface of the electro-magnet, which means that it is very sensitive to external magnetic influences and disturbances. External disturbances interfere with the information signal and reduce the usability and reliability of the proximity measurements and, consequently, the whole application operation. A sensor fusion algorithm is deployed for the aforementioned reasons. The sensor fusion algorithm is based on the Unscented Kalman Filter, where a nonlinear dynamic model was derived with the Finite Element Modelling approach. The advantage of such modelling is a more accurate dynamic model parameter estimation, especially in the case when the real structure, materials and dimensions of the real-time application are known. The novelty of the paper is the design of a compact electro-magnetic actuator with a built-in low cost proximity sensor for accurate proximity measurement of the magnetic object. The paper successively presents a modelling procedure with the finite element method, design and parameter settings of a sensor fusion algorithm with Unscented Kalman Filter and, finally, the implementation procedure and results of real-time operation.
Effective behavioral modeling and prediction even when few exemplars are available
NASA Astrophysics Data System (ADS)
Goan, Terrance; Kartha, Neelakantan; Kaneshiro, Ryan
2006-05-01
While great progress has been made in the lowest levels of data fusion, practical advances in behavior modeling and prediction remain elusive. The most critical limitation of existing approaches is their inability to support the required knowledge modeling and continuing refinement under realistic constraints (e.g., few historic exemplars, the lack of knowledge engineering support, and the need for rapid system deployment). This paper reports on our ongoing efforts to develop Propheteer, a system which will address these shortcomings through two primary techniques. First, with Propheteer we abandon the typical consensus-driven modeling approaches that involve infrequent group decision making sessions in favor of an approach that solicits asynchronous knowledge contributions (in the form of alternative future scenarios and indicators) without burdening the user with endless certainty or probability estimates. Second, we enable knowledge contributions by personnel beyond the typical core decision making group, thereby casting light on blind spots, mitigating human biases, and helping maintain the currency of the developed behavior models. We conclude with a discussion of the many lessons learned in the development of our prototype Propheteer system.
A multi-data stream assimilation framework for the assessment of volcanic unrest
NASA Astrophysics Data System (ADS)
Gregg, Patricia M.; Pettijohn, J. Cory
2016-01-01
Active volcanoes pose a constant risk to populations living in their vicinity. Significant effort has been spent to increase monitoring and data collection campaigns to mitigate potential volcano disasters. To utilize these datasets to their fullest extent, a new generation of model-data fusion techniques is required that combine multiple, disparate observations of volcanic activity with cutting-edge modeling techniques to provide efficient assessment of volcanic unrest. The purpose of this paper is to develop a data assimilation framework for volcano applications. Specifically, the Ensemble Kalman Filter (EnKF) is adapted to assimilate GPS and InSAR data into viscoelastic, time-forward, finite element models of an evolving magma system to provide model forecasts and error estimations. Since the goal of this investigation is to provide a methodological framework, our efforts are focused on theoretical development and synthetic tests to illustrate the effectiveness of the EnKF and its applicability in physical volcanology. The synthetic tests provide two critical results: (1) a proof of concept for using the EnKF for multi dataset assimilation in investigations of volcanic activity; and (2) the comparison of spatially limited, but temporally dense, GPS data with temporally limited InSAR observations for evaluating magma chamber dynamics during periods of volcanic unrest. Results indicate that the temporally dense information provided by GPS observations results in faster convergence and more accurate model predictions. However, most importantly, the synthetic tests illustrate that the EnKF is able to swiftly respond to data updates by changing the model forecast trajectory to match incoming observations. The synthetic results demonstrate a great potential for utilizing the EnKF model-data fusion method to assess volcanic unrest and provide model forecasts. The development of these new techniques provides: (1) a framework for future applications of rapid data assimilation and model development during volcanic crises; (2) a method for hind-casting to investigate previous volcanic eruptions, including potential eruption triggering mechanisms and precursors; and (3) an approach for optimizing survey designs for future data collection campaigns at active volcanic systems.
Adu-Gyamfi, Emmanuel; Kim, Lori S; Jardetzky, Theodore S; Lamb, Robert A
2016-10-15
The Paramyxoviridae comprise a large family of enveloped, negative-sense, single-stranded RNA viruses with significant economic and public health implications. For nearly all paramyxoviruses, infection is initiated by fusion of the viral and host cell plasma membranes in a pH-independent fashion. Fusion is orchestrated by the receptor binding protein hemagglutinin-neuraminidase (HN; also called H or G depending on the virus type) protein and a fusion (F) protein, the latter undergoing a major refolding process to merge the two membranes. Mechanistic details regarding the coupling of receptor binding to F activation are not fully understood. Here, we have identified the flexible loop region connecting the bulky enzymatically active head and the four-helix bundle stalk to be essential for fusion promotion. Proline substitution in this region of HN of parainfluenza virus 5 (PIV5) and Newcastle disease virus HN abolishes cell-cell fusion, whereas HN retains receptor binding and neuraminidase activity. By using reverse genetics, we engineered recombinant PIV5-EGFP viruses with mutations in the head-stalk linker region of HN. Mutations in this region abolished virus recovery and infectivity. In sum, our data suggest that the loop region acts as a "hinge" around which the bulky head of HN swings to-and-fro to facilitate timely HN-mediate F-triggering, a notion consistent with the stalk-mediated activation model of paramyxovirus fusion. Paramyxovirus fusion with the host cell plasma membrane is essential for virus infection. Membrane fusion is orchestrated via interaction of the receptor binding protein (HN, H, or G) with the viral fusion glycoprotein (F). Two distinct models have been suggested to describe the mechanism of fusion: these include "the clamp" and the "provocateur" model of activation. By using biochemical and reverse genetics tools, we have obtained strong evidence in favor of the HN stalk-mediated activation of paramyxovirus fusion. Specifically, our data strongly support the notion that the short linker between the head and stalk plays a role in "conformational switching" of the head group to facilitate F-HN interaction and triggering. Copyright © 2016, American Society for Microbiology. All Rights Reserved.
Adu-Gyamfi, Emmanuel; Kim, Lori S.; Jardetzky, Theodore S.
2016-01-01
ABSTRACT The Paramyxoviridae comprise a large family of enveloped, negative-sense, single-stranded RNA viruses with significant economic and public health implications. For nearly all paramyxoviruses, infection is initiated by fusion of the viral and host cell plasma membranes in a pH-independent fashion. Fusion is orchestrated by the receptor binding protein hemagglutinin-neuraminidase (HN; also called H or G depending on the virus type) protein and a fusion (F) protein, the latter undergoing a major refolding process to merge the two membranes. Mechanistic details regarding the coupling of receptor binding to F activation are not fully understood. Here, we have identified the flexible loop region connecting the bulky enzymatically active head and the four-helix bundle stalk to be essential for fusion promotion. Proline substitution in this region of HN of parainfluenza virus 5 (PIV5) and Newcastle disease virus HN abolishes cell-cell fusion, whereas HN retains receptor binding and neuraminidase activity. By using reverse genetics, we engineered recombinant PIV5-EGFP viruses with mutations in the head-stalk linker region of HN. Mutations in this region abolished virus recovery and infectivity. In sum, our data suggest that the loop region acts as a “hinge” around which the bulky head of HN swings to-and-fro to facilitate timely HN-mediate F-triggering, a notion consistent with the stalk-mediated activation model of paramyxovirus fusion. IMPORTANCE Paramyxovirus fusion with the host cell plasma membrane is essential for virus infection. Membrane fusion is orchestrated via interaction of the receptor binding protein (HN, H, or G) with the viral fusion glycoprotein (F). Two distinct models have been suggested to describe the mechanism of fusion: these include “the clamp” and the “provocateur” model of activation. By using biochemical and reverse genetics tools, we have obtained strong evidence in favor of the HN stalk-mediated activation of paramyxovirus fusion. Specifically, our data strongly support the notion that the short linker between the head and stalk plays a role in “conformational switching” of the head group to facilitate F-HN interaction and triggering. PMID:27489276
Bhateja, Vikrant; Moin, Aisha; Srivastava, Anuja; Bao, Le Nguyen; Lay-Ekuakille, Aimé; Le, Dac-Nhuong
2016-07-01
Computer based diagnosis of Alzheimer's disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer's disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Component Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bhateja, Vikrant, E-mail: bhateja.vikrant@gmail.com, E-mail: nhuongld@hus.edu.vn; Moin, Aisha; Srivastava, Anuja
Computer based diagnosis of Alzheimer’s disease can be performed by dint of the analysis of the functional and structural changes in the brain. Multispectral image fusion deliberates upon fusion of the complementary information while discarding the surplus information to achieve a solitary image which encloses both spatial and spectral details. This paper presents a Non-Sub-sampled Contourlet Transform (NSCT) based multispectral image fusion model for computer-aided diagnosis of Alzheimer’s disease. The proposed fusion methodology involves color transformation of the input multispectral image. The multispectral image in YIQ color space is decomposed using NSCT followed by dimensionality reduction using modified Principal Componentmore » Analysis algorithm on the low frequency coefficients. Further, the high frequency coefficients are enhanced using non-linear enhancement function. Two different fusion rules are then applied to the low-pass and high-pass sub-bands: Phase congruency is applied to low frequency coefficients and a combination of directive contrast and normalized Shannon entropy is applied to high frequency coefficients. The superiority of the fusion response is depicted by the comparisons made with the other state-of-the-art fusion approaches (in terms of various fusion metrics).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Strozzi, D. J.; Bailey, D. S.; Michel, P.
The effects of laser-plasma interactions (LPI) on the dynamics of inertial confinement fusion hohlraums are investigated in this work via a new approach that self-consistently couples reduced LPI models into radiation-hydrodynamics numerical codes. The interplay between hydrodynamics and LPI—specifically stimulated Raman scatter and crossed-beam energy transfer (CBET)—mostly occurs via momentum and energy deposition into Langmuir and ion acoustic waves. This spatially redistributes energy coupling to the target, which affects the background plasma conditions and thus, modifies laser propagation. In conclusion, this model shows reduced CBET and significant laser energy depletion by Langmuir waves, which reduce the discrepancy between modeling andmore » data from hohlraum experiments on wall x-ray emission and capsule implosion shape.« less
Fusion and direct reactions around the barrier for the systems {sup 7,9}Be,{sup 7}Li+{sup 238}U
DOE Office of Scientific and Technical Information (OSTI.GOV)
Raabe, R.; Angulo, C.; Charvet, J. L.
2006-10-15
We present new cross section data for the complete fusion of the weakly bound systems {sup 7,9}Be and {sup 7}Li on {sup 238}U at energies around the Coulomb barrier. In the same measurement, yields for direct processes and incomplete fusion are detected. For all systems, a suppression of the complete fusion cross section around and above the barrier is observed. At energies below the barrier, the fusion of the {sup 7}Be+{sup 238}U system shows no enhancement with respect to simple model predictions.
Single-Scale Fusion: An Effective Approach to Merging Images.
Ancuti, Codruta O; Ancuti, Cosmin; De Vleeschouwer, Christophe; Bovik, Alan C
2017-01-01
Due to its robustness and effectiveness, multi-scale fusion (MSF) based on the Laplacian pyramid decomposition has emerged as a popular technique that has shown utility in many applications. Guided by several intuitive measures (weight maps) the MSF process is versatile and straightforward to be implemented. However, the number of pyramid levels increases with the image size, which implies sophisticated data management and memory accesses, as well as additional computations. Here, we introduce a simplified formulation that reduces MSF to only a single level process. Starting from the MSF decomposition, we explain both mathematically and intuitively (visually) a way to simplify the classical MSF approach with minimal loss of information. The resulting single-scale fusion (SSF) solution is a close approximation of the MSF process that eliminates important redundant computations. It also provides insights regarding why MSF is so effective. While our simplified expression is derived in the context of high dynamic range imaging, we show its generality on several well-known fusion-based applications, such as image compositing, extended depth of field, medical imaging, and blending thermal (infrared) images with visible light. Besides visual validation, quantitative evaluations demonstrate that our SSF strategy is able to yield results that are highly competitive with traditional MSF approaches.
Taxonomy of multi-focal nematode image stacks by a CNN based image fusion approach.
Liu, Min; Wang, Xueping; Zhang, Hongzhong
2018-03-01
In the biomedical field, digital multi-focal images are very important for documentation and communication of specimen data, because the morphological information for a transparent specimen can be captured in form of a stack of high-quality images. Given biomedical image stacks containing multi-focal images, how to efficiently extract effective features from all layers to classify the image stacks is still an open question. We present to use a deep convolutional neural network (CNN) image fusion based multilinear approach for the taxonomy of multi-focal image stacks. A deep CNN based image fusion technique is used to combine relevant information of multi-focal images within a given image stack into a single image, which is more informative and complete than any single image in the given stack. Besides, multi-focal images within a stack are fused along 3 orthogonal directions, and multiple features extracted from the fused images along different directions are combined by canonical correlation analysis (CCA). Because multi-focal image stacks represent the effect of different factors - texture, shape, different instances within the same class and different classes of objects, we embed the deep CNN based image fusion method within a multilinear framework to propose an image fusion based multilinear classifier. The experimental results on nematode multi-focal image stacks demonstrated that the deep CNN image fusion based multilinear classifier can reach a higher classification rate (95.7%) than that by the previous multilinear based approach (88.7%), even we only use the texture feature instead of the combination of texture and shape features as in the previous work. The proposed deep CNN image fusion based multilinear approach shows great potential in building an automated nematode taxonomy system for nematologists. It is effective to classify multi-focal image stacks. Copyright © 2018 Elsevier B.V. All rights reserved.
An empirical InSAR-optical fusion approach to mapping vegetation canopy height
Wayne S. Walker; Josef M. Kellndorfer; Elizabeth LaPoint; Michael Hoppus; James Westfall
2007-01-01
Exploiting synergies afforded by a host of recently available national-scale data sets derived from interferometric synthetic aperture radar (InSAR) and passive optical remote sensing, this paper describes the development of a novel empirical approach for the provision of regional- to continental-scale estimates of vegetation canopy height. Supported by data from the...
Huda, Shamsul; Yearwood, John; Togneri, Roberto
2009-02-01
This paper attempts to overcome the tendency of the expectation-maximization (EM) algorithm to locate a local rather than global maximum when applied to estimate the hidden Markov model (HMM) parameters in speech signal modeling. We propose a hybrid algorithm for estimation of the HMM in automatic speech recognition (ASR) using a constraint-based evolutionary algorithm (EA) and EM, the CEL-EM. The novelty of our hybrid algorithm (CEL-EM) is that it is applicable for estimation of the constraint-based models with many constraints and large numbers of parameters (which use EM) like HMM. Two constraint-based versions of the CEL-EM with different fusion strategies have been proposed using a constraint-based EA and the EM for better estimation of HMM in ASR. The first one uses a traditional constraint-handling mechanism of EA. The other version transforms a constrained optimization problem into an unconstrained problem using Lagrange multipliers. Fusion strategies for the CEL-EM use a staged-fusion approach where EM has been plugged with the EA periodically after the execution of EA for a specific period of time to maintain the global sampling capabilities of EA in the hybrid algorithm. A variable initialization approach (VIA) has been proposed using a variable segmentation to provide a better initialization for EA in the CEL-EM. Experimental results on the TIMIT speech corpus show that CEL-EM obtains higher recognition accuracies than the traditional EM algorithm as well as a top-standard EM (VIA-EM, constructed by applying the VIA to EM).
Reis, Yara; Wolf, Thomas; Brors, Benedikt; Hamacher-Brady, Anne; Eils, Roland; Brady, Nathan R.
2012-01-01
Mitochondria exist as a network of interconnected organelles undergoing constant fission and fusion. Current approaches to study mitochondrial morphology are limited by low data sampling coupled with manual identification and classification of complex morphological phenotypes. Here we propose an integrated mechanistic and data-driven modeling approach to analyze heterogeneous, quantified datasets and infer relations between mitochondrial morphology and apoptotic events. We initially performed high-content, multi-parametric measurements of mitochondrial morphological, apoptotic, and energetic states by high-resolution imaging of human breast carcinoma MCF-7 cells. Subsequently, decision tree-based analysis was used to automatically classify networked, fragmented, and swollen mitochondrial subpopulations, at the single-cell level and within cell populations. Our results revealed subtle but significant differences in morphology class distributions in response to various apoptotic stimuli. Furthermore, key mitochondrial functional parameters including mitochondrial membrane potential and Bax activation, were measured under matched conditions. Data-driven fuzzy logic modeling was used to explore the non-linear relationships between mitochondrial morphology and apoptotic signaling, combining morphological and functional data as a single model. Modeling results are in accordance with previous studies, where Bax regulates mitochondrial fragmentation, and mitochondrial morphology influences mitochondrial membrane potential. In summary, we established and validated a platform for mitochondrial morphological and functional analysis that can be readily extended with additional datasets. We further discuss the benefits of a flexible systematic approach for elucidating specific and general relationships between mitochondrial morphology and apoptosis. PMID:22272225
Assessing tropical rainforest growth traits: Data - Model fusion in the Congo basin and beyond.
NASA Astrophysics Data System (ADS)
Pietsch, S.
2016-12-01
Virgin forest ecosystems resemble the key reference level for natural tree growth dynamics. The mosaic cycle concept describes such dynamics as local disequilibria driven by patch level succession cycles of breakdown, regeneration, juvenescence and old growth. These cycles, however, may involve different traits of light demanding and shade tolerant species assemblies. In this work a data model fusion concept will be introduced to assess the differences in growth dynamics of the mosaic cycle of the Western Congolian Lowland Rainforest ecosystem. Field data from 34 forest patches located in an ice age forest refuge, recently pinpointed to the ground and still devoid of direct human impact up to today - resemble the data base. A 3D error assessment procedure versus BGC model simulations for the 34 patches revealed two different growth dynamics, consistent with observed growth traits of pioneer and late succession species assemblies of the Western Congolian Lowland rainforest. An application of the same procedure to Central American Pacific rainforests confirms the strength of the 3D error field data model fusion concept to assess different growth traits of the mosaic cycle of natural forest dynamics.
Mitochondrial fusion through membrane automata.
Giannakis, Konstantinos; Andronikos, Theodore
2015-01-01
Studies have shown that malfunctions in mitochondrial processes can be blamed for diseases. However, the mechanism behind these operations is yet not sufficiently clear. In this work we present a novel approach to describe a biomolecular model for mitochondrial fusion using notions from the membrane computing. We use a case study defined in BioAmbient calculus and we show how to translate it in terms of a P automata variant. We combine brane calculi with (mem)brane automata to produce a new scheme capable of describing simple, realistic models. We propose the further use of similar methods and the test of other biomolecular models with the same behaviour.
Activity recognition using Video Event Segmentation with Text (VEST)
NASA Astrophysics Data System (ADS)
Holloway, Hillary; Jones, Eric K.; Kaluzniacki, Andrew; Blasch, Erik; Tierno, Jorge
2014-06-01
Multi-Intelligence (multi-INT) data includes video, text, and signals that require analysis by operators. Analysis methods include information fusion approaches such as filtering, correlation, and association. In this paper, we discuss the Video Event Segmentation with Text (VEST) method, which provides event boundaries of an activity to compile related message and video clips for future interest. VEST infers meaningful activities by clustering multiple streams of time-sequenced multi-INT intelligence data and derived fusion products. We discuss exemplar results that segment raw full-motion video (FMV) data by using extracted commentary message timestamps, FMV metadata, and user-defined queries.
NASA Astrophysics Data System (ADS)
Liu, Chunhui; Zhang, Duona; Zhao, Xintao
2018-03-01
Saliency detection in synthetic aperture radar (SAR) images is a difficult problem. This paper proposed a multitask saliency detection (MSD) model for the saliency detection task of SAR images. We extract four features of the SAR image, which include the intensity, orientation, uniqueness, and global contrast, as the input of the MSD model. The saliency map is generated by the multitask sparsity pursuit, which integrates the multiple features collaboratively. Detection of different scale features is also taken into consideration. Subjective and objective evaluation of the MSD model verifies its effectiveness. Based on the saliency maps obtained by the MSD model, we apply the saliency map of the SAR image to the SAR and color optical image fusion. The experimental results of real data show that the saliency map obtained by the MSD model helps to improve the fusion effect, and the salient areas in the SAR image can be highlighted in the fusion results.
Systematic identification and analysis of frequent gene fusion events in metabolic pathways
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.
Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less
Systematic identification and analysis of frequent gene fusion events in metabolic pathways
Henry, Christopher S.; Lerma-Ortiz, Claudia; Gerdes, Svetlana Y.; ...
2016-06-24
Here, gene fusions are the most powerful type of in silico-derived functional associations. However, many fusion compilations were made when <100 genomes were available, and algorithms for identifying fusions need updating to handle the current avalanche of sequenced genomes. The availability of a large fusion dataset would help probe functional associations and enable systematic analysis of where and why fusion events occur. As a result, here we present a systematic analysis of fusions in prokaryotes. We manually generated two training sets: (i) 121 fusions in the model organism Escherichia coli; (ii) 131 fusions found in B vitamin metabolism. These setsmore » were used to develop a fusion prediction algorithm that captured the training set fusions with only 7 % false negatives and 50 % false positives, a substantial improvement over existing approaches. This algorithm was then applied to identify 3.8 million potential fusions across 11,473 genomes. The results of the analysis are available in a searchable database. A functional analysis identified 3,000 reactions associated with frequent fusion events and revealed areas of metabolism where fusions are particularly prevalent. In conclusion, customary definitions of fusions were shown to be ambiguous, and a stricter one was proposed. Exploring the genes participating in fusion events showed that they most commonly encode transporters, regulators, and metabolic enzymes. The major rationales for fusions between metabolic genes appear to be overcoming pathway bottlenecks, avoiding toxicity, controlling competing pathways, and facilitating expression and assembly of protein complexes. Finally, our fusion dataset provides powerful clues to decipher the biological activities of domains of unknown function.« less
Wang, Shiyao; Deng, Zhidong; Yin, Gang
2016-01-01
A high-performance differential global positioning system (GPS) receiver with real time kinematics provides absolute localization for driverless cars. However, it is not only susceptible to multipath effect but also unable to effectively fulfill precise error correction in a wide range of driving areas. This paper proposes an accurate GPS–inertial measurement unit (IMU)/dead reckoning (DR) data fusion method based on a set of predictive models and occupancy grid constraints. First, we employ a set of autoregressive and moving average (ARMA) equations that have different structural parameters to build maximum likelihood models of raw navigation. Second, both grid constraints and spatial consensus checks on all predictive results and current measurements are required to have removal of outliers. Navigation data that satisfy stationary stochastic process are further fused to achieve accurate localization results. Third, the standard deviation of multimodal data fusion can be pre-specified by grid size. Finally, we perform a lot of field tests on a diversity of real urban scenarios. The experimental results demonstrate that the method can significantly smooth small jumps in bias and considerably reduce accumulated position errors due to DR. With low computational complexity, the position accuracy of our method surpasses existing state-of-the-arts on the same dataset and the new data fusion method is practically applied in our driverless car. PMID:26927108
Wang, Shiyao; Deng, Zhidong; Yin, Gang
2016-02-24
A high-performance differential global positioning system (GPS) receiver with real time kinematics provides absolute localization for driverless cars. However, it is not only susceptible to multipath effect but also unable to effectively fulfill precise error correction in a wide range of driving areas. This paper proposes an accurate GPS-inertial measurement unit (IMU)/dead reckoning (DR) data fusion method based on a set of predictive models and occupancy grid constraints. First, we employ a set of autoregressive and moving average (ARMA) equations that have different structural parameters to build maximum likelihood models of raw navigation. Second, both grid constraints and spatial consensus checks on all predictive results and current measurements are required to have removal of outliers. Navigation data that satisfy stationary stochastic process are further fused to achieve accurate localization results. Third, the standard deviation of multimodal data fusion can be pre-specified by grid size. Finally, we perform a lot of field tests on a diversity of real urban scenarios. The experimental results demonstrate that the method can significantly smooth small jumps in bias and considerably reduce accumulated position errors due to DR. With low computational complexity, the position accuracy of our method surpasses existing state-of-the-arts on the same dataset and the new data fusion method is practically applied in our driverless car.
Quantitative image fusion in infrared radiometry
NASA Astrophysics Data System (ADS)
Romm, Iliya; Cukurel, Beni
2018-05-01
Towards high-accuracy infrared radiance estimates, measurement practices and processing techniques aimed to achieve quantitative image fusion using a set of multi-exposure images of a static scene are reviewed. The conventional non-uniformity correction technique is extended, as the original is incompatible with quantitative fusion. Recognizing the inherent limitations of even the extended non-uniformity correction, an alternative measurement methodology, which relies on estimates of the detector bias using self-calibration, is developed. Combining data from multi-exposure images, two novel image fusion techniques that ultimately provide high tonal fidelity of a photoquantity are considered: ‘subtract-then-fuse’, which conducts image subtraction in the camera output domain and partially negates the bias frame contribution common to both the dark and scene frames; and ‘fuse-then-subtract’, which reconstructs the bias frame explicitly and conducts image fusion independently for the dark and the scene frames, followed by subtraction in the photoquantity domain. The performances of the different techniques are evaluated for various synthetic and experimental data, identifying the factors contributing to potential degradation of the image quality. The findings reflect the superiority of the ‘fuse-then-subtract’ approach, conducting image fusion via per-pixel nonlinear weighted least squares optimization.
Phan, Kevin; Malham, Greg; Seex, Kevin; Rao, Prashanth J.
2015-01-01
Degenerative disc and facet joint disease of the lumbar spine is common in the ageing population, and is one of the most frequent causes of disability. Lumbar spondylosis may result in mechanical back pain, radicular and claudicant symptoms, reduced mobility and poor quality of life. Surgical interbody fusion of degenerative levels is an effective treatment option to stabilize the painful motion segment, and may provide indirect decompression of the neural elements, restore lordosis and correct deformity. The surgical options for interbody fusion of the lumbar spine include: posterior lumbar interbody fusion (PLIF), transforaminal lumbar interbody fusion (TLIF), minimally invasive transforaminal lumbar interbody fusion (MI-TLIF), oblique lumbar interbody fusion/anterior to psoas (OLIF/ATP), lateral lumbar interbody fusion (LLIF) and anterior lumbar interbody fusion (ALIF). The indications may include: discogenic/facetogenic low back pain, neurogenic claudication, radiculopathy due to foraminal stenosis, lumbar degenerative spinal deformity including symptomatic spondylolisthesis and degenerative scoliosis. In general, traditional posterior approaches are frequently used with acceptable fusion rates and low complication rates, however they are limited by thecal sac and nerve root retraction, along with iatrogenic injury to the paraspinal musculature and disruption of the posterior tension band. Minimally invasive (MIS) posterior approaches have evolved in an attempt to reduce approach related complications. Anterior approaches avoid the spinal canal, cauda equina and nerve roots, however have issues with approach related abdominal and vascular complications. In addition, lateral and OLIF techniques have potential risks to the lumbar plexus and psoas muscle. The present study aims firstly to comprehensively review the available literature and evidence for different lumbar interbody fusion (LIF) techniques. Secondly, we propose a set of recommendations and guidelines for the indications for interbody fusion options. Thirdly, this article provides a description of each approach, and illustrates the potential benefits and disadvantages of each technique with reference to indication and spine level performed. PMID:27683674
NASA Astrophysics Data System (ADS)
Lv, Zheng; Sui, Haigang; Zhang, Xilin; Huang, Xianfeng
2007-11-01
As one of the most important geo-spatial objects and military establishment, airport is always a key target in fields of transportation and military affairs. Therefore, automatic recognition and extraction of airport from remote sensing images is very important and urgent for updating of civil aviation and military application. In this paper, a new multi-source data fusion approach on automatic airport information extraction, updating and 3D modeling is addressed. Corresponding key technologies including feature extraction of airport information based on a modified Ostu algorithm, automatic change detection based on new parallel lines-based buffer detection algorithm, 3D modeling based on gradual elimination of non-building points algorithm, 3D change detecting between old airport model and LIDAR data, typical CAD models imported and so on are discussed in detail. At last, based on these technologies, we develop a prototype system and the results show our method can achieve good effects.
Effective World Modeling: Multisensor Data Fusion Methodology for Automated Driving
Elfring, Jos; Appeldoorn, Rein; van den Dries, Sjoerd; Kwakkernaat, Maurice
2016-01-01
The number of perception sensors on automated vehicles increases due to the increasing number of advanced driver assistance system functions and their increasing complexity. Furthermore, fail-safe systems require redundancy, thereby increasing the number of sensors even further. A one-size-fits-all multisensor data fusion architecture is not realistic due to the enormous diversity in vehicles, sensors and applications. As an alternative, this work presents a methodology that can be used to effectively come up with an implementation to build a consistent model of a vehicle’s surroundings. The methodology is accompanied by a software architecture. This combination minimizes the effort required to update the multisensor data fusion system whenever sensors or applications are added or replaced. A series of real-world experiments involving different sensors and algorithms demonstrates the methodology and the software architecture. PMID:27727171
Gene network inference by fusing data from diverse distributions
Žitnik, Marinka; Zupan, Blaž
2015-01-01
Motivation: Markov networks are undirected graphical models that are widely used to infer relations between genes from experimental data. Their state-of-the-art inference procedures assume the data arise from a Gaussian distribution. High-throughput omics data, such as that from next generation sequencing, often violates this assumption. Furthermore, when collected data arise from multiple related but otherwise nonidentical distributions, their underlying networks are likely to have common features. New principled statistical approaches are needed that can deal with different data distributions and jointly consider collections of datasets. Results: We present FuseNet, a Markov network formulation that infers networks from a collection of nonidentically distributed datasets. Our approach is computationally efficient and general: given any number of distributions from an exponential family, FuseNet represents model parameters through shared latent factors that define neighborhoods of network nodes. In a simulation study, we demonstrate good predictive performance of FuseNet in comparison to several popular graphical models. We show its effectiveness in an application to breast cancer RNA-sequencing and somatic mutation data, a novel application of graphical models. Fusion of datasets offers substantial gains relative to inference of separate networks for each dataset. Our results demonstrate that network inference methods for non-Gaussian data can help in accurate modeling of the data generated by emergent high-throughput technologies. Availability and implementation: Source code is at https://github.com/marinkaz/fusenet. Contact: blaz.zupan@fri.uni-lj.si Supplementary information: Supplementary information is available at Bioinformatics online. PMID:26072487
DOE Office of Scientific and Technical Information (OSTI.GOV)
Post, Wilfred M; King, Anthony Wayne; Dragoni, Danilo
Many parameters in terrestrial biogeochemical models are inherently uncertain, leading to uncertainty in predictions of key carbon cycle variables. At observation sites, this uncertainty can be quantified by applying model-data fusion techniques to estimate model parameters using eddy covariance observations and associated biometric data sets as constraints. Uncertainty is reduced as data records become longer and different types of observations are added. We estimate parametric and associated predictive uncertainty at the Morgan Monroe State Forest in Indiana, USA. Parameters in the Local Terrestrial Ecosystem Carbon (LoTEC) are estimated using both synthetic and actual constraints. These model parameters and uncertainties aremore » then used to make predictions of carbon flux for up to 20 years. We find a strong dependence of both parametric and prediction uncertainty on the length of the data record used in the model-data fusion. In this model framework, this dependence is strongly reduced as the data record length increases beyond 5 years. If synthetic initial biomass pool constraints with realistic uncertainties are included in the model-data fusion, prediction uncertainty is reduced by more than 25% when constraining flux records are less than 3 years. If synthetic annual aboveground woody biomass increment constraints are also included, uncertainty is similarly reduced by an additional 25%. When actual observed eddy covariance data are used as constraints, there is still a strong dependence of parameter and prediction uncertainty on data record length, but the results are harder to interpret because of the inability of LoTEC to reproduce observed interannual variations and the confounding effects of model structural error.« less
Semiclassical treatment of fusion and breakup processes of ^{6,8}He halo nuclei
NASA Astrophysics Data System (ADS)
Majeed, Fouad A.; Abdul-Hussien, Yousif A.
2016-06-01
A semiclassical approach has been used to study the effect of channel coupling on the calculations of the total fusion reaction cross section σ _{fus}, and the fusion barrier distribution D_{fus} for the systems 6He +^{238}U and 8He +^{197}Au. Since these systems invloves light exotic nuclei, breakup states channel play an important role that should be considered in the calculations. In semiclassical treatment, the relative motion between the projectile and target nuclei is approximated by a classical trajectory while the intrinsic dynamics is handled by time-dependent quantum mechanics. The calculations of the total fusion cross section σ _{fus}, and the fusion barrier distribution D_{fus} are compared with the full quantum mechanical calculations using the coupled-channels calculations with all order coupling using the computer code and with the available experimental data.
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-08-20
The limitations of satellite data acquisition mean that there is a lack of satellite data with high spatial and temporal resolutions for environmental process monitoring. In this study, we address this problem by applying the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) and the Spatial and Temporal Data Fusion Approach (STDFA) to combine Huanjing satellite charge coupled device (HJ CCD), Gaofen satellite no. 1 wide field of view camera (GF-1 WFV) and Moderate Resolution Imaging Spectroradiometer (MODIS) data to generate daily high spatial resolution synthetic data for land surface process monitoring. Actual HJ CCD and GF-1 WFV data were used to evaluate the precision of the synthetic images using the correlation analysis method. Our method was tested and validated for two study areas in Xinjiang Province, China. The results show that both the ESTARFM and STDFA can be applied to combine HJ CCD and MODIS reflectance data, and GF-1 WFV and MODIS reflectance data, to generate synthetic HJ CCD data and synthetic GF-1 WFV data that closely match actual data with correlation coefficients (r) greater than 0.8989 and 0.8643, respectively. Synthetic red- and near infrared (NIR)-band data generated by ESTARFM are more suitable for the calculation of Normalized Different Vegetation Index (NDVI) than the data generated by STDFA.
Wu, Mingquan; Huang, Wenjiang; Niu, Zheng; Wang, Changyao
2015-01-01
The limitations of satellite data acquisition mean that there is a lack of satellite data with high spatial and temporal resolutions for environmental process monitoring. In this study, we address this problem by applying the Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model (ESTARFM) and the Spatial and Temporal Data Fusion Approach (STDFA) to combine Huanjing satellite charge coupled device (HJ CCD), Gaofen satellite no. 1 wide field of view camera (GF-1 WFV) and Moderate Resolution Imaging Spectroradiometer (MODIS) data to generate daily high spatial resolution synthetic data for land surface process monitoring. Actual HJ CCD and GF-1 WFV data were used to evaluate the precision of the synthetic images using the correlation analysis method. Our method was tested and validated for two study areas in Xinjiang Province, China. The results show that both the ESTARFM and STDFA can be applied to combine HJ CCD and MODIS reflectance data, and GF-1 WFV and MODIS reflectance data, to generate synthetic HJ CCD data and synthetic GF-1 WFV data that closely match actual data with correlation coefficients (r) greater than 0.8989 and 0.8643, respectively. Synthetic red- and near infrared (NIR)-band data generated by ESTARFM are more suitable for the calculation of Normalized Different Vegetation Index (NDVI) than the data generated by STDFA. PMID:26308017
Data fusion for QRS complex detection in multi-lead electrocardiogram recordings
NASA Astrophysics Data System (ADS)
Ledezma, Carlos A.; Perpiñan, Gilberto; Severeyn, Erika; Altuve, Miguel
2015-12-01
Heart diseases are the main cause of death worldwide. The first step in the diagnose of these diseases is the analysis of the electrocardiographic (ECG) signal. In turn, the ECG analysis begins with the detection of the QRS complex, which is the one with the most energy in the cardiac cycle. Numerous methods have been proposed in the bibliography for QRS complex detection, but few authors have analyzed the possibility of taking advantage of the information redundancy present in multiple ECG leads (simultaneously acquired) to produce accurate QRS detection. In our previous work we presented such an approach, proposing various data fusion techniques to combine the detections made by an algorithm on multiple ECG leads. In this paper we present further studies that show the advantages of this multi-lead detection approach, analyzing how many leads are necessary in order to observe an improvement in the detection performance. A well known QRS detection algorithm was used to test the fusion techniques on the St. Petersburg Institute of Cardiological Technics database. Results show improvement in the detection performance with as little as three leads, but the reliability of these results becomes interesting only after using seven or more leads. Results were evaluated using the detection error rate (DER). The multi-lead detection approach allows an improvement from DER = 3:04% to DER = 1:88%. Further works are to be made in order to improve the detection performance by implementing further fusion steps.
Anomalous anisotropies of fission fragments in near- and sub-barrier fusion-fussion reactions
NASA Astrophysics Data System (ADS)
Huanqiao, Zhang; Zuhua, Liu; Jincheng, Xu; Jun, Lu; Ming, Ruan; Kan, Xu
1992-03-01
Fission cross sections and angular distributions have been measured for the reactions of 16O + 232Th and238U, and19F + 208Pb and232Th at near- and sub-barrier energies. The fission excitation functions are rather well reproduced on the basis of Wong model or coupled channels theory. However, the models which reproduce the sub-barrier fusion cross sections fail to account for the experimental anisotropies of fission fragments. It is found that the observed anisotropies are much larger than expected. For the first time it has been observed that the anisotropies as a function of the center-of-mass energy show a peak centered near 4.5 MeV below the fusion barrier for several reaction systems. The present approaches fail to explain these anomalies. For 19F + 208Pb systems, our results confirm the prediction of an approximately constant value for the mean square spin of the compound nucleus produced in far sub-barrier fusion reaction.
Residue-level resolution of alphavirus envelope protein interactions in pH-dependent fusion.
Zeng, Xiancheng; Mukhopadhyay, Suchetana; Brooks, Charles L
2015-02-17
Alphavirus envelope proteins, organized as trimers of E2-E1 heterodimers on the surface of the pathogenic alphavirus, mediate the low pH-triggered fusion of viral and endosomal membranes in human cells. The lack of specific treatment for alphaviral infections motivates our exploration of potential antiviral approaches by inhibiting one or more fusion steps in the common endocytic viral entry pathway. In this work, we performed constant pH molecular dynamics based on an atomic model of the alphavirus envelope with icosahedral symmetry. We have identified pH-sensitive residues that cause the largest shifts in thermodynamic driving forces under neutral and acidic pH conditions for various fusion steps. A series of conserved interdomain His residues is identified to be responsible for the pH-dependent conformational changes in the fusion process, and ligand binding sites in their vicinity are anticipated to be potential drug targets aimed at inhibiting viral infections.
3D reconstruction from multi-view VHR-satellite images in MicMac
NASA Astrophysics Data System (ADS)
Rupnik, Ewelina; Pierrot-Deseilligny, Marc; Delorme, Arthur
2018-05-01
This work addresses the generation of high quality digital surface models by fusing multiple depths maps calculated with the dense image matching method. The algorithm is adapted to very high resolution multi-view satellite images, and the main contributions of this work are in the multi-view fusion. The algorithm is insensitive to outliers, takes into account the matching quality indicators, handles non-correlated zones (e.g. occlusions), and is solved with a multi-directional dynamic programming approach. No geometric constraints (e.g. surface planarity) or auxiliary data in form of ground control points are required for its operation. Prior to the fusion procedures, the RPC geolocation parameters of all images are improved in a bundle block adjustment routine. The performance of the algorithm is evaluated on two VHR (Very High Resolution)-satellite image datasets (Pléiades, WorldView-3) revealing its good performance in reconstructing non-textured areas, repetitive patterns, and surface discontinuities.
Cell fusion in the liver, revisited
Lizier, Michela; Castelli, Alessandra; Montagna, Cristina; Lucchini, Franco; Vezzoni, Paolo; Faggioli, Francesca
2018-01-01
There is wide agreement that cell fusion is a physiological process in cells in mammalian bone, muscle and placenta. In other organs, such as the cerebellum, cell fusion is controversial. The liver contains a considerable number of polyploid cells: They are commonly believed to originate by genome endoreplication, although the contribution of cell fusion to polyploidization has not been excluded. Here, we address the topic of cell fusion in the liver from a historical point of view. We discuss experimental evidence clearly supporting the hypothesis that cell fusion occurs in the liver, specifically when bone marrow cells were injected into mice and shown to rescue genetic hepatic degenerative defects. Those experiments-carried out in the latter half of the last century-were initially interpreted to show “transdifferentiation”, but are now believed to demonstrate fusion between donor macrophages and host hepatocytes, raising the possibility that physiologically polyploid cells, such as hepatocytes, could originate, at least partially, through homotypic cell fusion. In support of the homotypic cell fusion hypothesis, we present new data generated using a chimera-based model, a much simpler model than those previously used. Cell fusion as a road to polyploidization in the liver has not been extensively investigated, and its contribution to a variety of conditions, such as viral infections, carcinogenesis and aging, remains unclear. PMID:29527257
Sensor Data Fusion with Z-Numbers and Its Application in Fault Diagnosis
Jiang, Wen; Xie, Chunhe; Zhuang, Miaoyan; Shou, Yehang; Tang, Yongchuan
2016-01-01
Sensor data fusion technology is widely employed in fault diagnosis. The information in a sensor data fusion system is characterized by not only fuzziness, but also partial reliability. Uncertain information of sensors, including randomness, fuzziness, etc., has been extensively studied recently. However, the reliability of a sensor is often overlooked or cannot be analyzed adequately. A Z-number, Z = (A, B), can represent the fuzziness and the reliability of information simultaneously, where the first component A represents a fuzzy restriction on the values of uncertain variables and the second component B is a measure of the reliability of A. In order to model and process the uncertainties in a sensor data fusion system reasonably, in this paper, a novel method combining the Z-number and Dempster–Shafer (D-S) evidence theory is proposed, where the Z-number is used to model the fuzziness and reliability of the sensor data and the D-S evidence theory is used to fuse the uncertain information of Z-numbers. The main advantages of the proposed method are that it provides a more robust measure of reliability to the sensor data, and the complementary information of multi-sensors reduces the uncertainty of the fault recognition, thus enhancing the reliability of fault detection. PMID:27649193
Accurate nonlinear mapping between MNI volumetric and FreeSurfer surface coordinate systems.
Wu, Jianxiao; Ngo, Gia H; Greve, Douglas; Li, Jingwei; He, Tong; Fischl, Bruce; Eickhoff, Simon B; Yeo, B T Thomas
2018-05-16
The results of most neuroimaging studies are reported in volumetric (e.g., MNI152) or surface (e.g., fsaverage) coordinate systems. Accurate mappings between volumetric and surface coordinate systems can facilitate many applications, such as projecting fMRI group analyses from MNI152/Colin27 to fsaverage for visualization or projecting resting-state fMRI parcellations from fsaverage to MNI152/Colin27 for volumetric analysis of new data. However, there has been surprisingly little research on this topic. Here, we evaluated three approaches for mapping data between MNI152/Colin27 and fsaverage coordinate systems by simulating the above applications: projection of group-average data from MNI152/Colin27 to fsaverage and projection of fsaverage parcellations to MNI152/Colin27. Two of the approaches are currently widely used. A third approach (registration fusion) was previously proposed, but not widely adopted. Two implementations of the registration fusion (RF) approach were considered, with one implementation utilizing the Advanced Normalization Tools (ANTs). We found that RF-ANTs performed the best for mapping between fsaverage and MNI152/Colin27, even for new subjects registered to MNI152/Colin27 using a different software tool (FSL FNIRT). This suggests that RF-ANTs would be useful even for researchers not using ANTs. Finally, it is worth emphasizing that the most optimal approach for mapping data to a coordinate system (e.g., fsaverage) is to register individual subjects directly to the coordinate system, rather than via another coordinate system. Only in scenarios where the optimal approach is not possible (e.g., mapping previously published results from MNI152 to fsaverage), should the approaches evaluated in this manuscript be considered. In these scenarios, we recommend RF-ANTs (https://github.com/ThomasYeoLab/CBIG/tree/master/stable_projects/registration/Wu2017_RegistrationFusion). © 2018 Wiley Periodicals, Inc.
Advanced algorithms for distributed fusion
NASA Astrophysics Data System (ADS)
Gelfand, A.; Smith, C.; Colony, M.; Bowman, C.; Pei, R.; Huynh, T.; Brown, C.
2008-03-01
The US Military has been undergoing a radical transition from a traditional "platform-centric" force to one capable of performing in a "Network-Centric" environment. This transformation will place all of the data needed to efficiently meet tactical and strategic goals at the warfighter's fingertips. With access to this information, the challenge of fusing data from across the batttlespace into an operational picture for real-time Situational Awareness emerges. In such an environment, centralized fusion approaches will have limited application due to the constraints of real-time communications networks and computational resources. To overcome these limitations, we are developing a formalized architecture for fusion and track adjudication that allows the distribution of fusion processes over a dynamically created and managed information network. This network will support the incorporation and utilization of low level tracking information within the Army Distributed Common Ground System (DCGS-A) or Future Combat System (FCS). The framework is based on Bowman's Dual Node Network (DNN) architecture that utilizes a distributed network of interlaced fusion and track adjudication nodes to build and maintain a globally consistent picture across all assets.
Energy-resolved neutron imaging for inertial confinement fusion
NASA Astrophysics Data System (ADS)
Moran, M. J.; Haan, S. W.; Hatchett, S. P.; Izumi, N.; Koch, J. A.; Lerche, R. A.; Phillips, T. W.
2003-03-01
The success of the National Ignition Facility program will depend on diagnostic measurements which study the performance of inertial confinement fusion (ICF) experiments. Neutron yield, fusion-burn time history, and images are examples of important diagnostics. Neutron and x-ray images will record the geometries of compressed targets during the fusion-burn process. Such images provide a critical test of the accuracy of numerical modeling of ICF experiments. They also can provide valuable information in cases where experiments produce unexpected results. Although x-ray and neutron images provide similar data, they do have significant differences. X-ray images represent the distribution of high-temperature regions where fusion occurs, while neutron images directly reveal the spatial distribution of fusion-neutron emission. X-ray imaging has the advantage of a relatively straightforward path to the imaging system design. Neutron imaging, by using energy-resolved detection, offers the intriguing advantage of being able to provide independent images of burning and nonburning regions of the nuclear fuel. The usefulness of energy-resolved neutron imaging depends on both the information content of the data and on the quality of the data that can be recorded. The information content will relate to the characteristic neutron spectra that are associated with emission from different regions of the source. Numerical modeling of ICF fusion burn will be required to interpret the corresponding energy-dependent images. The exercise will be useful only if the images can be recorded with sufficient definition to reveal the spatial and energy-dependent features of interest. Several options are being evaluated with respect to the feasibility of providing the desired simultaneous spatial and energy resolution.
Multisensor data fusion for IED threat detection
NASA Astrophysics Data System (ADS)
Mees, Wim; Heremans, Roel
2012-10-01
In this paper we present the multi-sensor registration and fusion algorithms that were developed for a force protection research project in order to detect threats against military patrol vehicles. The fusion is performed at object level, using a hierarchical evidence aggregation approach. It first uses expert domain knowledge about the features used to characterize the detected threats, that is implemented in the form of a fuzzy expert system. The next level consists in fusing intra-sensor and inter-sensor information. Here an ordered weighted averaging operator is used. The object level fusion between candidate threats that are detected asynchronously on a moving vehicle by sensors with different imaging geometries, requires an accurate sensor to world coordinate transformation. This image registration will also be discussed in this paper.
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Modeling the evolution space of breakage fusion bridge cycles with a stochastic folding process.
Greenman, C D; Cooke, S L; Marshall, J; Stratton, M R; Campbell, P J
2016-01-01
Breakage-fusion-bridge cycles in cancer arise when a broken segment of DNA is duplicated and an end from each copy joined together. This structure then 'unfolds' into a new piece of palindromic DNA. This is one mechanism responsible for the localised amplicons observed in cancer genome data. Here we study the evolution space of breakage-fusion-bridge structures in detail. We firstly consider discrete representations of this space with 2-d trees to demonstrate that there are [Formula: see text] qualitatively distinct evolutions involving [Formula: see text] breakage-fusion-bridge cycles. Secondly we consider the stochastic nature of the process to show these evolutions are not equally likely, and also describe how amplicons become localized. Finally we highlight these methods by inferring the evolution of breakage-fusion-bridge cycles with data from primary tissue cancer samples.
NASA Astrophysics Data System (ADS)
Ricciuto, Daniel M.; King, Anthony W.; Dragoni, D.; Post, Wilfred M.
2011-03-01
Many parameters in terrestrial biogeochemical models are inherently uncertain, leading to uncertainty in predictions of key carbon cycle variables. At observation sites, this uncertainty can be quantified by applying model-data fusion techniques to estimate model parameters using eddy covariance observations and associated biometric data sets as constraints. Uncertainty is reduced as data records become longer and different types of observations are added. We estimate parametric and associated predictive uncertainty at the Morgan Monroe State Forest in Indiana, USA. Parameters in the Local Terrestrial Ecosystem Carbon (LoTEC) are estimated using both synthetic and actual constraints. These model parameters and uncertainties are then used to make predictions of carbon flux for up to 20 years. We find a strong dependence of both parametric and prediction uncertainty on the length of the data record used in the model-data fusion. In this model framework, this dependence is strongly reduced as the data record length increases beyond 5 years. If synthetic initial biomass pool constraints with realistic uncertainties are included in the model-data fusion, prediction uncertainty is reduced by more than 25% when constraining flux records are less than 3 years. If synthetic annual aboveground woody biomass increment constraints are also included, uncertainty is similarly reduced by an additional 25%. When actual observed eddy covariance data are used as constraints, there is still a strong dependence of parameter and prediction uncertainty on data record length, but the results are harder to interpret because of the inability of LoTEC to reproduce observed interannual variations and the confounding effects of model structural error.
Design and Evaluation of Fusion Approach for Combining Brain and Gaze Inputs for Target Selection
Évain, Andéol; Argelaguet, Ferran; Casiez, Géry; Roussel, Nicolas; Lécuyer, Anatole
2016-01-01
Gaze-based interfaces and Brain-Computer Interfaces (BCIs) allow for hands-free human–computer interaction. In this paper, we investigate the combination of gaze and BCIs. We propose a novel selection technique for 2D target acquisition based on input fusion. This new approach combines the probabilistic models for each input, in order to better estimate the intent of the user. We evaluated its performance against the existing gaze and brain–computer interaction techniques. Twelve participants took part in our study, in which they had to search and select 2D targets with each of the evaluated techniques. Our fusion-based hybrid interaction technique was found to be more reliable than the previous gaze and BCI hybrid interaction techniques for 10 participants over 12, while being 29% faster on average. However, similarly to what has been observed in hybrid gaze-and-speech interaction, gaze-only interaction technique still provides the best performance. Our results should encourage the use of input fusion, as opposed to sequential interaction, in order to design better hybrid interfaces. PMID:27774048
NASA Astrophysics Data System (ADS)
Bagheri, H.; Schmitt, M.; Zhu, X. X.
2017-05-01
Recently, with InSAR data provided by the German TanDEM-X mission, a new global, high-resolution Digital Elevation Model (DEM) has been produced by the German Aerospace Center (DLR) with unprecedented height accuracy. However, due to SAR-inherent sensor specifics, its quality decreases over urban areas, making additional improvement necessary. On the other hand, DEMs derived from optical remote sensing imagery, such as Cartosat-1 data, have an apparently greater resolution in urban areas, making their fusion with TanDEM-X elevation data a promising perspective. The objective of this paper is two-fold: First, the height accuracies of TanDEM-X and Cartosat-1 elevation data over different land types are empirically evaluated in order to analyze the potential of TanDEM-XCartosat- 1 DEM data fusion. After the quality assessment, urban DEM fusion using weighted averaging is investigated. In this experiment, both weight maps derived from the height error maps delivered with the DEM data, as well as more sophisticated weight maps predicted by a procedure based on artificial neural networks (ANNs) are compared. The ANN framework employs several features that can describe the height residual performance to predict the weights used in the subsequent fusion step. The results demonstrate that especially the ANN-based framework is able to improve the quality of the final DEM through data fusion.
NASA Astrophysics Data System (ADS)
Sukawattanavijit, Chanika; Srestasathiern, Panu
2017-10-01
Land Use and Land Cover (LULC) information are significant to observe and evaluate environmental change. LULC classification applying remotely sensed data is a technique popularly employed on a global and local dimension particularly, in urban areas which have diverse land cover types. These are essential components of the urban terrain and ecosystem. In the present, object-based image analysis (OBIA) is becoming widely popular for land cover classification using the high-resolution image. COSMO-SkyMed SAR data was fused with THAICHOTE (namely, THEOS: Thailand Earth Observation Satellite) optical data for land cover classification using object-based. This paper indicates a comparison between object-based and pixel-based approaches in image fusion. The per-pixel method, support vector machines (SVM) was implemented to the fused image based on Principal Component Analysis (PCA). For the objectbased classification was applied to the fused images to separate land cover classes by using nearest neighbor (NN) classifier. Finally, the accuracy assessment was employed by comparing with the classification of land cover mapping generated from fused image dataset and THAICHOTE image. The object-based data fused COSMO-SkyMed with THAICHOTE images demonstrated the best classification accuracies, well over 85%. As the results, an object-based data fusion provides higher land cover classification accuracy than per-pixel data fusion.
NASA Astrophysics Data System (ADS)
Hahn, Markus; Barrois, Björn; Krüger, Lars; Wöhler, Christian; Sagerer, Gerhard; Kummert, Franz
2010-09-01
This study introduces an approach to model-based 3D pose estimation and instantaneous motion analysis of the human hand-forearm limb in the application context of safe human-robot interaction. 3D pose estimation is performed using two approaches: The Multiocular Contracting Curve Density (MOCCD) algorithm is a top-down technique based on pixel statistics around a contour model projected into the images from several cameras. The Iterative Closest Point (ICP) algorithm is a bottom-up approach which uses a motion-attributed 3D point cloud to estimate the object pose. Due to their orthogonal properties, a fusion of these algorithms is shown to be favorable. The fusion is performed by a weighted combination of the extracted pose parameters in an iterative manner. The analysis of object motion is based on the pose estimation result and the motion-attributed 3D points belonging to the hand-forearm limb using an extended constraint-line approach which does not rely on any temporal filtering. A further refinement is obtained using the Shape Flow algorithm, a temporal extension of the MOCCD approach, which estimates the temporal pose derivative based on the current and the two preceding images, corresponding to temporal filtering with a short response time of two or at most three frames. Combining the results of the two motion estimation stages provides information about the instantaneous motion properties of the object. Experimental investigations are performed on real-world image sequences displaying several test persons performing different working actions typically occurring in an industrial production scenario. In all example scenes, the background is cluttered, and the test persons wear various kinds of clothes. For evaluation, independently obtained ground truth data are used. [Figure not available: see fulltext.
Development of emergent processing loops as a system of systems concept
NASA Astrophysics Data System (ADS)
Gainey, James C., Jr.; Blasch, Erik P.
1999-03-01
This paper describes an engineering approach toward implementing the current neuroscientific understanding of how the primate brain fuses, or integrates, 'information' in the decision-making process. We describe a System of Systems (SoS) design for improving the overall performance, capabilities, operational robustness, and user confidence in Identification (ID) systems and show how it could be applied to biometrics security. We use the Physio-associative temporal sensor integration algorithm (PATSIA) which is motivated by observed functions and interactions of the thalamus, hippocampus, and cortical structures in the brain. PATSIA utilizes signal theory mathematics to model how the human efficiently perceives and uses information from the environment. The hybrid architecture implements a possible SoS-level description of the Joint Directors of US Laboratories for Fusion Working Group's functional description involving 5 levels of fusion and their associated definitions. This SoS architecture propose dynamic sensor and knowledge-source integration by implementing multiple Emergent Processing Loops for predicting, feature extracting, matching, and Searching both static and dynamic database like MSTAR's PEMS loops. Biologically, this effort demonstrates these objectives by modeling similar processes from the eyes, ears, and somatosensory channels, through the thalamus, and to the cortices as appropriate while using the hippocampus for short-term memory search and storage as necessary. The particular approach demonstrated incorporates commercially available speaker verification and face recognition software and hardware to collect data and extract features to the PATSIA. The PATSIA maximizes the confidence levels for target identification or verification in dynamic situations using a belief filter. The proof of concept described here is easily adaptable and scaleable to other military and nonmilitary sensor fusion applications.
A Regularized Volumetric Fusion Framework for Large-Scale 3D Reconstruction
NASA Astrophysics Data System (ADS)
Rajput, Asif; Funk, Eugen; Börner, Anko; Hellwich, Olaf
2018-07-01
Modern computational resources combined with low-cost depth sensing systems have enabled mobile robots to reconstruct 3D models of surrounding environments in real-time. Unfortunately, low-cost depth sensors are prone to produce undesirable estimation noise in depth measurements which result in either depth outliers or introduce surface deformations in the reconstructed model. Conventional 3D fusion frameworks integrate multiple error-prone depth measurements over time to reduce noise effects, therefore additional constraints such as steady sensor movement and high frame-rates are required for high quality 3D models. In this paper we propose a generic 3D fusion framework with controlled regularization parameter which inherently reduces noise at the time of data fusion. This allows the proposed framework to generate high quality 3D models without enforcing additional constraints. Evaluation of the reconstructed 3D models shows that the proposed framework outperforms state of art techniques in terms of both absolute reconstruction error and processing time.
Entity Recognition Via Multimodal Sensor Fusion With Smart Phones
2015-03-26
Xs ,t|Et = 1] P[ Xs ,t|Et = 0] ≥ τ However, an event such as an earthquake, due to its’ rarity, does not have suffi- cient data to obtain good...Faulkner et al. develop a methodology to estimate the distribution of normal observations over time L̂0( Xs ,t) = P̂[ Xs ,t|Et = 0] for non-events. This is done...by using a parametric 37 approach: P[ Xs ,t|Et0] = φ( Xs ,t, θ) This model improves when the time span of sensing increases and thus the availability of
2016-05-31
and included explosives such as TATP, HMTD, RDX, RDX, ammonium nitrate , potassium perchlorate, potassium nitrate , sugar, and TNT. The approach...Distribution Unlimited UU UU UU UU 31-05-2016 15-Apr-2014 14-Jan-2015 Final Report: Technical Topic 3.2.2. d Bayesian and Non- parametric Statistics...of Papers published in non peer-reviewed journals: Final Report: Technical Topic 3.2.2. d Bayesian and Non-parametric Statistics: Integration of Neural
Histogram equalization with Bayesian estimation for noise robust speech recognition.
Suh, Youngjoo; Kim, Hoirin
2018-02-01
The histogram equalization approach is an efficient feature normalization technique for noise robust automatic speech recognition. However, it suffers from performance degradation when some fundamental conditions are not satisfied in the test environment. To remedy these limitations of the original histogram equalization methods, class-based histogram equalization approach has been proposed. Although this approach showed substantial performance improvement under noise environments, it still suffers from performance degradation due to the overfitting problem when test data are insufficient. To address this issue, the proposed histogram equalization technique employs the Bayesian estimation method in the test cumulative distribution function estimation. It was reported in a previous study conducted on the Aurora-4 task that the proposed approach provided substantial performance gains in speech recognition systems based on the acoustic modeling of the Gaussian mixture model-hidden Markov model. In this work, the proposed approach was examined in speech recognition systems with deep neural network-hidden Markov model (DNN-HMM), the current mainstream speech recognition approach where it also showed meaningful performance improvement over the conventional maximum likelihood estimation-based method. The fusion of the proposed features with the mel-frequency cepstral coefficients provided additional performance gains in DNN-HMM systems, which otherwise suffer from performance degradation in the clean test condition.
NASA Astrophysics Data System (ADS)
Gautam, Manjeet Singh
2015-01-01
In the present work, the fusion of symmetric and asymmetric projectile-target combinations are deeply analyzed within the framework of energy dependent Woods-Saxon potential model (EDWSP model) in conjunction with one dimensional Wong formula and the coupled channel code CCFULL. The neutron transfer channels and the inelastic surface excitations of collision partners are dominating mode of couplings and the coupling of relative motion of colliding nuclei to such relevant internal degrees of freedom produces a significant fusion enhancement at sub-barrier energies. It is quite interesting that the effects of dominant intrinsic degrees of freedom such as multi-phonon vibrational states, neutron transfer channels and proton transfer channels can be simulated by introducing the energy dependence in the nucleus-nucleus potential (EDWSP model). In the EDWSP model calculations, a wide range of diffuseness parameter ranging from a = 0.85 fm to a = 0.97 fm, which is much larger than a value (a = 0.65 fm) extracted from the elastic scattering data, is needed to reproduce sub-barrier fusion data. However, such diffuseness anomaly, which might be an artifact of some dynamical effects, has been resolved by trajectory fluctuation dissipation (TFD) model wherein the resulting nucleus-nucleus potential possesses normal diffuseness parameter.
The Fusion Model of Intelligent Transportation Systems Based on the Urban Traffic Ontology
NASA Astrophysics Data System (ADS)
Yang, Wang-Dong; Wang, Tao
On these issues unified representation of urban transport information using urban transport ontology, it defines the statute and the algebraic operations of semantic fusion in ontology level in order to achieve the fusion of urban traffic information in the semantic completeness and consistency. Thus this paper takes advantage of the semantic completeness of the ontology to build urban traffic ontology model with which we resolve the problems as ontology mergence and equivalence verification in semantic fusion of traffic information integration. Information integration in urban transport can increase the function of semantic fusion, and reduce the amount of data integration of urban traffic information as well enhance the efficiency and integrity of traffic information query for the help, through the practical application of intelligent traffic information integration platform of Changde city, the paper has practically proved that the semantic fusion based on ontology increases the effect and efficiency of the urban traffic information integration, reduces the storage quantity, and improve query efficiency and information completeness.
Development of a fusion approach selection tool
NASA Astrophysics Data System (ADS)
Pohl, C.; Zeng, Y.
2015-06-01
During the last decades number and quality of available remote sensing satellite sensors for Earth observation has grown significantly. The amount of available multi-sensor images along with their increased spatial and spectral resolution provides new challenges to Earth scientists. With a Fusion Approach Selection Tool (FAST) the remote sensing community would obtain access to an optimized and improved image processing technology. Remote sensing image fusion is a mean to produce images containing information that is not inherent in the single image alone. In the meantime the user has access to sophisticated commercialized image fusion techniques plus the option to tune the parameters of each individual technique to match the anticipated application. This leaves the operator with an uncountable number of options to combine remote sensing images, not talking about the selection of the appropriate images, resolution and bands. Image fusion can be a machine and time-consuming endeavour. In addition it requires knowledge about remote sensing, image fusion, digital image processing and the application. FAST shall provide the user with a quick overview of processing flows to choose from to reach the target. FAST will ask for available images, application parameters and desired information to process this input to come out with a workflow to quickly obtain the best results. It will optimize data and image fusion techniques. It provides an overview on the possible results from which the user can choose the best. FAST will enable even inexperienced users to use advanced processing methods to maximize the benefit of multi-sensor image exploitation.
Wang, Yanran; Xiao, Gang; Dai, Zhouyun
2017-11-13
Automatic Dependent Surveillance-Broadcast (ADS-B) is the direction of airspace surveillance development. Research analyzing the benefits of Traffic Collision Avoidance System (TCAS) and ADS-B data fusion is almost absent. The paper proposes an ADS-B minimum system from ADS-B In and ADS-B Out. In ADS-B In, a fusion model with a variable sampling Variational Bayesian-Interacting Multiple Model (VSVB-IMM) algorithm is proposed for integrated display and an airspace traffic situation display is developed by using ADS-B information. ADS-B Out includes ADS-B Out transmission based on a simulator platform and an Unmanned Aerial Vehicle (UAV) platform. This paper describes the overall implementation of ADS-B minimum system, including theoretical model design, experimental simulation verification, engineering implementation, results analysis, etc. Simulation and implementation results show that the fused system has better performance than each independent subsystem and it can work well in engineering applications.
Joint interpretation of geophysical data using Image Fusion techniques
NASA Astrophysics Data System (ADS)
Karamitrou, A.; Tsokas, G.; Petrou, M.
2013-12-01
Joint interpretation of geophysical data produced from different methods is a challenging area of research in a wide range of applications. In this work we apply several image fusion approaches to combine maps of electrical resistivity, electromagnetic conductivity, vertical gradient of the magnetic field, magnetic susceptibility, and ground penetrating radar reflections, in order to detect archaeological relics. We utilize data gathered from Arkansas University, with the support of the U.S. Department of Defense, through the Strategic Environmental Research and Development Program (SERDP-CS1263). The area of investigation is the Army City, situated in Riley Country of Kansas, USA. The depth of the relics is estimated about 30 cm from the surface, yet the surface indications of its existence are limited. We initially register the images from the different methods to correct from random offsets due to the use of hand-held devices during the measurement procedure. Next, we apply four different image fusion approaches to create combined images, using fusion with mean values, wavelet decomposition, curvelet transform, and curvelet transform enhancing the images along specific angles. We create seven combinations of pairs between the available geophysical datasets. The combinations are such that for every pair at least one high-resolution method (resistivity or magnetic gradiometry) is included. Our results indicate that in almost every case the method of mean values produces satisfactory fused images that corporate the majority of the features of the initial images. However, the contrast of the final image is reduced, and in some cases the averaging process nearly eliminated features that are fade in the original images. Wavelet based fusion outputs also good results, providing additional control in selecting the feature wavelength. Curvelet based fusion is proved the most effective method in most of the cases. The ability of curvelet domain to unfold the image in terms of space, wavenumber, and orientation, provides important advantages compared with the rest of the methods by allowing the incorporation of a-priori information about the orientation of the potential targets.
Processing LiDAR Data to Predict Natural Hazards
NASA Technical Reports Server (NTRS)
Fairweather, Ian; Crabtree, Robert; Hager, Stacey
2008-01-01
ELF-Base and ELF-Hazards (wherein 'ELF' signifies 'Extract LiDAR Features' and 'LiDAR' signifies 'light detection and ranging') are developmental software modules for processing remote-sensing LiDAR data to identify past natural hazards (principally, landslides) and predict future ones. ELF-Base processes raw LiDAR data, including LiDAR intensity data that are often ignored in other software, to create digital terrain models (DTMs) and digital feature models (DFMs) with sub-meter accuracy. ELF-Hazards fuses raw LiDAR data, data from multispectral and hyperspectral optical images, and DTMs and DFMs generated by ELF-Base to generate hazard risk maps. Advanced algorithms in these software modules include line-enhancement and edge-detection algorithms, surface-characterization algorithms, and algorithms that implement innovative data-fusion techniques. The line-extraction and edge-detection algorithms enable users to locate such features as faults and landslide headwall scarps. Also implemented in this software are improved methodologies for identification and mapping of past landslide events by use of (1) accurate, ELF-derived surface characterizations and (2) three LiDAR/optical-data-fusion techniques: post-classification data fusion, maximum-likelihood estimation modeling, and hierarchical within-class discrimination. This software is expected to enable faster, more accurate forecasting of natural hazards than has previously been possible.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nguyen, Ba Nghiep; Henager, Jr., Charles H.; Overman, Nicole R.
Increasing fracture toughness and modifying the ductile-brittle transition temperature of a tungsten-alloy relative to pure tungsten has been shown to be feasible by ductile-phase toughening (DPT) of tungsten for future plasma-facing materials for fusion energy. In DPT, a ductile phase is included in a brittle tungsten matrix to increase the overall work of fracture for the material. This research models the deformation behavior of DPT tungsten materials, such as tungsten-copper composites, using a multiscale modeling approach that involves a microstructural dual-phase (copper-tungsten) region of interest where the constituent phases are finely discretized and are described by a continuum damage mechanicsmore » model. Large deformation, damage, and fracture are allowed to occur and are modeled in a region that is connected to adjacent homogenized elastic regions to form a macroscopic structure, such as a test specimen. The present paper illustrates this multiscale modeling approach to analyze unnotched and single-edge notched (SENB) tungsten-copper composite specimens subjected to three-point bending. The predicted load-displacement responses and crack propagation patterns are compared to the corresponding experimental results to validate the model. Furthermore, such models may help design future DPT composite configurations for fusion materials, including volume fractions of ductile phase and microstructural optimization.« less
Nguyen, Ba Nghiep; Henager, Jr., Charles H.; Overman, Nicole R.; ...
2018-05-23
Increasing fracture toughness and modifying the ductile-brittle transition temperature of a tungsten-alloy relative to pure tungsten has been shown to be feasible by ductile-phase toughening (DPT) of tungsten for future plasma-facing materials for fusion energy. In DPT, a ductile phase is included in a brittle tungsten matrix to increase the overall work of fracture for the material. This research models the deformation behavior of DPT tungsten materials, such as tungsten-copper composites, using a multiscale modeling approach that involves a microstructural dual-phase (copper-tungsten) region of interest where the constituent phases are finely discretized and are described by a continuum damage mechanicsmore » model. Large deformation, damage, and fracture are allowed to occur and are modeled in a region that is connected to adjacent homogenized elastic regions to form a macroscopic structure, such as a test specimen. The present paper illustrates this multiscale modeling approach to analyze unnotched and single-edge notched (SENB) tungsten-copper composite specimens subjected to three-point bending. The predicted load-displacement responses and crack propagation patterns are compared to the corresponding experimental results to validate the model. Furthermore, such models may help design future DPT composite configurations for fusion materials, including volume fractions of ductile phase and microstructural optimization.« less
Statistical modeling for visualization evaluation through data fusion.
Chen, Xiaoyu; Jin, Ran
2017-11-01
There is a high demand of data visualization providing insights to users in various applications. However, a consistent, online visualization evaluation method to quantify mental workload or user preference is lacking, which leads to an inefficient visualization and user interface design process. Recently, the advancement of interactive and sensing technologies makes the electroencephalogram (EEG) signals, eye movements as well as visualization logs available in user-centered evaluation. This paper proposes a data fusion model and the application procedure for quantitative and online visualization evaluation. 15 participants joined the study based on three different visualization designs. The results provide a regularized regression model which can accurately predict the user's evaluation of task complexity, and indicate the significance of all three types of sensing data sets for visualization evaluation. This model can be widely applied to data visualization evaluation, and other user-centered designs evaluation and data analysis in human factors and ergonomics. Copyright © 2016 Elsevier Ltd. All rights reserved.
A Distributed Artificial Intelligence Approach To Object Identification And Classification
NASA Astrophysics Data System (ADS)
Sikka, Digvijay I.; Varshney, Pramod K.; Vannicola, Vincent C.
1989-09-01
This paper presents an application of Distributed Artificial Intelligence (DAI) tools to the data fusion and classification problem. Our approach is to use a blackboard for information management and hypothe-ses formulation. The blackboard is used by the knowledge sources (KSs) for sharing information and posting their hypotheses on, just as experts sitting around a round table would do. The present simulation performs classification of an Aircraft(AC), after identifying it by its features, into disjoint sets (object classes) comprising of the five commercial ACs; Boeing 747, Boeing 707, DC10, Concord and Boeing 727. A situation data base is characterized by experimental data available from the three levels of expert reasoning. Ohio State University ElectroScience Laboratory provided this experimental data. To validate the architecture presented, we employ two KSs for modeling the sensors, aspect angle polarization feature and the ellipticity data. The system has been implemented on Symbolics 3645, under Genera 7.1, in Common LISP.
Haralalka, Shruti; Shelton, Claude; Cartwright, Heather N.; Katzfey, Erin; Janzen, Evan; Abmayr, Susan M.
2011-01-01
Myoblast fusion is an intricate process that is initiated by cell recognition and adhesion, and culminates in cell membrane breakdown and formation of multinucleate syncytia. In the Drosophila embryo, this process occurs asymmetrically between founder cells that pattern the musculature and fusion-competent myoblasts (FCMs) that account for the bulk of the myoblasts. The present studies clarify and amplify current models of myoblast fusion in several important ways. We demonstrate that the non-conventional guanine nucleotide exchange factor (GEF) Mbc plays a fundamental role in the FCMs, where it functions to activate Rac1, but is not required in the founder cells for fusion. Mbc, active Rac1 and F-actin foci are highly enriched in the FCMs, where they localize to the Sns:Kirre junction. Furthermore, Mbc is crucial for the integrity of the F-actin foci and the FCM cytoskeleton, presumably via its activation of Rac1 in these cells. Finally, the local asymmetric distribution of these proteins at adhesion sites is reminiscent of invasive podosomes and, consistent with this model, they are enriched at sites of membrane deformation, where the FCM protrudes into the founder cell/myotube. These data are consistent with models promoting actin polymerization as the driving force for myoblast fusion. PMID:21389053
2005-09-01
appropriate use and dissemination. When information begins to flow in both directions, national and local entities can benefit from the developing...Linc Radios • Cell Phones • Laptops 88 4. The various systems, both traditional and “high-tech,” used by GISAC to disseminate terrorism...1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE September 2005 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE: State
Multi-look fusion identification: a paradigm shift from quality to quantity in data samples
NASA Astrophysics Data System (ADS)
Wong, S.
2009-05-01
A multi-look identification method known as score-level fusion is found to be capable of achieving very high identification accuracy, even when low quality target signatures are used. Analysis using measured ground vehicle radar signatures has shown that a 97% correct identification rate can be achieved using this multi-look fusion method; in contrast, only a 37% accuracy rate is obtained when single target signature input is used. The results suggest that quantity can be used to replace quality of the target data in improving identification accuracy. With the advent of sensor technology, a large amount of target signatures of marginal quality can be captured routinely. This quantity over quality approach allows maximum exploitation of the available data to improve the target identification performance and this could have the potential of being developed into a disruptive technology.
Y fuse? Sex chromosome fusions in fishes and reptiles.
Pennell, Matthew W; Kirkpatrick, Mark; Otto, Sarah P; Vamosi, Jana C; Peichel, Catherine L; Valenzuela, Nicole; Kitano, Jun
2015-05-01
Chromosomal fusion plays a recurring role in the evolution of adaptations and reproductive isolation among species, yet little is known of the evolutionary drivers of chromosomal fusions. Because sex chromosomes (X and Y in male heterogametic systems, Z and W in female heterogametic systems) differ in their selective, mutational, and demographic environments, those differences provide a unique opportunity to dissect the evolutionary forces that drive chromosomal fusions. We estimate the rate at which fusions between sex chromosomes and autosomes become established across the phylogenies of both fishes and squamate reptiles. Both the incidence among extant species and the establishment rate of Y-autosome fusions is much higher than for X-autosome, Z-autosome, or W-autosome fusions. Using population genetic models, we show that this pattern cannot be reconciled with many standard explanations for the spread of fusions. In particular, direct selection acting on fusions or sexually antagonistic selection cannot, on their own, account for the predominance of Y-autosome fusions. The most plausible explanation for the observed data seems to be (a) that fusions are slightly deleterious, and (b) that the mutation rate is male-biased or the reproductive sex ratio is female-biased. We identify other combinations of evolutionary forces that might in principle account for the data although they appear less likely. Our results shed light on the processes that drive structural changes throughout the genome.
Y Fuse? Sex Chromosome Fusions in Fishes and Reptiles
Vamosi, Jana C.; Peichel, Catherine L.; Valenzuela, Nicole; Kitano, Jun
2015-01-01
Chromosomal fusion plays a recurring role in the evolution of adaptations and reproductive isolation among species, yet little is known of the evolutionary drivers of chromosomal fusions. Because sex chromosomes (X and Y in male heterogametic systems, Z and W in female heterogametic systems) differ in their selective, mutational, and demographic environments, those differences provide a unique opportunity to dissect the evolutionary forces that drive chromosomal fusions. We estimate the rate at which fusions between sex chromosomes and autosomes become established across the phylogenies of both fishes and squamate reptiles. Both the incidence among extant species and the establishment rate of Y-autosome fusions is much higher than for X-autosome, Z-autosome, or W-autosome fusions. Using population genetic models, we show that this pattern cannot be reconciled with many standard explanations for the spread of fusions. In particular, direct selection acting on fusions or sexually antagonistic selection cannot, on their own, account for the predominance of Y-autosome fusions. The most plausible explanation for the observed data seems to be (a) that fusions are slightly deleterious, and (b) that the mutation rate is male-biased or the reproductive sex ratio is female-biased. We identify other combinations of evolutionary forces that might in principle account for the data although they appear less likely. Our results shed light on the processes that drive structural changes throughout the genome. PMID:25993542
Fusion-based multi-target tracking and localization for intelligent surveillance systems
NASA Astrophysics Data System (ADS)
Rababaah, Haroun; Shirkhodaie, Amir
2008-04-01
In this paper, we have presented two approaches addressing visual target tracking and localization in complex urban environment. The two techniques presented in this paper are: fusion-based multi-target visual tracking, and multi-target localization via camera calibration. For multi-target tracking, the data fusion concepts of hypothesis generation/evaluation/selection, target-to-target registration, and association are employed. An association matrix is implemented using RGB histograms for associated tracking of multi-targets of interests. Motion segmentation of targets of interest (TOI) from the background was achieved by a Gaussian Mixture Model. Foreground segmentation, on other hand, was achieved by the Connected Components Analysis (CCA) technique. The tracking of individual targets was estimated by fusing two sources of information, the centroid with the spatial gating, and the RGB histogram association matrix. The localization problem is addressed through an effective camera calibration technique using edge modeling for grid mapping (EMGM). A two-stage image pixel to world coordinates mapping technique is introduced that performs coarse and fine location estimation of moving TOIs. In coarse estimation, an approximate neighborhood of the target position is estimated based on nearest 4-neighbor method, and in fine estimation, we use Euclidean interpolation to localize the position within the estimated four neighbors. Both techniques were tested and shown reliable results for tracking and localization of Targets of interests in complex urban environment.
Uncertainty estimation and multi sensor fusion for kinematic laser tracker measurements
NASA Astrophysics Data System (ADS)
Ulrich, Thomas
2013-08-01
Laser trackers are widely used to measure kinematic tasks such as tracking robot movements. Common methods to evaluate the uncertainty in the kinematic measurement include approximations specified by the manufacturers, various analytical adjustment methods and the Kalman filter. In this paper a new, real-time technique is proposed, which estimates the 4D-path (3D-position + time) uncertainty of an arbitrary path in space. Here a hybrid system estimator is applied in conjunction with the kinematic measurement model. This method can be applied to processes, which include various types of kinematic behaviour, constant velocity, variable acceleration or variable turn rates. The new approach is compared with the Kalman filter and a manufacturer's approximations. The comparison was made using data obtained by tracking an industrial robot's tool centre point with a Leica laser tracker AT901 and a Leica laser tracker LTD500. It shows that the new approach is more appropriate to analysing kinematic processes than the Kalman filter, as it reduces overshoots and decreases the estimated variance. In comparison with the manufacturer's approximations, the new approach takes account of kinematic behaviour with an improved description of the real measurement process and a reduction in estimated variance. This approach is therefore well suited to the analysis of kinematic processes with unknown changes in kinematic behaviour as well as the fusion among laser trackers.
NASA Astrophysics Data System (ADS)
Couture, Jean; Boily, Edouard; Simard, Marc-Alain
1996-05-01
The research and development group at Loral Canada is now at the second phase of the development of a data fusion demonstration model (DFDM) for a naval anti-air warfare to be used as a workbench tool to perform exploratory research. This project has emphatically addressed how the concepts related to fusion could be implemented within the Canadian Patrol Frigate (CPF) software environment. The project has been designed to read data passively on the CPF bus without any modification to the CPF software. This has brought to light important time alignment issues since the CPF sensors and the CPF command and control system were not important time alignment issues since the CPF sensors and the CPF command and control system were not originally designed to support a track management function which fuses information. The fusion of data from non-organic sensors with the tactical Link-11 data has produced stimulating spatial alignment problems which have been overcome by the use of a geodetic referencing coordinate system. Some benchmark scenarios have been selected to quantitatively demonstrate the capabilities of this fusion implementation. This paper describes the implementation design of DFDM (version 2), and summarizes the results obtained so far when fusing the scenarios simulated data.
Rouiller, Yolande; Solacroup, Thomas; Deparis, Véronique; Barbafieri, Marco; Gleixner, Ralf; Broly, Hervé; Eon-Duval, Alex
2012-06-01
The production bioreactor step of an Fc-Fusion protein manufacturing cell culture process was characterized following Quality by Design principles. Using scientific knowledge derived from the literature and process knowledge gathered during development studies and manufacturing to support clinical trials, potential critical and key process parameters with a possible impact on product quality and process performance, respectively, were determined during a risk assessment exercise. The identified process parameters were evaluated using a design of experiment approach. The regression models generated from the data allowed characterizing the impact of the identified process parameters on quality attributes. The main parameters having an impact on product titer were pH and dissolved oxygen, while those having the highest impact on process- and product-related impurities and variants were pH and culture duration. The models derived from characterization studies were used to define the cell culture process design space. The design space limits were set in such a way as to ensure that the drug substance material would consistently have the desired quality. Copyright © 2012 Elsevier B.V. All rights reserved.