Mashup Scheme Design of Map Tiles Using Lightweight Open Source Webgis Platform
NASA Astrophysics Data System (ADS)
Hu, T.; Fan, J.; He, H.; Qin, L.; Li, G.
2018-04-01
To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.
SU-D-210-03: Limited-View Multi-Source Quantitative Photoacoustic Tomography
DOE Office of Scientific and Technical Information (OSTI.GOV)
Feng, J; Gao, H
2015-06-15
Purpose: This work is to investigate a novel limited-view multi-source acquisition scheme for the direct and simultaneous reconstruction of optical coefficients in quantitative photoacoustic tomography (QPAT), which has potentially improved signal-to-noise ratio and reduced data acquisition time. Methods: Conventional QPAT is often considered in two steps: first to reconstruct the initial acoustic pressure from the full-view ultrasonic data after each optical illumination, and then to quantitatively reconstruct optical coefficients (e.g., absorption and scattering coefficients) from the initial acoustic pressure, using multi-source or multi-wavelength scheme.Based on a novel limited-view multi-source scheme here, We have to consider the direct reconstruction of opticalmore » coefficients from the ultrasonic data, since the initial acoustic pressure can no longer be reconstructed as an intermediate variable due to the incomplete acoustic data in the proposed limited-view scheme. In this work, based on a coupled photo-acoustic forward model combining diffusion approximation and wave equation, we develop a limited-memory Quasi-Newton method (LBFGS) for image reconstruction that utilizes the adjoint forward problem for fast computation of gradients. Furthermore, the tensor framelet sparsity is utilized to improve the image reconstruction which is solved by Alternative Direction Method of Multipliers (ADMM). Results: The simulation was performed on a modified Shepp-Logan phantom to validate the feasibility of the proposed limited-view scheme and its corresponding image reconstruction algorithms. Conclusion: A limited-view multi-source QPAT scheme is proposed, i.e., the partial-view acoustic data acquisition accompanying each optical illumination, and then the simultaneous rotations of both optical sources and ultrasonic detectors for next optical illumination. Moreover, LBFGS and ADMM algorithms are developed for the direct reconstruction of optical coefficients from the acoustic data. Jing Feng and Hao Gao were partially supported by the NSFC (#11405105), the 973 Program (#2015CB856000) and the Shanghai Pujiang Talent Program (#14PJ1404500)« less
Variable cycle control model for intersection based on multi-source information
NASA Astrophysics Data System (ADS)
Sun, Zhi-Yuan; Li, Yue; Qu, Wen-Cong; Chen, Yan-Yan
2018-05-01
In order to improve the efficiency of traffic control system in the era of big data, a new variable cycle control model based on multi-source information is presented for intersection in this paper. Firstly, with consideration of multi-source information, a unified framework based on cyber-physical system is proposed. Secondly, taking into account the variable length of cell, hysteresis phenomenon of traffic flow and the characteristics of lane group, a Lane group-based Cell Transmission Model is established to describe the physical properties of traffic flow under different traffic signal control schemes. Thirdly, the variable cycle control problem is abstracted into a bi-level programming model. The upper level model is put forward for cycle length optimization considering traffic capacity and delay. The lower level model is a dynamic signal control decision model based on fairness analysis. Then, a Hybrid Intelligent Optimization Algorithm is raised to solve the proposed model. Finally, a case study shows the efficiency and applicability of the proposed model and algorithm.
Multi-Source Cooperative Data Collection with a Mobile Sink for the Wireless Sensor Network.
Han, Changcai; Yang, Jinsheng
2017-10-30
The multi-source cooperation integrating distributed low-density parity-check codes is investigated to jointly collect data from multiple sensor nodes to the mobile sink in the wireless sensor network. The one-round and two-round cooperative data collection schemes are proposed according to the moving trajectories of the sink node. Specifically, two sparse cooperation models are firstly formed based on geographical locations of sensor source nodes, the impairment of inter-node wireless channels and moving trajectories of the mobile sink. Then, distributed low-density parity-check codes are devised to match the directed graphs and cooperation matrices related with the cooperation models. In the proposed schemes, each source node has quite low complexity attributed to the sparse cooperation and the distributed processing. Simulation results reveal that the proposed cooperative data collection schemes obtain significant bit error rate performance and the two-round cooperation exhibits better performance compared with the one-round scheme. The performance can be further improved when more source nodes participate in the sparse cooperation. For the two-round data collection schemes, the performance is evaluated for the wireless sensor networks with different moving trajectories and the variant data sizes.
Multi-Source Cooperative Data Collection with a Mobile Sink for the Wireless Sensor Network
Han, Changcai; Yang, Jinsheng
2017-01-01
The multi-source cooperation integrating distributed low-density parity-check codes is investigated to jointly collect data from multiple sensor nodes to the mobile sink in the wireless sensor network. The one-round and two-round cooperative data collection schemes are proposed according to the moving trajectories of the sink node. Specifically, two sparse cooperation models are firstly formed based on geographical locations of sensor source nodes, the impairment of inter-node wireless channels and moving trajectories of the mobile sink. Then, distributed low-density parity-check codes are devised to match the directed graphs and cooperation matrices related with the cooperation models. In the proposed schemes, each source node has quite low complexity attributed to the sparse cooperation and the distributed processing. Simulation results reveal that the proposed cooperative data collection schemes obtain significant bit error rate performance and the two-round cooperation exhibits better performance compared with the one-round scheme. The performance can be further improved when more source nodes participate in the sparse cooperation. For the two-round data collection schemes, the performance is evaluated for the wireless sensor networks with different moving trajectories and the variant data sizes. PMID:29084155
The optimal algorithm for Multi-source RS image fusion.
Fu, Wei; Huang, Shui-Guang; Li, Zeng-Shun; Shen, Hao; Li, Jun-Shuai; Wang, Peng-Yuan
2016-01-01
In order to solve the issue which the fusion rules cannot be self-adaptively adjusted by using available fusion methods according to the subsequent processing requirements of Remote Sensing (RS) image, this paper puts forward GSDA (genetic-iterative self-organizing data analysis algorithm) by integrating the merit of genetic arithmetic together with the advantage of iterative self-organizing data analysis algorithm for multi-source RS image fusion. The proposed algorithm considers the wavelet transform of the translation invariance as the model operator, also regards the contrast pyramid conversion as the observed operator. The algorithm then designs the objective function by taking use of the weighted sum of evaluation indices, and optimizes the objective function by employing GSDA so as to get a higher resolution of RS image. As discussed above, the bullet points of the text are summarized as follows.•The contribution proposes the iterative self-organizing data analysis algorithm for multi-source RS image fusion.•This article presents GSDA algorithm for the self-adaptively adjustment of the fusion rules.•This text comes up with the model operator and the observed operator as the fusion scheme of RS image based on GSDA. The proposed algorithm opens up a novel algorithmic pathway for multi-source RS image fusion by means of GSDA.
Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline.
Zhang, Jie; Li, Qingyang; Caselli, Richard J; Thompson, Paul M; Ye, Jieping; Wang, Yalin
2017-06-01
Alzheimer's Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms.
Multi-Source Multi-Target Dictionary Learning for Prediction of Cognitive Decline
Zhang, Jie; Li, Qingyang; Caselli, Richard J.; Thompson, Paul M.; Ye, Jieping; Wang, Yalin
2017-01-01
Alzheimer’s Disease (AD) is the most common type of dementia. Identifying correct biomarkers may determine pre-symptomatic AD subjects and enable early intervention. Recently, Multi-task sparse feature learning has been successfully applied to many computer vision and biomedical informatics researches. It aims to improve the generalization performance by exploiting the shared features among different tasks. However, most of the existing algorithms are formulated as a supervised learning scheme. Its drawback is with either insufficient feature numbers or missing label information. To address these challenges, we formulate an unsupervised framework for multi-task sparse feature learning based on a novel dictionary learning algorithm. To solve the unsupervised learning problem, we propose a two-stage Multi-Source Multi-Target Dictionary Learning (MMDL) algorithm. In stage 1, we propose a multi-source dictionary learning method to utilize the common and individual sparse features in different time slots. In stage 2, supported by a rigorous theoretical analysis, we develop a multi-task learning method to solve the missing label problem. Empirical studies on an N = 3970 longitudinal brain image data set, which involves 2 sources and 5 targets, demonstrate the improved prediction accuracy and speed efficiency of MMDL in comparison with other state-of-the-art algorithms. PMID:28943731
[Estimation of desert vegetation coverage based on multi-source remote sensing data].
Wan, Hong-Mei; Li, Xia; Dong, Dao-Rui
2012-12-01
Taking the lower reaches of Tarim River in Xinjiang of Northwest China as study areaAbstract: Taking the lower reaches of Tarim River in Xinjiang of Northwest China as study area and based on the ground investigation and the multi-source remote sensing data of different resolutions, the estimation models for desert vegetation coverage were built, with the precisions of different estimation methods and models compared. The results showed that with the increasing spatial resolution of remote sensing data, the precisions of the estimation models increased. The estimation precision of the models based on the high, middle-high, and middle-low resolution remote sensing data was 89.5%, 87.0%, and 84.56%, respectively, and the precisions of the remote sensing models were higher than that of vegetation index method. This study revealed the change patterns of the estimation precision of desert vegetation coverage based on different spatial resolution remote sensing data, and realized the quantitative conversion of the parameters and scales among the high, middle, and low spatial resolution remote sensing data of desert vegetation coverage, which would provide direct evidence for establishing and implementing comprehensive remote sensing monitoring scheme for the ecological restoration in the study area.
Lv, Ying; Huang, Guohe; Sun, Wei
2013-01-01
A scenario-based interval two-phase fuzzy programming (SITF) method was developed for water resources planning in a wetland ecosystem. The SITF approach incorporates two-phase fuzzy programming, interval mathematical programming, and scenario analysis within a general framework. It can tackle fuzzy and interval uncertainties in terms of cost coefficients, resources availabilities, water demands, hydrological conditions and other parameters within a multi-source supply and multi-sector consumption context. The SITF method has the advantage in effectively improving the membership degrees of the system objective and all fuzzy constraints, so that both higher satisfactory grade of the objective and more efficient utilization of system resources can be guaranteed. Under the systematic consideration of water demands by the ecosystem, the SITF method was successfully applied to Baiyangdian Lake, which is the largest wetland in North China. Multi-source supplies (including the inter-basin water sources of Yuecheng Reservoir and Yellow River), and multiple water users (including agricultural, industrial and domestic sectors) were taken into account. The results indicated that, the SITF approach would generate useful solutions to identify long-term water allocation and transfer schemes under multiple economic, environmental, ecological, and system-security targets. It can address a comparative analysis for the system satisfactory degrees of decisions under various policy scenarios. Moreover, it is of significance to quantify the relationship between hydrological change and human activities, such that a scheme on ecologically sustainable water supply to Baiyangdian Lake can be achieved. Copyright © 2012 Elsevier B.V. All rights reserved.
Multisource Data Classification Using A Hybrid Semi-supervised Learning Scheme
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Bhaduri, Budhendra L; Shekhar, Shashi
2009-01-01
In many practical situations thematic classes can not be discriminated by spectral measurements alone. Often one needs additional features such as population density, road density, wetlands, elevation, soil types, etc. which are discrete attributes. On the other hand remote sensing image features are continuous attributes. Finding a suitable statistical model and estimation of parameters is a challenging task in multisource (e.g., discrete and continuous attributes) data classification. In this paper we present a semi-supervised learning method by assuming that the samples were generated by a mixture model, where each component could be either a continuous or discrete distribution. Overall classificationmore » accuracy of the proposed method is improved by 12% in our initial experiments.« less
Physics Model-Based Scatter Correction in Multi-Source Interior Computed Tomography.
Gong, Hao; Li, Bin; Jia, Xun; Cao, Guohua
2018-02-01
Multi-source interior computed tomography (CT) has a great potential to provide ultra-fast and organ-oriented imaging at low radiation dose. However, X-ray cross scattering from multiple simultaneously activated X-ray imaging chains compromises imaging quality. Previously, we published two hardware-based scatter correction methods for multi-source interior CT. Here, we propose a software-based scatter correction method, with the benefit of no need for hardware modifications. The new method is based on a physics model and an iterative framework. The physics model was derived analytically, and was used to calculate X-ray scattering signals in both forward direction and cross directions in multi-source interior CT. The physics model was integrated to an iterative scatter correction framework to reduce scatter artifacts. The method was applied to phantom data from both Monte Carlo simulations and physical experimentation that were designed to emulate the image acquisition in a multi-source interior CT architecture recently proposed by our team. The proposed scatter correction method reduced scatter artifacts significantly, even with only one iteration. Within a few iterations, the reconstructed images fast converged toward the "scatter-free" reference images. After applying the scatter correction method, the maximum CT number error at the region-of-interests (ROIs) was reduced to 46 HU in numerical phantom dataset and 48 HU in physical phantom dataset respectively, and the contrast-noise-ratio at those ROIs increased by up to 44.3% and up to 19.7%, respectively. The proposed physics model-based iterative scatter correction method could be useful for scatter correction in dual-source or multi-source CT.
Three-dimensional inversion of multisource array electromagnetic data
NASA Astrophysics Data System (ADS)
Tartaras, Efthimios
Three-dimensional (3-D) inversion is increasingly important for the correct interpretation of geophysical data sets in complex environments. To this effect, several approximate solutions have been developed that allow the construction of relatively fast inversion schemes. One such method that is fast and provides satisfactory accuracy is the quasi-linear (QL) approximation. It has, however, the drawback that it is source-dependent and, therefore, impractical in situations where multiple transmitters in different positions are employed. I have, therefore, developed a localized form of the QL approximation that is source-independent. This so-called localized quasi-linear (LQL) approximation can have a scalar, a diagonal, or a full tensor form. Numerical examples of its comparison with the full integral equation solution, the Born approximation, and the original QL approximation are given. The objective behind developing this approximation is to use it in a fast 3-D inversion scheme appropriate for multisource array data such as those collected in airborne surveys, cross-well logging, and other similar geophysical applications. I have developed such an inversion scheme using the scalar and diagonal LQL approximation. It reduces the original nonlinear inverse electromagnetic (EM) problem to three linear inverse problems. The first of these problems is solved using a weighted regularized linear conjugate gradient method, whereas the last two are solved in the least squares sense. The algorithm I developed provides the option of obtaining either smooth or focused inversion images. I have applied the 3-D LQL inversion to synthetic 3-D EM data that simulate a helicopter-borne survey over different earth models. The results demonstrate the stability and efficiency of the method and show that the LQL approximation can be a practical solution to the problem of 3-D inversion of multisource array frequency-domain EM data. I have also applied the method to helicopter-borne EM data collected by INCO Exploration over the Voisey's Bay area in Labrador, Canada. The results of the 3-D inversion successfully delineate the shallow massive sulfides and show that the method can produce reasonable results even in areas of complex geology and large resistivity contrasts.
A novel image encryption scheme based on Kepler’s third law and random Hadamard transform
NASA Astrophysics Data System (ADS)
Luo, Yu-Ling; Zhou, Rong-Long; Liu, Jun-Xiu; Qiu, Sen-Hui; Cao, Yi
2017-12-01
Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 61661008 and 61603104), the Natural Science Foundation of Guangxi Zhuang Autonomous Region, China (Grant Nos. 2015GXNSFBA139256 and 2016GXNSFCA380017), the Funding of Overseas 100 Talents Program of Guangxi Provincial Higher Education, China, the Research Project of Guangxi University of China (Grant No. KY2016YB059), the Guangxi Key Laboratory of Multi-source Information Mining & Security, China (Grant No. MIMS15-07), the Doctoral Research Foundation of Guangxi Normal University, the Guangxi Provincial Experiment Center of Information Science, and the Innovation Project of Guangxi Graduate Education (Grant No. YCSZ2017055).
NASA Astrophysics Data System (ADS)
Ma, Y.; Liu, S.
2017-12-01
Accurate estimation of surface evapotranspiration (ET) with high quality is one of the biggest obstacles for routine applications of remote sensing in eco-hydrological studies and water resource management at basin scale. However, many aspects urgently need to deeply research, such as the applicability of the ET models, the parameterization schemes optimization at the regional scale, the temporal upscaling, the selecting and developing of the spatiotemporal data fusion method and ground-based validation over heterogeneous land surfaces. This project is based on the theoretically robust surface energy balance system (SEBS) model, which the model mechanism need further investigation, including the applicability and the influencing factors, such as local environment, and heterogeneity of the landscape, for improving estimation accuracy. Due to technical and budget limitations, so far, optical remote sensing data is missing due to frequent cloud contamination and other poor atmospheric conditions in Southwest China. Here, a multi-source remote sensing data fusion method (ESTARFM: Enhanced Spatial and Temporal Adaptive Reflectance Fusion Model) method will be proposed through blending multi-source remote sensing data acquired by optical, and passive microwave remote sensors on board polar satellite platforms. The accurate "all-weather" ET estimation will be carried out for daily ET of the River Source Region in Southwest China, and then the remotely sensed ET results are overlapped with the footprint-weighted images of EC (eddy correlation) for ground-based validation.
NASA Technical Reports Server (NTRS)
Benediktsson, Jon A.; Swain, Philip H.; Ersoy, Okan K.
1990-01-01
Neural network learning procedures and statistical classificaiton methods are applied and compared empirically in classification of multisource remote sensing and geographic data. Statistical multisource classification by means of a method based on Bayesian classification theory is also investigated and modified. The modifications permit control of the influence of the data sources involved in the classification process. Reliability measures are introduced to rank the quality of the data sources. The data sources are then weighted according to these rankings in the statistical multisource classification. Four data sources are used in experiments: Landsat MSS data and three forms of topographic data (elevation, slope, and aspect). Experimental results show that two different approaches have unique advantages and disadvantages in this classification application.
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information. PMID:25938760
A Fault Diagnosis Methodology for Gear Pump Based on EEMD and Bayesian Network.
Liu, Zengkai; Liu, Yonghong; Shan, Hongkai; Cai, Baoping; Huang, Qing
2015-01-01
This paper proposes a fault diagnosis methodology for a gear pump based on the ensemble empirical mode decomposition (EEMD) method and the Bayesian network. Essentially, the presented scheme is a multi-source information fusion based methodology. Compared with the conventional fault diagnosis with only EEMD, the proposed method is able to take advantage of all useful information besides sensor signals. The presented diagnostic Bayesian network consists of a fault layer, a fault feature layer and a multi-source information layer. Vibration signals from sensor measurement are decomposed by the EEMD method and the energy of intrinsic mode functions (IMFs) are calculated as fault features. These features are added into the fault feature layer in the Bayesian network. The other sources of useful information are added to the information layer. The generalized three-layer Bayesian network can be developed by fully incorporating faults and fault symptoms as well as other useful information such as naked eye inspection and maintenance records. Therefore, diagnostic accuracy and capacity can be improved. The proposed methodology is applied to the fault diagnosis of a gear pump and the structure and parameters of the Bayesian network is established. Compared with artificial neural network and support vector machine classification algorithms, the proposed model has the best diagnostic performance when sensor data is used only. A case study has demonstrated that some information from human observation or system repair records is very helpful to the fault diagnosis. It is effective and efficient in diagnosing faults based on uncertain, incomplete information.
Research on precise modeling of buildings based on multi-source data fusion of air to ground
NASA Astrophysics Data System (ADS)
Li, Yongqiang; Niu, Lubiao; Yang, Shasha; Li, Lixue; Zhang, Xitong
2016-03-01
Aims at the accuracy problem of precise modeling of buildings, a test research was conducted based on multi-source data for buildings of the same test area , including top data of air-borne LiDAR, aerial orthophotos, and façade data of vehicle-borne LiDAR. After accurately extracted the top and bottom outlines of building clusters, a series of qualitative and quantitative analysis was carried out for the 2D interval between outlines. Research results provide a reliable accuracy support for precise modeling of buildings of air ground multi-source data fusion, on the same time, discussed some solution for key technical problems.
NASA Technical Reports Server (NTRS)
Benediktsson, J. A.; Swain, P. H.; Ersoy, O. K.
1993-01-01
Application of neural networks to classification of remote sensing data is discussed. Conventional two-layer backpropagation is found to give good results in classification of remote sensing data but is not efficient in training. A more efficient variant, based on conjugate-gradient optimization, is used for classification of multisource remote sensing and geographic data and very-high-dimensional data. The conjugate-gradient neural networks give excellent performance in classification of multisource data, but do not compare as well with statistical methods in classification of very-high-dimentional data.
Design and application of BIM based digital sand table for construction management
NASA Astrophysics Data System (ADS)
Fuquan, JI; Jianqiang, LI; Weijia, LIU
2018-05-01
This paper explores the design and application of BIM based digital sand table for construction management. Aiming at the demands and features of construction management plan for bridge and tunnel engineering, the key functional features of digital sand table should include three-dimensional GIS, model navigation, virtual simulation, information layers, and data exchange, etc. That involving the technology of 3D visualization and 4D virtual simulation of BIM, breakdown structure of BIM model and project data, multi-dimensional information layers, and multi-source data acquisition and interaction. Totally, the digital sand table is a visual and virtual engineering information integrated terminal, under the unified data standard system. Also, the applications shall contain visual constructing scheme, virtual constructing schedule, and monitoring of construction, etc. Finally, the applicability of several basic software to the digital sand table is analyzed.
Integrating multisource imagery and GIS analysis for mapping Bermuda`s benthic habitats
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vierros, M.K.
1997-06-01
Bermuda is a group of isolated oceanic situated in the northwest Atlantic Ocean and surrounded by the Sargasso Sea. Bermuda possesses the northernmost coral reefs and mangroves in the Atlantic Ocean, and because of its high population density, both the terrestrial and marine environments are under intense human pressure. Although a long record of scientific research exists, this study is the first attempt to comprehensively map the area`s benthic habitats, despite the need for such a map for resource assessment and management purposes. Multi-source and multi-date imagery were used for producing the habitat map due to lack of a completemore » up-to-date image. Classifications were performed with SPOT data, and the results verified from recent aerial photography and current aerial video, along with extensive ground truthing. Stratification of the image into regions prior to classification reduced the confusing effects of varying water depth. Classification accuracy in shallow areas was increased by derivation of a texture pseudo-channel, while bathymetry was used as a classification tool in deeper areas, where local patterns of zonation were well known. Because of seasonal variation in extent of seagrasses, a classification scheme based on density could not be used. Instead, a set of classes based on the seagrass area`s exposure to the open ocean were developed. The resulting habitat map is currently being assessed for accuracy with promising preliminary results, indicating its usefulness as a basis for future resource assessment studies.« less
Dang, Yaoguo; Mao, Wenxin
2018-01-01
In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method. PMID:29510521
Sun, Huifang; Dang, Yaoguo; Mao, Wenxin
2018-03-03
In view of the multi-attribute decision-making problem that the attribute values are grey multi-source heterogeneous data, a decision-making method based on kernel and greyness degree is proposed. The definitions of kernel and greyness degree of an extended grey number in a grey multi-source heterogeneous data sequence are given. On this basis, we construct the kernel vector and greyness degree vector of the sequence to whiten the multi-source heterogeneous information, then a grey relational bi-directional projection ranking method is presented. Considering the multi-attribute multi-level decision structure and the causalities between attributes in decision-making problem, the HG-DEMATEL method is proposed to determine the hierarchical attribute weights. A green supplier selection example is provided to demonstrate the rationality and validity of the proposed method.
An integrated multi-source energy harvester based on vibration and magnetic field energy
NASA Astrophysics Data System (ADS)
Hu, Zhengwen; Qiu, Jing; Wang, Xian; Gao, Yuan; Liu, Xin; Chang, Qijie; Long, Yibing; He, Xingduo
2018-05-01
In this paper, an integrated multi-source energy harvester (IMSEH) employing a special shaped cantilever beam and a piezoelectric transducer to convert vibration and magnetic field energy into electrical energy is presented. The electric output performance of the proposed IMSEH has been investigated. Compared to a traditional multi-source energy harvester (MSEH) or single source energy harvester (SSEH), the proposed IMSEH can simultaneously harvest vibration and magnetic field energy with an integrated structure and the electric output is greatly improved. When other conditions keep identical, the IMSEH can obtain high voltage of 12.8V. Remarkably, the proposed IMSEHs have great potential for its application in wireless sensor network.
Hill, Jacqueline J; Asprey, Anthea; Richards, Suzanne H; Campbell, John L
2012-01-01
Background UK revalidation plans for doctors include obtaining multisource feedback from patient and colleague questionnaires as part of the supporting information for appraisal and revalidation. Aim To investigate GPs' and appraisers' views of using multisource feedback data in appraisal, and of the emerging links between multisource feedback, appraisal, and revalidation. Design and setting A qualitative study in UK general practice. Method In total, 12 GPs who had recently completed the General Medical Council multisource feedback questionnaires and 12 appraisers undertook a semi-structured, telephone interview. A thematic analysis was performed. Results Participants supported multisource feedback for formative development, although most expressed concerns about some elements of its methodology (for example, ‘self’ selection of colleagues, or whether patients and colleagues can provide objective feedback). Some participants reported difficulties in understanding benchmark data and some were upset by their scores. Most accepted the links between appraisal and revalidation, and that multisource feedback could make a positive contribution. However, tensions between the formative processes of appraisal and the summative function of revalidation were identified. Conclusion Participants valued multisource feedback as part of formative assessment and saw a role for it in appraisal. However, concerns about some elements of multisource feedback methodology may undermine its credibility as a tool for identifying poor performance. Proposals linking multisource feedback, appraisal, and revalidation may limit the use of multisource feedback and appraisal for learning and development by some doctors. Careful consideration is required with respect to promoting the accuracy and credibility of such feedback processes so that their use for learning and development, and for revalidation, is maximised. PMID:22546590
Hill, Jacqueline J; Asprey, Anthea; Richards, Suzanne H; Campbell, John L
2012-05-01
UK revalidation plans for doctors include obtaining multisource feedback from patient and colleague questionnaires as part of the supporting information for appraisal and revalidation. To investigate GPs' and appraisers' views of using multisource feedback data in appraisal, and of the emerging links between multisource feedback, appraisal, and revalidation. A qualitative study in UK general practice. In total, 12 GPs who had recently completed the General Medical Council multisource feedback questionnaires and 12 appraisers undertook a semi-structured, telephone interview. A thematic analysis was performed. Participants supported multisource feedback for formative development, although most expressed concerns about some elements of its methodology (for example, 'self' selection of colleagues, or whether patients and colleagues can provide objective feedback). Some participants reported difficulties in understanding benchmark data and some were upset by their scores. Most accepted the links between appraisal and revalidation, and that multisource feedback could make a positive contribution. However, tensions between the formative processes of appraisal and the summative function of revalidation were identified. Participants valued multisource feedback as part of formative assessment and saw a role for it in appraisal. However, concerns about some elements of multisource feedback methodology may undermine its credibility as a tool for identifying poor performance. Proposals linking multisource feedback, appraisal, and revalidation may limit the use of multisource feedback and appraisal for learning and development by some doctors. Careful consideration is required with respect to promoting the accuracy and credibility of such feedback processes so that their use for learning and development, and for revalidation, is maximised.
A research on the positioning technology of vehicle navigation system from single source to "ASPN"
NASA Astrophysics Data System (ADS)
Zhang, Jing; Li, Haizhou; Chen, Yu; Chen, Hongyue; Sun, Qian
2017-10-01
Due to the suddenness and complexity of modern warfare, land-based weapon systems need to have precision strike capability on roads and railways. The vehicle navigation system is one of the most important equipments for the land-based weapon systems that have precision strick capability. There are inherent shortcomings for single source navigation systems to provide continuous and stable navigation information. To overcome the shortcomings, the multi-source positioning technology is developed. The All Source Positioning and Navigaiton (ASPN) program was proposed in 2010, which seeks to enable low cost, robust, and seamless navigation solutions for military to use on any operational platform and in any environment with or without GPS. The development trend of vehicle positioning technology was reviewed in this paper. The trend indicates that the positioning technology is developed from single source and multi-source to ASPN. The data fusion techniques based on multi-source and ASPN was analyzed in detail.
Multisource geological data mining and its utilization of uranium resources exploration
NASA Astrophysics Data System (ADS)
Zhang, Jie-lin
2009-10-01
Nuclear energy as one of clear energy sources takes important role in economic development in CHINA, and according to the national long term development strategy, many more nuclear powers will be built in next few years, so it is a great challenge for uranium resources exploration. Research and practice on mineral exploration demonstrates that utilizing the modern Earth Observe System (EOS) technology and developing new multi-source geological data mining methods are effective approaches to uranium deposits prospecting. Based on data mining and knowledge discovery technology, this paper uses multi-source geological data to character electromagnetic spectral, geophysical and spatial information of uranium mineralization factors, and provides the technical support for uranium prospecting integrating with field remote sensing geological survey. Multi-source geological data used in this paper include satellite hyperspectral image (Hyperion), high spatial resolution remote sensing data, uranium geological information, airborne radiometric data, aeromagnetic and gravity data, and related data mining methods have been developed, such as data fusion of optical data and Radarsat image, information integration of remote sensing and geophysical data, and so on. Based on above approaches, the multi-geoscience information of uranium mineralization factors including complex polystage rock mass, mineralization controlling faults and hydrothermal alterations have been identified, the metallogenic potential of uranium has been evaluated, and some predicting areas have been located.
NASA Astrophysics Data System (ADS)
Prasad, S.; Bruce, L. M.
2007-04-01
There is a growing interest in using multiple sources for automatic target recognition (ATR) applications. One approach is to take multiple, independent observations of a phenomenon and perform a feature level or a decision level fusion for ATR. This paper proposes a method to utilize these types of multi-source fusion techniques to exploit hyperspectral data when only a small number of training pixels are available. Conventional hyperspectral image based ATR techniques project the high dimensional reflectance signature onto a lower dimensional subspace using techniques such as Principal Components Analysis (PCA), Fisher's linear discriminant analysis (LDA), subspace LDA and stepwise LDA. While some of these techniques attempt to solve the curse of dimensionality, or small sample size problem, these are not necessarily optimal projections. In this paper, we present a divide and conquer approach to address the small sample size problem. The hyperspectral space is partitioned into contiguous subspaces such that the discriminative information within each subspace is maximized, and the statistical dependence between subspaces is minimized. We then treat each subspace as a separate source in a multi-source multi-classifier setup and test various decision fusion schemes to determine their efficacy. Unlike previous approaches which use correlation between variables for band grouping, we study the efficacy of higher order statistical information (using average mutual information) for a bottom up band grouping. We also propose a confidence measure based decision fusion technique, where the weights associated with various classifiers are based on their confidence in recognizing the training data. To this end, training accuracies of all classifiers are used for weight assignment in the fusion process of test pixels. The proposed methods are tested using hyperspectral data with known ground truth, such that the efficacy can be quantitatively measured in terms of target recognition accuracies.
NASA Astrophysics Data System (ADS)
Lin, Yueguan; Wang, Wei; Wen, Qi; Huang, He; Lin, Jingli; Zhang, Wei
2015-12-01
Ms8.0 Wenchuan earthquake that occurred on May 12, 2008 brought huge casualties and property losses to the Chinese people, and Beichuan County was destroyed in the earthquake. In order to leave a site for commemorate of the people, and for science propaganda and research of earthquake science, Beichuan National Earthquake Ruins Museum has been built on the ruins of Beichuan county. Based on the demand for digital preservation of the earthquake ruins park and collection of earthquake damage assessment of research and data needs, we set up a data set of Beichuan National Earthquake Ruins Museum, including satellite remote sensing image, airborne remote sensing image, ground photogrammetry data and ground acquisition data. At the same time, in order to make a better service for earthquake science research, we design the sharing ideas and schemes for this scientific data set.
Li, Hao; Zhang, Gaofei; Ma, Rui; You, Zheng
2014-01-01
An effective multisource energy harvesting system is presented as power supply for wireless sensor nodes (WSNs). The advanced system contains not only an expandable power management module including control of the charging and discharging process of the lithium polymer battery but also an energy harvesting system using the maximum power point tracking (MPPT) circuit with analog driving scheme for the collection of both solar and vibration energy sources. Since the MPPT and the power management module are utilized, the system is able to effectively achieve a low power consumption. Furthermore, a super capacitor is integrated in the system so that current fluctuations of the lithium polymer battery during the charging and discharging processes can be properly reduced. In addition, through a simple analog switch circuit with low power consumption, the proposed system can successfully switch the power supply path according to the ambient energy sources and load power automatically. A practical WSNs platform shows that efficiency of the energy harvesting system can reach about 75-85% through the 24-hour environmental test, which confirms that the proposed system can be used as a long-term continuous power supply for WSNs.
Li, Hao; Zhang, Gaofei; Ma, Rui; You, Zheng
2014-01-01
An effective multisource energy harvesting system is presented as power supply for wireless sensor nodes (WSNs). The advanced system contains not only an expandable power management module including control of the charging and discharging process of the lithium polymer battery but also an energy harvesting system using the maximum power point tracking (MPPT) circuit with analog driving scheme for the collection of both solar and vibration energy sources. Since the MPPT and the power management module are utilized, the system is able to effectively achieve a low power consumption. Furthermore, a super capacitor is integrated in the system so that current fluctuations of the lithium polymer battery during the charging and discharging processes can be properly reduced. In addition, through a simple analog switch circuit with low power consumption, the proposed system can successfully switch the power supply path according to the ambient energy sources and load power automatically. A practical WSNs platform shows that efficiency of the energy harvesting system can reach about 75–85% through the 24-hour environmental test, which confirms that the proposed system can be used as a long-term continuous power supply for WSNs. PMID:25032233
Tang, Jun; Yao, Yibin; Zhang, Liang; Kong, Jian
2015-01-01
The insufficiency of data is the essential reason for ill-posed problem existed in computerized ionospheric tomography (CIT) technique. Therefore, the method of integrating multi-source data is proposed. Currently, the multiple satellite navigation systems and various ionospheric observing instruments provide abundant data which can be employed to reconstruct ionospheric electron density (IED). In order to improve the vertical resolution of IED, we do research on IED reconstruction by integration of ground-based GPS data, occultation data from the LEO satellite, satellite altimetry data from Jason-1 and Jason-2 and ionosonde data. We used the CIT results to compare with incoherent scatter radar (ISR) observations, and found that the multi-source data fusion was effective and reliable to reconstruct electron density, showing its superiority than CIT with GPS data alone. PMID:26266764
Tang, Jun; Yao, Yibin; Zhang, Liang; Kong, Jian
2015-08-12
The insufficiency of data is the essential reason for ill-posed problem existed in computerized ionospheric tomography (CIT) technique. Therefore, the method of integrating multi-source data is proposed. Currently, the multiple satellite navigation systems and various ionospheric observing instruments provide abundant data which can be employed to reconstruct ionospheric electron density (IED). In order to improve the vertical resolution of IED, we do research on IED reconstruction by integration of ground-based GPS data, occultation data from the LEO satellite, satellite altimetry data from Jason-1 and Jason-2 and ionosonde data. We used the CIT results to compare with incoherent scatter radar (ISR) observations, and found that the multi-source data fusion was effective and reliable to reconstruct electron density, showing its superiority than CIT with GPS data alone.
Full Waveform Inversion with Multisource Frequency Selection of Marine Streamer Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Huang, Yunsong; Schuster, Gerard T.
The theory and practice of multisource full waveform inversion of marine supergathers are described with a frequency-selection strategy. The key enabling property of frequency selection is that it eliminates the crosstalk among sources, thus overcoming the aperture mismatch of marine multisource inversion. Tests on multisource full waveform inversion of synthetic marine data and Gulf of Mexico data show speedups of 4× and 8×, respectively, compared to conventional full waveform inversion.
Full Waveform Inversion with Multisource Frequency Selection of Marine Streamer Data
Huang, Yunsong; Schuster, Gerard T.
2017-10-26
The theory and practice of multisource full waveform inversion of marine supergathers are described with a frequency-selection strategy. The key enabling property of frequency selection is that it eliminates the crosstalk among sources, thus overcoming the aperture mismatch of marine multisource inversion. Tests on multisource full waveform inversion of synthetic marine data and Gulf of Mexico data show speedups of 4× and 8×, respectively, compared to conventional full waveform inversion.
NASA Astrophysics Data System (ADS)
Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.
2014-04-01
Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.
Multisource feedback to graduate nurses: a multimethod study.
McPhee, Samantha; Phillips, Nicole M; Ockerby, Cherene; Hutchinson, Alison M
2017-11-01
(1) To explore graduate nurses' perceptions of the influence of multisource feedback on their performance and (2) to explore perceptions of Clinical Nurse Educators involved in providing feedback regarding feasibility and benefit of the approach. Graduate registered nurses are expected to provide high-quality care for patients in demanding and unpredictable clinical environments. Receiving feedback is essential to their development. Performance appraisals are a common method used to provide feedback and typically involve a single source of feedback. Alternatively, multisource feedback allows the learner to gain insight into performance from a variety of perspectives. This study explores multisource feedback in an Australian setting within the graduate nurse context. Multimethod study. Eleven graduates were given structured performance feedback from four raters: Nurse Unit Manager, Clinical Nurse Educator, preceptor and a self-appraisal. Thirteen graduates received standard single-rater appraisals. Data regarding perceptions of feedback for both groups were obtained using a questionnaire. Semistructured interviews were conducted with nurses who received multisource feedback and the educators. In total, 94% (n = 15) of survey respondents perceived feedback was important during the graduate year. Four themes emerged from interviews: informal feedback, appropriateness of raters, elements of delivery and creating an appraisal process that is 'more real'. Multisource feedback was perceived as more beneficial compared to single-rater feedback. Educators saw value in multisource feedback; however, perceived barriers were engaging raters and collating feedback. Some evidence exists to indicate that feedback from multiple sources is valued by graduates. Further research in a larger sample and with more experienced nurses is required. Evidence resulting from this study indicates that multisource feedback is valued by both graduates and educators and informs graduates' development and transition into the role. Thus, a multisource approach to feedback for graduate nurses should be considered. © 2016 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Hortos, William S.
2008-04-01
Proposed distributed wavelet-based algorithms are a means to compress sensor data received at the nodes forming a wireless sensor network (WSN) by exchanging information between neighboring sensor nodes. Local collaboration among nodes compacts the measurements, yielding a reduced fused set with equivalent information at far fewer nodes. Nodes may be equipped with multiple sensor types, each capable of sensing distinct phenomena: thermal, humidity, chemical, voltage, or image signals with low or no frequency content as well as audio, seismic or video signals within defined frequency ranges. Compression of the multi-source data through wavelet-based methods, distributed at active nodes, reduces downstream processing and storage requirements along the paths to sink nodes; it also enables noise suppression and more energy-efficient query routing within the WSN. Targets are first detected by the multiple sensors; then wavelet compression and data fusion are applied to the target returns, followed by feature extraction from the reduced data; feature data are input to target recognition/classification routines; targets are tracked during their sojourns through the area monitored by the WSN. Algorithms to perform these tasks are implemented in a distributed manner, based on a partition of the WSN into clusters of nodes. In this work, a scheme of collaborative processing is applied for hierarchical data aggregation and decorrelation, based on the sensor data itself and any redundant information, enabled by a distributed, in-cluster wavelet transform with lifting that allows multiple levels of resolution. The wavelet-based compression algorithm significantly decreases RF bandwidth and other resource use in target processing tasks. Following wavelet compression, features are extracted. The objective of feature extraction is to maximize the probabilities of correct target classification based on multi-source sensor measurements, while minimizing the resource expenditures at participating nodes. Therefore, the feature-extraction method based on the Haar DWT is presented that employs a maximum-entropy measure to determine significant wavelet coefficients. Features are formed by calculating the energy of coefficients grouped around the competing clusters. A DWT-based feature extraction algorithm used for vehicle classification in WSNs can be enhanced by an added rule for selecting the optimal number of resolution levels to improve the correct classification rate and reduce energy consumption expended in local algorithm computations. Published field trial data for vehicular ground targets, measured with multiple sensor types, are used to evaluate the wavelet-assisted algorithms. Extracted features are used in established target recognition routines, e.g., the Bayesian minimum-error-rate classifier, to compare the effects on the classification performance of the wavelet compression. Simulations of feature sets and recognition routines at different resolution levels in target scenarios indicate the impact on classification rates, while formulas are provided to estimate reduction in resource use due to distributed compression.
Multisource Feedback in the Ambulatory Setting
Warm, Eric J.; Schauer, Daniel; Revis, Brian; Boex, James R.
2010-01-01
Background The Accreditation Council for Graduate Medical Education has mandated multisource feedback (MSF) in the ambulatory setting for internal medicine residents. Few published reports demonstrate actual MSF results for a residency class, and fewer still include clinical quality measures and knowledge-based testing performance in the data set. Methods Residents participating in a year-long group practice experience called the “long-block” received MSF that included self, peer, staff, attending physician, and patient evaluations, as well as concomitant clinical quality data and knowledge-based testing scores. Residents were given a rank for each data point compared with peers in the class, and these data were reviewed with the chief resident and program director over the course of the long-block. Results Multisource feedback identified residents who performed well on most measures compared with their peers (10%), residents who performed poorly on most measures compared with their peers (10%), and residents who performed well on some measures and poorly on others (80%). Each high-, intermediate-, and low-performing resident had a least one aspect of the MSF that was significantly lower than the other, and this served as the basis of formative feedback during the long-block. Conclusion Use of multi-source feedback in the ambulatory setting can identify high-, intermediate-, and low-performing residents and suggest specific formative feedback for each. More research needs to be done on the effect of such feedback, as well as the relationships between each of the components in the MSF data set. PMID:21975632
ERIC Educational Resources Information Center
Goldring, Ellen B.; Mavrogordato, Madeline; Haynes, Katherine Taylor
2015-01-01
Purpose: A relatively new approach to principal evaluation is the use of multisource feedback, which typically entails a leader's self-evaluation as well as parallel evaluations from subordinates, peers, and/or superiors. However, there is little research on how principals interact with evaluation data from multisource feedback systems. This…
Multisource feedback analysis of pediatric outpatient teaching
2013-01-01
Background This study aims to evaluate the outpatient communication skills of medical students via multisource feedback, which may be useful to map future directions in improving physician-patient communication. Methods Family respondents of patients, a nurse, a clinical teacher, and a research assistant evaluated video-recorded medical students’ interactions with outpatients by using multisource feedback questionnaires; students also assessed their own skills. The questionnaire was answered based on the video-recorded interactions between outpatients and the medical students. Results A total of 60 family respondents of the 60 patients completed the questionnaires, 58 (96.7%) of them agreed with the video recording. Two reasons for reluctance were “personal privacy” issues and “simply disagree” with the video recording. The average satisfaction score of the 58 students was 85.1 points, indicating students’ performance was in the category between satisfied and very satisfied. The family respondents were most satisfied with the “teacher”s attitude,“ followed by ”teaching quality”. In contrast, the family respondents were least satisfied with “being open to questions”. Among the 6 assessment domains of communication skills, the students scored highest on “explaining” and lowest on “giving recommendations”. In the detailed assessment by family respondents, the students scored lowest on “asking about life/school burden”. In the multisource analysis, the nurses’ mean score was much higher and the students’ mean self-assessment score was lower than the average scores on all domains. Conclusion The willingness and satisfaction of family respondents were high in this study. Students scored the lowest on giving recommendations to patients. Multisource feedback with video recording is useful in providing more accurate evaluation of students’ communication competence and in identifying the areas of communication that require enhancement. PMID:24180615
Defect inspection in hot slab surface: multi-source CCD imaging based fuzzy-rough sets method
NASA Astrophysics Data System (ADS)
Zhao, Liming; Zhang, Yi; Xu, Xiaodong; Xiao, Hong; Huang, Chao
2016-09-01
To provide an accurate surface defects inspection method and make the automation of robust image region of interests(ROI) delineation strategy a reality in production line, a multi-source CCD imaging based fuzzy-rough sets method is proposed for hot slab surface quality assessment. The applicability of the presented method and the devised system are mainly tied to the surface quality inspection for strip, billet and slab surface etcetera. In this work we take into account the complementary advantages in two common machine vision (MV) systems(line array CCD traditional scanning imaging (LS-imaging) and area array CCD laser three-dimensional (3D) scanning imaging (AL-imaging)), and through establishing the model of fuzzy-rough sets in the detection system the seeds for relative fuzzy connectedness(RFC) delineation for ROI can placed adaptively, which introduces the upper and lower approximation sets for RIO definition, and by which the boundary region can be delineated by RFC region competitive classification mechanism. For the first time, a Multi-source CCD imaging based fuzzy-rough sets strategy is attempted for CC-slab surface defects inspection that allows an automatic way of AI algorithms and powerful ROI delineation strategies to be applied to the MV inspection field.
Wang, Bao-Zhen; Chen, Zhi
2013-01-01
This article presents a GIS-based multi-source and multi-box modeling approach (GMSMB) to predict the spatial concentration distributions of airborne pollutant on local and regional scales. In this method, an extended multi-box model combined with a multi-source and multi-grid Gaussian model are developed within the GIS framework to examine the contributions from both point- and area-source emissions. By using GIS, a large amount of data including emission sources, air quality monitoring, meteorological data, and spatial location information required for air quality modeling are brought into an integrated modeling environment. It helps more details of spatial variation in source distribution and meteorological condition to be quantitatively analyzed. The developed modeling approach has been examined to predict the spatial concentration distribution of four air pollutants (CO, NO(2), SO(2) and PM(2.5)) for the State of California. The modeling results are compared with the monitoring data. Good agreement is acquired which demonstrated that the developed modeling approach could deliver an effective air pollution assessment on both regional and local scales to support air pollution control and management planning.
NASA Astrophysics Data System (ADS)
Tang, Jian; Qiao, Junfei; Wu, ZhiWei; Chai, Tianyou; Zhang, Jian; Yu, Wen
2018-01-01
Frequency spectral data of mechanical vibration and acoustic signals relate to difficult-to-measure production quality and quantity parameters of complex industrial processes. A selective ensemble (SEN) algorithm can be used to build a soft sensor model of these process parameters by fusing valued information selectively from different perspectives. However, a combination of several optimized ensemble sub-models with SEN cannot guarantee the best prediction model. In this study, we use several techniques to construct mechanical vibration and acoustic frequency spectra of a data-driven industrial process parameter model based on selective fusion multi-condition samples and multi-source features. Multi-layer SEN (MLSEN) strategy is used to simulate the domain expert cognitive process. Genetic algorithm and kernel partial least squares are used to construct the inside-layer SEN sub-model based on each mechanical vibration and acoustic frequency spectral feature subset. Branch-and-bound and adaptive weighted fusion algorithms are integrated to select and combine outputs of the inside-layer SEN sub-models. Then, the outside-layer SEN is constructed. Thus, "sub-sampling training examples"-based and "manipulating input features"-based ensemble construction methods are integrated, thereby realizing the selective information fusion process based on multi-condition history samples and multi-source input features. This novel approach is applied to a laboratory-scale ball mill grinding process. A comparison with other methods indicates that the proposed MLSEN approach effectively models mechanical vibration and acoustic signals.
LINKS: learning-based multi-source IntegratioN frameworK for Segmentation of infant brain images.
Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Gilmore, John H; Lin, Weili; Shen, Dinggang
2015-03-01
Segmentation of infant brain MR images is challenging due to insufficient image quality, severe partial volume effect, and ongoing maturation and myelination processes. In the first year of life, the image contrast between white and gray matters of the infant brain undergoes dramatic changes. In particular, the image contrast is inverted around 6-8months of age, and the white and gray matter tissues are isointense in both T1- and T2-weighted MR images and thus exhibit the extremely low tissue contrast, which poses significant challenges for automated segmentation. Most previous studies used multi-atlas label fusion strategy, which has the limitation of equally treating the different available image modalities and is often computationally expensive. To cope with these limitations, in this paper, we propose a novel learning-based multi-source integration framework for segmentation of infant brain images. Specifically, we employ the random forest technique to effectively integrate features from multi-source images together for tissue segmentation. Here, the multi-source images include initially only the multi-modality (T1, T2 and FA) images and later also the iteratively estimated and refined tissue probability maps of gray matter, white matter, and cerebrospinal fluid. Experimental results on 119 infants show that the proposed method achieves better performance than other state-of-the-art automated segmentation methods. Further validation was performed on the MICCAI grand challenge and the proposed method was ranked top among all competing methods. Moreover, to alleviate the possible anatomical errors, our method can also be combined with an anatomically-constrained multi-atlas labeling approach for further improving the segmentation accuracy. Copyright © 2014 Elsevier Inc. All rights reserved.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation.
Dong, Yingying; Luo, Ruisen; Feng, Haikuan; Wang, Jihua; Zhao, Jinling; Zhu, Yining; Yang, Guijun
2014-01-01
Differences exist among analysis results of agriculture monitoring and crop production based on remote sensing observations, which are obtained at different spatial scales from multiple remote sensors in same time period, and processed by same algorithms, models or methods. These differences can be mainly quantitatively described from three aspects, i.e. multiple remote sensing observations, crop parameters estimation models, and spatial scale effects of surface parameters. Our research proposed a new method to analyse and correct the differences between multi-source and multi-scale spatial remote sensing surface reflectance datasets, aiming to provide references for further studies in agricultural application with multiple remotely sensed observations from different sources. The new method was constructed on the basis of physical and mathematical properties of multi-source and multi-scale reflectance datasets. Theories of statistics were involved to extract statistical characteristics of multiple surface reflectance datasets, and further quantitatively analyse spatial variations of these characteristics at multiple spatial scales. Then, taking the surface reflectance at small spatial scale as the baseline data, theories of Gaussian distribution were selected for multiple surface reflectance datasets correction based on the above obtained physical characteristics and mathematical distribution properties, and their spatial variations. This proposed method was verified by two sets of multiple satellite images, which were obtained in two experimental fields located in Inner Mongolia and Beijing, China with different degrees of homogeneity of underlying surfaces. Experimental results indicate that differences of surface reflectance datasets at multiple spatial scales could be effectively corrected over non-homogeneous underlying surfaces, which provide database for further multi-source and multi-scale crop growth monitoring and yield prediction, and their corresponding consistency analysis evaluation. PMID:25405760
LINKS: Learning-based multi-source IntegratioN frameworK for Segmentation of infant brain images
Wang, Li; Gao, Yaozong; Shi, Feng; Li, Gang; Gilmore, John H.; Lin, Weili; Shen, Dinggang
2014-01-01
Segmentation of infant brain MR images is challenging due to insufficient image quality, severe partial volume effect, and ongoing maturation and myelination processes. In the first year of life, the image contrast between white and gray matters of the infant brain undergoes dramatic changes. In particular, the image contrast is inverted around 6-8 months of age, and the white and gray matter tissues are isointense in both T1- and T2-weighted MR images and thus exhibit the extremely low tissue contrast, which poses significant challenges for automated segmentation. Most previous studies used multi-atlas label fusion strategy, which has the limitation of equally treating the different available image modalities and is often computationally expensive. To cope with these limitations, in this paper, we propose a novel learning-based multi-source integration framework for segmentation of infant brain images. Specifically, we employ the random forest technique to effectively integrate features from multi-source images together for tissue segmentation. Here, the multi-source images include initially only the multi-modality (T1, T2 and FA) images and later also the iteratively estimated and refined tissue probability maps of gray matter, white matter, and cerebrospinal fluid. Experimental results on 119 infants show that the proposed method achieves better performance than other state-of-the-art automated segmentation methods. Further validation was performed on the MICCAI grand challenge and the proposed method was ranked top among all competing methods. Moreover, to alleviate the possible anatomical errors, our method can also be combined with an anatomically-constrained multi-atlas labeling approach for further improving the segmentation accuracy. PMID:25541188
Impact of workplace based assessment on doctors' education and performance: a systematic review.
Miller, Alice; Archer, Julian
2010-09-24
To investigate the literature for evidence that workplace based assessment affects doctors' education and performance. Systematic review. The primary data sources were the databases Journals@Ovid, Medline, Embase, CINAHL, PsycINFO, and ERIC. Evidence based reviews (Bandolier, Cochrane Library, DARE, HTA Database, and NHS EED) were accessed and searched via the Health Information Resources website. Reference lists of relevant studies and bibliographies of review articles were also searched. Review methods Studies of any design that attempted to evaluate either the educational impact of workplace based assessment, or the effect of workplace based assessment on doctors' performance, were included. Studies were excluded if the sampled population was non-medical or the study was performed with medical students. Review articles, commentaries, and letters were also excluded. The final exclusion criterion was the use of simulated patients or models rather than real life clinical encounters. Sixteen studies were included. Fifteen of these were non-comparative descriptive or observational studies; the other was a randomised controlled trial. Study quality was mixed. Eight studies examined multisource feedback with mixed results; most doctors felt that multisource feedback had educational value, although the evidence for practice change was conflicting. Some junior doctors and surgeons displayed little willingness to change in response to multisource feedback, whereas family physicians might be more prepared to initiate change. Performance changes were more likely to occur when feedback was credible and accurate or when coaching was provided to help subjects identify their strengths and weaknesses. Four studies examined the mini-clinical evaluation exercise, one looked at direct observation of procedural skills, and three were concerned with multiple assessment methods: all these studies reported positive results for the educational impact of workplace based assessment tools. However, there was no objective evidence of improved performance with these tools. Considering the emphasis placed on workplace based assessment as a method of formative performance assessment, there are few published articles exploring its impact on doctors' education and performance. This review shows that multisource feedback can lead to performance improvement, although individual factors, the context of the feedback, and the presence of facilitation have a profound effect on the response. There is no evidence that alternative workplace based assessment tools (mini-clinical evaluation exercise, direct observation of procedural skills, and case based discussion) lead to improvement in performance, although subjective reports on their educational impact are positive.
Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M
2017-02-01
Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.
Objected-oriented remote sensing image classification method based on geographic ontology model
NASA Astrophysics Data System (ADS)
Chu, Z.; Liu, Z. J.; Gu, H. Y.
2016-11-01
Nowadays, with the development of high resolution remote sensing image and the wide application of laser point cloud data, proceeding objected-oriented remote sensing classification based on the characteristic knowledge of multi-source spatial data has been an important trend on the field of remote sensing image classification, which gradually replaced the traditional method through improving algorithm to optimize image classification results. For this purpose, the paper puts forward a remote sensing image classification method that uses the he characteristic knowledge of multi-source spatial data to build the geographic ontology semantic network model, and carries out the objected-oriented classification experiment to implement urban features classification, the experiment uses protégé software which is developed by Stanford University in the United States, and intelligent image analysis software—eCognition software as the experiment platform, uses hyperspectral image and Lidar data that is obtained through flight in DaFeng City of JiangSu as the main data source, first of all, the experiment uses hyperspectral image to obtain feature knowledge of remote sensing image and related special index, the second, the experiment uses Lidar data to generate nDSM(Normalized DSM, Normalized Digital Surface Model),obtaining elevation information, the last, the experiment bases image feature knowledge, special index and elevation information to build the geographic ontology semantic network model that implement urban features classification, the experiment results show that, this method is significantly higher than the traditional classification algorithm on classification accuracy, especially it performs more evidently on the respect of building classification. The method not only considers the advantage of multi-source spatial data, for example, remote sensing image, Lidar data and so on, but also realizes multi-source spatial data knowledge integration and application of the knowledge to the field of remote sensing image classification, which provides an effective way for objected-oriented remote sensing image classification in the future.
NASA Astrophysics Data System (ADS)
Wang, Feiyan; Morten, Jan Petter; Spitzer, Klaus
2018-05-01
In this paper, we present a recently developed anisotropic 3-D inversion framework for interpreting controlled-source electromagnetic (CSEM) data in the frequency domain. The framework integrates a high-order finite-element forward operator and a Gauss-Newton inversion algorithm. Conductivity constraints are applied using a parameter transformation. We discretize the continuous forward and inverse problems on unstructured grids for a flexible treatment of arbitrarily complex geometries. Moreover, an unstructured mesh is more desirable in comparison to a single rectilinear mesh for multisource problems because local grid refinement will not significantly influence the mesh density outside the region of interest. The non-uniform spatial discretization facilitates parametrization of the inversion domain at a suitable scale. For a rapid simulation of multisource EM data, we opt to use a parallel direct solver. We further accelerate the inversion process by decomposing the entire data set into subsets with respect to frequencies (and transmitters if memory requirement is affordable). The computational tasks associated with each data subset are distributed to different processes and run in parallel. We validate the scheme using a synthetic marine CSEM model with rough bathymetry, and finally, apply it to an industrial-size 3-D data set from the Troll field oil province in the North Sea acquired in 2008 to examine its robustness and practical applicability.
NASA Technical Reports Server (NTRS)
Kim, H.; Swain, P. H.
1991-01-01
A method of classifying multisource data in remote sensing is presented. The proposed method considers each data source as an information source providing a body of evidence, represents statistical evidence by interval-valued probabilities, and uses Dempster's rule to integrate information based on multiple data source. The method is applied to the problems of ground-cover classification of multispectral data combined with digital terrain data such as elevation, slope, and aspect. Then this method is applied to simulated 201-band High Resolution Imaging Spectrometer (HIRIS) data by dividing the dimensionally huge data source into smaller and more manageable pieces based on the global statistical correlation information. It produces higher classification accuracy than the Maximum Likelihood (ML) classification method when the Hughes phenomenon is apparent.
Mercury⊕: An evidential reasoning image classifier
NASA Astrophysics Data System (ADS)
Peddle, Derek R.
1995-12-01
MERCURY⊕ is a multisource evidential reasoning classification software system based on the Dempster-Shafer theory of evidence. The design and implementation of this software package is described for improving the classification and analysis of multisource digital image data necessary for addressing advanced environmental and geoscience applications. In the remote-sensing context, the approach provides a more appropriate framework for classifying modern, multisource, and ancillary data sets which may contain a large number of disparate variables with different statistical properties, scales of measurement, and levels of error which cannot be handled using conventional Bayesian approaches. The software uses a nonparametric, supervised approach to classification, and provides a more objective and flexible interface to the evidential reasoning framework using a frequency-based method for computing support values from training data. The MERCURY⊕ software package has been implemented efficiently in the C programming language, with extensive use made of dynamic memory allocation procedures and compound linked list and hash-table data structures to optimize the storage and retrieval of evidence in a Knowledge Look-up Table. The software is complete with a full user interface and runs under Unix, Ultrix, VAX/VMS, MS-DOS, and Apple Macintosh operating system. An example of classifying alpine land cover and permafrost active layer depth in northern Canada is presented to illustrate the use and application of these ideas.
WE-DE-201-08: Multi-Source Rotating Shield Brachytherapy Apparatus for Prostate Cancer
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dadkhah, H; Wu, X; Kim, Y
Purpose: To introduce a novel multi-source rotating shield brachytherapy (RSBT) apparatus for the precise simultaneous angular and linear positioning of all partially-shielded 153Gd radiation sources in interstitial needles for treating prostate cancer. The mechanism is designed to lower the detrimental dose to healthy tissues, the urethra in particular, relative to conventional high-dose-rate brachytherapy (HDR-BT) techniques. Methods: Following needle implantation, the delivery system is docked to the patient template. Each needle is coupled to a multi-source afterloader catheter by a connector passing through a shaft. The shafts are rotated by translating a moving template between two stationary templates. Shaft walls asmore » well as moving template holes are threaded such that the resistive friction produced between the two parts exerts enough force on the shafts to bring about the rotation. Rotation of the shaft is then transmitted to the shielded source via several keys. Thus, shaft angular position is fully correlated with the position of the moving template. The catheter angles are simultaneously incremented throughout treatment as needed, and only a single 360° rotation of all catheters is needed for a full treatment. For each rotation angle, source depth in each needle is controlled by a multi-source afterloader, which is proposed as an array of belt-driven linear actuators, each of which drives a source wire. Results: Optimized treatment plans based on Monte Carlo dose calculations demonstrated RSBT with the proposed apparatus reduced urethral D{sub 1cc} below that of conventional HDR-BT by 35% for urethral dose gradient volume within 3 mm of the urethra surface. Treatment time to deliver 20 Gy with multi-source RSBT apparatus using nineteen 62.4 GBq {sup 153}Gd sources is 117 min. Conclusions: The proposed RSBT delivery apparatus in conjunction with multiple nitinol catheter-mounted platinum-shielded {sup 153}Gd sources enables a mechanically feasible urethra-sparing treatment technique for prostate cancer in a clinically reasonable timeframe.« less
Malling, Bente; Mortensen, Lene; Bonderup, Thomas; Scherpbier, Albert; Ringsted, Charlotte
2009-12-10
Leadership courses and multi-source feedback are widely used developmental tools for leaders in health care. On this background we aimed to study the additional effect of a leadership course following a multi-source feedback procedure compared to multi-source feedback alone especially regarding development of leadership skills over time. Study participants were consultants responsible for postgraduate medical education at clinical departments. pre-post measures with an intervention and control group. The intervention was participation in a seven-day leadership course. Scores of multi-source feedback from the consultants responsible for education and respondents (heads of department, consultants and doctors in specialist training) were collected before and one year after the intervention and analysed using Mann-Whitney's U-test and Multivariate analysis of variances. There were no differences in multi-source feedback scores at one year follow up compared to baseline measurements, either in the intervention or in the control group (p = 0.149). The study indicates that a leadership course following a MSF procedure compared to MSF alone does not improve leadership skills of consultants responsible for education in clinical departments. Developing leadership skills takes time and the time frame of one year might have been too short to show improvement in leadership skills of consultants responsible for education. Further studies are needed to investigate if other combination of initiatives to develop leadership might have more impact in the clinical setting.
Zhang, J L; Li, Y P; Huang, G H; Baetz, B W; Liu, J
2017-06-01
In this study, a Bayesian estimation-based simulation-optimization modeling approach (BESMA) is developed for identifying effluent trading strategies. BESMA incorporates nutrient fate modeling with soil and water assessment tool (SWAT), Bayesian estimation, and probabilistic-possibilistic interval programming with fuzzy random coefficients (PPI-FRC) within a general framework. Based on the water quality protocols provided by SWAT, posterior distributions of parameters can be analyzed through Bayesian estimation; stochastic characteristic of nutrient loading can be investigated which provides the inputs for the decision making. PPI-FRC can address multiple uncertainties in the form of intervals with fuzzy random boundaries and the associated system risk through incorporating the concept of possibility and necessity measures. The possibility and necessity measures are suitable for optimistic and pessimistic decision making, respectively. BESMA is applied to a real case of effluent trading planning in the Xiangxihe watershed, China. A number of decision alternatives can be obtained under different trading ratios and treatment rates. The results can not only facilitate identification of optimal effluent-trading schemes, but also gain insight into the effects of trading ratio and treatment rate on decision making. The results also reveal that decision maker's preference towards risk would affect decision alternatives on trading scheme as well as system benefit. Compared with the conventional optimization methods, it is proved that BESMA is advantageous in (i) dealing with multiple uncertainties associated with randomness and fuzziness in effluent-trading planning within a multi-source, multi-reach and multi-period context; (ii) reflecting uncertainties existing in nutrient transport behaviors to improve the accuracy in water quality prediction; and (iii) supporting pessimistic and optimistic decision making for effluent trading as well as promoting diversity of decision alternatives. Copyright © 2017 Elsevier Ltd. All rights reserved.
Field Trials of the Multi-Source Approach for Resistivity and Induced Polarization Data Acquisition
NASA Astrophysics Data System (ADS)
LaBrecque, D. J.; Morelli, G.; Fischanger, F.; Lamoureux, P.; Brigham, R.
2013-12-01
Implementing systems of distributed receivers and transmitters for resistivity and induced polarization data is an almost inevitable result of the availability of wireless data communication modules and GPS modules offering precise timing and instrument locations. Such systems have a number of advantages; for example, they can be deployed around obstacles such as rivers, canyons, or mountains which would be difficult with traditional 'hard-wired' systems. However, deploying a system of identical, small, battery powered, transceivers, each capable of injecting a known current and measuring the induced potential has an additional and less obvious advantage in that multiple units can inject current simultaneously. The original purpose for using multiple simultaneous current sources (multi-source) was to increase signal levels. In traditional systems, to double the received signal you inject twice the current which requires you to apply twice the voltage and thus four times the power. Alternatively, one approach to increasing signal levels for large-scale surveys collected using small, battery powered transceivers is it to allow multiple units to transmit in parallel. In theory, using four 400 watt transmitters on separate, parallel dipoles yields roughly the same signal as a single 6400 watt transmitter. Furthermore, implementing the multi-source approach creates the opportunity to apply more complex current flow patterns than simple, parallel dipoles. For a perfect, noise-free system, multi-sources adds no new information to a data set that contains a comprehensive set of data collected using single sources. However, for realistic, noisy systems, it appears that multi-source data can substantially impact survey results. In preliminary model studies, the multi-source data produced such startling improvements in subsurface images that even the authors questioned their veracity. Between December of 2012 and July of 2013, we completed multi-source surveys at five sites with depths of exploration ranging from 150 to 450 m. The sites included shallow geothermal sites near Reno Nevada, Pomarance Italy, and Volterra Italy; a mineral exploration site near Timmins Quebec; and a landslide investigation near Vajont Dam in northern Italy. These sites provided a series of challenges in survey design and deployment including some extremely difficult terrain and a broad range of background resistivity and induced values. Despite these challenges, comparison of multi-source results to resistivity and induced polarization data collection with more traditional methods support the thesis that the multi-source approach is capable of providing substantial improvements in both depth of penetration and resolution over conventional approaches.
Kim, Sunghun; Sterling, Bobbie Sue; Latimer, Lara
2010-01-01
Developing focused and relevant health promotion interventions is critical for behavioral change in a low-resource or special population. Evidence-based interventions, however, may not match the specific population or health concern of interest. This article describes the Multi-Source Method (MSM) which, in combination with a workshop format, may be used by health professionals and researchers in health promotion program development. The MSM draws on positive deviance practices and processes, focus groups, community advisors, behavioral change theory, and evidence-based strategies. Use of the MSM is illustrated in development of ethnic-specific weight loss interventions for low-income postpartum women. The MSM may be useful in designing future health programs designed for other special populations for whom existing interventions are unavailable or lack relevance. PMID:20433674
NASA Astrophysics Data System (ADS)
Yu, Le; Zhang, Dengrong; Holden, Eun-Jung
2008-07-01
Automatic registration of multi-source remote-sensing images is a difficult task as it must deal with the varying illuminations and resolutions of the images, different perspectives and the local deformations within the images. This paper proposes a fully automatic and fast non-rigid image registration technique that addresses those issues. The proposed technique performs a pre-registration process that coarsely aligns the input image to the reference image by automatically detecting their matching points by using the scale invariant feature transform (SIFT) method and an affine transformation model. Once the coarse registration is completed, it performs a fine-scale registration process based on a piecewise linear transformation technique using feature points that are detected by the Harris corner detector. The registration process firstly finds in succession, tie point pairs between the input and the reference image by detecting Harris corners and applying a cross-matching strategy based on a wavelet pyramid for a fast search speed. Tie point pairs with large errors are pruned by an error-checking step. The input image is then rectified by using triangulated irregular networks (TINs) to deal with irregular local deformations caused by the fluctuation of the terrain. For each triangular facet of the TIN, affine transformations are estimated and applied for rectification. Experiments with Quickbird, SPOT5, SPOT4, TM remote-sensing images of the Hangzhou area in China demonstrate the efficiency and the accuracy of the proposed technique for multi-source remote-sensing image registration.
Multi-sources data fusion framework for remote triage prioritization in telehealth.
Salman, O H; Rasid, M F A; Saripan, M I; Subramaniam, S K
2014-09-01
The healthcare industry is streamlining processes to offer more timely and effective services to all patients. Computerized software algorithm and smart devices can streamline the relation between users and doctors by providing more services inside the healthcare telemonitoring systems. This paper proposes a multi-sources framework to support advanced healthcare applications. The proposed framework named Multi Sources Healthcare Architecture (MSHA) considers multi-sources: sensors (ECG, SpO2 and Blood Pressure) and text-based inputs from wireless and pervasive devices of Wireless Body Area Network. The proposed framework is used to improve the healthcare scalability efficiency by enhancing the remote triaging and remote prioritization processes for the patients. The proposed framework is also used to provide intelligent services over telemonitoring healthcare services systems by using data fusion method and prioritization technique. As telemonitoring system consists of three tiers (Sensors/ sources, Base station and Server), the simulation of the MSHA algorithm in the base station is demonstrated in this paper. The achievement of a high level of accuracy in the prioritization and triaging patients remotely, is set to be our main goal. Meanwhile, the role of multi sources data fusion in the telemonitoring healthcare services systems has been demonstrated. In addition to that, we discuss how the proposed framework can be applied in a healthcare telemonitoring scenario. Simulation results, for different symptoms relate to different emergency levels of heart chronic diseases, demonstrate the superiority of our algorithm compared with conventional algorithms in terms of classify and prioritize the patients remotely.
Analysis of flood inundation in ungauged basins based on multi-source remote sensing data.
Gao, Wei; Shen, Qiu; Zhou, Yuehua; Li, Xin
2018-02-09
Floods are among the most expensive natural hazards experienced in many places of the world and can result in heavy losses of life and economic damages. The objective of this study is to analyze flood inundation in ungauged basins by performing near-real-time detection with flood extent and depth based on multi-source remote sensing data. Via spatial distribution analysis of flood extent and depth in a time series, the inundation condition and the characteristics of flood disaster can be reflected. The results show that the multi-source remote sensing data can make up the lack of hydrological data in ungauged basins, which is helpful to reconstruct hydrological sequence; the combination of MODIS (moderate-resolution imaging spectroradiometer) surface reflectance productions and the DFO (Dartmouth Flood Observatory) flood database can achieve the macro-dynamic monitoring of the flood inundation in ungauged basins, and then the differential technique of high-resolution optical and microwave images before and after floods can be used to calculate flood extent to reflect spatial changes of inundation; the monitoring algorithm for the flood depth combining RS and GIS is simple and easy and can quickly calculate the depth with a known flood extent that is obtained from remote sensing images in ungauged basins. Relevant results can provide effective help for the disaster relief work performed by government departments.
A multi-source data assimilation framework for flood forecasting: Accounting for runoff routing lags
NASA Astrophysics Data System (ADS)
Meng, S.; Xie, X.
2015-12-01
In the flood forecasting practice, model performance is usually degraded due to various sources of uncertainties, including the uncertainties from input data, model parameters, model structures and output observations. Data assimilation is a useful methodology to reduce uncertainties in flood forecasting. For the short-term flood forecasting, an accurate estimation of initial soil moisture condition will improve the forecasting performance. Considering the time delay of runoff routing is another important effect for the forecasting performance. Moreover, the observation data of hydrological variables (including ground observations and satellite observations) are becoming easily available. The reliability of the short-term flood forecasting could be improved by assimilating multi-source data. The objective of this study is to develop a multi-source data assimilation framework for real-time flood forecasting. In this data assimilation framework, the first step is assimilating the up-layer soil moisture observations to update model state and generated runoff based on the ensemble Kalman filter (EnKF) method, and the second step is assimilating discharge observations to update model state and runoff within a fixed time window based on the ensemble Kalman smoother (EnKS) method. This smoothing technique is adopted to account for the runoff routing lag. Using such assimilation framework of the soil moisture and discharge observations is expected to improve the flood forecasting. In order to distinguish the effectiveness of this dual-step assimilation framework, we designed a dual-EnKF algorithm in which the observed soil moisture and discharge are assimilated separately without accounting for the runoff routing lag. The results show that the multi-source data assimilation framework can effectively improve flood forecasting, especially when the runoff routing has a distinct time lag. Thus, this new data assimilation framework holds a great potential in operational flood forecasting by merging observations from ground measurement and remote sensing retrivals.
2009-01-01
Background Leadership courses and multi-source feedback are widely used developmental tools for leaders in health care. On this background we aimed to study the additional effect of a leadership course following a multi-source feedback procedure compared to multi-source feedback alone especially regarding development of leadership skills over time. Methods Study participants were consultants responsible for postgraduate medical education at clinical departments. Study design: pre-post measures with an intervention and control group. The intervention was participation in a seven-day leadership course. Scores of multi-source feedback from the consultants responsible for education and respondents (heads of department, consultants and doctors in specialist training) were collected before and one year after the intervention and analysed using Mann-Whitney's U-test and Multivariate analysis of variances. Results There were no differences in multi-source feedback scores at one year follow up compared to baseline measurements, either in the intervention or in the control group (p = 0.149). Conclusion The study indicates that a leadership course following a MSF procedure compared to MSF alone does not improve leadership skills of consultants responsible for education in clinical departments. Developing leadership skills takes time and the time frame of one year might have been too short to show improvement in leadership skills of consultants responsible for education. Further studies are needed to investigate if other combination of initiatives to develop leadership might have more impact in the clinical setting. PMID:20003311
Multisource inverse-geometry CT. Part I. System concept and development
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Man, Bruno, E-mail: deman@ge.com; Harrison, Dan
Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: Themore » authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals.« less
Multisource inverse-geometry CT. Part I. System concept and development
De Man, Bruno; Uribe, Jorge; Baek, Jongduk; Harrison, Dan; Yin, Zhye; Longtin, Randy; Roy, Jaydeep; Waters, Bill; Wilson, Colin; Short, Jonathan; Inzinna, Lou; Reynolds, Joseph; Neculaes, V. Bogdan; Frutschy, Kristopher; Senzig, Bob; Pelc, Norbert
2016-01-01
Purpose: This paper presents an overview of multisource inverse-geometry computed tomography (IGCT) as well as the development of a gantry-based research prototype system. The development of the distributed x-ray source is covered in a companion paper [V. B. Neculaes et al., “Multisource inverse-geometry CT. Part II. X-ray source design and prototype,” Med. Phys. 43, 4617–4627 (2016)]. While progress updates of this development have been presented at conferences and in journal papers, this paper is the first comprehensive overview of the multisource inverse-geometry CT concept and prototype. The authors also provide a review of all previous IGCT related publications. Methods: The authors designed and implemented a gantry-based 32-source IGCT scanner with 22 cm field-of-view, 16 cm z-coverage, 1 s rotation time, 1.09 × 1.024 mm detector cell size, as low as 0.4 × 0.8 mm focal spot size and 80–140 kVp x-ray source voltage. The system is built using commercially available CT components and a custom made distributed x-ray source. The authors developed dedicated controls, calibrations, and reconstruction algorithms and evaluated the system performance using phantoms and small animals. Results: The authors performed IGCT system experiments and demonstrated tube current up to 125 mA with up to 32 focal spots. The authors measured a spatial resolution of 13 lp/cm at 5% cutoff. The scatter-to-primary ratio is estimated 62% for a 32 cm water phantom at 140 kVp. The authors scanned several phantoms and small animals. The initial images have relatively high noise due to the low x-ray flux levels but minimal artifacts. Conclusions: IGCT has unique benefits in terms of dose-efficiency and cone-beam artifacts, but comes with challenges in terms of scattered radiation and x-ray flux limits. To the authors’ knowledge, their prototype is the first gantry-based IGCT scanner. The authors summarized the design and implementation of the scanner and the authors presented results with phantoms and small animals. PMID:27487877
Wu, Peng; Huang, Yiyin; Kang, Longtian; Wu, Maoxiang; Wang, Yaobing
2015-01-01
A series of palladium-based catalysts of metal alloying (Sn, Pb) and/or (N-doped) graphene support with regular enhanced electrocatalytic activity were investigated. The peak current density (118.05 mA cm−2) of PdSn/NG is higher than the sum current density (45.63 + 47.59 mA cm−2) of Pd/NG and PdSn/G. It reveals a synergistic electrocatalytic oxidation effect in PdSn/N-doped graphene Nanocomposite. Extend experiments show this multisource synergetic catalytic effect of metal alloying and N-doped graphene support in one catalyst on small organic molecule (methanol, ethanol and Ethylene glycol) oxidation is universal in PdM(M = Sn, Pb)/NG catalysts. Further, The high dispersion of small nanoparticles, the altered electron structure and Pd(0)/Pd(II) ratio of Pd in catalysts induced by strong coupled the metal alloying and N-doped graphene are responsible for the multisource synergistic catalytic effect in PdM(M = Sn, Pb) /NG catalysts. Finally, the catalytic durability and stability are also greatly improved. PMID:26434949
NASA Astrophysics Data System (ADS)
Fan, Hong; Li, Huan
2015-12-01
Location-related data are playing an increasingly irreplaceable role in business, government and scientific research. At the same time, the amount and types of data are rapidly increasing. It is a challenge how to quickly find required information from this rapidly growing volume of data, as well as how to efficiently provide different levels of geospatial data to users. This paper puts forward a data-oriented access model for geographic information science data. First, we analyze the features of GIS data including traditional types such as vector and raster data and new types such as Volunteered Geographic Information (VGI). Taking into account these analyses, a classification scheme for geographic data is proposed and TRAFIE is introduced to describe the establishment of a multi-level model for geographic data. Based on this model, a multi-level, scalable access system for geospatial information is put forward. Users can select different levels of data according to their concrete application needs. Pull-based and push-based data access mechanisms based on this model are presented. A Service Oriented Architecture (SOA) was chosen for the data processing. The model of this study has been described by providing decision-making process of government departments with a simulation of fire disaster data collection. The use case shows this data model and the data provision system is flexible and has good adaptability.
Castro, Eduardo; Martínez-Ramón, Manel; Pearlson, Godfrey; Sui, Jing; Calhoun, Vince D.
2011-01-01
Pattern classification of brain imaging data can enable the automatic detection of differences in cognitive processes of specific groups of interest. Furthermore, it can also give neuroanatomical information related to the regions of the brain that are most relevant to detect these differences by means of feature selection procedures, which are also well-suited to deal with the high dimensionality of brain imaging data. This work proposes the application of recursive feature elimination using a machine learning algorithm based on composite kernels to the classification of healthy controls and patients with schizophrenia. This framework, which evaluates nonlinear relationships between voxels, analyzes whole-brain fMRI data from an auditory task experiment that is segmented into anatomical regions and recursively eliminates the uninformative ones based on their relevance estimates, thus yielding the set of most discriminative brain areas for group classification. The collected data was processed using two analysis methods: the general linear model (GLM) and independent component analysis (ICA). GLM spatial maps as well as ICA temporal lobe and default mode component maps were then input to the classifier. A mean classification accuracy of up to 95% estimated with a leave-two-out cross-validation procedure was achieved by doing multi-source data classification. In addition, it is shown that the classification accuracy rate obtained by using multi-source data surpasses that reached by using single-source data, hence showing that this algorithm takes advantage of the complimentary nature of GLM and ICA. PMID:21723948
Multimethod-Multisource Approach for Assessing High-Technology Training Systems.
ERIC Educational Resources Information Center
Shlechter, Theodore M.; And Others
This investigation examined the value of using a multimethod-multisource approach to assess high-technology training systems. The research strategy was utilized to provide empirical information on the instructional effectiveness of the Reserve Component Virtual Training Program (RCVTP), which was developed to improve the training of Army National…
Understanding the Influence of Emotions and Reflection upon Multi-Source Feedback Acceptance and Use
ERIC Educational Resources Information Center
Sargeant, Joan; Mann, Karen; Sinclair, Douglas; Van der Vleuten, Cees; Metsemakers, Job
2008-01-01
Introduction: Receiving negative performance feedback can elicit negative emotional reactions which can interfere with feedback acceptance and use. This study investigated emotional responses of family physicians' participating in a multi-source feedback (MSF) program, sources of these emotions, and their influence upon feedback acceptance and…
Long-term monitoring on environmental disasters using multi-source remote sensing technique
NASA Astrophysics Data System (ADS)
Kuo, Y. C.; Chen, C. F.
2017-12-01
Environmental disasters are extreme events within the earth's system that cause deaths and injuries to humans, as well as causing damages and losses of valuable assets, such as buildings, communication systems, farmlands, forest and etc. In disaster management, a large amount of multi-temporal spatial data is required. Multi-source remote sensing data with different spatial, spectral and temporal resolutions is widely applied on environmental disaster monitoring. With multi-source and multi-temporal high resolution images, we conduct rapid, systematic and seriate observations regarding to economic damages and environmental disasters on earth. It is based on three monitoring platforms: remote sensing, UAS (Unmanned Aircraft Systems) and ground investigation. The advantages of using UAS technology include great mobility and availability in real-time rapid and more flexible weather conditions. The system can produce long-term spatial distribution information from environmental disasters, obtaining high-resolution remote sensing data and field verification data in key monitoring areas. It also supports the prevention and control on ocean pollutions, illegally disposed wastes and pine pests in different scales. Meanwhile, digital photogrammetry can be applied on the camera inside and outside the position parameters to produce Digital Surface Model (DSM) data. The latest terrain environment information is simulated by using DSM data, and can be used as references in disaster recovery in the future.
Lyness, Karen S; Judiesch, Michael K
2008-07-01
The present study was the first cross-national examination of whether managers who were perceived to be high in work-life balance were expected to be more or less likely to advance in their careers than were less balanced, more work-focused managers. Using self ratings, peer ratings, and supervisor ratings of 9,627 managers in 33 countries, the authors examined within-source and multisource relationships with multilevel analyses. The authors generally found that managers who were rated higher in work-life balance were rated higher in career advancement potential than were managers who were rated lower in work-life balance. However, national gender egalitarianism, measured with Project GLOBE scores, moderated relationships based on supervisor and self ratings, with stronger positive relationships in low egalitarian cultures. The authors also found 3-way interactions of work-life balance ratings, ratee gender, and gender egalitarianism in multisource analyses in which self balance ratings predicted supervisor and peer ratings of advancement potential. Work-life balance ratings were positively related to advancement potential ratings for women in high egalitarian cultures and men in low gender egalitarian cultures, but relationships were nonsignificant for men in high egalitarian cultures and women in low egalitarian cultures.
Harth, Yoram
2015-03-01
In the last decade, Radiofrequency (RF) energy has proven to be safe and highly efficacious for face and neck skin tightening, body contouring, and cellulite reduction. In contrast to first-generation Monopolar/Bipolar and "X -Polar" RF systems which use one RF generator connected to one or more skin electrodes, multisource radiofrequency devices use six independent RF generators allowing efficient dermal heating to 52-55°C, with no pain or risk of other side effects. In this review, the basic science and clinical results of body contouring and cellulite treatment using multisource radiofrequency system (Endymed PRO, Endymed, Cesarea, Israel) will be discussed and analyzed. © 2015 Wiley Periodicals, Inc.
Using the 360 degrees multisource feedback model to evaluate teaching and professionalism.
Berk, Ronald A
2009-12-01
Student ratings have dominated as the primary and, frequently, only measure of teaching performance at colleges and universities for the past 50 years. Recently, there has been a trend toward augmenting those ratings with other data sources to broaden and deepen the evidence base. The 360 degrees multisource feedback (MSF) model used in management and industry for half a century and in clinical medicine for the last decade seemed like a best fit to evaluate teaching performance and professionalism. To adapt the 360 degrees MSF model to the assessment of teaching performance and professionalism of medical school faculty. The salient characteristics of the MSF models in industry and medicine were extracted from the literature. These characteristics along with 14 sources of evidence from eight possible raters, including students, self, peers, outside experts, mentors, alumni, employers, and administrators, based on the research in higher education were adapted to formative and summative decisions. Three 360 degrees MSF models were generated for three different decisions: (1) formative decisions and feedback about teaching improvement; (2) summative decisions and feedback for merit pay and contract renewal; and (3) formative decisions and feedback about professional behaviors in the academic setting. The characteristics of each model were listed. Finally, a top-10 list of the most persistent and, perhaps, intractable psychometric issues in executing these models was suggested to guide future research. The 360 degrees MSF model appears to be a useful framework for implementing a multisource evaluation of faculty teaching performance and professionalism in medical schools. This model can provide more accurate, reliable, fair, and equitable decisions than the one based on just a single source.
Development of Physical Therapy Practical Assessment System by Using Multisource Feedback
ERIC Educational Resources Information Center
Hengsomboon, Ninwisan; Pasiphol, Shotiga; Sujiva, Siridej
2017-01-01
The purposes of the research were (1) to develop the physical therapy practical assessment system by using the multisource feedback (MSF) approach and (2) to investigate the effectiveness of the implementation of the developed physical therapy practical assessment system. The development of physical therapy practical assessment system by using MSF…
ERIC Educational Resources Information Center
Roberts, Martin J.; Campbell, John L.; Richards, Suzanne H.; Wright, Christine
2013-01-01
Introduction: Multisource feedback (MSF) ratings provided by patients and colleagues are often poorly correlated with doctors' self-assessments. Doctors' reactions to feedback depend on its agreement with their own perceptions, but factors influencing self-other agreement in doctors' MSF ratings have received little attention. We aimed to identify…
Multi-Source Evaluation of Interpersonal and Communication Skills of Family Medicine Residents
ERIC Educational Resources Information Center
Leung, Kai-Kuen; Wang, Wei-Dan; Chen, Yen-Yuan
2012-01-01
There is a lack of information on the use of multi-source evaluation to assess trainees' interpersonal and communication skills in Oriental settings. This study is conducted to assess the reliability and applicability of assessing the interpersonal and communication skills of family medicine residents by patients, peer residents, nurses, and…
NASA Astrophysics Data System (ADS)
Pan, X. G.; Wang, J. Q.; Zhou, H. Y.
2013-05-01
The variance component estimation (VCE) based on semi-parametric estimator with weighted matrix of data depth has been proposed, because the coupling system model error and gross error exist in the multi-source heterogeneous measurement data of space and ground combined TT&C (Telemetry, Tracking and Command) technology. The uncertain model error has been estimated with the semi-parametric estimator model, and the outlier has been restrained with the weighted matrix of data depth. On the basis of the restriction of the model error and outlier, the VCE can be improved and used to estimate weighted matrix for the observation data with uncertain model error or outlier. Simulation experiment has been carried out under the circumstance of space and ground combined TT&C. The results show that the new VCE based on the model error compensation can determine the rational weight of the multi-source heterogeneous data, and restrain the outlier data.
The design and implementation of hydrographical information management system (HIMS)
NASA Astrophysics Data System (ADS)
Sui, Haigang; Hua, Li; Wang, Qi; Zhang, Anming
2005-10-01
With the development of hydrographical work and information techniques, the large variety of hydrographical information including electronic charts, documents and other materials are widely used, and the traditional management mode and techniques are unsuitable for the development of the Chinese Marine Safety Administration Bureau (CMSAB). How to manage all kinds of hydrographical information has become an important and urgent problem. A lot of advanced techniques including GIS, RS, spatial database management and VR techniques are introduced for solving these problems. Some design principles and key techniques of the HIMS including the mixed mode base on B/S, C/S and stand-alone computer mode, multi-source & multi-scale data organization and management, multi-source data integration and diverse visualization of digital chart, efficient security control strategies are illustrated in detail. Based on the above ideas and strategies, an integrated system named Hydrographical Information Management System (HIMS) was developed. And the HIMS has been applied in the Shanghai Marine Safety Administration Bureau and obtained good evaluation.
NASA Astrophysics Data System (ADS)
Luo, Qiu; Xin, Wu; Qiming, Xiong
2017-06-01
In the process of vegetation remote sensing information extraction, the problem of phenological features and low performance of remote sensing analysis algorithm is not considered. To solve this problem, the method of remote sensing vegetation information based on EVI time-series and the classification of decision-tree of multi-source branch similarity is promoted. Firstly, to improve the time-series stability of recognition accuracy, the seasonal feature of vegetation is extracted based on the fitting span range of time-series. Secondly, the decision-tree similarity is distinguished by adaptive selection path or probability parameter of component prediction. As an index, it is to evaluate the degree of task association, decide whether to perform migration of multi-source decision tree, and ensure the speed of migration. Finally, the accuracy of classification and recognition of pests and diseases can reach 87%--98% of commercial forest in Dalbergia hainanensis, which is significantly better than that of MODIS coverage accuracy of 80%--96% in this area. Therefore, the validity of the proposed method can be verified.
Application of Ontology Technology in Health Statistic Data Analysis.
Guo, Minjiang; Hu, Hongpu; Lei, Xingyun
2017-01-01
Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.
Multi-source and ontology-based retrieval engine for maize mutant phenotypes
USDA-ARS?s Scientific Manuscript database
In the midst of this genomics era, major plant genome databases are collecting massive amounts of heterogeneous information, including sequence data, gene product information, images of mutant phenotypes, etc., as well as textual descriptions of many of these entities. While basic browsing and sear...
Foundational Technologies for Activity-Based Intelligence - A Review of the Literature
2014-02-01
academic community. The Center for Multisource Information Fusion ( CMIF ) at the University at Buffalo, Harvard University, and the University of...depth of researchers conducting high-value Multi-INT research; these efforts 26 are delivering high-value research outcomes, e.g., [46-47]. CMIF
NASA Astrophysics Data System (ADS)
Li, J.; Wen, G.; Li, D.
2018-04-01
Trough mastering background information of Yunnan province grassland resources utilization and ecological conditions to improves grassland elaborating management capacity, it carried out grassland resource investigation work by Yunnan province agriculture department in 2017. The traditional grassland resource investigation method is ground based investigation, which is time-consuming and inefficient, especially not suitable for large scale and hard-to-reach areas. While remote sensing is low cost, wide range and efficient, which can reflect grassland resources present situation objectively. It has become indispensable grassland monitoring technology and data sources and it has got more and more recognition and application in grassland resources monitoring research. This paper researches application of multi-source remote sensing image in Yunnan province grassland resources investigation. First of all, it extracts grassland resources thematic information and conducts field investigation through BJ-2 high space resolution image segmentation. Secondly, it classifies grassland types and evaluates grassland degradation degree through high resolution characteristics of Landsat 8 image. Thirdly, it obtained grass yield model and quality classification through high resolution and wide scanning width characteristics of MODIS images and sample investigate data. Finally, it performs grassland field qualitative analysis through UAV remote sensing image. According to project area implementation, it proves that multi-source remote sensing data can be applied to the grassland resources investigation in Yunnan province and it is indispensable method.
Modeling multi-source flooding disaster and developing simulation framework in Delta
NASA Astrophysics Data System (ADS)
Liu, Y.; Cui, X.; Zhang, W.
2016-12-01
Most Delta regions of the world are densely populated and with advanced economies. However, due to impact of the multi-source flooding (upstream flood, rainstorm waterlogging, storm surge flood), the Delta regions is very vulnerable. The academic circles attach great importance to the multi-source flooding disaster in these areas. The Pearl River Delta urban agglomeration in south China is selected as the research area. Based on analysis of natural and environmental characteristics data of the Delta urban agglomeration(remote sensing data, land use data, topographic map, etc.), hydrological monitoring data, research of the uneven distribution and process of regional rainfall, the relationship between the underlying surface and the parameters of runoff, effect of flood storage pattern, we use an automatic or semi-automatic method for dividing spatial units to reflect the runoff characteristics in urban agglomeration, and develop an Multi-model Ensemble System in changing environment, including urban hydrologic model, parallel computational 1D&2D hydrodynamic model, storm surge forecast model and other professional models, the system will have the abilities like real-time setting a variety of boundary conditions, fast and real-time calculation, dynamic presentation of results, powerful statistical analysis function. The model could be optimized and improved by a variety of verification methods. This work was supported by the National Natural Science Foundation of China (41471427); Special Basic Research Key Fund for Central Public Scientific Research Institutes.
Ni, Li-Jun; Luan, Shao-Rong; Zhang, Li-Guo
2016-10-01
Because of the numerous varieties of herbal species and active ingredients in the traditional Chinese medicine(TCM),the traditional methods employed could hardly satisfy the current determination requirements of TCM.The present work proposed an idea to realize rapid determination of the quality of TCM based on near infrared(NIR)spectroscopy and internet sharing mode. Low cost and portable multi-source composite spectrometer was invented by our group for in-site fast measurement of spectra of TCM samples. The database could be set up by sharing spectra and quality detection data of TCM samples among TCM enterprises based on the internet platform.A novel method called as keeping same relationship between X and Y space based on K nearest neighbors(KNN-KSR for short)was applied to predict the contents of effective compounds of the samples. In addition,a comparative study between KNN-KSR and partial least squares(PLS)was conducted. Two datasets were applied to validate above idea:one was about 58 Ginkgo Folium samples samples measured with four near-infrared spectroscopy instruments and two multi-source composite spectrometers,another one was about 80 corn samples available online measured with three NIR instruments. The results show that the KNN-KSR method could obtain more reliable outcomes without correcting spectrum.However transforming the PLS models to other instruments could hardly acquire better predictive results until spectral calibration is performed. Meanwhile,the similar analysis results of total flavonoids and total lactones of Ginkgo Folium samples are achieved on the multi-source composite spectrometers and near-infrared spectroscopy instruments,and the prediction results of KNN-KSR are better than PLS. The idea proposed in present study is in urgent need of more samples spectra, and then to be verified by more case studies. Copyright© by the Chinese Pharmaceutical Association.
The Finnish multisource national forest inventory: small-area estimation and map production
Erkki Tomppo
2009-01-01
A driving force motivating development of the multisource national forest inventory (MS-NFI) in connection with the Finnish national forest inventory (NFI) was the desire to obtain forest resource information for smaller areas than is possible using field data only without significantly increasing the cost of the inventory. A basic requirement for the method was that...
On Meaningful Measurement: Concepts, Technology and Examples.
ERIC Educational Resources Information Center
Cheung, K. C.
This paper discusses how concepts and procedural skills in problem-solving tasks, as well as affects and emotions, can be subjected to meaningful measurement (MM), based on a multisource model of learning and a constructivist information-processing theory of knowing. MM refers to the quantitative measurement of conceptual and procedural knowledge…
Estimating error cross-correlations in soil moisture data sets using extended collocation analysis
USDA-ARS?s Scientific Manuscript database
Consistent global soil moisture records are essential for studying the role of hydrologic processes within the larger earth system. Various studies have shown the benefit of assimilating satellite-based soil moisture data into water balance models or merging multi-source soil moisture retrievals int...
Unified Research on Network-Based Hard/Soft Information Fusion
2016-02-02
types). There are a number of search tree run parameters which must be set depending on the experimental setting. A pilot study was run to identify...Unlimited Final Report: Unified Research on Network-Based Hard/Soft Information Fusion The views, opinions and/or findings contained in this report...Final Report: Unified Research on Network-Based Hard/Soft Information Fusion Report Title The University at Buffalo (UB) Center for Multisource
A multi-source dataset of urban life in the city of Milan and the Province of Trentino.
Barlacchi, Gianni; De Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno
2015-01-01
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others.
A multi-source dataset of urban life in the city of Milan and the Province of Trentino
NASA Astrophysics Data System (ADS)
Barlacchi, Gianni; de Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno
2015-10-01
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others.
Multisource energy system project
NASA Astrophysics Data System (ADS)
Dawson, R. W.; Cowan, R. A.
1987-03-01
The mission of this project is to investigate methods of providing uninterruptible power to Army communications and navigational facilities, many of which have limited access or are located in rugged terrain. Two alternatives are currently available for deploying terrestrial stand-alone power systems: (1) conventional electric systems powered by diesel fuel, propane, or natural gas, and (2) alternative power systems using renewable energy sources such as solar photovoltaics (PV) or wind turbines (WT). The increased cost of fuels for conventional systems and the high cost of energy storage for single-source renewable energy systems have created interest in the hybrid or multisource energy system. This report will provide a summary of the first and second interim reports, final test results, and a user's guide for software that will assist in applying and designing multi-source energy systems.
A multi-source dataset of urban life in the city of Milan and the Province of Trentino
Barlacchi, Gianni; De Nadai, Marco; Larcher, Roberto; Casella, Antonio; Chitic, Cristiana; Torrisi, Giovanni; Antonelli, Fabrizio; Vespignani, Alessandro; Pentland, Alex; Lepri, Bruno
2015-01-01
The study of socio-technical systems has been revolutionized by the unprecedented amount of digital records that are constantly being produced by human activities such as accessing Internet services, using mobile devices, and consuming energy and knowledge. In this paper, we describe the richest open multi-source dataset ever released on two geographical areas. The dataset is composed of telecommunications, weather, news, social networks and electricity data from the city of Milan and the Province of Trentino. The unique multi-source composition of the dataset makes it an ideal testbed for methodologies and approaches aimed at tackling a wide range of problems including energy consumption, mobility planning, tourist and migrant flows, urban structures and interactions, event detection, urban well-being and many others. PMID:26528394
Fan, Yuanjie; Yin, Yuehong
2013-12-01
Although exoskeletons have received enormous attention and have been widely used in gait training and walking assistance in recent years, few reports addressed their application during early poststroke rehabilitation. This paper presents a healthcare technology for active and progressive early rehabilitation using multisource information fusion from surface electromyography and force-position extended physiological proprioception. The active-compliance control based on interaction force between patient and exoskeleton is applied to accelerate the recovery of the neuromuscular function, whereby progressive treatment through timely evaluation contributes to an effective and appropriate physical rehabilitation. Moreover, a clinic-oriented rehabilitation system, wherein a lower extremity exoskeleton with active compliance is mounted on a standing bed, is designed to ensure comfortable and secure rehabilitation according to the structure and control requirements. Preliminary experiments and clinical trial demonstrate valuable information on the feasibility, safety, and effectiveness of the progressive exoskeleton-assisted training.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan Miguel
2013-01-01
Research biobanks are often composed by data from multiple sources. In some cases, these different subsets of data may present dissimilarities among their probability density functions (PDF) due to spatial shifts. This, may lead to wrong hypothesis when treating the data as a whole. Also, the overall quality of the data is diminished. With the purpose of developing a generic and comparable metric to assess the stability of multi-source datasets, we have studied the applicability and behaviour of several PDF distances over shifts on different conditions (such as uni- and multivariate, different types of variable, and multi-modality) which may appear in real biomedical data. From the studied distances, we found information-theoretic based and Earth Mover's Distance to be the most practical distances for most conditions. We discuss the properties and usefulness of each distance according to the possible requirements of a general stability metric.
Using multilevel, multisource needs assessment data for planning community interventions.
Levy, Susan R; Anderson, Emily E; Issel, L Michele; Willis, Marilyn A; Dancy, Barbara L; Jacobson, Kristin M; Fleming, Shirley G; Copper, Elizabeth S; Berrios, Nerida M; Sciammarella, Esther; Ochoa, Mónica; Hebert-Beirne, Jennifer
2004-01-01
African Americans and Latinos share higher rates of cardiovascular disease (CVD) and diabetes compared with Whites. These diseases have common risk factors that are amenable to primary and secondary prevention. The goal of the Chicago REACH 2010-Lawndale Health Promotion Project is to eliminate disparities related to CVD and diabetes experienced by African Americans and Latinos in two contiguous Chicago neighborhoods using a community-based prevention approach. This article shares findings from the Phase 1 participatory planning process and discusses the implications these findings and lessons learned may have for programs aiming to reduce health disparities in multiethnic communities. The triangulation of data sources from the planning phase enriched interpretation and led to more creative and feasible suggestions for programmatic interventions across the four levels of the ecological framework. Multisource data yielded useful information for program planning and a better understanding of the cultural differences and similarities between African Americans and Latinos.
NASA Astrophysics Data System (ADS)
Yongzhi, WANG; hui, WANG; Lixia, LIAO; Dongsen, LI
2017-02-01
In order to analyse the geological characteristics of salt rock and stability of salt caverns, rough three-dimensional (3D) models of salt rock stratum and the 3D models of salt caverns on study areas are built by 3D GIS spatial modeling technique. During implementing, multi-source data, such as basic geographic data, DEM, geological plane map, geological section map, engineering geological data, and sonar data are used. In this study, the 3D spatial analyzing and calculation methods, such as 3D GIS intersection detection method in three-dimensional space, Boolean operations between three-dimensional space entities, three-dimensional space grid discretization, are used to build 3D models on wall rock of salt caverns. Our methods can provide effective calculation models for numerical simulation and analysis of the creep characteristics of wall rock in salt caverns.
A multi-source probabilistic hazard assessment of tephra dispersal in the Neapolitan area
NASA Astrophysics Data System (ADS)
Sandri, Laura; Costa, Antonio; Selva, Jacopo; Folch, Arnau; Macedonio, Giovanni; Tonini, Roberto
2015-04-01
In this study we present the results obtained from a long-term Probabilistic Hazard Assessment (PHA) of tephra dispersal in the Neapolitan area. Usual PHA for tephra dispersal needs the definition of eruptive scenarios (usually by grouping eruption sizes and possible vent positions in a limited number of classes) with associated probabilities, a meteorological dataset covering a representative time period, and a tephra dispersal model. PHA then results from combining simulations considering different volcanological and meteorological conditions through weights associated to their specific probability of occurrence. However, volcanological parameters (i.e., erupted mass, eruption column height, eruption duration, bulk granulometry, fraction of aggregates) typically encompass a wide range of values. Because of such a natural variability, single representative scenarios or size classes cannot be adequately defined using single values for the volcanological inputs. In the present study, we use a method that accounts for this within-size-class variability in the framework of Event Trees. The variability of each parameter is modeled with specific Probability Density Functions, and meteorological and volcanological input values are chosen by using a stratified sampling method. This procedure allows for quantifying hazard without relying on the definition of scenarios, thus avoiding potential biases introduced by selecting single representative scenarios. Embedding this procedure into the Bayesian Event Tree scheme enables the tephra fall PHA and its epistemic uncertainties. We have appied this scheme to analyze long-term tephra fall PHA from Vesuvius and Campi Flegrei, in a multi-source paradigm. We integrate two tephra dispersal models (the analytical HAZMAP and the numerical FALL3D) into BET_VH. The ECMWF reanalysis dataset are used for exploring different meteorological conditions. The results obtained show that PHA accounting for the whole natural variability are consistent with previous probabilities maps elaborated for Vesuvius and Campi Flegrei on the basis of single representative scenarios, but show significant differences. In particular, the area characterized by a 300 kg/m2-load exceedance probability larger than 5%, accounting for the whole range of variability (that is, from small violent strombolian to plinian eruptions), is similar to that displayed in the maps based on the medium magnitude reference eruption, but it is of a smaller extent. This is due to the relatively higher weight of the small magnitude eruptions considered in this study, but neglected in the reference scenario maps. On the other hand, in our new maps the area characterized by a 300 kg/m2-load exceedance probability larger than 1% is much larger than that of the medium magnitude reference eruption, due to the contribution of plinian eruptions at lower probabilities, again neglected in the reference scenario maps.
NASA Astrophysics Data System (ADS)
Camporese, M.; Botto, A.
2017-12-01
Data assimilation is becoming increasingly popular in hydrological and earth system modeling, as it allows for direct integration of multisource observation data in modeling predictions and uncertainty reduction. For this reason, data assimilation has been recently the focus of much attention also for integrated surface-subsurface hydrological models, whereby multiple terrestrial compartments (e.g., snow cover, surface water, groundwater) are solved simultaneously, in an attempt to tackle environmental problems in a holistic approach. Recent examples include the joint assimilation of water table, soil moisture, and river discharge measurements in catchment models of coupled surface-subsurface flow using the ensemble Kalman filter (EnKF). Although the EnKF has been specifically developed to deal with nonlinear models, integrated hydrological models based on the Richards equation still represent a challenge, due to strong nonlinearities that may significantly affect the filter performance. Thus, more studies are needed to investigate the capabilities of EnKF to correct the system state and identify parameters in cases where the unsaturated zone dynamics are dominant. Here, the model CATHY (CATchment HYdrology) is applied to reproduce the hydrological dynamics observed in an experimental hillslope, equipped with tensiometers, water content reflectometer probes, and tipping bucket flow gages to monitor the hillslope response to a series of artificial rainfall events. We assimilate pressure head, soil moisture, and subsurface outflow with EnKF in a number of assimilation scenarios and discuss the challenges, issues, and tradeoffs arising from the assimilation of multisource data in a real-world test case, with particular focus on the capability of DA to update the subsurface parameters.
NASA Astrophysics Data System (ADS)
Zhao, Junsan; Chen, Guoping; Yuan, Lei
2017-04-01
The new technologies, such as 3D laser scanning, InSAR, GNSS, unmanned aerial vehicle and Internet of things, will provide much more data resources for the surveying and monitoring, as well as the development of Early Warning System (EWS). This paper provides the solutions of the design and implementation of a geological disaster monitoring and early warning system (GDMEWS), which includes landslides and debris flows hazard, based on the multi-sources of the date by use of technologies above mentioned. The complex and changeable characteristics of the GDMEWS are described. The architecture of the system, composition of the multi-source database, development mode and service logic, the methods and key technologies of system development are also analyzed. To elaborate the process of the implementation of the GDMEWS, Deqin Tibetan County is selected as a case study area, which has the unique terrain and diverse types of typical landslides and debris flows. Firstly, the system functional requirements, monitoring and forecasting models of the system are discussed. Secondly, the logic relationships of the whole process of disaster including pre-disaster, disaster rescue and post-disaster reconstruction are studied, and the support tool for disaster prevention, disaster reduction and geological disaster management are developed. Thirdly, the methods of the multi - source monitoring data integration and the generation of the mechanism model of Geological hazards and simulation are expressed. Finally, the construction of the GDMEWS is issued, which will be applied to management, monitoring and forecasting of whole disaster process in real-time and dynamically in Deqin Tibetan County. Keywords: multi-source spatial data; geological disaster; monitoring and warning system; Deqin Tibetan County
L1-norm locally linear representation regularization multi-source adaptation learning.
Tao, Jianwen; Wen, Shiting; Hu, Wenjun
2015-09-01
In most supervised domain adaptation learning (DAL) tasks, one has access only to a small number of labeled examples from target domain. Therefore the success of supervised DAL in this "small sample" regime needs the effective utilization of the large amounts of unlabeled data to extract information that is useful for generalization. Toward this end, we here use the geometric intuition of manifold assumption to extend the established frameworks in existing model-based DAL methods for function learning by incorporating additional information about the target geometric structure of the marginal distribution. We would like to ensure that the solution is smooth with respect to both the ambient space and the target marginal distribution. In doing this, we propose a novel L1-norm locally linear representation regularization multi-source adaptation learning framework which exploits the geometry of the probability distribution, which has two techniques. Firstly, an L1-norm locally linear representation method is presented for robust graph construction by replacing the L2-norm reconstruction measure in LLE with L1-norm one, which is termed as L1-LLR for short. Secondly, considering the robust graph regularization, we replace traditional graph Laplacian regularization with our new L1-LLR graph Laplacian regularization and therefore construct new graph-based semi-supervised learning framework with multi-source adaptation constraint, which is coined as L1-MSAL method. Moreover, to deal with the nonlinear learning problem, we also generalize the L1-MSAL method by mapping the input data points from the input space to a high-dimensional reproducing kernel Hilbert space (RKHS) via a nonlinear mapping. Promising experimental results have been obtained on several real-world datasets such as face, visual video and object. Copyright © 2015 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Han, P.; Long, D.
2017-12-01
Snow water equivalent (SWE) and total water storage (TWS) changes are important hydrological state variables over cryospheric regions, such as China's Upper Yangtze River (UYR) basin. Accurate simulation of these two state variables plays a critical role in understanding hydrological processes over this region and, in turn, benefits water resource management, hydropower development, and ecological integrity over the lower reaches of the Yangtze River, one of the largest rivers globally. In this study, an improved CREST model coupled with a snow and glacier melting module was used to simulate SWE and TWS changes over the UYR, and to quantify contributions of snow and glacier meltwater to the total runoff. Forcing, calibration, and validation data are mainly from multi-source remote sensing observations, including satellite-based precipitation estimates, passive microwave remote sensing-based SWE, and GRACE-derived TWS changes, along with streamflow measurements at the Zhimenda gauging station. Results show that multi-source remote sensing information can be extremely valuable in model forcing, calibration, and validation over the poorly gauged region. The simulated SWE and TWS changes and the observed counterparts are highly consistent, showing NSE coefficients higher than 0.8. The results also show that the contributions of snow and glacier meltwater to the total runoff are 8% and 6%, respectively, during the period 2003‒2014, which is an important source of runoff. Moreover, from this study, the TWS is found to increase at a rate of 5 mm/a ( 0.72 Gt/a) for the period 2003‒2014. The snow melting module may overestimate SWE for high precipitation events and was improved in this study. Key words: CREST model; Remote Sensing; Melting model; Source Region of the Yangtze River
The Multi-energy High precision Data Processor Based on AD7606
NASA Astrophysics Data System (ADS)
Zhao, Chen; Zhang, Yanchi; Xie, Da
2017-11-01
This paper designs an information collector based on AD7606 to realize the high-precision simultaneous acquisition of multi-source information of multi-energy systems to form the information platform of the energy Internet at Laogang with electricty as its major energy source. Combined with information fusion technologies, this paper analyzes the data to improve the overall energy system scheduling capability and reliability.
NASA Astrophysics Data System (ADS)
Zhang, Z.; Xiao, R.; Li, X.
2015-12-01
Peri-urban area is a new type region under the impacts of both rural Industrialization and the radiation of metropolitan during rapid urbanization. Due to its complex natural and social characteristics and unique development patterns, many problems such as environmental pollution and land use waste emerged, which became an urgent issue to be addressed. Study area in this paper covers three typical peri-urban districts (Pudong, Fengxian and Jinshan), which around the Shanghai inner city. By coupling cellular automata and multi-agent system model as the basic tools, this research focus on modelling the urban land expansion and driving mechanism in peri-urban area. The big data is aslo combined with the Bayesian maximum entropy method (BME) for spatiotemporal prediction of multi-source data, which expand the dataset of urban expansion models. Data assimilation method is used to optimize the parameters of the coupling model and minimize the uncertainty of observations, improving the precision of future simulation in peri-urban area. By setting quantitative parameters, the coupling model can effectively improve the simulation of the process of urban land expansion under different policies and management schemes, in order to provide scientificimplications for new urbanization strategy. In this research, we precise the urban land expansion simulation and prediction for peri-urban area, expand the scopes and selections of data acquisition measurements and methods, develop the new applications of the data assimilation method in geographical science, provide a new idea for understanding the inherent rules of urban land expansion, and give theoretical and practical support for the peri-urban area in urban planning and decision making.
Lai, Michelle Mei Yee; Roberts, Noel; Martin, Jenepher
2014-09-17
Oral feedback from clinical educators is the traditional teaching method for improving clinical consultation skills in medical students. New approaches are needed to enhance this teaching model. Multisource feedback is a commonly used assessment method for learning among practising clinicians, but this assessment has not been explored rigorously in medical student education. This study seeks to evaluate if additional feedback on patient satisfaction improves medical student performance. The Patient Teaching Associate (PTA) Feedback Study is a single site randomized controlled, double-blinded trial with two parallel groups.An after-hours general practitioner clinic in Victoria, Australia, is adapted as a teaching clinic during the day. Medical students from two universities in their first clinical year participate in six simulated clinical consultations with ambulatory patient volunteers living with chronic illness. Eligible students will be randomized in equal proportions to receive patient satisfaction score feedback with the usual multisource feedback and the usual multisource feedback alone as control. Block randomization will be performed. We will assess patient satisfaction and consultation performance outcomes at baseline and after one semester and will compare any change in mean scores at the last session from that at baseline. We will model data using regression analysis to determine any differences between intervention and control groups. Full ethical approval has been obtained for the study. This trial will comply with CONSORT guidelines and we will disseminate data at conferences and in peer-reviewed journals. This is the first proposed trial to determine whether consumer feedback enhances the use of multisource feedback in medical student education, and to assess the value of multisource feedback in teaching and learning about the management of ambulatory patients living with chronic conditions. Australian New Zealand Clinical Trials Registry (ANZCTR): ACTRN12613001055796.
Cui, Tianxiang; Wang, Yujie; Sun, Rui; Qiao, Chen; Fan, Wenjie; Jiang, Guoqing; Hao, Lvyuan; Zhang, Lei
2016-01-01
Estimating gross primary production (GPP) and net primary production (NPP) are significant important in studying carbon cycles. Using models driven by multi-source and multi-scale data is a promising approach to estimate GPP and NPP at regional and global scales. With a focus on data that are openly accessible, this paper presents a GPP and NPP model driven by remotely sensed data and meteorological data with spatial resolutions varying from 30 m to 0.25 degree and temporal resolutions ranging from 3 hours to 1 month, by integrating remote sensing techniques and eco-physiological process theories. Our model is also designed as part of the Multi-source data Synergized Quantitative (MuSyQ) Remote Sensing Production System. In the presented MuSyQ-NPP algorithm, daily GPP for a 10-day period was calculated as a product of incident photosynthetically active radiation (PAR) and its fraction absorbed by vegetation (FPAR) using a light use efficiency (LUE) model. The autotrophic respiration (Ra) was determined using eco-physiological process theories and the daily NPP was obtained as the balance between GPP and Ra. To test its feasibility at regional scales, our model was performed in an arid and semi-arid region of Heihe River Basin, China to generate daily GPP and NPP during the growing season of 2012. The results indicated that both GPP and NPP exhibit clear spatial and temporal patterns in their distribution over Heihe River Basin during the growing season due to the temperature, water and solar influx conditions. After validated against ground-based measurements, MODIS GPP product (MOD17A2H) and results reported in recent literature, we found the MuSyQ-NPP algorithm could yield an RMSE of 2.973 gC m(-2) d(-1) and an R of 0.842 when compared with ground-based GPP while an RMSE of 8.010 gC m(-2) d(-1) and an R of 0.682 can be achieved for MODIS GPP, the estimated NPP values were also well within the range of previous literature, which proved the reliability of our modelling results. This research suggested that the utilization of multi-source data with various scales would help to the establishment of an appropriate model for calculating GPP and NPP at regional scales with relatively high spatial and temporal resolution.
Cui, Tianxiang; Wang, Yujie; Sun, Rui; Qiao, Chen; Fan, Wenjie; Jiang, Guoqing; Hao, Lvyuan; Zhang, Lei
2016-01-01
Estimating gross primary production (GPP) and net primary production (NPP) are significant important in studying carbon cycles. Using models driven by multi-source and multi-scale data is a promising approach to estimate GPP and NPP at regional and global scales. With a focus on data that are openly accessible, this paper presents a GPP and NPP model driven by remotely sensed data and meteorological data with spatial resolutions varying from 30 m to 0.25 degree and temporal resolutions ranging from 3 hours to 1 month, by integrating remote sensing techniques and eco-physiological process theories. Our model is also designed as part of the Multi-source data Synergized Quantitative (MuSyQ) Remote Sensing Production System. In the presented MuSyQ-NPP algorithm, daily GPP for a 10-day period was calculated as a product of incident photosynthetically active radiation (PAR) and its fraction absorbed by vegetation (FPAR) using a light use efficiency (LUE) model. The autotrophic respiration (Ra) was determined using eco-physiological process theories and the daily NPP was obtained as the balance between GPP and Ra. To test its feasibility at regional scales, our model was performed in an arid and semi-arid region of Heihe River Basin, China to generate daily GPP and NPP during the growing season of 2012. The results indicated that both GPP and NPP exhibit clear spatial and temporal patterns in their distribution over Heihe River Basin during the growing season due to the temperature, water and solar influx conditions. After validated against ground-based measurements, MODIS GPP product (MOD17A2H) and results reported in recent literature, we found the MuSyQ-NPP algorithm could yield an RMSE of 2.973 gC m-2 d-1 and an R of 0.842 when compared with ground-based GPP while an RMSE of 8.010 gC m-2 d-1 and an R of 0.682 can be achieved for MODIS GPP, the estimated NPP values were also well within the range of previous literature, which proved the reliability of our modelling results. This research suggested that the utilization of multi-source data with various scales would help to the establishment of an appropriate model for calculating GPP and NPP at regional scales with relatively high spatial and temporal resolution. PMID:27088356
A Bayesian approach to multisource forest area estimation
Andrew O. Finley
2007-01-01
In efforts such as land use change monitoring, carbon budgeting, and forecasting ecological conditions and timber supply, demand is increasing for regional and national data layers depicting forest cover. These data layers must permit small area estimates of forest and, most importantly, provide associated error estimates. This paper presents a model-based approach for...
ERIC Educational Resources Information Center
Burns, G. Leonard; Desmul, Chris; Walsh, James A.; Silpakit, Chatchawan; Ussahawanitchakit, Phapruke
2009-01-01
Confirmatory factor analysis was used with a multitrait (attention-deficit/hyperactivity disorder-inattention, attention-deficit/hyperactivity disorder-hyperactivity/impulsivity, oppositional defiant disorder toward adults, academic competence, and social competence) by multisource (mothers and fathers) matrix to test the invariance and…
Ng, Kok-Yee; Koh, Christine; Ang, Soon; Kennedy, Jeffrey C; Chan, Kim-Yin
2011-09-01
This study extends multisource feedback research by assessing the effects of rater source and raters' cultural value orientations on rating bias (leniency and halo). Using a motivational perspective of performance appraisal, the authors posit that subordinate raters followed by peers will exhibit more rating bias than superiors. More important, given that multisource feedback systems were premised on low power distance and individualistic cultural assumptions, the authors expect raters' power distance and individualism-collectivism orientations to moderate the effects of rater source on rating bias. Hierarchical linear modeling on data collected from 1,447 superiors, peers, and subordinates who provided developmental feedback to 172 military officers show that (a) subordinates exhibit the most rating leniency, followed by peers and superiors; (b) subordinates demonstrate more halo than superiors and peers, whereas superiors and peers do not differ; (c) the effects of power distance on leniency and halo are strongest for subordinates than for peers and superiors; (d) the effects of collectivism on leniency were stronger for subordinates and peers than for superiors; effects on halo were stronger for subordinates than superiors, but these effects did not differ for subordinates and peers. The present findings highlight the role of raters' cultural values in multisource feedback ratings. PsycINFO Database Record (c) 2011 APA, all rights reserved
NASA Astrophysics Data System (ADS)
Ni, X. Y.; Huang, H.; Du, W. P.
2017-02-01
The PM2.5 problem is proving to be a major public crisis and is of great public-concern requiring an urgent response. Information about, and prediction of PM2.5 from the perspective of atmospheric dynamic theory is still limited due to the complexity of the formation and development of PM2.5. In this paper, we attempted to realize the relevance analysis and short-term prediction of PM2.5 concentrations in Beijing, China, using multi-source data mining. A correlation analysis model of PM2.5 to physical data (meteorological data, including regional average rainfall, daily mean temperature, average relative humidity, average wind speed, maximum wind speed, and other pollutant concentration data, including CO, NO2, SO2, PM10) and social media data (microblog data) was proposed, based on the Multivariate Statistical Analysis method. The study found that during these factors, the value of average wind speed, the concentrations of CO, NO2, PM10, and the daily number of microblog entries with key words 'Beijing; Air pollution' show high mathematical correlation with PM2.5 concentrations. The correlation analysis was further studied based on a big data's machine learning model- Back Propagation Neural Network (hereinafter referred to as BPNN) model. It was found that the BPNN method performs better in correlation mining. Finally, an Autoregressive Integrated Moving Average (hereinafter referred to as ARIMA) Time Series model was applied in this paper to explore the prediction of PM2.5 in the short-term time series. The predicted results were in good agreement with the observed data. This study is useful for helping realize real-time monitoring, analysis and pre-warning of PM2.5 and it also helps to broaden the application of big data and the multi-source data mining methods.
Gradient-Type Magnetoelectric Current Sensor with Strong Multisource Noise Suppression.
Zhang, Mingji; Or, Siu Wing
2018-02-14
A novel gradient-type magnetoelectric (ME) current sensor operating in magnetic field gradient (MFG) detection and conversion mode is developed based on a pair of ME composites that have a back-to-back capacitor configuration under a baseline separation and a magnetic biasing in an electrically-shielded and mechanically-enclosed housing. The physics behind the current sensing process is the product effect of the current-induced MFG effect associated with vortex magnetic fields of current-carrying cables (i.e., MFG detection) and the MFG-induced ME effect in the ME composite pair (i.e., MFG conversion). The sensor output voltage is directly obtained from the gradient ME voltage of the ME composite pair and is calibrated against cable current to give the current sensitivity. The current sensing performance of the sensor is evaluated, both theoretically and experimentally, under multisource noises of electric fields, magnetic fields, vibrations, and thermals. The sensor combines the merits of small nonlinearity in the current-induced MFG effect with those of high sensitivity and high common-mode noise rejection rate in the MFG-induced ME effect to achieve a high current sensitivity of 0.65-12.55 mV/A in the frequency range of 10 Hz-170 kHz, a small input-output nonlinearity of <500 ppm, a small thermal drift of <0.2%/℃ in the current range of 0-20 A, and a high common-mode noise rejection rate of 17-28 dB from multisource noises.
Gradient-Type Magnetoelectric Current Sensor with Strong Multisource Noise Suppression
2018-01-01
A novel gradient-type magnetoelectric (ME) current sensor operating in magnetic field gradient (MFG) detection and conversion mode is developed based on a pair of ME composites that have a back-to-back capacitor configuration under a baseline separation and a magnetic biasing in an electrically-shielded and mechanically-enclosed housing. The physics behind the current sensing process is the product effect of the current-induced MFG effect associated with vortex magnetic fields of current-carrying cables (i.e., MFG detection) and the MFG-induced ME effect in the ME composite pair (i.e., MFG conversion). The sensor output voltage is directly obtained from the gradient ME voltage of the ME composite pair and is calibrated against cable current to give the current sensitivity. The current sensing performance of the sensor is evaluated, both theoretically and experimentally, under multisource noises of electric fields, magnetic fields, vibrations, and thermals. The sensor combines the merits of small nonlinearity in the current-induced MFG effect with those of high sensitivity and high common-mode noise rejection rate in the MFG-induced ME effect to achieve a high current sensitivity of 0.65–12.55 mV/A in the frequency range of 10 Hz–170 kHz, a small input-output nonlinearity of <500 ppm, a small thermal drift of <0.2%/℃ in the current range of 0–20 A, and a high common-mode noise rejection rate of 17–28 dB from multisource noises. PMID:29443920
2010-07-01
Multisource Information Fusion ( CMIF ) along with a team including the Pennsylvania State University (PSU), Iona College (Iona), and Tennessee State...License. 14. ABSTRACT The University at Buffalo (UB) Center for Multisource Information Fusion ( CMIF ) along with a team including the Pennsylvania...of CMIF current research on methods for Test and Evaluation ([7], [8]) involving for example large- factor-space experimental design techniques ([9
Objective Work-Nonwork Conflict: From Incompatible Demands to Decreased Work Role Performance
ERIC Educational Resources Information Center
Haun, Sascha; Steinmetz, Holger; Dormann, Christian
2011-01-01
Research on work-nonwork conflict (WNC) is based on the assumption that incompatible demands from the work and the nonwork domain hamper role performance. This assumption implies that role demands from both domains interact in predicting role performance, but research has been largely limited to main effects. In this multi-source study, we analyze…
Multi-Sensor Triangulation of Multi-Source Spatial Data
NASA Technical Reports Server (NTRS)
Habib, Ayman; Kim, Chang-Jae; Bang, Ki-In
2007-01-01
The introduced methodologies are successful in: a) Ising LIDAR features for photogrammetric geo-refererncing; b) Delivering a geo-referenced imagery of the same quality as point-based geo-referencing procedures; c) Taking advantage of the synergistic characteristics of spatial data acquisition systems. The triangulation output can be used for the generation of 3-D perspective views.
Sound Localization in Multisource Environments
2009-03-01
A total of 7 paid volunteer listeners (3 males and 4 females, 20-25 years of age ) par- ticipated in the experiment. All had normal hearing (i.e...effects of the loudspeaker frequency responses, and were then sent from an experimental control computer to a Mark of the Unicorn (MOTU 24 I/O) digital-to...after the overall multisource stimulus has been presented (the ’post-cue’ condition). 3.2 Methods 3.2.1 Listeners Eight listeners, ranging in age from
A beam optics study of a modular multi-source X-ray tube for novel computed tomography applications
NASA Astrophysics Data System (ADS)
Walker, Brandon J.; Radtke, Jeff; Chen, Guang-Hong; Eliceiri, Kevin W.; Mackie, Thomas R.
2017-10-01
A modular implementation of a scanning multi-source X-ray tube is designed for the increasing number of multi-source imaging applications in computed tomography (CT). An electron beam array coupled with an oscillating magnetic deflector is proposed as a means for producing an X-ray focal spot at any position along a line. The preliminary multi-source model includes three thermionic electron guns that are deflected in tandem by a slowly varying magnetic field and pulsed according to a scanning sequence that is dependent on the intended imaging application. Particle tracking simulations with particle dynamics analysis software demonstrate that three 100 keV electron beams are laterally swept a combined distance of 15 cm over a stationary target with an oscillating magnetic field of 102 G perpendicular to the beam axis. Beam modulation is accomplished using 25 μs pulse widths to a grid electrode with a reverse gate bias of -500 V and an extraction voltage of +1000 V. Projected focal spot diameters are approximately 1 mm for 138 mA electron beams and the stationary target stays within thermal limits for the 14 kW module. This concept could be used as a research platform for investigating high-speed stationary CT scanners, for lowering dose with virtual fan beam formation, for reducing scatter radiation in cone-beam CT, or for other industrial applications.
General practitioner registrars' experiences of multisource feedback: a qualitative study.
Findlay, Nigel
2012-09-01
To explore the experiences of general practitioner (GP) specialty training registrars, thereby generating more understanding of the ways in which multisource feedback impacts upon their self-perceptions and professional behaviour, and provide information that might guide its use in the revalidation process of practising GPs. Complete transcripts of semi-structured, audio-taped qualitative interviews were analysed using the constant comparative method, to describe the experiences of multisource feedback for individual registrars. Five GP registrars participated. The first theme to emerge was the importance of the educational supervisor in encouraging the registrar through the emotional response, then facilitating interpretation of feedback and personal development. The second was the differing attitudes to learning and development, which may be in conflict with threats to self-image. The current RCGP format for obtaining multisource feedback for GP registrars may not always be achieving its purpose of challenging self-perceptions and motivating improved performance. An enhanced qualitative approach, through personal interviews rather than anonymous questionnaires, may provide a more accurate picture. This would address the concerns of some registrars by reducing their logistical burden and may facilitate more constructive feedback. The educational supervisor has an important role in promoting personal development, once this feedback is shared. The challenge for teaching organisations is to create a climate of comfort for learning, yet encourage learning beyond a 'comfort zone'.
Research on multi-source image fusion technology in haze environment
NASA Astrophysics Data System (ADS)
Ma, GuoDong; Piao, Yan; Li, Bing
2017-11-01
In the haze environment, the visible image collected by a single sensor can express the details of the shape, color and texture of the target very well, but because of the haze, the sharpness is low and some of the target subjects are lost; Because of the expression of thermal radiation and strong penetration ability, infrared image collected by a single sensor can clearly express the target subject, but it will lose detail information. Therefore, the multi-source image fusion method is proposed to exploit their respective advantages. Firstly, the improved Dark Channel Prior algorithm is used to preprocess the visible haze image. Secondly, the improved SURF algorithm is used to register the infrared image and the haze-free visible image. Finally, the weighted fusion algorithm based on information complementary is used to fuse the image. Experiments show that the proposed method can improve the clarity of the visible target and highlight the occluded infrared target for target recognition.
Ren, Yin; Deng, Lu-Ying; Zuo, Shu-Di; Song, Xiao-Dong; Liao, Yi-Lan; Xu, Cheng-Dong; Chen, Qi; Hua, Li-Zhong; Li, Zheng-Wei
2016-09-01
Identifying factors that influence the land surface temperature (LST) of urban forests can help improve simulations and predictions of spatial patterns of urban cool islands. This requires a quantitative analytical method that combines spatial statistical analysis with multi-source observational data. The purpose of this study was to reveal how human activities and ecological factors jointly influence LST in clustering regions (hot or cool spots) of urban forests. Using Xiamen City, China from 1996 to 2006 as a case study, we explored the interactions between human activities and ecological factors, as well as their influences on urban forest LST. Population density was selected as a proxy for human activity. We integrated multi-source data (forest inventory, digital elevation models (DEM), population, and remote sensing imagery) to develop a database on a unified urban scale. The driving mechanism of urban forest LST was revealed through a combination of multi-source spatial data and spatial statistical analysis of clustering regions. The results showed that the main factors contributing to urban forest LST were dominant tree species and elevation. The interactions between human activity and specific ecological factors linearly or nonlinearly increased LST in urban forests. Strong interactions between elevation and dominant species were generally observed and were prevalent in either hot or cold spots areas in different years. In conclusion, quantitative studies based on spatial statistics and GeogDetector models should be conducted in urban areas to reveal interactions between human activities, ecological factors, and LST. Copyright © 2016 Elsevier Ltd. All rights reserved.
Group Multilateral Relation Analysis Based on Large Data
NASA Astrophysics Data System (ADS)
LIU, Qiang; ZHOU, Guo-min; CHEN, Guang-xuan; XU, Yong
2017-09-01
Massive, multi-source, heterogeneous police data and social data brings challenges to the current police work. The existing massive data resources are studied as the research object to excavate the group of multilateral relations by using large data technology for data archiving. The results of the study could provide technical support to police enforcement departments for fighting crime and preventing crime.
NASA Astrophysics Data System (ADS)
Bi, Chuan-Xing; Geng, Lin; Zhang, Xiao-Zheng
2016-05-01
In the sound field with multiple non-stationary sources, the measured pressure is the sum of the pressures generated by all sources, and thus cannot be used directly for studying the vibration and sound radiation characteristics of every source alone. This paper proposes a separation model based on the interpolated time-domain equivalent source method (ITDESM) to separate the pressure field belonging to every source from the non-stationary multi-source sound field. In the proposed method, ITDESM is first extended to establish the relationship between the mixed time-dependent pressure and all the equivalent sources distributed on every source with known location and geometry information, and all the equivalent source strengths at each time step are solved by an iterative solving process; then, the corresponding equivalent source strengths of one interested source are used to calculate the pressure field generated by that source alone. Numerical simulation of two baffled circular pistons demonstrates that the proposed method can be effective in separating the non-stationary pressure generated by every source alone in both time and space domains. An experiment with two speakers in a semi-anechoic chamber further evidences the effectiveness of the proposed method.
A multi-source feedback tool for measuring a subset of Pediatrics Milestones.
Schwartz, Alan; Margolis, Melissa J; Multerer, Sara; Haftel, Hilary M; Schumacher, Daniel J
2016-10-01
The Pediatrics Milestones Assessment Pilot employed a new multisource feedback (MSF) instrument to assess nine Pediatrics Milestones among interns and subinterns in the inpatient context. To report validity evidence for the MSF tool for informing milestone classification decisions. We obtained MSF instruments by different raters per learner per rotation. We present evidence for validity based on the unified validity framework. One hundred and ninety two interns and 41 subinterns at 18 Pediatrics residency programs received a total of 1084 MSF forms from faculty (40%), senior residents (34%), nurses (22%), and other staff (4%). Variance in ratings was associated primarily with rater (32%) and learner (22%). The milestone factor structure fit data better than simpler structures. In domains except professionalism, ratings by nurses were significantly lower than those by faculty and ratings by other staff were significantly higher. Ratings were higher when the rater observed the learner for longer periods and had a positive global opinion of the learner. Ratings of interns and subinterns did not differ, except for ratings by senior residents. MSF-based scales correlated with summative milestone scores. We obtain moderately reliable MSF ratings of interns and subinterns in the inpatient context to inform some milestone assignments.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gastelum, Zoe N.; White, Amanda M.; Whitney, Paul D.
2013-06-04
The Multi-Source Signatures for Nuclear Programs project, part of Pacific Northwest National Laboratory’s (PNNL) Signature Discovery Initiative, seeks to computationally capture expert assessment of multi-type information such as text, sensor output, imagery, or audio/video files, to assess nuclear activities through a series of Bayesian network (BN) models. These models incorporate knowledge from a diverse range of information sources in order to help assess a country’s nuclear activities. The models span engineering topic areas, state-level indicators, and facility-specific characteristics. To illustrate the development, calibration, and use of BN models for multi-source assessment, we present a model that predicts a country’s likelihoodmore » to participate in the international nuclear nonproliferation regime. We validate this model by examining the extent to which the model assists non-experts arrive at conclusions similar to those provided by nuclear proliferation experts. We also describe the PNNL-developed software used throughout the lifecycle of the Bayesian network model development.« less
Prediction With Dimension Reduction of Multiple Molecular Data Sources for Patient Survival.
Kaplan, Adam; Lock, Eric F
2017-01-01
Predictive modeling from high-dimensional genomic data is often preceded by a dimension reduction step, such as principal component analysis (PCA). However, the application of PCA is not straightforward for multisource data, wherein multiple sources of 'omics data measure different but related biological components. In this article, we use recent advances in the dimension reduction of multisource data for predictive modeling. In particular, we apply exploratory results from Joint and Individual Variation Explained (JIVE), an extension of PCA for multisource data, for prediction of differing response types. We conduct illustrative simulations to illustrate the practical advantages and interpretability of our approach. As an application example, we consider predicting survival for patients with glioblastoma multiforme from 3 data sources measuring messenger RNA expression, microRNA expression, and DNA methylation. We also introduce a method to estimate JIVE scores for new samples that were not used in the initial dimension reduction and study its theoretical properties; this method is implemented in the R package R.JIVE on CRAN, in the function jive.predict.
A Hybrid Semi-supervised Classification Scheme for Mining Multisource Geospatial Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vatsavai, Raju; Bhaduri, Budhendra L
2011-01-01
Supervised learning methods such as Maximum Likelihood (ML) are often used in land cover (thematic) classification of remote sensing imagery. ML classifier relies exclusively on spectral characteristics of thematic classes whose statistical distributions (class conditional probability densities) are often overlapping. The spectral response distributions of thematic classes are dependent on many factors including elevation, soil types, and ecological zones. A second problem with statistical classifiers is the requirement of large number of accurate training samples (10 to 30 |dimensions|), which are often costly and time consuming to acquire over large geographic regions. With the increasing availability of geospatial databases, itmore » is possible to exploit the knowledge derived from these ancillary datasets to improve classification accuracies even when the class distributions are highly overlapping. Likewise newer semi-supervised techniques can be adopted to improve the parameter estimates of statistical model by utilizing a large number of easily available unlabeled training samples. Unfortunately there is no convenient multivariate statistical model that can be employed for mulitsource geospatial databases. In this paper we present a hybrid semi-supervised learning algorithm that effectively exploits freely available unlabeled training samples from multispectral remote sensing images and also incorporates ancillary geospatial databases. We have conducted several experiments on real datasets, and our new hybrid approach shows over 25 to 35% improvement in overall classification accuracy over conventional classification schemes.« less
Luck, Margaux; Bertho, Gildas; Bateson, Mathilde; Karras, Alexandre; Yartseva, Anastasia; Thervet, Eric
2016-01-01
1H Nuclear Magnetic Resonance (NMR)-based metabolic profiling is very promising for the diagnostic of the stages of chronic kidney disease (CKD). Because of the high dimension of NMR spectra datasets and the complex mixture of metabolites in biological samples, the identification of discriminant biomarkers of a disease is challenging. None of the widely used chemometric methods in NMR metabolomics performs a local exhaustive exploration of the data. We developed a descriptive and easily understandable approach that searches for discriminant local phenomena using an original exhaustive rule-mining algorithm in order to predict two groups of patients: 1) patients having low to mild CKD stages with no renal failure and 2) patients having moderate to established CKD stages with renal failure. Our predictive algorithm explores the m-dimensional variable space to capture the local overdensities of the two groups of patients under the form of easily interpretable rules. Afterwards, a L2-penalized logistic regression on the discriminant rules was used to build predictive models of the CKD stages. We explored a complex multi-source dataset that included the clinical, demographic, clinical chemistry, renal pathology and urine metabolomic data of a cohort of 110 patients. Given this multi-source dataset and the complex nature of metabolomic data, we analyzed 1- and 2-dimensional rules in order to integrate the information carried by the interactions between the variables. The results indicated that our local algorithm is a valuable analytical method for the precise characterization of multivariate CKD stage profiles and as efficient as the classical global model using chi2 variable section with an approximately 70% of good classification level. The resulting predictive models predominantly identify urinary metabolites (such as 3-hydroxyisovalerate, carnitine, citrate, dimethylsulfone, creatinine and N-methylnicotinamide) as relevant variables indicating that CKD significantly affects the urinary metabolome. In addition, the simple knowledge of the concentration of urinary metabolites classifies the CKD stage of the patients correctly. PMID:27861591
A Bayesian Framework of Uncertainties Integration in 3D Geological Model
NASA Astrophysics Data System (ADS)
Liang, D.; Liu, X.
2017-12-01
3D geological model can describe complicated geological phenomena in an intuitive way while its application may be limited by uncertain factors. Great progress has been made over the years, lots of studies decompose the uncertainties of geological model to analyze separately, while ignored the comprehensive impacts of multi-source uncertainties. Great progress has been made over the years, while lots of studies ignored the comprehensive impacts of multi-source uncertainties when analyzed them item by item from each source. To evaluate the synthetical uncertainty, we choose probability distribution to quantify uncertainty, and propose a bayesian framework of uncertainties integration. With this framework, we integrated data errors, spatial randomness, and cognitive information into posterior distribution to evaluate synthetical uncertainty of geological model. Uncertainties propagate and cumulate in modeling process, the gradual integration of multi-source uncertainty is a kind of simulation of the uncertainty propagation. Bayesian inference accomplishes uncertainty updating in modeling process. Maximum entropy principle makes a good effect on estimating prior probability distribution, which ensures the prior probability distribution subjecting to constraints supplied by the given information with minimum prejudice. In the end, we obtained a posterior distribution to evaluate synthetical uncertainty of geological model. This posterior distribution represents the synthetical impact of all the uncertain factors on the spatial structure of geological model. The framework provides a solution to evaluate synthetical impact on geological model of multi-source uncertainties and a thought to study uncertainty propagation mechanism in geological modeling.
Processing multisource feedback during residency under the guidance of a non-medical coach
Eckenhausen, Marina A.W.; ten Cate, Olle
2018-01-01
Objectives The present study aimed to investigate residents’ preferences in dealing with personal multi-source feedback (MSF) reports with or without the support of a coach. Methods Residents employed for at least half a year in the study hospital were eligible to participate. All 43 residents opting to discuss their MSF report with a psychologist-coach before discussing results with the program director were included. Semi-structured interviews were conducted following individual coaching sessions. Qualitative and quantitative data were gathered using field notes. Results Seventy-four percent (n= 32) preferred sharing the MFS report always with a coach, 21% (n= 9) if either the feedback or the relationship with the program director was less favorable, and 5% (n=2) saw no difference between discussing with a coach or with the program director. In the final stage of training residents more often preferred the coach (82.6%, n=19) than in the first stages (65%, n=13). Reasons for discussing the report with a coach included her neutral and objective position, her expertise, and the open and safe context during the discussion. Conclusions Most residents preferred discussing multisource feedback results with a coach before their meeting with a program director, particularly if the results were negative. They appeared to struggle with the dual role of the program director (coaching and judging) and appreciated the expertise of a dedicated coach to navigate this confrontation. We encourage residency programs to consider offering residents neutral coaching when processing multisource feedback. PMID:29478041
Data-to-Decisions S&T Priority Initiative
2011-11-08
Context Mapping − Track Performance Model Multi-Source Tracking − Track Fusion − Track through Gaps − Move-Stop-Move Performance Based ...Decisions S&T Priority Initiative Dr. Carey Schwartz PSC Lead Office of Naval Research NDIA Disruptive Technologies Conference November 8-9, 2011...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Office of Naval Research ,875 North Randolph Street , Arlington,VA,2217 8. PERFORMING ORGANIZATION REPORT
SU-C-207-01: Four-Dimensional Inverse Geometry Computed Tomography: Concept and Its Validation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, K; Kim, D; Kim, T
2015-06-15
Purpose: In past few years, the inverse geometry computed tomography (IGCT) system has been developed to overcome shortcomings of a conventional computed tomography (CT) system such as scatter problem induced from large detector size and cone-beam artifact. In this study, we intend to present a concept of a four-dimensional (4D) IGCT system that has positive aspects above all with temporal resolution for dynamic studies and reduction of motion artifact. Methods: Contrary to conventional CT system, projection data at a certain angle in IGCT was a group of fractionated narrow cone-beam projection data, projection group (PG), acquired from multi-source array whichmore » have extremely short time gap of sequential operation between each of sources. At this, for 4D IGCT imaging, time-related data acquisition parameters were determined by combining multi-source scanning time for collecting one PG with conventional 4D CBCT data acquisition sequence. Over a gantry rotation, acquired PGs from multi-source array were tagged time and angle for 4D image reconstruction. Acquired PGs were sorted into 10 phase and image reconstructions were independently performed at each phase. Image reconstruction algorithm based upon filtered-backprojection was used in this study. Results: The 4D IGCT had uniform image without cone-beam artifact on the contrary to 4D CBCT image. In addition, the 4D IGCT images of each phase had no significant artifact induced from motion compared with 3D CT. Conclusion: The 4D IGCT image seems to give relatively accurate dynamic information of patient anatomy based on the results were more endurable than 3D CT about motion artifact. From this, it will be useful for dynamic study and respiratory-correlated radiation therapy. This work was supported by the Industrial R&D program of MOTIE/KEIT [10048997, Development of the core technology for integrated therapy devices based on real-time MRI guided tumor tracking] and the Mid-career Researcher Program (2014R1A2A1A10050270) through the National Research Foundation of Korea funded by the Ministry of Science, ICT&Future Planning.« less
Cervo, Silvia; Rovina, Jane; Talamini, Renato; Perin, Tiziana; Canzonieri, Vincenzo; De Paoli, Paolo; Steffan, Agostino
2013-07-30
Efforts to improve patients' understanding of their own medical treatments or research in which they are involved are progressing, especially with regard to informed consent procedures. We aimed to design a multisource informed consent procedure that is easily adaptable to both clinical and research applications, and to evaluate its effectiveness in terms of understanding and awareness, even in less educated patients. We designed a multisource informed consent procedure for patients' enrolment in a Cancer Institute Biobank (CRO-Biobank). From October 2009 to July 2011, a total of 550 cancer patients admitted to the Centro di Riferimento Oncologico IRCCS Aviano, who agreed to contribute to its biobank, were consecutively enrolled. Participants were asked to answer a self-administered questionnaire aim at exploring their understanding of biobanks and their needs for information on this topic, before and after study participation. Chi-square tests were performed on the questionnaire answers, according to gender or education. Of the 430 patients who returned the questionnaire, only 36.5% knew what a biobank was before participating in the study. Patients with less formal education were less informed by some sources (the Internet, newspapers, magazines, and our Institute). The final assessment test, taken after the multisource informed consent procedure, showed more than 95% correct answers. The information received was judged to be very or fairly understandable in almost all cases. More than 95% of patients were aware of participating in a biobank project, and gave helping cancer research (67.5%), moral obligation, and supporting cancer care as main reasons for their involvement. Our multisource informed consent information system allowed a high rate of understanding and awareness of study participation, even among less-educated participants, and could be an effective and easy-to-apply model for others to consider to contribute to a well-informed decision making process in several fields, from clinical practice to research.Further studies are needed to explore the effects on the study comprehension by each source of information, and by other sources suggested by participants in the questionnaire.
Validating Remotely Sensed Land Surface Evapotranspiration Based on Multi-scale Field Measurements
NASA Astrophysics Data System (ADS)
Jia, Z.; Liu, S.; Ziwei, X.; Liang, S.
2012-12-01
The land surface evapotranspiration plays an important role in the surface energy balance and the water cycle. There have been significant technical and theoretical advances in our knowledge of evapotranspiration over the past two decades. Acquisition of the temporally and spatially continuous distribution of evapotranspiration using remote sensing technology has attracted the widespread attention of researchers and managers. However, remote sensing technology still has many uncertainties coming from model mechanism, model inputs, parameterization schemes, and scaling issue in the regional estimation. Achieving remotely sensed evapotranspiration (RS_ET) with confident certainty is required but difficult. As a result, it is indispensable to develop the validation methods to quantitatively assess the accuracy and error sources of the regional RS_ET estimations. This study proposes an innovative validation method based on multi-scale evapotranspiration acquired from field measurements, with the validation results including the accuracy assessment, error source analysis, and uncertainty analysis of the validation process. It is a potentially useful approach to evaluate the accuracy and analyze the spatio-temporal properties of RS_ET at both the basin and local scales, and is appropriate to validate RS_ET in diverse resolutions at different time-scales. An independent RS_ET validation using this method was presented over the Hai River Basin, China in 2002-2009 as a case study. Validation at the basin scale showed good agreements between the 1 km annual RS_ET and the validation data such as the water balanced evapotranspiration, MODIS evapotranspiration products, precipitation, and landuse types. Validation at the local scale also had good results for monthly, daily RS_ET at 30 m and 1 km resolutions, comparing to the multi-scale evapotranspiration measurements from the EC and LAS, respectively, with the footprint model over three typical landscapes. Although some validation experiments demonstrated that the models yield accurate estimates at flux measurement sites, the question remains whether they are performing well over the broader landscape. Moreover, a large number of RS_ET products have been released in recent years. Thus, we also pay attention to the cross-validation method of RS_ET derived from multi-source models. "The Multi-scale Observation Experiment on Evapotranspiration over Heterogeneous Land Surfaces: Flux Observation Matrix" campaign is carried out at the middle reaches of the Heihe River Basin, China in 2012. Flux measurements from an observation matrix composed of 22 EC and 4 LAS are acquired to investigate the cross-validation of multi-source models over different landscapes. In this case, six remote sensing models, including the empirical statistical model, the one-source and two-source models, the Penman-Monteith equation based model, the Priestley-Taylor equation based model, and the complementary relationship based model, are used to perform an intercomparison. All the results from the two cases of RS_ET validation showed that the proposed validation methods are reasonable and feasible.
NASA Astrophysics Data System (ADS)
Guarnieri, A.; Masiero, A.; Piragnolo, M.; Pirotti, F.; Vettore, A.
2016-06-01
In this paper we present the results of the development of a Web-based archiving and documenting system aimed to the management of multisource and multitemporal data related to cultural heritage. As case study we selected the building complex of Villa Revedin Bolasco in Castefranco Veneto (Treviso, Italy) and its park. Buildings and park were built in XIX century after several restorations of the original XIV century area. The data management system relies on a geodatabase framework, in which different kinds of datasets were stored. More specifically, the geodatabase elements consist of historical information, documents, descriptions of artistic characteristics of the building and the park, in the form of text and images. In addition, we used also floorplans, sections and views of the outer facades of the building extracted by a TLS-based 3D model of the whole Villa. In order to manage and explore these rich dataset, we developed a geodatabase using PostgreSQL and PostGIS as spatial plugin. The Web-GIS platform, based on HTML5 and PHP programming languages, implements the NASA Web World Wind virtual globe, a 3D virtual globe we used to enable the navigation and interactive exploration of the park. Furthermore, through a specific timeline function, the user can explore the historical evolution of the building complex.
School adjustment of children in residential care: a multi-source analysis.
Martín, Eduardo; Muñoz de Bustillo, María del Carmen
2009-11-01
School adjustment is one the greatest challenges in residential child care programs. This study has two aims: to analyze school adjustment compared to a normative population, and to carry out a multi-source analysis (child, classmates, and teacher) of this adjustment. A total of 50 classrooms containing 60 children from residential care units were studied. The "Método de asignación de atributos perceptivos" (Allocation of perceptive attributes; Díaz-Aguado, 2006), the "Test Autoevaluativo Multifactorial de Adaptación Infantil" (TAMAI [Multifactor Self-assessment Test of Child Adjustment]; Hernández, 1996) and the "Protocolo de valoración para el profesorado (Evaluation Protocol for Teachers; Fernández del Valle, 1998) were applied. The main results indicate that, compared with their classmates, children in residential care are perceived as more controversial and less integrated at school, although no differences were observed in problems of isolation. The multi-source analysis shows that there is agreement among the different sources when the externalized and visible aspects are evaluated. These results are discussed in connection with the practices that are being developed in residential child care programs.
NASA Astrophysics Data System (ADS)
Hu, Wenmin; Wang, Zhongcheng; Li, Chunhua; Zhao, Jin; Li, Yi
2018-02-01
Multi-source remote sensing data is rarely used for the comprehensive assessment of land ecologic environment quality. In this study, a digital environmental model was proposed with the inversion algorithm of land and environmental factors based on the multi-source remote sensing data, and a comprehensive index (Ecoindex) was applied to reconstruct and predict the land environment quality of the Dongting Lake Area to assess the effect of human activities on the environment. The main finding was that with the decrease of Grade I and Grade II quality had a decreasing tendency in the lake area, mostly in suburbs and wetlands. Atmospheric water vapour, land use intensity, surface temperature, vegetation coverage, and soil water content were the main driving factors. The cause of degradation was the interference of multi-factor combinations, which led to positive and negative environmental agglomeration effects. Positive agglomeration, such as increased rainfall and vegetation coverage and reduced land use intensity, could increase environmental quality, while negative agglomeration resulted in the opposite. Therefore, reasonable ecological restoration measures should be beneficial to limit the negative effects and decreasing tendency, improve the land ecological environment quality and provide references for macroscopic planning by the government.
Ye, Hongqiang; Ma, Qijun; Hou, Yuezhong; Li, Man; Zhou, Yongsheng
2017-12-01
Digital techniques are not clinically applied for 1-piece maxillary prostheses containing an obturator and removable partial denture retained by the remaining teeth because of the difficulty in obtaining sufficiently accurate 3-dimensional (3D) images. The purpose of this pilot clinical study was to generate 3D digital casts of maxillary defects, including the defective region and the maxillary dentition, based on multisource data registration and to evaluate their effectiveness. Twelve participants with maxillary defects were selected. The maxillofacial region was scanned with spiral computer tomography (CT), and the maxillary arch and palate were scanned using an intraoral optical scanner. The 3D images from the CT and intraoral scanner were registered and merged to form a 3D digital cast of the maxillary defect containing the anatomic structures needed for the maxillary prosthesis. This included the defect cavity, maxillary dentition, and palate. Traditional silicone impressions were also made, and stone casts were poured. The accuracy of the digital cast in comparison with that of the stone cast was evaluated by measuring the distance between 4 anatomic landmarks. Differences and consistencies were assessed using paired Student t tests and the intraclass correlation coefficient (ICC). In 3 participants, physical resin casts were produced by rapid prototyping from digital casts. Based on the resin casts, maxillary prostheses were fabricated by using conventional methods and then evaluated in the participants to assess the clinical applicability of the digital casts. Digital casts of the maxillary defects were generated and contained all the anatomic details needed for the maxillary prosthesis. Comparing the digital and stone casts, a paired Student t test indicated that differences in the linear distances between landmarks were not statistically significant (P>.05). High ICC values (0.977 to 0.998) for the interlandmark distances further indicated the high degree of consistency between the digital and stone casts. The maxillary prostheses showed good clinical effectiveness, indicating that the corresponding digital casts met the requirements for clinical application. Based on multisource data from spiral CT and the intraoral scanner, 3D digital casts of maxillary defects were generated using the registration technique. These casts were consistent with conventional stone casts in terms of accuracy and were suitable for clinical use. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Multisource passive acoustic tracking: an application of random finite set data fusion
NASA Astrophysics Data System (ADS)
Ali, Andreas M.; Hudson, Ralph E.; Lorenzelli, Flavio; Yao, Kung
2010-04-01
Multisource passive acoustic tracking is useful in animal bio-behavioral study by replacing or enhancing human involvement during and after field data collection. Multiple simultaneous vocalizations are a common occurrence in a forest or a jungle, where many species are encountered. Given a set of nodes that are capable of producing multiple direction-of-arrivals (DOAs), such data needs to be combined into meaningful estimates. Random Finite Set provides the mathematical probabilistic model, which is suitable for analysis and optimal estimation algorithm synthesis. Then the proposed algorithm has been verified using a simulation and a controlled test experiment.
A computer vision system for the recognition of trees in aerial photographs
NASA Technical Reports Server (NTRS)
Pinz, Axel J.
1991-01-01
Increasing problems of forest damage in Central Europe set the demand for an appropriate forest damage assessment tool. The Vision Expert System (VES) is presented which is capable of finding trees in color infrared aerial photographs. Concept and architecture of VES are discussed briefly. The system is applied to a multisource test data set. The processing of this multisource data set leads to a multiple interpretation result for one scene. An integration of these results will provide a better scene description by the vision system. This is achieved by an implementation of Steven's correlation algorithm.
NASA Astrophysics Data System (ADS)
Loubet, Benjamin; Carozzi, Marco
2015-04-01
Tropospheric ammonia (NH3) is a key player in atmospheric chemistry and its deposition is a threat for the environment (ecosystem eutrophication, soil acidification and reduction in species biodiversity). Most of the NH3 global emissions derive from agriculture, mainly from livestock manure (storage and field application) but also from nitrogen-based fertilisers. Inverse dispersion modelling has been widely used to infer emission sources from a homogeneous source of known geometry. When the emission derives from different sources inside of the measured footprint, the emission should be treated as multi-source problem. This work aims at estimating whether multi-source inverse dispersion modelling can be used to infer NH3 emissions from different agronomic treatment, composed of small fields (typically squares of 25 m side) located near to each other, using low-cost NH3 measurements (diffusion samplers). To do that, a numerical experiment was designed with a combination of 3 x 3 square field sources (625 m2), and a set of sensors placed at the centre of each field at several heights as well as at 200 m away from the sources in each cardinal directions. The concentration at each sensor location was simulated with a forward Lagrangian Stochastic (WindTrax) and a Gaussian-like (FIDES) dispersion model. The concentrations were averaged over various integration times (3 hours to 28 days), to mimic the diffusion sampler behaviour with several sampling strategy. The sources were then inferred by inverse modelling using the averaged concentration and the same models in backward mode. The sources patterns were evaluated using a soil-vegetation-atmosphere model (SurfAtm-NH3) that incorporates the response of the NH3 emissions to surface temperature. A combination emission patterns (constant, linear decreasing, exponential decreasing and Gaussian type) and strengths were used to evaluate the uncertainty of the inversion method. Each numerical experiment covered a period of 28 days. The meteorological dataset of the fluxnet FR-Gri site (Grignon, FR) in 2008 was employed. Several sensor heights were tested, from 0.25 m to 2 m. The multi-source inverse problem was solved based on several sampling and field trial strategies: considering 1 or 2 heights over each field, considering the background concentration as known or unknown, and considering block-repetitions in the field set-up (3 repetitions). The inverse modelling approach demonstrated to be adapted for discriminating large differences in NH3 emissions from small agronomic plots using integrating sensors. The method is sensitive to sensor heights. The uncertainties and systematic biases are evaluated and discussed.
NASA Astrophysics Data System (ADS)
Majasalmi, Titta; Eisner, Stephanie; Astrup, Rasmus; Fridman, Jonas; Bright, Ryan M.
2018-01-01
Forest management affects the distribution of tree species and the age class of a forest, shaping its overall structure and functioning and in turn the surface-atmosphere exchanges of mass, energy, and momentum. In order to attribute climate effects to anthropogenic activities like forest management, good accounts of forest structure are necessary. Here, using Fennoscandia as a case study, we make use of Fennoscandic National Forest Inventory (NFI) data to systematically classify forest cover into groups of similar aboveground forest structure. An enhanced forest classification scheme and related lookup table (LUT) of key forest structural attributes (i.e., maximum growing season leaf area index (LAImax), basal-area-weighted mean tree height, tree crown length, and total stem volume) was developed, and the classification was applied for multisource NFI (MS-NFI) maps from Norway, Sweden, and Finland. To provide a complete surface representation, our product was integrated with the European Space Agency Climate Change Initiative Land Cover (ESA CCI LC) map of present day land cover (v.2.0.7). Comparison of the ESA LC and our enhanced LC products (https://doi.org/10.21350/7zZEy5w3) showed that forest extent notably (κ = 0.55, accuracy 0.64) differed between the two products. To demonstrate the potential of our enhanced LC product to improve the description of the maximum growing season LAI (LAImax) of managed forests in Fennoscandia, we compared our LAImax map with reference LAImax maps created using the ESA LC product (and related cross-walking table) and PFT-dependent LAImax values used in three leading land models. Comparison of the LAImax maps showed that our product provides a spatially more realistic description of LAImax in managed Fennoscandian forests compared to reference maps. This study presents an approach to account for the transient nature of forest structural attributes due to human intervention in different land models.
Comparison of Frequency-Domain Array Methods for Studying Earthquake Rupture Process
NASA Astrophysics Data System (ADS)
Sheng, Y.; Yin, J.; Yao, H.
2014-12-01
Seismic array methods, in both time- and frequency- domains, have been widely used to study the rupture process and energy radiation of earthquakes. With better spatial resolution, the high-resolution frequency-domain methods, such as Multiple Signal Classification (MUSIC) (Schimdt, 1986; Meng et al., 2011) and the recently developed Compressive Sensing (CS) technique (Yao et al., 2011, 2013), are revealing new features of earthquake rupture processes. We have performed various tests on the methods of MUSIC, CS, minimum-variance distortionless response (MVDR) Beamforming and conventional Beamforming in order to better understand the advantages and features of these methods for studying earthquake rupture processes. We use the ricker wavelet to synthesize seismograms and use these frequency-domain techniques to relocate the synthetic sources we set, for instance, two sources separated in space but, their waveforms completely overlapping in the time domain. We also test the effects of the sliding window scheme on the recovery of a series of input sources, in particular, some artifacts that are caused by the sliding window scheme. Based on our tests, we find that CS, which is developed from the theory of sparsity inversion, has relatively high spatial resolution than the other frequency-domain methods and has better performance at lower frequencies. In high-frequency bands, MUSIC, as well as MVDR Beamforming, is more stable, especially in the multi-source situation. Meanwhile, CS tends to produce more artifacts when data have poor signal-to-noise ratio. Although these techniques can distinctly improve the spatial resolution, they still produce some artifacts along with the sliding of the time window. Furthermore, we propose a new method, which combines both the time-domain and frequency-domain techniques, to suppress these artifacts and obtain more reliable earthquake rupture images. Finally, we apply this new technique to study the 2013 Okhotsk deep mega earthquake in order to better capture the rupture characteristics (e.g., rupture area and velocity) of this earthquake.
Classification of forest land attributes using multi-source remotely sensed data
NASA Astrophysics Data System (ADS)
Pippuri, Inka; Suvanto, Aki; Maltamo, Matti; Korhonen, Kari T.; Pitkänen, Juho; Packalen, Petteri
2016-02-01
The aim of the study was to (1) examine the classification of forest land using airborne laser scanning (ALS) data, satellite images and sample plots of the Finnish National Forest Inventory (NFI) as training data and to (2) identify best performing metrics for classifying forest land attributes. Six different schemes of forest land classification were studied: land use/land cover (LU/LC) classification using both national classes and FAO (Food and Agricultural Organization of the United Nations) classes, main type, site type, peat land type and drainage status. Special interest was to test different ALS-based surface metrics in classification of forest land attributes. Field data consisted of 828 NFI plots collected in 2008-2012 in southern Finland and remotely sensed data was from summer 2010. Multinomial logistic regression was used as the classification method. Classification of LU/LC classes were highly accurate (kappa-values 0.90 and 0.91) but also the classification of site type, peat land type and drainage status succeeded moderately well (kappa-values 0.51, 0.69 and 0.52). ALS-based surface metrics were found to be the most important predictor variables in classification of LU/LC class, main type and drainage status. In best classification models of forest site types both spectral metrics from satellite data and point cloud metrics from ALS were used. In turn, in the classification of peat land types ALS point cloud metrics played the most important role. Results indicated that the prediction of site type and forest land category could be incorporated into stand level forest management inventory system in Finland.
Evaluating the potential of improving residential water balance at building scale.
Agudelo-Vera, Claudia M; Keesman, Karel J; Mels, Adriaan R; Rijnaarts, Huub H M
2013-12-15
Earlier results indicated that, for an average household, self-sufficiency in water supply can be achieved by following the Urban harvest Approach (UHA), in a combination of demand minimization, cascading and multi-sourcing. To achieve these results, it was assumed that all available local resources can be harvested. In reality, however, temporal, spatial and location-bound factors pose limitations to this harvest and, thus, to self-sufficiency. This article investigates potential spatial and temporal limitations to harvest local water resources at building level for the Netherlands, with a focus on indoor demand. Two building types were studied, a free standing house (one four-people household) and a mid-rise apartment flat (28 two-person households). To be able to model yearly water balances, daily patterns considering household occupancy and presence of water using appliances were defined per building type. Three strategies were defined. The strategies include demand minimization, light grey water (LGW) recycling, and rainwater harvesting (multi-sourcing). Recycling and multi-sourcing cater for toilet flushing and laundry machine. Results showed that water saving devices may reduce 30% of the conventional demand. Recycling of LGW can supply 100% of second quality water (DQ2) which represents 36% of the conventional demand or up to 20% of the minimized demand. Rainwater harvesting may supply approximately 80% of the minimized demand in case of the apartment flat and 60% in case of the free standing house. To harvest these potentials, different system specifications, related to the household type, are required. Two constraints to recycle and multi-source were identified, namely i) limitations in the grey water production and available rainfall; and ii) the potential to harvest water as determined by the temporal pattern in water availability, water use, and storage and treatment capacities. Copyright © 2013 Elsevier Ltd. All rights reserved.
Challenges with secondary use of multi-source water-quality data in the United States
Sprague, Lori A.; Oelsner, Gretchen P.; Argue, Denise M.
2017-01-01
Combining water-quality data from multiple sources can help counterbalance diminishing resources for stream monitoring in the United States and lead to important regional and national insights that would not otherwise be possible. Individual monitoring organizations understand their own data very well, but issues can arise when their data are combined with data from other organizations that have used different methods for reporting the same common metadata elements. Such use of multi-source data is termed “secondary use”—the use of data beyond the original intent determined by the organization that collected the data. In this study, we surveyed more than 25 million nutrient records collected by 488 organizations in the United States since 1899 to identify major inconsistencies in metadata elements that limit the secondary use of multi-source data. Nearly 14.5 million of these records had missing or ambiguous information for one or more key metadata elements, including (in decreasing order of records affected) sample fraction, chemical form, parameter name, units of measurement, precise numerical value, and remark codes. As a result, metadata harmonization to make secondary use of these multi-source data will be time consuming, expensive, and inexact. Different data users may make different assumptions about the same ambiguous data, potentially resulting in different conclusions about important environmental issues. The value of these ambiguous data is estimated at \\$US12 billion, a substantial collective investment by water-resource organizations in the United States. By comparison, the value of unambiguous data is estimated at \\$US8.2 billion. The ambiguous data could be preserved for uses beyond the original intent by developing and implementing standardized metadata practices for future and legacy water-quality data throughout the United States.
Distributed software framework and continuous integration in hydroinformatics systems
NASA Astrophysics Data System (ADS)
Zhou, Jianzhong; Zhang, Wei; Xie, Mengfei; Lu, Chengwei; Chen, Xiao
2017-08-01
When encountering multiple and complicated models, multisource structured and unstructured data, complex requirements analysis, the platform design and integration of hydroinformatics systems become a challenge. To properly solve these problems, we describe a distributed software framework and it’s continuous integration process in hydroinformatics systems. This distributed framework mainly consists of server cluster for models, distributed database, GIS (Geographic Information System) servers, master node and clients. Based on it, a GIS - based decision support system for joint regulating of water quantity and water quality of group lakes in Wuhan China is established.
Towards Device-Independent Information Processing on General Quantum Networks
NASA Astrophysics Data System (ADS)
Lee, Ciarán M.; Hoban, Matty J.
2018-01-01
The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.
Towards Device-Independent Information Processing on General Quantum Networks.
Lee, Ciarán M; Hoban, Matty J
2018-01-12
The violation of certain Bell inequalities allows for device-independent information processing secure against nonsignaling eavesdroppers. However, this only holds for the Bell network, in which two or more agents perform local measurements on a single shared source of entanglement. To overcome the practical constraints that entangled systems can only be transmitted over relatively short distances, large-scale multisource networks have been employed. Do there exist analogs of Bell inequalities for such networks, whose violation is a resource for device independence? In this Letter, the violation of recently derived polynomial Bell inequalities will be shown to allow for device independence on multisource networks, secure against nonsignaling eavesdroppers.
Advanced techniques for the storage and use of very large, heterogeneous spatial databases
NASA Technical Reports Server (NTRS)
Peuquet, Donna J.
1987-01-01
Progress is reported in the development of a prototype knowledge-based geographic information system. The overall purpose of this project is to investigate and demonstrate the use of advanced methods in order to greatly improve the capabilities of geographic information system technology in the handling of large, multi-source collections of spatial data in an efficient manner, and to make these collections of data more accessible and usable for the Earth scientist.
In-vehicle group activity modeling and simulation in sensor-based virtual environment
NASA Astrophysics Data System (ADS)
Shirkhodaie, Amir; Telagamsetti, Durga; Poshtyar, Azin; Chan, Alex; Hu, Shuowen
2016-05-01
Human group activity recognition is a very complex and challenging task, especially for Partially Observable Group Activities (POGA) that occur in confined spaces with limited visual observability and often under severe occultation. In this paper, we present IRIS Virtual Environment Simulation Model (VESM) for the modeling and simulation of dynamic POGA. More specifically, we address sensor-based modeling and simulation of a specific category of POGA, called In-Vehicle Group Activities (IVGA). In VESM, human-alike animated characters, called humanoids, are employed to simulate complex in-vehicle group activities within the confined space of a modeled vehicle. Each articulated humanoid is kinematically modeled with comparable physical attributes and appearances that are linkable to its human counterpart. Each humanoid exhibits harmonious full-body motion - simulating human-like gestures and postures, facial impressions, and hands motions for coordinated dexterity. VESM facilitates the creation of interactive scenarios consisting of multiple humanoids with different personalities and intentions, which are capable of performing complicated human activities within the confined space inside a typical vehicle. In this paper, we demonstrate the efficiency and effectiveness of VESM in terms of its capabilities to seamlessly generate time-synchronized, multi-source, and correlated imagery datasets of IVGA, which are useful for the training and testing of multi-source full-motion video processing and annotation. Furthermore, we demonstrate full-motion video processing of such simulated scenarios under different operational contextual constraints.
Design and realization of disaster assessment algorithm after forest fire
NASA Astrophysics Data System (ADS)
Xu, Aijun; Wang, Danfeng; Tang, Lihua
2008-10-01
Based on GIS technology, this paper mainly focuses on the application of disaster assessment algorithm after forest fire and studies on the design and realization of disaster assessment based on GIS. After forest fire through the analysis and processing of multi-sources and heterogeneous data, this paper integrates the foundation that the domestic and foreign scholars laid of the research on assessment for forest fire loss with the related knowledge of assessment, accounting and forest resources appraisal so as to study and approach the theory framework and assessment index of the research on assessment for forest fire loss. The technologies of extracting boundary, overlay analysis, and division processing of multi-sources spatial data are available to realize the application of the investigation method of the burnt forest area and the computation of the fire area. The assessment provides evidence for fire cleaning in burnt areas and new policy making on restoration in terms of the direct and the indirect economic loss and ecological and environmental damage caused by forest fire under the condition of different fire danger classes and different amounts of forest accumulation, thus makes forest resources protection operated in a faster, more efficient and more economical way. Finally, this paper takes Lin'an city of Zhejiang province as a test area to confirm the method mentioned in the paper in terms of key technologies.
NASA Technical Reports Server (NTRS)
Kim, Hakil; Swain, Philip H.
1990-01-01
An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.
A 3D modeling approach to complex faults with multi-source data
NASA Astrophysics Data System (ADS)
Wu, Qiang; Xu, Hua; Zou, Xukai; Lei, Hongzhuan
2015-04-01
Fault modeling is a very important step in making an accurate and reliable 3D geological model. Typical existing methods demand enough fault data to be able to construct complex fault models, however, it is well known that the available fault data are generally sparse and undersampled. In this paper, we propose a workflow of fault modeling, which can integrate multi-source data to construct fault models. For the faults that are not modeled with these data, especially small-scale or approximately parallel with the sections, we propose the fault deduction method to infer the hanging wall and footwall lines after displacement calculation. Moreover, using the fault cutting algorithm can supplement the available fault points on the location where faults cut each other. Increasing fault points in poor sample areas can not only efficiently construct fault models, but also reduce manual intervention. By using a fault-based interpolation and remeshing the horizons, an accurate 3D geological model can be constructed. The method can naturally simulate geological structures no matter whether the available geological data are sufficient or not. A concrete example of using the method in Tangshan, China, shows that the method can be applied to broad and complex geological areas.
NASA Astrophysics Data System (ADS)
Zhang, Kongwen; Hu, Baoxin; Robinson, Justin
2014-01-01
The emerald ash borer (EAB) poses a significant economic and environmental threat to ash trees in southern Ontario, Canada, and the northern states of the USA. It is critical that effective technologies are urgently developed to detect, monitor, and control the spread of EAB. This paper presents a methodology using multisourced data to predict potential infestations of EAB in the town of Oakville, Ontario, Canada. The information combined in this study includes remotely sensed data, such as high spatial resolution aerial imagery, commercial ground and airborne hyperspectral data, and Google Earth imagery, in addition to nonremotely sensed data, such as archived paper maps and documents. This wide range of data provides extensive information that can be used for early detection of EAB, yet their effective employment and use remain a significant challenge. A prediction function was developed to estimate the EAB infestation states of individual ash trees using three major attributes: leaf chlorophyll content, tree crown spatial pattern, and prior knowledge. Comparison between these predicted values and a ground-based survey demonstrated an overall accuracy of 62.5%, with 22.5% omission and 18.5% commission errors.
Research for the jamming mechanism of high-frequency laser to the laser seeker
NASA Astrophysics Data System (ADS)
Zheng, Xingyuan; Zhang, Haiyang; Wang, Yunping; Feng, Shuang; Zhao, Changming
2013-08-01
High-frequency laser will be able to enter the enemy laser signal processing systems without encoded identification and a copy. That makes it one of the research directions of new interference sources. In order to study the interference mechanism of high-frequency laser to laser guided weapons. According to the principle of high-frequency laser interference, a series of related theoretical models such as a semi-active laser seeker coded identification model, a time door model, multi-signal processing model and a interference signal modulation processing model are established. Then seeker interfere with effective 3σ criterion is proposed. Based on this, the study of the effect of multi-source interference and signal characteristics of the effect of high repetition frequency laser interference are key research. According to the simulation system testing, the results show that the multi-source interference and interference signal frequency modulation can effectively enhance the interference effect. While the interference effect of the interference signal amplitude modulation is not obvious. The research results will provide the evaluation of high-frequency laser interference effect and provide theoretical references for high-frequency laser interference system application.
Kaplan, Haim; Kaplan, Lilach
2016-12-01
In the recent years, there is a growth in demand for radiofrequency (RF)-based procedures to improve skin texture, laxity and contour. The new generation of systems allow non-invasive and fractional resurfacing treatments on one platform. The aim of this study was to evaluate the safety and efficacy of a new treatment protocol using a multisource RF, combining 3 different modalities in each patient: [1] non-ablative RF skin tightening, [2] fractional skin resurfacing, and [3] microneedling RF for non-ablative coagulation and collagen remodelling. 14 subjects were enrolled in this study using EndyMed PRO ™ platform. Each patient had 8 non-ablative treatments and 4 fractional treatments (fractional skin resurfacing and Intensif). The global aesthetic score was used to evaluate improvement. All patients had improvement in skin appearance. About 43% had excellent or very good improvement above 50%, 18% had good improvement between 25 and 50%, and the rest 39% had a mild improvement of < 25%. Downtime was minimal and no adverse effect was reported. Our data show significant improvement of skin texture, skin laxity and wrinkle reduction achieved using RF treatment platform.
Velpuri, Naga Manohar; Senay, Gabriel B.
2012-01-01
Lake Turkana, the largest desert lake in the world, is fed by ungauged or poorly gauged river systems. To meet the demand of electricity in the East African region, Ethiopia is currently building the Gibe III hydroelectric dam on the Omo River, which supplies more than 80% of the inflows to Lake Turkana. On completion, the Gibe III dam will be the tallest dam in Africa with a height of 241 m. However, the nature of interactions and potential impacts of regulated inflows to Lake Turkana are not well understood due to its remote location and unavailability of reliable in-situ datasets. In this study, we used 12 years (1998–2009) of existing multi-source satellite and model-assimilated global weather data. We use calibrated multi-source satellite data-driven water balance model for Lake Turkana that takes into account model routed runoff, lake/reservoir evapotranspiration, direct rain on lakes/reservoirs and releases from the dam to compute lake water levels. The model evaluates the impact of Gibe III dam using three different approaches such as (a historical approach, a knowledge-based approach, and a nonparametric bootstrap resampling approach) to generate rainfall-runoff scenarios. All the approaches provided comparable and consistent results. Model results indicated that the hydrological impact of the dam on Lake Turkana would vary with the magnitude and distribution of rainfall post-dam commencement. On average, the reservoir would take up to 8–10 months, after commencement, to reach a minimum operation level of 201 m depth of water. During the dam filling period, the lake level would drop up to 2 m (95% confidence) compared to the lake level modelled without the dam. The lake level variability caused by regulated inflows after the dam commissioning were found to be within the natural variability of the lake of 4.8 m. Moreover, modelling results indicated that the hydrological impact of the Gibe III dam would depend on the initial lake level at the time of dam commencement. Areas along the Lake Turkana shoreline that are vulnerable to fluctuations in lake levels were also identified. This study demonstrates the effectiveness of using existing multi-source satellite data in a basic modeling framework to assess the potential hydrological impact of an upstream dam on a terminal downstream lake. The results obtained from this study could also be used to evaluate alternate dam-filling scenarios and assess the potential impact of the dam on Lake Turkana under different operational strategies.
NASA Astrophysics Data System (ADS)
Eberle, J.; Schmullius, C.
2017-12-01
Increasing archives of global satellite data present a new challenge to handle multi-source satellite data in a user-friendly way. Any user is confronted with different data formats and data access services. In addition the handling of time-series data is complex as an automated processing and execution of data processing steps is needed to supply the user with the desired product for a specific area of interest. In order to simplify the access to data archives of various satellite missions and to facilitate the subsequent processing, a regional data and processing middleware has been developed. The aim of this system is to provide standardized and web-based interfaces to multi-source time-series data for individual regions on Earth. For further use and analysis uniform data formats and data access services are provided. Interfaces to data archives of the sensor MODIS (NASA) as well as the satellites Landsat (USGS) and Sentinel (ESA) have been integrated in the middleware. Various scientific algorithms, such as the calculation of trends and breakpoints of time-series data, can be carried out on the preprocessed data on the basis of uniform data management. Jupyter Notebooks are linked to the data and further processing can be conducted directly on the server using Python and the statistical language R. In addition to accessing EO data, the middleware is also used as an intermediary between the user and external databases (e.g., Flickr, YouTube). Standardized web services as specified by OGC are provided for all tools of the middleware. Currently, the use of cloud services is being researched to bring algorithms to the data. As a thematic example, an operational monitoring of vegetation phenology is being implemented on the basis of various optical satellite data and validation data from the German Weather Service. Other examples demonstrate the monitoring of wetlands focusing on automated discovery and access of Landsat and Sentinel data for local areas.
Image fusion based on Bandelet and sparse representation
NASA Astrophysics Data System (ADS)
Zhang, Jiuxing; Zhang, Wei; Li, Xuzhi
2018-04-01
Bandelet transform could acquire geometric regular direction and geometric flow, sparse representation could represent signals with as little as possible atoms on over-complete dictionary, both of which could be used to image fusion. Therefore, a new fusion method is proposed based on Bandelet and Sparse Representation, to fuse Bandelet coefficients of multi-source images and obtain high quality fusion effects. The test are performed on remote sensing images and simulated multi-focus images, experimental results show that the performance of new method is better than tested methods according to objective evaluation indexes and subjective visual effects.
Efficient Inversion of Mult-frequency and Multi-Source Electromagnetic Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gary D. Egbert
2007-03-22
The project covered by this report focused on development of efficient but robust non-linear inversion algorithms for electromagnetic induction data, in particular for data collected with multiple receivers, and multiple transmitters, a situation extremely common in eophysical EM subsurface imaging methods. A key observation is that for such multi-transmitter problems each step in commonly used linearized iterative limited memory search schemes such as conjugate gradients (CG) requires solution of forward and adjoint EM problems for each of the N frequencies or sources, essentially generating data sensitivities for an N dimensional data-subspace. These multiple sensitivities allow a good approximation to themore » full Jacobian of the data mapping to be built up in many fewer search steps than would be required by application of textbook optimization methods, which take no account of the multiplicity of forward problems that must be solved for each search step. We have applied this idea to a develop a hybrid inversion scheme that combines features of the iterative limited memory type methods with a Newton-type approach using a partial calculation of the Jacobian. Initial tests on 2D problems show that the new approach produces results essentially identical to a Newton type Occam minimum structure inversion, while running more rapidly than an iterative (fixed regularization parameter) CG style inversion. Memory requirements, while greater than for something like CG, are modest enough that even in 3D the scheme should allow 3D inverse problems to be solved on a common desktop PC, at least for modest (~ 100 sites, 15-20 frequencies) data sets. A secondary focus of the research has been development of a modular system for EM inversion, using an object oriented approach. This system has proven useful for more rapid prototyping of inversion algorithms, in particular allowing initial development and testing to be conducted with two-dimensional example problems, before approaching more computationally cumbersome three-dimensional problems.« less
Pattern recognition methods and air pollution source identification. [based on wind direction
NASA Technical Reports Server (NTRS)
Leibecki, H. F.; King, R. B.
1978-01-01
Directional air samplers, used for resolving suspended particulate matter on the basis of time and wind direction were used to assess the feasibility of characterizing and identifying emission source types in urban multisource environments. Filters were evaluated for 16 elements and X-ray fluorescence methods yielded elemental concentrations for direction, day, and the interaction of direction and day. Large numbers of samples are necessary to compensate for large day-to-day variations caused by wind perturbations and/or source changes.
Energy Harvesting Research: The Road from Single Source to Multisource.
Bai, Yang; Jantunen, Heli; Juuti, Jari
2018-06-07
Energy harvesting technology may be considered an ultimate solution to replace batteries and provide a long-term power supply for wireless sensor networks. Looking back into its research history, individual energy harvesters for the conversion of single energy sources into electricity are developed first, followed by hybrid counterparts designed for use with multiple energy sources. Very recently, the concept of a truly multisource energy harvester built from only a single piece of material as the energy conversion component is proposed. This review, from the aspect of materials and device configurations, explains in detail a wide scope to give an overview of energy harvesting research. It covers single-source devices including solar, thermal, kinetic and other types of energy harvesters, hybrid energy harvesting configurations for both single and multiple energy sources and single material, and multisource energy harvesters. It also includes the energy conversion principles of photovoltaic, electromagnetic, piezoelectric, triboelectric, electrostatic, electrostrictive, thermoelectric, pyroelectric, magnetostrictive, and dielectric devices. This is one of the most comprehensive reviews conducted to date, focusing on the entire energy harvesting research scene and providing a guide to seeking deeper and more specific research references and resources from every corner of the scientific community. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Kostyuchenko, Yuriy V.; Yuschenko, Maxim; Movchan, Dmytro; Kopachevsky, Ivan
2017-10-01
Problem of remote sensing data harnessing for decision making in conflict territories is considered. Approach for analysis of socio-economic and demographic parameters with a limited set of data and deep uncertainty is described. Number of interlinked techniques to estimate a population and economy in crisis territories are proposed. Stochastic method to assessment of population dynamics using multi-source data using remote sensing data is proposed. Adaptive Markov's chain based method to study of land-use changes using satellite data is proposed. Proposed approach is applied to analysis of socio-economic situation in Donbas (East Ukraine) territory of conflict in 2014-2015. Land-use and landcover patterns for different periods were analyzed using the Landsat and MODIS data . The land-use classification scheme includes the following categories: (1) urban or built-up land, (2) barren land, (3) cropland, (4) horticulture farms, (5) livestock farms, (6) forest, and (7) water. It was demonstrated, that during the period 2014-2015 was not detected drastic changes in land-use structure of study area. Heterogeneously distributed decreasing of horticulture farms (4-6%), livestock farms (5-6%), croplands (3-4%), and increasing of barren land (6-7%) have been observed. Way to analyze land-cover productivity variations using satellite data is proposed. Algorithm is based on analysis of time-series of NDVI and NDWI distributions. Drastic changes of crop area and its productivity were detected. Set of indirect indicators, such as night light intensity, is also considered. Using the approach proposed, using the data utilized, the local and regional GDP, local population, and its dynamics are estimated.
Multi-Source Learning for Joint Analysis of Incomplete Multi-Modality Neuroimaging Data
Yuan, Lei; Wang, Yalin; Thompson, Paul M.; Narayan, Vaibhav A.; Ye, Jieping
2013-01-01
Incomplete data present serious problems when integrating largescale brain imaging data sets from different imaging modalities. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), for example, over half of the subjects lack cerebrospinal fluid (CSF) measurements; an independent half of the subjects do not have fluorodeoxyglucose positron emission tomography (FDG-PET) scans; many lack proteomics measurements. Traditionally, subjects with missing measures are discarded, resulting in a severe loss of available information. We address this problem by proposing two novel learning methods where all the samples (with at least one available data source) can be used. In the first method, we divide our samples according to the availability of data sources, and we learn shared sets of features with state-of-the-art sparse learning methods. Our second method learns a base classifier for each data source independently, based on which we represent each source using a single column of prediction scores; we then estimate the missing prediction scores, which, combined with the existing prediction scores, are used to build a multi-source fusion model. To illustrate the proposed approaches, we classify patients from the ADNI study into groups with Alzheimer’s disease (AD), mild cognitive impairment (MCI) and normal controls, based on the multi-modality data. At baseline, ADNI’s 780 participants (172 AD, 397 MCI, 211 Normal), have at least one of four data types: magnetic resonance imaging (MRI), FDG-PET, CSF and proteomics. These data are used to test our algorithms. Comprehensive experiments show that our proposed methods yield stable and promising results. PMID:24014189
Enhancing the performance of regional land cover mapping
NASA Astrophysics Data System (ADS)
Wu, Weicheng; Zucca, Claudio; Karam, Fadi; Liu, Guangping
2016-10-01
Different pixel-based, object-based and subpixel-based methods such as time-series analysis, decision-tree, and different supervised approaches have been proposed to conduct land use/cover classification. However, despite their proven advantages in small dataset tests, their performance is variable and less satisfactory while dealing with large datasets, particularly, for regional-scale mapping with high resolution data due to the complexity and diversity in landscapes and land cover patterns, and the unacceptably long processing time. The objective of this paper is to demonstrate the comparatively highest performance of an operational approach based on integration of multisource information ensuring high mapping accuracy in large areas with acceptable processing time. The information used includes phenologically contrasted multiseasonal and multispectral bands, vegetation index, land surface temperature, and topographic features. The performance of different conventional and machine learning classifiers namely Malahanobis Distance (MD), Maximum Likelihood (ML), Artificial Neural Networks (ANNs), Support Vector Machines (SVMs) and Random Forests (RFs) was compared using the same datasets in the same IDL (Interactive Data Language) environment. An Eastern Mediterranean area with complex landscape and steep climate gradients was selected to test and develop the operational approach. The results showed that SVMs and RFs classifiers produced most accurate mapping at local-scale (up to 96.85% in Overall Accuracy), but were very time-consuming in whole-scene classification (more than five days per scene) whereas ML fulfilled the task rapidly (about 10 min per scene) with satisfying accuracy (94.2-96.4%). Thus, the approach composed of integration of seasonally contrasted multisource data and sampling at subclass level followed by a ML classification is a suitable candidate to become an operational and effective regional land cover mapping method.
Multisource Data Integration in Remote Sensing
NASA Technical Reports Server (NTRS)
Tilton, James C. (Editor)
1991-01-01
Papers presented at the workshop on Multisource Data Integration in Remote Sensing are compiled. The full text of these papers is included. New instruments and new sensors are discussed that can provide us with a large variety of new views of the real world. This huge amount of data has to be combined and integrated in a (computer-) model of this world. Multiple sources may give complimentary views of the world - consistent observations from different (and independent) data sources support each other and increase their credibility, while contradictions may be caused by noise, errors during processing, or misinterpretations, and can be identified as such. As a consequence, integration results are very reliable and represent a valid source of information for any geographical information system.
WHO Expert Committee on Specifications for Pharmaceutical Preparations.
2012-01-01
The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use: Development of monographs for The International Pharmacopoeia; WHO good manufacturing practices: water for pharmaceutical use; Pharmaceutical development of multisource (generic) pharmaceutical products--points to consider; Guidelines on submission of documentation for a multisource (generic) finished pharmaceutical product for the WHO Prequalification of Medicines Programme: quality part; Development of paediatric medicines: points to consider in formulation; Recommendations for quality requirements for artemisinin as a starting material in the production of antimalarial active pharmaceutical ingredients.
Evaluation of the Maximum Allowable Cost Program
Lee, A. James; Hefner, Dennis; Dobson, Allen; Hardy, Ralph
1983-01-01
This article summarizes an evaluation of the Maximum Allowable Cost (MAC)-Estimated Acquisition Cost (EAC) program, the Federal Government's cost-containment program for prescription drugs.1 The MAC-EAC regulations which became effective on August 26, 1976, have four major components: (1) Maximum Allowable Cost reimbursement limits for selected multisource or generically available drugs; (2) Estimated Acquisition Cost reimbursement limits for all drugs; (3) “usual and customary” reimbursement limits for all drugs; and (4) a directive that professional fee studies be performed by each State. The study examines the benefits and costs of the MAC reimbursement limits for 15 dosage forms of five multisource drugs and EAC reimbursement limits for all drugs for five selected States as of 1979. PMID:10309857
NASA Astrophysics Data System (ADS)
Heitlager, Ilja; Helms, Remko; Brinkkemper, Sjaak
Information Technology Outsourcing practice and research mainly considers the outsourcing phenomenon as a generic fulfilment of the IT function by external parties. Inspired by the logic of commodity, core competencies and economies of scale; assets, existing departments and IT functions are transferred to external parties. Although the generic approach might work for desktop outsourcing, where standardisation is the dominant factor, it does not work for the management of mission critical applications. Managing mission critical applications requires a different approach where building relationships is critical. The relationships involve inter and intra organisational parties in a multi-sourcing arrangement, called an IT service chain, consisting of multiple (specialist) parties that have to collaborate closely to deliver high quality services.
MSWEP V2 global 3-hourly 0.1° precipitation: methodology and quantitative appraisal
NASA Astrophysics Data System (ADS)
Beck, H.; Yang, L.; Pan, M.; Wood, E. F.; William, L.
2017-12-01
Here, we present Multi-Source Weighted-Ensemble Precipitation (MSWEP) V2, the first fully global gridded precipitation (P) dataset with a 0.1° spatial resolution. The dataset covers the period 1979-2016, has a 3-hourly temporal resolution, and was derived by optimally merging a wide range of data sources based on gauges (WorldClim, GHCN-D, GSOD, and others), satellites (CMORPH, GridSat, GSMaP, and TMPA 3B42RT), and reanalyses (ERA-Interim, JRA-55, and NCEP-CFSR). MSWEP V2 implements some major improvements over V1, such as (i) the correction of distributional P biases using cumulative distribution function matching, (ii) increasing the spatial resolution from 0.25° to 0.1°, (iii) the inclusion of ocean areas, (iv) the addition of NCEP-CFSR P estimates, (v) the addition of thermal infrared-based P estimates for the pre-TRMM era, (vi) the addition of 0.1° daily interpolated gauge data, (vii) the use of a daily gauge correction scheme that accounts for regional differences in the 24-hour accumulation period of gauges, and (viii) extension of the data record to 2016. The gauge-based assessment of the reanalysis and satellite P datasets, necessary for establishing the merging weights, revealed that the reanalysis datasets strongly overestimate the P frequency for the entire globe, and that the satellite (resp. reanalysis) datasets consistently performed better at low (high) latitudes. Compared to other state-of-the-art P datasets, MSWEP V2 exhibits more plausible global patterns in mean annual P, percentiles, and annual number of dry days, and better resolves the small-scale variability over topographically complex terrain. Other P datasets appear to consistently underestimate P amounts over mountainous regions. Long-term mean P estimates for the global, land, and ocean domains based on MSWEP V2 are 959, 796, and 1026 mm/yr, respectively, in close agreement with the best previous published estimates.
The MiPACQ Clinical Question Answering System
Cairns, Brian L.; Nielsen, Rodney D.; Masanz, James J.; Martin, James H.; Palmer, Martha S.; Ward, Wayne H.; Savova, Guergana K.
2011-01-01
The Multi-source Integrated Platform for Answering Clinical Questions (MiPACQ) is a QA pipeline that integrates a variety of information retrieval and natural language processing systems into an extensible question answering system. We present the system’s architecture and an evaluation of MiPACQ on a human-annotated evaluation dataset based on the Medpedia health and medical encyclopedia. Compared with our baseline information retrieval system, the MiPACQ rule-based system demonstrates 84% improvement in Precision at One and the MiPACQ machine-learning-based system demonstrates 134% improvement. Other performance metrics including mean reciprocal rank and area under the precision/recall curves also showed significant improvement, validating the effectiveness of the MiPACQ design and implementation. PMID:22195068
The MiPACQ clinical question answering system.
Cairns, Brian L; Nielsen, Rodney D; Masanz, James J; Martin, James H; Palmer, Martha S; Ward, Wayne H; Savova, Guergana K
2011-01-01
The Multi-source Integrated Platform for Answering Clinical Questions (MiPACQ) is a QA pipeline that integrates a variety of information retrieval and natural language processing systems into an extensible question answering system. We present the system's architecture and an evaluation of MiPACQ on a human-annotated evaluation dataset based on the Medpedia health and medical encyclopedia. Compared with our baseline information retrieval system, the MiPACQ rule-based system demonstrates 84% improvement in Precision at One and the MiPACQ machine-learning-based system demonstrates 134% improvement. Other performance metrics including mean reciprocal rank and area under the precision/recall curves also showed significant improvement, validating the effectiveness of the MiPACQ design and implementation.
NASA Astrophysics Data System (ADS)
Vieira, João; da Conceição Cunha, Maria
2017-04-01
A multi-objective decision model has been developed to identify the Pareto-optimal set of management alternatives for the conjunctive use of surface water and groundwater of a multisource urban water supply system. A multi-objective evolutionary algorithm, Borg MOEA, is used to solve the multi-objective decision model. The multiple solutions can be shown to stakeholders allowing them to choose their own solutions depending on their preferences. The multisource urban water supply system studied here is dependent on surface water and groundwater and located in the Algarve region, southernmost province of Portugal, with a typical warm Mediterranean climate. The rainfall is low, intermittent and concentrated in a short winter, followed by a long and dry period. A base population of 450 000 inhabitants and visits by more than 13 million tourists per year, mostly in summertime, turns water management critical and challenging. Previous studies on single objective optimization after aggregating multiple objectives together have already concluded that only an integrated and interannual water resources management perspective can be efficient for water resource allocation in this drought prone region. A simulation model of the multisource urban water supply system using mathematical functions to represent the water balance in the surface reservoirs, the groundwater flow in the aquifers, and the water transport in the distribution network with explicit representation of water quality is coupled with Borg MOEA. The multi-objective problem formulation includes five objectives. Two objective evaluate separately the water quantity and the water quality supplied for the urban use in a finite time horizon, one objective calculates the operating costs, and two objectives appraise the state of the two water sources - the storage in the surface reservoir and the piezometric levels in aquifer - at the end of the time horizon. The decision variables are the volume of withdrawals from each water source in each time step (i.e., reservoir diversion and groundwater pumping). The results provide valuable information for analysing the impacts of the conjunctive use of surface water and groundwater. For example, considering a drought scenario, the results show how the same level of total water supplied can be achieved by different management alternatives with different impact on the water quality, costs, and the state of the water sources at the end of the time horizon. The results allow also the clear understanding of the potential benefits from the conjunctive use of surface water and groundwater thorough the mitigation of the variation in the availability of surface water, improving the water quantity and/or water quality delivered to the users, or the better adaptation of such systems to a changing world.
Incomplete Multisource Transfer Learning.
Ding, Zhengming; Shao, Ming; Fu, Yun
2018-02-01
Transfer learning is generally exploited to adapt well-established source knowledge for learning tasks in weakly labeled or unlabeled target domain. Nowadays, it is common to see multiple sources available for knowledge transfer, each of which, however, may not include complete classes information of the target domain. Naively merging multiple sources together would lead to inferior results due to the large divergence among multiple sources. In this paper, we attempt to utilize incomplete multiple sources for effective knowledge transfer to facilitate the learning task in target domain. To this end, we propose an incomplete multisource transfer learning through two directional knowledge transfer, i.e., cross-domain transfer from each source to target, and cross-source transfer. In particular, in cross-domain direction, we deploy latent low-rank transfer learning guided by iterative structure learning to transfer knowledge from each single source to target domain. This practice reinforces to compensate for any missing data in each source by the complete target data. While in cross-source direction, unsupervised manifold regularizer and effective multisource alignment are explored to jointly compensate for missing data from one portion of source to another. In this way, both marginal and conditional distribution discrepancy in two directions would be mitigated. Experimental results on standard cross-domain benchmarks and synthetic data sets demonstrate the effectiveness of our proposed model in knowledge transfer from incomplete multiple sources.
NASA Technical Reports Server (NTRS)
Brooks, Colin; Bourgeau-Chavez, Laura; Endres, Sarah; Battaglia, Michael; Shuchman, Robert
2015-01-01
Assist with the evaluation and measuring of wetlands hydroperiod at the Plum Brook Station using multi-source remote sensing data as part of a larger effort on projecting climate change-related impacts on the station's wetland ecosystems. MTRI expanded on the multi-source remote sensing capabilities to help estimate and measure hydroperiod and the relative soil moisture of wetlands at NASA's Plum Brook Station. Multi-source remote sensing capabilities are useful in estimating and measuring hydroperiod and relative soil moisture of wetlands. This is important as a changing regional climate has several potential risks for wetland ecosystem function. The year two analysis built on the first year of the project by acquiring and analyzing remote sensing data for additional dates and types of imagery, combined with focused field work. Five deliverables were planned and completed: (1) Show the relative length of hydroperiod using available remote sensing datasets, (2) Date linked table of wetlands extent over time for all feasible non-forested wetlands, (3) Utilize LIDAR data to measure topographic height above sea level of all wetlands, wetland to catchment area radio, slope of wetlands, and other useful variables (4), A demonstration of how analyzed results from multiple remote sensing data sources can help with wetlands vulnerability assessment; and (5) A MTRI style report summarizing year 2 results.
Multi-source energy harvester to power sensing hardware on rotating structures
NASA Astrophysics Data System (ADS)
Schlichting, Alexander; Ouellette, Scott; Carlson, Clinton; Farinholt, Kevin M.; Park, Gyuhae; Farrar, Charles R.
2010-04-01
The U.S. Department of Energy (DOE) proposes to meet 20% of the nation's energy needs through wind power by the year 2030. To accomplish this goal, the industry will need to produce larger (>100m diameter) turbines to increase efficiency and maximize energy production. It will be imperative to instrument the large composite structures with onboard sensing to provide structural health monitoring capabilities to understand the global response and integrity of these systems as they age. A critical component in the deployment of such a system will be a robust power source that can operate for the lifespan of the wind turbine. In this paper we consider the use of discrete, localized power sources that derive energy from the ambient (solar, thermal) or operational (kinetic) environment. This approach will rely on a multi-source configuration that scavenges energy from photovoltaic and piezoelectric transducers. Each harvester is first characterized individually in the laboratory and then they are combined through a multi-source power conditioner that is designed to combine the output of each harvester in series to power a small wireless sensor node that has active-sensing capabilities. The advantages/disadvantages of each approach are discussed, along with the proposed design for a field ready energy harvester that will be deployed on a small-scale 19.8m diameter wind turbine.
Wang, Qi; Xie, Zhiyi; Li, Fangbai
2015-11-01
This study aims to identify and apportion multi-source and multi-phase heavy metal pollution from natural and anthropogenic inputs using ensemble models that include stochastic gradient boosting (SGB) and random forest (RF) in agricultural soils on the local scale. The heavy metal pollution sources were quantitatively assessed, and the results illustrated the suitability of the ensemble models for the assessment of multi-source and multi-phase heavy metal pollution in agricultural soils on the local scale. The results of SGB and RF consistently demonstrated that anthropogenic sources contributed the most to the concentrations of Pb and Cd in agricultural soils in the study region and that SGB performed better than RF. Copyright © 2015 Elsevier Ltd. All rights reserved.
Gomez, Rapson; Burns, G Leonard; Walsh, James A; Hafetz, Nina
2005-04-01
Confirmatory factor analysis (CFA) was used to model a multitrait by multisource matrix to determine the convergent and discriminant validity of measures of attention-deficit hyperactivity disorder (ADHD)-inattention (IN), ADHD-hyperactivity/impulsivity (HI), and oppositional defiant disorder (ODD) in 917 Malaysian elementary school children. The three trait factors were ADHD-IN, ADHDHI, and ODD. The two source factors were parents and teachers. Similar to earlier studies with Australian and Brazilian children, the parent and teacher measures failed to show convergent and discriminant validity with Malaysian children. The study outlines the implications of such strong source effects in ADHD-IN, ADHD-HI, and ODD measures for the use of such parent and teacher scales to study the symptom dimensions.
Multisource least-squares reverse-time migration with structure-oriented filtering
NASA Astrophysics Data System (ADS)
Fan, Jing-Wen; Li, Zhen-Chun; Zhang, Kai; Zhang, Min; Liu, Xue-Tong
2016-09-01
The technology of simultaneous-source acquisition of seismic data excited by several sources can significantly improve the data collection efficiency. However, direct imaging of simultaneous-source data or blended data may introduce crosstalk noise and affect the imaging quality. To address this problem, we introduce a structure-oriented filtering operator as preconditioner into the multisource least-squares reverse-time migration (LSRTM). The structure-oriented filtering operator is a nonstationary filter along structural trends that suppresses crosstalk noise while maintaining structural information. The proposed method uses the conjugate-gradient method to minimize the mismatch between predicted and observed data, while effectively attenuating the interference noise caused by exciting several sources simultaneously. Numerical experiments using synthetic data suggest that the proposed method can suppress the crosstalk noise and produce highly accurate images.
Global 3-D ionospheric electron density reanalysis based on multisource data assimilation
NASA Astrophysics Data System (ADS)
Yue, Xinan; Schreiner, William S.; Kuo, Ying-Hwa; Hunt, Douglas C.; Wang, Wenbin; Solomon, Stanley C.; Burns, Alan G.; Bilitza, Dieter; Liu, Jann-Yenq; Wan, Weixing; Wickert, Jens
2012-09-01
We report preliminary results of a global 3-D ionospheric electron density reanalysis demonstration study during 2002-2011 based on multisource data assimilation. The monthly global ionospheric electron density reanalysis has been done by assimilating the quiet days ionospheric data into a data assimilation model constructed using the International Reference Ionosphere (IRI) 2007 model and a Kalman filter technique. These data include global navigation satellite system (GNSS) observations of ionospheric total electron content (TEC) from ground-based stations, ionospheric radio occultations by CHAMP, GRACE, COSMIC, SAC-C, Metop-A, and the TerraSAR-X satellites, and Jason-1 and 2 altimeter TEC measurements. The output of the reanalysis are 3-D gridded ionospheric electron densities with temporal and spatial resolutions of 1 h in universal time, 5° in latitude, 10° in longitude, and ˜30 km in altitude. The climatological features of the reanalysis results, such as solar activity dependence, seasonal variations, and the global morphology of the ionosphere, agree well with those in the empirical models and observations. The global electron content derived from the international GNSS service global ionospheric maps, the observed electron density profiles from the Poker Flat Incoherent Scatter Radar during 2007-2010, and foF2 observed by the global ionosonde network during 2002-2011 are used to validate the reanalysis method. All comparisons show that the reanalysis have smaller deviations and biases than the IRI-2007 predictions. Especially after April 2006 when the six COSMIC satellites were launched, the reanalysis shows significant improvement over the IRI predictions. The obvious overestimation of the low-latitude ionospheric F region densities by the IRI model during the 23/24 solar minimum is corrected well by the reanalysis. The potential application and improvements of the reanalysis are also discussed.
New geomorphic data on the active Taiwan orogen: A multisource approach
NASA Technical Reports Server (NTRS)
Deffontaines, B.; Lee, J.-C.; Angelier, J.; Carvalho, J.; Rudant, J.-P.
1994-01-01
A multisource and multiscale approach of Taiwan morphotectonics combines different complementary geomorphic analyses based on a new elevation model (DEM), side-looking airborne radar (SLAR), and satellite (SPOT) imagery, aerial photographs, and control from independent field data. This analysis enables us not only to present an integrated geomorphic description of the Taiwan orogen but also to highlight some new geodynamic aspects. Well-known, major geological structures such as the Longitudinal Valley, Lishan, Pingtung, and the Foothills fault zones are of course clearly recognized, but numerous, previously unrecognized structures appear distributed within different regions of Taiwan. For instance, transfer fault zones within the Western Foothills and the Central Range are identified based on analyses of lineaments and general morphology. In many cases, the existence of geomorphic features identified in general images is supported by the results of geological field analyses carried out independently. In turn, the field analyses of structures and mechanisms at some sites provide a key for interpreting similar geomorphic featues in other areas. Examples are the conjugate pattern of strike-slip faults within the Central Range and the oblique fold-and-thrust pattern of the Coastal Range. Furthermore, neotectonic and morphological analyses (drainage and erosional surfaces) has been combined in order to obtain a more comprehensive description and interpretation of neotectonic features in Taiwan, such as for the Longitudinal Valley Fault. Next, at a more general scale, numerical processing of digital elevation models, resulting in average topography, summit level or base level maps, allows identification of major features related to the dynamics of uplift and erosion and estimates of erosion balance. Finally, a preliminary morphotectonic sketch map of Taiwan, combining information from all the sources listed above, is presented.
Connected Vehicle Applications : Mobility
DOT National Transportation Integrated Search
2017-03-03
Connected vehicle mobility applications are commonly referred to as dynamic mobility applications (DMAs). DMAs seek to fully leverage frequently collected and rapidly disseminated multi-source data gathered from connected travelers, vehicles, and inf...
NASA Astrophysics Data System (ADS)
Wason, H.; Herrmann, F. J.; Kumar, R.
2016-12-01
Current efforts towards dense shot (or receiver) sampling and full azimuthal coverage to produce high resolution images have led to the deployment of multiple source vessels (or streamers) across marine survey areas. Densely sampled marine seismic data acquisition, however, is expensive, and hence necessitates the adoption of sampling schemes that save acquisition costs and time. Compressed sensing is a sampling paradigm that aims to reconstruct a signal--that is sparse or compressible in some transform domain--from relatively fewer measurements than required by the Nyquist sampling criteria. Leveraging ideas from the field of compressed sensing, we show how marine seismic acquisition can be setup as a compressed sensing problem. A step ahead from multi-source seismic acquisition is simultaneous source acquisition--an emerging technology that is stimulating both geophysical research and commercial efforts--where multiple source arrays/vessels fire shots simultaneously resulting in better coverage in marine surveys. Following the design principles of compressed sensing, we propose a pragmatic simultaneous time-jittered time-compressed marine acquisition scheme where single or multiple source vessels sail across an ocean-bottom array firing airguns at jittered times and source locations, resulting in better spatial sampling and speedup acquisition. Our acquisition is low cost since our measurements are subsampled. Simultaneous source acquisition generates data with overlapping shot records, which need to be separated for further processing. We can significantly impact the reconstruction quality of conventional seismic data from jittered data and demonstrate successful recovery by sparsity promotion. In contrast to random (sub)sampling, acquisition via jittered (sub)sampling helps in controlling the maximum gap size, which is a practical requirement of wavefield reconstruction with localized sparsifying transforms. We illustrate our results with simulations of simultaneous time-jittered marine acquisition for 2D and 3D ocean-bottom cable survey.
NASA Astrophysics Data System (ADS)
Ma, Yingzhao; Yang, Yuan; Han, Zhongying; Tang, Guoqiang; Maguire, Lane; Chu, Zhigang; Hong, Yang
2018-01-01
The objective of this study is to comprehensively evaluate the new Ensemble Multi-Satellite Precipitation Dataset using the Dynamic Bayesian Model Averaging scheme (EMSPD-DBMA) at daily and 0.25° scales from 2001 to 2015 over the Tibetan Plateau (TP). Error analysis against gauge observations revealed that EMSPD-DBMA captured the spatiotemporal pattern of daily precipitation with an acceptable Correlation Coefficient (CC) of 0.53 and a Relative Bias (RB) of -8.28%. Moreover, EMSPD-DBMA outperformed IMERG and GSMaP-MVK in almost all metrics in the summers of 2014 and 2015, with the lowest RB and Root Mean Square Error (RMSE) values of -2.88% and 8.01 mm/d, respectively. It also better reproduced the Probability Density Function (PDF) in terms of daily rainfall amount and estimated moderate and heavy rainfall better than both IMERG and GSMaP-MVK. Further, hydrological evaluation with the Coupled Routing and Excess STorage (CREST) model in the Upper Yangtze River region indicated that the EMSPD-DBMA forced simulation showed satisfying hydrological performance in terms of streamflow prediction, with Nash-Sutcliffe coefficient of Efficiency (NSE) values of 0.82 and 0.58, compared to gauge forced simulation (0.88 and 0.60) at the calibration and validation periods, respectively. EMSPD-DBMA also performed a greater fitness for peak flow simulation than a new Multi-Source Weighted-Ensemble Precipitation Version 2 (MSWEP V2) product, indicating a promising prospect of hydrological utility for the ensemble satellite precipitation data. This study belongs to early comprehensive evaluation of the blended multi-satellite precipitation data across the TP, which would be significant for improving the DBMA algorithm in regions with complex terrain.
Elman, Monica; Harth, Yoram
2011-01-01
The basic properties of lasers and pulsed light sources limit their ability to deliver high energy to the dermis and subcutaneous tissues without excessive damage to the epidermis. Radiofrequency was shown to penetrate deeper than optical light sources independent of skin color. The early RF-based devices used single source bipolar RF, which is safe but limited in use due to the superficial flow of energy between the two bipolar electrodes. Another type of single source RF employs a single electrode (monopolar) in which the RF energy flows from one electrode on the surface of the skin through the entire body to a plate under the body. Although more effective than bipolar, this devices require intense active cooling of the skin and may be associated with considerable pain and other systemic and local safety concerns. Latest generation of RF technology developed by EndyMed Medical Ltd. (Caesarea, Israel) utilizes simultaneously six or more phase controlled RF generators (3DEEP technology). The multiple electrical fields created by the multiple sources "repel" or "attract" each other, leading to the precise 3 dimensional delivery of RF energy to the dermal and sub-dermal targets minimizing the energy flow through the epidermis without the need for active cooling. Confocal microscopy of the skin has shown that 6 treatment sessions of Multisource RF technology improve skin structure features. The skin after treatment had longer and narrower dermal papilla and denser and finer collagen fiber typical to younger skin as compared to pre treatment skin. Ultrasound of the skin showed after 6 treatment sessions reduction of 10 percent in the thickness of the subcutaneous fat layer. Non ablative facial clinical studies showed a significant reduction of wrinkles after treatment further reduced at 3 months follow-up. Body treatment studies showed a circumference reduction of 2.9 cm immediately after 6 treatments, and 2 cm at 12 months after the end of treatment, proving long term collagen remodeling effect. Clinical studies of the multisource fractional RF application have shown significant effects on wrinkles reduction and deep atrophic acne scars after 1-3 treatment sessions.
Elman, Monica; Harth, Yoram
2011-01-01
The basic properties of lasers and pulsed light sources limit their ability to deliver high energy to the dermis and subcutaneous tissues without excessive damage to the epidermis. Radiofrequency was shown to penetrate deeper than optical light sources independent of skin color. The early RF-based devices used single source bipolar RF, which is safe but limited in use due to the superficial flow of energy between the two bipolar electrodes. Another type of single source RF employs a single electrode (monopolar) in which the RF energy flows from one electrode on the surface of the skin through the entire body to a plate under the body. Although more effective than bipolar, this devices require intense active cooling of the skin and may be associated with considerable pain and other systemic and local safety concerns. Latest generation of RF technology developed by EndyMed Medical Ltd. (Caesarea, Israel) utilizes simultaneously six or more phase controlled RF generators (3DEEP technology). The multiple electrical fields created by the multiple sources “repel” or “attract” each other, leading to the precise 3 dimensional delivery of RF energy to the dermal and sub-dermal targets minimizing the energy flow through the epidermis without the need for active cooling. Confocal microscopy of the skin has shown that 6 treatment sessions of Multisource RF technology improve skin structure features. The skin after treatment had longer and narrower dermal papilla and denser and finer collagen fiber typical to younger skin as compared to pre treatment skin. Ultrasound of the skin showed after 6 treatment sessions reduction of 10 percent in the thickness of the subcutaneous fat layer. Non ablative facial clinical studies showed a significant reduction of wrinkles after treatment further reduced at 3 months follow-up. Body treatment studies showed a circumference reduction of 2.9 cm immediately after 6 treatments, and 2 cm at 12 months after the end of treatment, proving long term collagen remodeling effect. Clinical studies of the multisource fractional RF application have shown significant effects on wrinkles reduction and deep atrophic acne scars after 1–3 treatment sessions. PMID:24155523
Multisource image fusion method using support value transform.
Zheng, Sheng; Shi, Wen-Zhong; Liu, Jian; Zhu, Guang-Xi; Tian, Jin-Wen
2007-07-01
With the development of numerous imaging sensors, many images can be simultaneously pictured by various sensors. However, there are many scenarios where no one sensor can give the complete picture. Image fusion is an important approach to solve this problem and produces a single image which preserves all relevant information from a set of different sensors. In this paper, we proposed a new image fusion method using the support value transform, which uses the support value to represent the salient features of image. This is based on the fact that, in support vector machines (SVMs), the data with larger support values have a physical meaning in the sense that they reveal relative more importance of the data points for contributing to the SVM model. The mapped least squares SVM (mapped LS-SVM) is used to efficiently compute the support values of image. The support value analysis is developed by using a series of multiscale support value filters, which are obtained by filling zeros in the basic support value filter deduced from the mapped LS-SVM to match the resolution of the desired level. Compared with the widely used image fusion methods, such as the Laplacian pyramid, discrete wavelet transform methods, the proposed method is an undecimated transform-based approach. The fusion experiments are undertaken on multisource images. The results demonstrate that the proposed approach is effective and is superior to the conventional image fusion methods in terms of the pertained quantitative fusion evaluation indexes, such as quality of visual information (Q(AB/F)), the mutual information, etc.
Integrated Dynamic Transit Operations (IDTO) concept of operations.
DOT National Transportation Integrated Search
2012-05-01
In support of USDOTs Intelligent Transportation Systems (ITS) Mobility Program, the Dynamic Mobility Applications (DMA) program seeks to create applications that fully leverage frequently collected and rapidly disseminated multi-source data gat...
Imputation for multisource data with comparison and assessment techniques
Casleton, Emily Michele; Osthus, David Allen; Van Buren, Kendra Lu
2017-12-27
Missing data are prevalent issue in analyses involving data collection. The problem of missing data is exacerbated for multisource analysis, where data from multiple sensors are combined to arrive at a single conclusion. In this scenario, it is more likely to occur and can lead to discarding a large amount of data collected; however, the information from observed sensors can be leveraged to estimate those values not observed. We propose two methods for imputation of multisource data, both of which take advantage of potential correlation between data from different sensors, through ridge regression and a state-space model. These methods, asmore » well as the common median imputation, are applied to data collected from a variety of sensors monitoring an experimental facility. Performance of imputation methods is compared with the mean absolute deviation; however, rather than using this metric to solely rank themethods,we also propose an approach to identify significant differences. Imputation techniqueswill also be assessed by their ability to produce appropriate confidence intervals, through coverage and length, around the imputed values. Finally, performance of imputed datasets is compared with a marginalized dataset through a weighted k-means clustering. In general, we found that imputation through a dynamic linearmodel tended to be the most accurate and to produce the most precise confidence intervals, and that imputing the missing values and down weighting them with respect to observed values in the analysis led to the most accurate performance.« less
The assessment of pathologists/laboratory medicine physicians through a multisource feedback tool.
Lockyer, Jocelyn M; Violato, Claudio; Fidler, Herta; Alakija, Pauline
2009-08-01
There is increasing interest in ensuring that physicians demonstrate the full range of Accreditation Council for Graduate Medical Education competencies. To determine whether it is possible to develop a feasible and reliable multisource feedback instrument for pathologists and laboratory medicine physicians. Surveys with 39, 30, and 22 items were developed to assess individual physicians by 8 peers, 8 referring physicians, and 8 coworkers (eg, technologists, secretaries), respectively, using 5-point scales and an unable-to-assess category. Physicians completed a self-assessment survey. Items addressed key competencies related to clinical competence, collaboration, professionalism, and communication. Data from 101 pathologists and laboratory medicine physicians were analyzed. The mean number of respondents per physician was 7.6, 7.4, and 7.6 for peers, referring physicians, and coworkers, respectively. The reliability of the internal consistency, measured by Cronbach alpha, was > or = .95 for the full scale of all instruments. Analysis indicated that the medical peer, referring physician, and coworker instruments achieved a generalizability coefficient of .78, .81, and .81, respectively. Factor analysis showed 4 factors on the peer questionnaire accounted for 68.8% of the total variance: reports and clinical competency, collaboration, educational leadership, and professional behavior. For the referring physician survey, 3 factors accounted for 66.9% of the variance: professionalism, reports, and clinical competency. Two factors on the coworker questionnaire accounted for 59.9% of the total variance: communication and professionalism. It is feasible to assess this group of physicians using multisource feedback with instruments that are reliable.
Imputation for multisource data with comparison and assessment techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Casleton, Emily Michele; Osthus, David Allen; Van Buren, Kendra Lu
Missing data are prevalent issue in analyses involving data collection. The problem of missing data is exacerbated for multisource analysis, where data from multiple sensors are combined to arrive at a single conclusion. In this scenario, it is more likely to occur and can lead to discarding a large amount of data collected; however, the information from observed sensors can be leveraged to estimate those values not observed. We propose two methods for imputation of multisource data, both of which take advantage of potential correlation between data from different sensors, through ridge regression and a state-space model. These methods, asmore » well as the common median imputation, are applied to data collected from a variety of sensors monitoring an experimental facility. Performance of imputation methods is compared with the mean absolute deviation; however, rather than using this metric to solely rank themethods,we also propose an approach to identify significant differences. Imputation techniqueswill also be assessed by their ability to produce appropriate confidence intervals, through coverage and length, around the imputed values. Finally, performance of imputed datasets is compared with a marginalized dataset through a weighted k-means clustering. In general, we found that imputation through a dynamic linearmodel tended to be the most accurate and to produce the most precise confidence intervals, and that imputing the missing values and down weighting them with respect to observed values in the analysis led to the most accurate performance.« less
NASA Astrophysics Data System (ADS)
Wang, Gongwen; Ma, Zhenbo; Li, Ruixi; Song, Yaowu; Qu, Jianan; Zhang, Shouting; Yan, Changhai; Han, Jiangwei
2017-04-01
In this paper, multi-source (geophysical, geochemical, geological and remote sensing) datasets were used to construct multi-scale (district-, deposit-, and orebody-scale) 3D geological models and extract 3D exploration criteria for subsurface Mo-polymetallic exploration targeting in the Luanchuan district in China. The results indicate that (i) a series of region-/district-scale NW-trending thrusts controlled main Mo-polymetallic forming, and they were formed by regional Indosinian Qinling orogenic events, the secondary NW-trending district-scale folds and NE-trending faults and the intrusive stock structure are produced based on thrust structure in Caledonian-Indosinian orogenic events; they are ore-bearing zones and ore-forming structures; (ii) the NW-trending district-scale and NE-trending deposit-scale normal faults were crossed and controlled by the Jurassic granite stocks in 3D space, they are associated with the magma-skarn Mo polymetallic mineralization (the 3D buffer distance of ore-forming granite stocks is 600 m) and the NW-trending hydrothermal Pb-Zn deposits which are surrounded by the Jurassic granite stocks and constrained by NW-trending or NE-trending faults (the 3D buffer distance of ore-forming fault is 700 m); and (iii) nine Mo polymetallic and four Pb-Zn targets were identified in the subsurface of the Luanchuan district.
Developing a multisource feedback tool for postgraduate medical educational supervisors.
Archer, Julian; Swanwick, Tim; Smith, Daniel; O'Keeffe, Catherine; Cater, Nerys
2013-01-01
Supervisors play a key role in the development of postgraduate medical trainees both in the oversight of their day-to-day clinical practice but also in the support of their learning experiences. In the UK, there has been a clear distinction made between these two activities. In this article, we report on the development of a web-based multisource feedback (MSF) tool for educational supervisors in the London Deanery, an organisation responsible for 20% of the UK's doctors and dentists in training. A narrative review of the literature generated a question framework for a series of focus groups. Data were analysed using an interpretative thematic approach and the resulting instrument piloted online. Instrument performance was analysed using a variety of tools including factor analysis, generalisability theory and analysis of performance in the first year of implementation. Two factors were initially identified. Three questions performed inadequately and were subsequently discarded. Educational supervisors scored well, generally rating themselves lower than they were by their trainees. The instrument was launched in July 2010, requiring five respondents to generate a summated report, with further validity evidence collated over the first year if implementation. Arising out of a robust development process, the London Deanery MSF instrument for educational supervisors is a tool that demonstrates considerable evidence of validity and can provide supervisors with useful evidence of their effectiveness.
Test readiness assessment summary for Integrated Dynamic Transit Operations (IDTO).
DOT National Transportation Integrated Search
2012-10-01
In support of USDOTs Intelligent Transportation Systems (ITS) Mobility Program, the Dynamic Mobility Applications (DMA) program seeks to create applications that fully leverage frequently collected and rapidly disseminated multi-source data gat...
Olesen, Alexander Neergaard; Christensen, Julie A E; Sorensen, Helge B D; Jennum, Poul J
2016-08-01
Reducing the number of recording modalities for sleep staging research can benefit both researchers and patients, under the condition that they provide as accurate results as conventional systems. This paper investigates the possibility of exploiting the multisource nature of the electrooculography (EOG) signals by presenting a method for automatic sleep staging using the complete ensemble empirical mode decomposition with adaptive noise algorithm, and a random forest classifier. It achieves a high overall accuracy of 82% and a Cohen's kappa of 0.74 indicating substantial agreement between automatic and manual scoring.
Content-based image exploitation for situational awareness
NASA Astrophysics Data System (ADS)
Gains, David
2008-04-01
Image exploitation is of increasing importance to the enterprise of building situational awareness from multi-source data. It involves image acquisition, identification of objects of interest in imagery, storage, search and retrieval of imagery, and the distribution of imagery over possibly bandwidth limited networks. This paper describes an image exploitation application that uses image content alone to detect objects of interest, and that automatically establishes and preserves spatial and temporal relationships between images, cameras and objects. The application features an intuitive user interface that exposes all images and information generated by the system to an operator thus facilitating the formation of situational awareness.
Emke, Amanda R; Cheng, Steven; Chen, Ling; Tian, Dajun; Dufault, Carolyn
2017-01-01
Phenomenon: Professionalism is integral to the role of the physician. Most professionalism assessments in medical training are delayed until clinical rotations where multisource feedback is available. This leaves a gap in student assessment portfolios and potentially delays professional development. A total of 246 second-year medical students (2013-2015) completed self- and peer assessments of professional behaviors in 2 courses following a series of Team-Based Learning exercises. Correlation and regression analyses were used to examine the alignment or misalignment in the relationship between the 2 types of assessments. Four subgroups were formed based on observed patterns of initial self- and peer assessment alignment or misalignment, and subgroup membership stability over time was assessed. A missing data analysis examined differences between average peer assessment scores as a function of selective nonparticipation. Spearman correlation demonstrated moderate to strong correlation between self-assessments completed alone (no simultaneous peer assessment) and self-assessments completed at the time of peer assessments (ρ = .59, p < .0001) but weak correlation between the two self-assessments and peer assessments (alone: ρ = .13, p < .013; at time of peer: ρ = .21, p < .0001). Generalized estimating equation models revealed that self-assessments done alone (p < .0001) were a significant predictor of self-assessments done at the time of peer. Course was also a significant predictor (p = .01) of self-assessment scores done at the time of peer. Peer assessment score was not a significant predictor. Bhapkar's test revealed subgroup membership based on the relationship between self- and peer ratings was relatively stable across Time 1 and Time 2 assessments (χ 2 = 0.83, p = .84) for all but one subgroup; members of the subgroup with initially high self-assessment and low peer assessment were significantly more likely to move to a new classification at the second measurement. A missing data analysis revealed that students who completed all self-assessments had significantly higher average peer assessment ratings compared to students who completed one or no self-assessments with a difference of -0.32, 95% confidence interval [-0.48, -0.15]. Insights: Multiple measurements of simultaneous self- and peer assessment identified a subgroup of students who consistently rated themselves higher on professionalism attributes relative to the low ratings given by their peers. This subgroup of preclinical students, along with those who elected to not complete self-assessments, may be at risk for professionalism concerns. Use of this multisource feedback tool to measure perceptual stability of professionalism behaviors is a new approach that may assist with early identification of at-risk students during preclinical years.
Optimal rotated staggered-grid finite-difference schemes for elastic wave modeling in TTI media
NASA Astrophysics Data System (ADS)
Yang, Lei; Yan, Hongyong; Liu, Hong
2015-11-01
The rotated staggered-grid finite-difference (RSFD) is an effective approach for numerical modeling to study the wavefield characteristics in tilted transversely isotropic (TTI) media. But it surfaces from serious numerical dispersion, which directly affects the modeling accuracy. In this paper, we propose two different optimal RSFD schemes based on the sampling approximation (SA) method and the least-squares (LS) method respectively to overcome this problem. We first briefly introduce the RSFD theory, based on which we respectively derive the SA-based RSFD scheme and the LS-based RSFD scheme. Then different forms of analysis are used to compare the SA-based RSFD scheme and the LS-based RSFD scheme with the conventional RSFD scheme, which is based on the Taylor-series expansion (TE) method. The contrast in numerical accuracy analysis verifies the greater accuracy of the two proposed optimal schemes, and indicates that these schemes can effectively widen the wavenumber range with great accuracy compared with the TE-based RSFD scheme. Further comparisons between these two optimal schemes show that at small wavenumbers, the SA-based RSFD scheme performs better, while at large wavenumbers, the LS-based RSFD scheme leads to a smaller error. Finally, the modeling results demonstrate that for the same operator length, the SA-based RSFD scheme and the LS-based RSFD scheme can achieve greater accuracy than the TE-based RSFD scheme, while for the same accuracy, the optimal schemes can adopt shorter difference operators to save computing time.
Are multisource levothyroxine sodium tablets marketed in Egypt interchangeable?
Abou-Taleb, Basant A; Bondok, Maha; Nounou, Mohamed Ismail; Khalafallah, Nawal; Khalil, Saleh
2018-02-01
A clinical study was initiated in response to patients' complaints, supported by the treating physicians, of suspected differences in efficacy among multisource levothyroxine sodium tablets marketed in Egypt. The study design was a multiple dose (100μg levothyroxine sodium tablet once daily for 6 months) and involved 50 primary hypothyroidism female patients (5 equal groups). Tablets administered included five tablet batches (two brands, three origin locations) purchased from local pharmacies in Alexandria. Assessment parameters (measured on consecutive visits) included the thyroid stimulating hormone, total and free levothyroxine. Tablet dissolution rate was determined (BP/EP 2014 & USP 2014). In vitro vs in vivovs correlations were developed. Clinical and pharmaceutical data confirmed inter-brand and inter-source differences in efficacy. Correlations examined indicated potential usefulness of in vitro dissolution test in detecting poor performing levothyroxine sodium tablets during shelf life. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Multisource drug policies in Latin America: survey of 10 countries.
Homedes, Núria; Ugalde, Antonio
2005-01-01
Essential drug lists and generic drug policies have been promoted as strategies to improve access to pharmaceuticals and control their rapidly escalating costs. This article reports the results of a preliminary survey conducted in 10 Latin American countries. The study aimed to document the experiences of different countries in defining and implementing generic drug policies, determine the cost of registering different types of pharmaceutical products and the time needed to register them, and uncover the incentives governments have developed to promote the use of multisource drugs. The survey instrument was administered in person in Chile, Ecuador and Peru and by email in Argentina, Brazil, Bolivia, Colombia, Costa Rica, Nicaragua and Uruguay. There was a total of 22 respondents. Survey responses indicated that countries use the terms generic and bioequivalence differently. We suggest there is a need to harmonize definitions and technical concepts. PMID:15682251
Effective Coping With Supervisor Conflict Depends on Control: Implications for Work Strains.
Eatough, Erin M; Chang, Chu-Hsiang
2018-01-11
This study examined the interactive effects of interpersonal conflict at work, coping strategy, and perceived control specific to the conflict on employee work strain using multisource and time-lagged data across two samples. In Sample 1, multisource data was collected from 438 employees as well as data from participant-identified secondary sources (e.g., significant others, best friends). In Sample 2, time-lagged data from 100 full-time employees was collected in a constructive replication. Overall, findings suggested that the success of coping efforts as indicated by lower strains hinges on the combination of the severity of the stressor, perceived control over the stressor, and coping strategy used (problem-focused vs. emotion-focused coping). Results from the current study provide insights for why previous efforts to document the moderating effects of coping have been inconsistent, especially with regards to emotion-focused coping. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
NASA Astrophysics Data System (ADS)
López de Ipiña, JM; Vaquero, C.; Gutierrez-Cañas, C.
2017-06-01
It is expected a progressive increase of the industrial processes that manufacture of intermediate (iNEPs) and end products incorporating ENMs (eNEPs) to bring about improved properties. Therefore, the assessment of occupational exposure to airborne NOAA will migrate, from the simple and well-controlled exposure scenarios in research laboratories and ENMs production plants using innovative production technologies, to much more complex exposure scenarios located around processes of manufacture of eNEPs that, in many cases, will be modified conventional production processes. Here will be discussed some of the typical challenging situations in the process of risk assessment of inhalation exposure to NOAA in Multi-Source Industrial Scenarios (MSIS), from the basis of the lessons learned when confronted to those scenarios in the frame of some European and Spanish research projects.
Nanomaterial-based x-ray sources
NASA Astrophysics Data System (ADS)
Cole, Matthew T.; Parmee, R. J.; Milne, William I.
2016-02-01
Following the recent global excitement and investment in the emerging, and rapidly growing, classes of one and two-dimensional nanomaterials, we here present a perspective on one of the viable applications of such materials: field electron emission based x-ray sources. These devices, which have a notable history in medicine, security, industry and research, to date have almost exclusively incorporated thermionic electron sources. Since the middle of the last century, field emission based cathodes were demonstrated, but it is only recently that they have become practicable. We outline some of the technological achievements of the past two decades, and describe a number of the seminal contributions. We explore the foremost market hurdles hindering their roll-out and broader industrial adoption and summarise the recent progress in miniaturised, pulsed and multi-source devices.
Integrating multisource land use and land cover data
Wright, Bruce E.; Tait, Mike; Lins, K.F.; Crawford, J.S.; Benjamin, S.P.; Brown, Jesslyn F.
1995-01-01
As part of the U.S. Geological Survey's (USGS) land use and land cover (LULC) program, the USGS in cooperation with the Environmental Systems Research Institute (ESRI) is collecting and integrating LULC data for a standard USGS 1:100,000-scale product. The LULC data collection techniques include interpreting spectrally clustered Landsat Thematic Mapper (TM) images; interpreting 1-meter resolution digital panchromatic orthophoto images; and, for comparison, aggregating locally available large-scale digital data of urban areas. The area selected is the Vancouver, WA-OR quadrangle, which has a mix of urban, rural agriculture, and forest land. Anticipated products include an integrated LULC prototype data set in a standard classification scheme referenced to the USGS digital line graph (DLG) data of the area and prototype software to develop digital LULC data sets.This project will evaluate a draft standard LULC classification system developed by the USGS for use with various source material and collection techniques. Federal, State, and local governments, and private sector groups will have an opportunity to evaluate the resulting prototype software and data sets and to provide recommendations. It is anticipated that this joint research endeavor will increase future collaboration among interested organizations, public and private, for LULC data collection using common standards and tools.
NASA Astrophysics Data System (ADS)
Anton, S. R.; Taylor, S. G.; Raby, E. Y.; Farinholt, K. M.
2013-03-01
With a global interest in the development of clean, renewable energy, wind energy has seen steady growth over the past several years. Advances in wind turbine technology bring larger, more complex turbines and wind farms. An important issue in the development of these complex systems is the ability to monitor the state of each turbine in an effort to improve the efficiency and power generation. Wireless sensor nodes can be used to interrogate the current state and health of wind turbine structures; however, a drawback of most current wireless sensor technology is their reliance on batteries for power. Energy harvesting solutions present the ability to create autonomous power sources for small, low-power electronics through the scavenging of ambient energy; however, most conventional energy harvesting systems employ a single mode of energy conversion, and thus are highly susceptible to variations in the ambient energy. In this work, a multi-source energy harvesting system is developed to power embedded electronics for wind turbine applications in which energy can be scavenged simultaneously from several ambient energy sources. Field testing is performed on a full-size, residential scale wind turbine where both vibration and solar energy harvesting systems are utilized to power wireless sensing systems. Two wireless sensors are investigated, including the wireless impedance device (WID) sensor node, developed at Los Alamos National Laboratory (LANL), and an ultra-low power RF system-on-chip board that is the basis for an embedded wireless accelerometer node currently under development at LANL. Results indicate the ability of the multi-source harvester to successfully power both sensors.
van der Meulen, Mirja W; Boerebach, Benjamin C M; Smirnova, Alina; Heeneman, Sylvia; Oude Egbrink, Mirjam G A; van der Vleuten, Cees P M; Arah, Onyebuchi A; Lombarts, Kiki M J M H
2017-01-01
Multisource feedback (MSF) instruments are used to and must feasibly provide reliable and valid data on physicians' performance from multiple perspectives. The "INviting Co-workers to Evaluate Physicians Tool" (INCEPT) is a multisource feedback instrument used to evaluate physicians' professional performance as perceived by peers, residents, and coworkers. In this study, we report on the validity, reliability, and feasibility of the INCEPT. The performance of 218 physicians was assessed by 597 peers, 344 residents, and 822 coworkers. Using explorative and confirmatory factor analyses, multilevel regression analyses between narrative and numerical feedback, item-total correlations, interscale correlations, Cronbach's α and generalizability analyses, the psychometric qualities, and feasibility of the INCEPT were investigated. For all respondent groups, three factors were identified, although constructed slightly different: "professional attitude," "patient-centeredness," and "organization and (self)-management." Internal consistency was high for all constructs (Cronbach's α ≥ 0.84 and item-total correlations ≥ 0.52). Confirmatory factor analyses indicated acceptable to good fit. Further validity evidence was given by the associations between narrative and numerical feedback. For reliable total INCEPT scores, three peer, two resident and three coworker evaluations were needed; for subscale scores, evaluations of three peers, three residents and three to four coworkers were sufficient. The INCEPT instrument provides physicians performance feedback in a valid and reliable way. The number of evaluations to establish reliable scores is achievable in a regular clinical department. When interpreting feedback, physicians should consider that respondent groups' perceptions differ as indicated by the different item clustering per performance factor.
Senay, Gabriel B.; Velpuri, Naga Manohar; Alemu, Henok; Pervez, Shahriar Md; Asante, Kwabena O; Karuki, Gatarwa; Taa, Asefa; Angerer, Jay
2013-01-01
Timely information on the availability of water and forage is important for the sustainable development of pastoral regions. The lack of such information increases the dependence of pastoral communities on perennial sources, which often leads to competition and conflicts. The provision of timely information is a challenging task, especially due to the scarcity or non-existence of conventional station-based hydrometeorological networks in the remote pastoral regions. A multi-source water balance modelling approach driven by satellite data was used to operationally monitor daily water level fluctuations across the pastoral regions of northern Kenya and southern Ethiopia. Advanced Spaceborne Thermal Emission and Reflection Radiometer data were used for mapping and estimating the surface area of the waterholes. Satellite-based rainfall, modelled run-off and evapotranspiration data were used to model daily water level fluctuations. Mapping of waterholes was achieved with 97% accuracy. Validation of modelled water levels with field-installed gauge data demonstrated the ability of the model to capture the seasonal patterns and variations. Validation results indicate that the model explained 60% of the observed variability in water levels, with an average root-mean-squared error of 22%. Up-to-date information on rainfall, evaporation, scaled water depth and condition of the waterholes is made available daily in near-real time via the Internet (http://watermon.tamu.edu). Such information can be used by non-governmental organizations, governmental organizations and other stakeholders for early warning and decision making. This study demonstrated an integrated approach for establishing an operational waterhole monitoring system using multi-source satellite data and hydrologic modelling.
Crossley, James G M
2015-01-01
Nurse appraisal is well established in the Western world because of its obvious educational advantages. Appraisal works best with many sources of information on performance. Multisource feedback (MSF) is widely used in business and in other clinical disciplines to provide such information. It has also been incorporated into nursing appraisals, but, so far, none of the instruments in use for nurses has been validated. We set out to develop an instrument aligned with the UK Knowledge and Skills Framework (KSF) and to evaluate its reliability and feasibility across a wide hospital-based nursing population. The KSF framework provided a content template. Focus groups developed an instrument based on consensus. The instrument was administered to all the nursing staff in 2 large NHS hospitals forming a single trust in London, England. We used generalizability analysis to estimate reliability, response rates and unstructured interviews to evaluate feasibility, and factor structure and correlation studies to evaluate validity. On a voluntary basis the response rate was moderate (60%). A failure to engage with information technology and employment-related concerns were commonly cited as reasons for not responding. In this population, 11 responses provided a profile with sufficient reliability to inform appraisal (G = 0.7). Performance on the instrument was closely and significantly correlated with performance on a KSF questionnaire. This is the first contemporary psychometric evaluation of an MSF instrument for nurses. MSF appears to be as valid and reliable as an assessment method to inform appraisal in nurses as it is in other health professional groups. © 2015 The Alliance for Continuing Education in the Health Professions, the Society for Academic Continuing Medical Education, and the Council on Continuing Medical Education, Association for Hospital Medical Education.
Scalable Metadata Management for a Large Multi-Source Seismic Data Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaylord, J. M.; Dodge, D. A.; Magana-Zook, S. A.
In this work, we implemented the key metadata management components of a scalable seismic data ingestion framework to address limitations in our existing system, and to position it for anticipated growth in volume and complexity.
A Multi-Scale Settlement Matching Algorithm Based on ARG
NASA Astrophysics Data System (ADS)
Yue, Han; Zhu, Xinyan; Chen, Di; Liu, Lingjia
2016-06-01
Homonymous entity matching is an important part of multi-source spatial data integration, automatic updating and change detection. Considering the low accuracy of existing matching methods in dealing with matching multi-scale settlement data, an algorithm based on Attributed Relational Graph (ARG) is proposed. The algorithm firstly divides two settlement scenes at different scales into blocks by small-scale road network and constructs local ARGs in each block. Then, ascertains candidate sets by merging procedures and obtains the optimal matching pairs by comparing the similarity of ARGs iteratively. Finally, the corresponding relations between settlements at large and small scales are identified. At the end of this article, a demonstration is presented and the results indicate that the proposed algorithm is capable of handling sophisticated cases.
Shadow detection of moving objects based on multisource information in Internet of things
NASA Astrophysics Data System (ADS)
Ma, Zhen; Zhang, De-gan; Chen, Jie; Hou, Yue-xian
2017-05-01
Moving object detection is an important part in intelligent video surveillance under the banner of Internet of things. The detection of moving target's shadow is also an important step in moving object detection. On the accuracy of shadow detection will affect the detection results of the object directly. Based on the variety of shadow detection method, we find that only using one feature can't make the result of detection accurately. Then we present a new method for shadow detection which contains colour information, the invariance of optical and texture feature. Through the comprehensive analysis of the detecting results of three kinds of information, the shadow was effectively determined. It gets ideal effect in the experiment when combining advantages of various methods.
Lakshminarayana, Indumathy; Wall, David; Bindal, Taruna; Goodyear, Helen M
2015-05-01
Leading a ward round is an essential skill for hospital consultants and senior trainees but is rarely assessed during training. To investigate the key attributes for ward round leadership and to use these results to develop a multisource feedback (MSF) tool to assess the ward round leadership skills of senior specialist trainees. A panel of experts comprising four senior paediatric consultants and two nurse managers were interviewed from May to August 2009. From analysis of the interview transcripts, 10 key themes emerged. A structured questionnaire based on the key themes was designed and sent electronically to paediatric consultants, nurses and trainees at a large university hospital (June-October 2010). 81 consultants, nurses and trainees responded to the survey. The internal consistency of this tool was high (Cronbach's α 0.95). Factor analysis showed that five factors accounted for 72% of variance. The five key areas for ward round leadership were communication skills, preparation and organisation, teaching and enthusiasm, team working and punctuality; communication was the most important key theme. A MSF tool for ward round leadership skills was developed with these areas as five domains. We believe that this tool will add to the current assessment tools available by providing feedback about ward round leadership skills. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Shepherd, Annabel; Lough, Murray
2010-05-01
Although multi-source feedback (MSF) has been used in primary healthcare, the development of an MSF instrument specific to this setting in the UK has not been previously described. The aims of this study were to develop and evaluate an MSF instrument for GPs in Scotland taking part in appraisal. The members of ten primary healthcare teams in the west of Scotland were asked to provide comments in answer to the question, 'What is a good GP?'. The data were reduced and coded by two researchers and questions were devised. Following content validity testing the MSF process was evaluated with volunteers using face-to-face interviews and a postal survey. Thirty-seven statements covering the six domains of communication skills, professional values, clinical care, working with colleagues, personality issues and duties and responsibilities were accepted as relevant by ten primary healthcare teams using a standard of 80 percent agreement. The evaluation found the MSF process to be feasible and acceptable and participants provided some evidence of educational impact. An MSF instrument for GPs has been developed based on the concept of 'the good GP' as described by the primary healthcare team. The evaluation of the resultant MSF process illustrates the potential of MSF, when delivered in the supportive environment of GP appraisal, to provide feedback which has the possibility of improving working relationships between GPs and their colleagues.
Terrestrial laser scanning in monitoring of anthropogenic objects
NASA Astrophysics Data System (ADS)
Zaczek-Peplinska, Janina; Kowalska, Maria
2017-12-01
The registered xyz coordinates in the form of a point cloud captured by terrestrial laser scanner and the intensity values (I) assigned to them make it possible to perform geometric and spectral analyses. Comparison of point clouds registered in different time periods requires conversion of the data to a common coordinate system and proper data selection is necessary. Factors like point distribution dependant on the distance between the scanner and the surveyed surface, angle of incidence, tasked scan's density and intensity value have to be taken into consideration. A prerequisite for running a correct analysis of the obtained point clouds registered during periodic measurements using a laser scanner is the ability to determine the quality and accuracy of the analysed data. The article presents a concept of spectral data adjustment based on geometric analysis of a surface as well as examples of geometric analyses integrating geometric and physical data in one cloud of points: cloud point coordinates, recorded intensity values, and thermal images of an object. The experiments described here show multiple possibilities of usage of terrestrial laser scanning data and display the necessity of using multi-aspect and multi-source analyses in anthropogenic object monitoring. The article presents examples of multisource data analyses with regard to Intensity value correction due to the beam's incidence angle. The measurements were performed using a Leica Nova MS50 scanning total station, Z+F Imager 5010 scanner and the integrated Z+F T-Cam thermal camera.
Detecting misinformation and knowledge conflicts in relational data
NASA Astrophysics Data System (ADS)
Levchuk, Georgiy; Jackobsen, Matthew; Riordan, Brian
2014-06-01
Information fusion is required for many mission-critical intelligence analysis tasks. Using knowledge extracted from various sources, including entities, relations, and events, intelligence analysts respond to commander's information requests, integrate facts into summaries about current situations, augment existing knowledge with inferred information, make predictions about the future, and develop action plans. However, information fusion solutions often fail because of conflicting and redundant knowledge contained in multiple sources. Most knowledge conflicts in the past were due to translation errors and reporter bias, and thus could be managed. Current and future intelligence analysis, especially in denied areas, must deal with open source data processing, where there is much greater presence of intentional misinformation. In this paper, we describe a model for detecting conflicts in multi-source textual knowledge. Our model is based on constructing semantic graphs representing patterns of multi-source knowledge conflicts and anomalies, and detecting these conflicts by matching pattern graphs against the data graph constructed using soft co-reference between entities and events in multiple sources. The conflict detection process maintains the uncertainty throughout all phases, providing full traceability and enabling incremental updates of the detection results as new knowledge or modification to previously analyzed information are obtained. Detected conflicts are presented to analysts for further investigation. In the experimental study with SYNCOIN dataset, our algorithms achieved perfect conflict detection in ideal situation (no missing data) while producing 82% recall and 90% precision in realistic noise situation (15% of missing attributes).
DOT National Transportation Integrated Search
2012-03-01
In support of USDOTs Intelligent Transportation Systems (ITS) Mobility Program, the Dynamic Mobility Applications (DMA) program seeks to create applications that fully leverage frequently collected and rapidly disseminated multi-source data gat...
Small Scale Multisource Site – Hydrogeology Investigation
A site impacted by brackish water was evaluated using traditional hydrogeologic and geochemical site characterization techniques. No single, specific source of the brine impacted ground water was identified. However, the extent of the brine impacted ground water was found to be...
Present situation and trend of precision guidance technology and its intelligence
NASA Astrophysics Data System (ADS)
Shang, Zhengguo; Liu, Tiandong
2017-11-01
This paper first introduces the basic concepts of precision guidance technology and artificial intelligence technology. Then gives a brief introduction of intelligent precision guidance technology, and with the help of development of intelligent weapon based on deep learning project in foreign: LRASM missile project, TRACE project, and BLADE project, this paper gives an overview of the current foreign precision guidance technology. Finally, the future development trend of intelligent precision guidance technology is summarized, mainly concentrated in the multi objectives, intelligent classification, weak target detection and recognition, intelligent between complex environment intelligent jamming and multi-source, multi missile cooperative fighting and other aspects.
Murphy, Douglas J; Bruce, David A; Mercer, Stewart W; Eva, Kevin W
2009-05-01
To investigate the reliability and feasibility of six potential workplace-based assessment methods in general practice training: criterion audit, multi-source feedback from clinical and non-clinical colleagues, patient feedback (the CARE Measure), referral letters, significant event analysis, and video analysis of consultations. Performance of GP registrars (trainees) was evaluated with each tool to assess the reliabilities of the tools and feasibility, given raters and number of assessments needed. Participant experience of process determined by questionnaire. 171 GP registrars and their trainers, drawn from nine deaneries (representing all four countries in the UK), participated. The ability of each tool to differentiate between doctors (reliability) was assessed using generalisability theory. Decision studies were then conducted to determine the number of observations required to achieve an acceptably high reliability for "high-stakes assessment" using each instrument. Finally, descriptive statistics were used to summarise participants' ratings of their experience using these tools. Multi-source feedback from colleagues and patient feedback on consultations emerged as the two methods most likely to offer a reliable and feasible opinion of workplace performance. Reliability co-efficients of 0.8 were attainable with 41 CARE Measure patient questionnaires and six clinical and/or five non-clinical colleagues per doctor when assessed on two occasions. For the other four methods tested, 10 or more assessors were required per doctor in order to achieve a reliable assessment, making the feasibility of their use in high-stakes assessment extremely low. Participant feedback did not raise any major concerns regarding the acceptability, feasibility, or educational impact of the tools. The combination of patient and colleague views of doctors' performance, coupled with reliable competence measures, may offer a suitable evidence-base on which to monitor progress and completion of doctors' training in general practice.
DOT National Transportation Integrated Search
2018-01-01
Connected vehicle mobility applications are commonly referred to as dynamic mobility applications (DMAs). DMAs seek to fully leverage frequently collected and rapidly disseminated multi-source data gathered from connected travelers, vehicles, and inf...
Advancing Future Network Science through Content Understanding
2014-05-01
BitTorrent, PostgreSQL, MySQL , and GRSecurity) and emerging technologies (HadoopDFS, Tokutera, Sector/Sphere, HBase, and other BigTable-like...result. • Multi-Source Network Pulse Analyzer and Correlator provides course of action planning by enhancing the understanding of the complex dynamics
DOT National Transportation Integrated Search
2012-08-01
In support of USDOTs Intelligent Transportation Systems (ITS) Mobility Program, the Dynamic Mobility Applications (DMA) program seeks to create applications that fully leverage frequently collected and rapidly disseminated multi-source data gat...
DOT National Transportation Integrated Search
2011-11-01
In support of USDOTs Intelligent Transportation Systems (ITS) Mobility Program, the Dynamic Mobility Applications (DMA) program seeks to create applications that fully leverage frequently collected and rapidly disseminated multi-source data gat...
Velpuri, N.M.; Senay, G.B.; Asante, K.O.
2011-01-01
Managing limited surface water resources is a great challenge in areas where ground-based data are either limited or unavailable. Direct or indirect measurements of surface water resources through remote sensing offer several advantages of monitoring in ungauged basins. A physical based hydrologic technique to monitor lake water levels in ungauged basins using multi-source satellite data such as satellite-based rainfall estimates, modelled runoff, evapotranspiration, a digital elevation model, and other data is presented. This approach is applied to model Lake Turkana water levels from 1998 to 2009. Modelling results showed that the model can reasonably capture all the patterns and seasonal variations of the lake water level fluctuations. A composite lake level product of TOPEX/Poseidon, Jason-1, and ENVISAT satellite altimetry data is used for model calibration (1998-2000) and model validation (2001-2009). Validation results showed that model-based lake levels are in good agreement with observed satellite altimetry data. Compared to satellite altimetry data, the Pearson's correlation coefficient was found to be 0.81 during the validation period. The model efficiency estimated using NSCE is found to be 0.93, 0.55 and 0.66 for calibration, validation and combined periods, respectively. Further, the model-based estimates showed a root mean square error of 0.62 m and mean absolute error of 0.46 m with a positive mean bias error of 0.36 m for the validation period (2001-2009). These error estimates were found to be less than 15 % of the natural variability of the lake, thus giving high confidence on the modelled lake level estimates. The approach presented in this paper can be used to (a) simulate patterns of lake water level variations in data scarce regions, (b) operationally monitor lake water levels in ungauged basins, (c) derive historical lake level information using satellite rainfall and evapotranspiration data, and (d) augment the information provided by the satellite altimetry systems on changes in lake water levels. ?? Author(s) 2011.
NASA Astrophysics Data System (ADS)
Gao, M.; Huang, S. T.; Wang, P.; Zhao, Y. A.; Wang, H. B.
2016-11-01
The geological disposal of high-level radioactive waste (hereinafter referred to "geological disposal") is a long-term, complex, and systematic scientific project, whose data and information resources in the research and development ((hereinafter referred to ”R&D”) process provide the significant support for R&D of geological disposal system, and lay a foundation for the long-term stability and safety assessment of repository site. However, the data related to the research and engineering in the sitting of the geological disposal repositories is more complicated (including multi-source, multi-dimension and changeable), the requirements for the data accuracy and comprehensive application has become much higher than before, which lead to the fact that the data model design of geo-information database for the disposal repository are facing more serious challenges. In the essay, data resources of the pre-selected areas of the repository has been comprehensive controlled and systematic analyzed. According to deeply understanding of the application requirements, the research work has made a solution for the key technical problems including reasonable classification system of multi-source data entity, complex logic relations and effective physical storage structures. The new solution has broken through data classification and conventional spatial data the organization model applied in the traditional industry, realized the data organization and integration with the unit of data entities and spatial relationship, which were independent, holonomic and with application significant features in HLW geological disposal. The reasonable, feasible and flexible data conceptual models, logical models and physical models have been established so as to ensure the effective integration and facilitate application development of multi-source data in pre-selected areas for geological disposal.
Multisource, Phase-controlled Radiofrequency for Treatment of Skin Laxity
Moreno-Moraga, Javier; Muñoz, Estefania; Cornejo Navarro, Paloma
2011-01-01
Objective: The objective of this study was to analyze the correlation between degrees of clinical improvement and microscopic changes detected using confocal microscopy at the temperature gradients reached in patients treated for skin laxity with a phase-controlled, multisource radiofrequency system. Design and setting: Patients with skin laxity in the abdominal area were treated in six sessions with radiofrequency (the first 4 sessions were held at 2-week intervals and the 2 remaining sessions at 3-week intervals). Patients attended monitoring at 6, 9, and 12 months. Participants: 33 patients (all women). Measurements: The authors recorded the following: variations in weight, measurements of the contour of the treated area and control area, evaluation of clinical improvement by the clinician and by the patient, images taken using an infrared camera, temperature (before, immediately after, and 20 minutes after the procedure), and confocal microscopy images (before treatment and at 6, 9, and 12 months). The degree of clinical improvement was contrasted by two external observers (clinicians). The procedure was performed using a new phase-controlled, multipolar radiofrequency system. Results: The results reveal a greater degree of clinical improvement in patients with surface temperature increases greater than 11.5ºC at the end of the procedure and remaining greater than 4.5ºC 20 minutes later. These changes induced by radiofrequency were contrasted with the structural improvements observed at the dermal-epidermal junction using confocal microscopy. Changes are more intense and are statistically correlated with patients who show a greater degree of improvement and have higher temperature gradients at the end of the procedure and 20 minutes later. Conclusion: Monitoring and the use of parameters to evaluate end-point values in skin quality treatment by multisource, phased-controlled radiofrequency can help optimize aesthetic outcome. PMID:21278896
Specialty-specific multi-source feedback: assuring validity, informing training.
Davies, Helena; Archer, Julian; Bateman, Adrian; Dewar, Sandra; Crossley, Jim; Grant, Janet; Southgate, Lesley
2008-10-01
The white paper 'Trust, Assurance and Safety: the Regulation of Health Professionals in the 21st Century' proposes a single, generic multi-source feedback (MSF) instrument in the UK. Multi-source feedback was proposed as part of the assessment programme for Year 1 specialty training in histopathology. An existing instrument was modified following blueprinting against the histopathology curriculum to establish content validity. Trainees were also assessed using an objective structured practical examination (OSPE). Factor analysis and correlation between trainees' OSPE performance and the MSF were used to explore validity. All 92 trainees participated and the assessor response rate was 93%. Reliability was acceptable with eight assessors (95% confidence interval 0.38). Factor analysis revealed two factors: 'generic' and 'histopathology'. Pearson correlation of MSF scores with OSPE performances was 0.48 (P = 0.001) and the histopathology factor correlated more highly (histopathology r = 0.54, generic r = 0.42; t = - 2.76, d.f. = 89, P < 0.01). Trainees scored least highly in relation to ability to use histopathology to solve clinical problems (mean = 4.39) and provision of good reports (mean = 4.39). Three of six doctors whose means were < 4.0 received free text comments about report writing. There were 83 forms with aggregate scores of < 4. Of these, 19.2% included comments about report writing. Specialty-specific MSF is feasible and achieves satisfactory reliability. The higher correlation of the 'histopathology' factor with the OSPE supports validity. This paper highlights the importance of validating an MSF instrument within the specialty-specific context as, in addition to assuring content validity, the PATH-SPRAT (Histopathology-Sheffield Peer Review Assessment Tool) also demonstrates the potential to inform training as part of a quality improvement model.
Adaptive Packet Combining Scheme in Three State Channel Model
NASA Astrophysics Data System (ADS)
Saring, Yang; Bulo, Yaka; Bhunia, Chandan Tilak
2018-01-01
The two popular techniques of packet combining based error correction schemes are: Packet Combining (PC) scheme and Aggressive Packet Combining (APC) scheme. PC scheme and APC scheme have their own merits and demerits; PC scheme has better throughput than APC scheme, but suffers from higher packet error rate than APC scheme. The wireless channel state changes all the time. Because of this random and time varying nature of wireless channel, individual application of SR ARQ scheme, PC scheme and APC scheme can't give desired levels of throughput. Better throughput can be achieved if appropriate transmission scheme is used based on the condition of channel. Based on this approach, adaptive packet combining scheme has been proposed to achieve better throughput. The proposed scheme adapts to the channel condition to carry out transmission using PC scheme, APC scheme and SR ARQ scheme to achieve better throughput. Experimentally, it was observed that the error correction capability and throughput of the proposed scheme was significantly better than that of SR ARQ scheme, PC scheme and APC scheme.
Shemer, Avner; Levy, Hanna; Sadick, Neil S; Harth, Yoram; Dorizas, Andrew S
2014-11-01
In the last decade, energy-based aesthetic treatments, using light, radiofrequency (RF), and ultrasound, have gained scientific acceptance as safe and efficacious for non-invasive treatment for aesthetic skin disorders. The phase-controlled multisource radiofrequency technology (3DEEP™), which is based on the simultaneous use of multiple RF generators, was proven to allow significant pigment-independent dermal heating without pain or the need of epidermal cooling. This study was performed in order to evaluate the efficacy and safety of a new handheld device delivering multisource radiofrequency to the skin for wrinkle reduction and skin tightening in the home setting. A total of 69 participants (age 54.3 years ± 8.09; age range 37-72 years) were enrolled in the study after meeting all inclusion/exclusion criteria (100%) and providing informed consent. Participants were provided with the tested device together with a user manual and treatment diary, to perform independent treatments at home for 4 weeks. The tested device, (Newa™, EndyMed Medical, Cesarea, Israel) emits 12 W of 1Mhz, RF energy through six electrodes arranged in a linear fashion. Independent control of RF polarity through each one of the 6 electrodes allows significant reduction of energy flow through the epidermis with increased dermal penetration. Participants were instructed to perform at least 5 treatments a week, for one month. Four follow-up visits were scheduled (once a week) during the period of independent treatments at home, following 4 weeks of home treatments, 1 month follow-up visit (1 month after treatment end) and at 3 months follow-up (3 months following treatment end). Analysis of pre-and post treatment images was conducted by three uninvolved physicians experienced with the Fitzpatrick Wrinkle and Elastosis Scale. Fitzpatrick Wrinkle and Elastosis score of each time point (4 weeks following home use treatments; 1 month follow-up, 3 months follow-up) was compared to baseline. Participants were asked a series of questions designed to explore usability concerns and level of satisfaction regarding the device use and subjective efficacy. Altogether, 62 subjects completed the study course and follow-up visits. No unexpected adverse effects were detected or reported throughout the independent treatment. All study participants did not experience any difficulties while operating the tested device for independent wrinkle reduction treatments. Photographic analysis of pre- and post-one month of independent home use treatments, and one and three months follow-up after end of treatment course, was conducted by three uninvolved board certified dermatologists. Analysis of results revealed improvement (downgrade of at least 1 score according to the Fitzpatrick scale) in 91.93%, 96.77%, and 98.39% of study subjects (according to the first, second, and third reviewer, respectively). Results were found to be statistically significant. The majority of study participants were very satisfied from the results of the independent treatment using the tested device for wrinkle reduction.
Runtime Simulation for Post-Disaster Data Fusion Visualization
2006-10-01
Center for Multisource Information Fusion ( CMIF ) The State University of New York at Buffalo Buffalo, NY 14260 USA kesh@eng.buffalo.edu ABSTRACT...Fusion ( CMIF ) The State University of New York at Buffalo Buffalo, NY 14260 USA 8. PERFORMING ORGANIZATION REPORT NUMBER 9. SPONSORING/MONITORING
ERIC Educational Resources Information Center
Frederiksen, H. Allan
In the belief that "the spread of technological development and the attendant rapidly changing environment creates the necessity for multi-source feedback systems to maximize the alternatives available in dealing with global problems," the author shows how to participate in the process of alternate video. He offers detailed information…
Multisource oil spill detection
NASA Astrophysics Data System (ADS)
Salberg, Arnt B.; Larsen, Siri O.; Zortea, Maciel
2013-10-01
In this paper we discuss how multisource data (wind, ocean-current, optical, bathymetric, automatic identification systems (AIS)) may be used to improve oil spill detection in SAR images, with emphasis on the use of automatic oil spill detection algorithms. We focus particularly on AIS, optical, and bathymetric data. For the AIS data we propose an algorithm for integrating AIS ship tracks into automatic oil spill detection in order to improve the confidence estimate of a potential oil spill. We demonstrate the use of ancillary data on a set of SAR images. Regarding the use of optical data, we did not observe a clear correspondence between high chlorophyll values (estimated from products derived from optical data) and observed slicks in the SAR image. Bathymetric data was shown to be a good data source for removing false detections caused by e.g. sand banks on low tide. For the AIS data we observed that a polluter could be identified for some dark slicks, however, a precise oil drift model is needed in order to identify the polluter with high certainty.
Sadick, Neil S; Sato, Masaki; Palmisano, Diana; Frank, Ido; Cohen, Hila; Harth, Yoram
2011-10-01
Acne scars are one of the most difficult disorders to treat in dermatology. The optimal treatment system will provide minimal downtime resurfacing for the epidermis and non-ablative deep volumetric heating for collagen remodeling in the dermis. A novel therapy system (EndyMed Ltd., Cesarea, Israel) uses phase-controlled multi-source radiofrequency (RF) to provide simultaneous one pulse microfractional resurfacing with simultaneous volumetric skin tightening. The study included 26 subjects (Fitzpatrick's skin type 2-5) with moderate to severe wrinkles and 4 subjects with depressed acne scars. Treatment was repeated each month up to a total of three treatment sessions. Patients' photographs were graded according to accepted scales by two uninvolved blinded evaluators. Significant reduction in the depth of wrinkles and acne scars was noted 4 weeks after therapy with further improvement at the 3-month follow-up. Our data show the histological impact and clinical beneficial effects of simultaneous RF fractional microablation and volumetric deep dermal heating for the treatment of wrinkles and acne scars.
Multisource feedback, human capital, and the financial performance of organizations.
Kim, Kyoung Yong; Atwater, Leanne; Patel, Pankaj C; Smither, James W
2016-11-01
We investigated the relationship between organizations' use of multisource feedback (MSF) programs and their financial performance. We proposed a moderated mediation framework in which the employees' ability and knowledge sharing mediate the relationship between MSF and organizational performance and the purpose for which MSF is used moderates the relationship of MSF with employees' ability and knowledge sharing. With a sample of 253 organizations representing 8,879 employees from 2005 to 2007 in South Korea, we found that MSF had a positive effect on organizational financial performance via employees' ability and knowledge sharing. We also found that when MSF was used for dual purpose (both administrative and developmental purposes), the relationship between MSF and knowledge sharing was stronger, and this interaction carried through to organizational financial performance. However, the purpose of MSF did not moderate the relationship between MSF and employees' ability. The theoretical relevance and practical implications of the findings are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).
Multi-source recruitment strategies for advancing addiction recovery research beyond treated samples
Subbaraman, Meenakshi Sabina; Laudet, Alexandre B.; Ritter, Lois A.; Stunz, Aina; Kaskutas, Lee Ann
2014-01-01
Background The lack of established sampling frames makes reaching individuals in recovery from substance problems difficult. Although general population studies are most generalizable, the low prevalence of individuals in recovery makes this strategy costly and inefficient. Though more efficient, treatment samples are biased. Aims To describe multi-source recruitment for capturing participants from heterogeneous pathways to recovery; assess which sources produced the most respondents within subgroups; and compare treatment and non-treatment samples to address generalizability. Results Family/friends, Craigslist, social media and non-12-step groups produced the most respondents from hard-to-reach groups, such as racial minorities and treatment-naïve individuals. Recovery organizations yielded twice as many African-Americans and more rural dwellers, while social media yielded twice as many young people than other sources. Treatment samples had proportionally fewer females and older individuals compared to non-treated samples. Conclusions Future research on recovery should utilize previously neglected recruiting strategies to maximize the representativeness of samples. PMID:26166909
NASA Astrophysics Data System (ADS)
LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi
2016-11-01
The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.
Control of parallel manipulators using force feedback
NASA Technical Reports Server (NTRS)
Nanua, Prabjot
1994-01-01
Two control schemes are compared for parallel robotic mechanisms actuated by hydraulic cylinders. One scheme, the 'rate based scheme', uses the position and rate information only for feedback. The second scheme, the 'force based scheme' feeds back the force information also. The force control scheme is shown to improve the response over the rate control one. It is a simple constant gain control scheme better suited to parallel mechanisms. The force control scheme can be easily modified for the dynamic forces on the end effector. This paper presents the results of a computer simulation of both the rate and force control schemes. The gains in the force based scheme can be individually adjusted in all three directions, whereas the adjustment in just one direction of the rate based scheme directly affects the other two directions.
NASA Astrophysics Data System (ADS)
Tormos, T.; Kosuth, P.; Souchon, Y.; Villeneuve, B.; Durrieu, S.; Chandesris, A.
2010-12-01
Preservation and restoration of river ecosystems require an improved understanding of the mechanisms through which they are influenced by landscape at multiple spatial scales and particularly at river corridor scale considering the role of riparian vegetation for regulating and protecting river ecological status and the relevance of this specific area for implementing efficient and realistic strategies. Assessing correctly this influence over large river networks involves accurate broad scale (i.e. at least regional) information on Land Cover within Riparian Areas (LCRA). As the structure of land cover along rivers is generally not accessible using moderate-scale satellite imagery, finer spatial resolution imagery and specific mapping techniques are needed. For this purpose we developed a generic multi-scale Object Based Image Analysis (OBIA) scheme able to produce LCRA maps in different geographic context by exploiting information available from very high spatial resolution imagery (satellite or airborne) and/or metric to decametric spatial thematic data on a given study zone thanks to fuzzy expert knowledge classification rules. A first experimentation was carried out on the Herault river watershed (southern of France), a 2650 square kilometers basin that presents a contrasted landscape (different ecoregions) and a total stream length of 1150 Km, using high and very high multispectral remotely-sensed images (10m Spot5 multispectral images and 0.5m aerial photography) and existing spatial thematic data. Application of the OBIA scheme produced a detailed (22 classes) LCRA map with an overall accuracy of 89% and a Kappa index of 83% according to a land cover pressures typology (six categories). A second experimentation (using the same data sources) was carried out on a larger test zone, a part of the Normandy river network (25 000 square kilometers basin; 6000 km long river network; 155 ecological stations). This second work aimed at elaborating a robust statistical eco-regional model to study links between land cover spatial indicators calculated at local and watershed scales, and river ecological status assessed with macroinvertebrate indicators. Application of the OBIA scheme produced a detailed (62 classes) LCRA map which allowed the model to highlight influence of specific land use patterns: (i) the significant beneficial effect of 20-m riparian tree vegetation strip near a station and 20-m riparian grassland strip along the upstream network of a station and (ii) the negative impact on river ecological status of urban areas and roads on the upstream flood plain of a station. Results of these two experimentations highlight that (i) the application of an OBIA scheme using multi-source spatial data provides an efficient approach for mapping and monitoring LCRA that can be implemented operationally at regional or national scale and (ii) and the interest of using LCRA-maps derived from very high spatial resolution imagery (satellite or airborne) and/or metric spatial thematic data to study landscape influence on river ecological status and support managers in the definition of optimized riparian preservation and restoration strategies.
Satisfaction Formation Processes in Library Users: Understanding Multisource Effects
ERIC Educational Resources Information Center
Shi, Xi; Holahan, Patricia J.; Jurkat, M. Peter
2004-01-01
This study explores whether disconfirmation theory can explain satisfaction formation processes in library users. Both library users' needs and expectations are investigated as disconfirmation standards. Overall library user satisfaction is predicted to be a function of two independent sources--satisfaction with the information product received…
Gao, Lin; Li, Chang-chun; Wang, Bao-shan; Yang Gui-jun; Wang, Lei; Fu, Kui
2016-01-01
With the innovation of remote sensing technology, remote sensing data sources are more and more abundant. The main aim of this study was to analyze retrieval accuracy of soybean leaf area index (LAI) based on multi-source remote sensing data including ground hyperspectral, unmanned aerial vehicle (UAV) multispectral and the Gaofen-1 (GF-1) WFV data. Ratio vegetation index (RVI), normalized difference vegetation index (NDVI), soil-adjusted vegetation index (SAVI), difference vegetation index (DVI), and triangle vegetation index (TVI) were used to establish LAI retrieval models, respectively. The models with the highest calibration accuracy were used in the validation. The capability of these three kinds of remote sensing data for LAI retrieval was assessed according to the estimation accuracy of models. The experimental results showed that the models based on the ground hyperspectral and UAV multispectral data got better estimation accuracy (R² was more than 0.69 and RMSE was less than 0.4 at 0.01 significance level), compared with the model based on WFV data. The RVI logarithmic model based on ground hyperspectral data was little superior to the NDVI linear model based on UAV multispectral data (The difference in E(A), R² and RMSE were 0.3%, 0.04 and 0.006, respectively). The models based on WFV data got the lowest estimation accuracy with R2 less than 0.30 and RMSE more than 0.70. The effects of sensor spectral response characteristics, sensor geometric location and spatial resolution on the soybean LAI retrieval were discussed. The results demonstrated that ground hyperspectral data were advantageous but not prominent over traditional multispectral data in soybean LAI retrieval. WFV imagery with 16 m spatial resolution could not meet the requirements of crop growth monitoring at field scale. Under the condition of ensuring the high precision in retrieving soybean LAI and working efficiently, the approach to acquiring agricultural information by UAV remote sensing could yet be regarded as an optimal plan. Therefore, in the case of more and more available remote sensing information sources, agricultural UAV remote sensing could become an important information resource for guiding field-scale crop management and provide more scientific and accurate information for precision agriculture research.
NASA Astrophysics Data System (ADS)
Li, Deying; Yin, Kunlong; Gao, Huaxi; Liu, Changchun
2009-10-01
Although the project of the Three Gorges Dam across the Yangtze River in China can utilize this huge potential source of hydroelectric power, and eliminate the loss of life and damage by flood, it also causes environmental problems due to the big rise and fluctuation of the water, such as geo-hazards. In order to prevent and predict geo-hazards, the establishment of prediction system of geo-hazards is very necessary. In order to implement functions of hazard prediction of regional and urban geo-hazard, single geo-hazard prediction, prediction of landslide surge and risk evaluation, logical layers of the system consist of data capturing layer, data manipulation and processing layer, analysis and application layer, and information publication layer. Due to the existence of multi-source spatial data, the research on the multi-source transformation and fusion data should be carried on in the paper. Its applicability of the system was testified on the spatial prediction of landslide hazard through spatial analysis of GIS in which information value method have been applied aims to identify susceptible areas that are possible to future landslide, on the basis of historical record of past landslide, terrain parameter, geology, rainfall and anthropogenic activity. Detailed discussion was carried out on spatial distribution characteristics of landslide hazard in the new town of Badong. These results can be used for risk evaluation. The system can be implemented as an early-warning and emergency management tool by the relevant authorities of the Three Gorges Reservoir in the future.
Jouhet, V; Defossez, G; Ingrand, P
2013-01-01
The aim of this study was to develop and evaluate a selection algorithm of relevant records for the notification of incident cases of cancer on the basis of the individual data available in a multi-source information system. This work was conducted on data for the year 2008 in the general cancer registry of Poitou-Charentes region (France). The selection algorithm hierarchizes information according to its level of relevance for tumoral topography and tumoral morphology independently. The selected data are combined to form composite records. These records are then grouped in respect with the notification rules of the International Agency for Research on Cancer for multiple primary cancers. The evaluation, based on recall, precision and F-measure confronted cases validated manually by the registry's physicians with tumours notified with and without records selection. The analysis involved 12,346 tumours validated among 11,971 individuals. The data used were hospital discharge data (104,474 records), pathology data (21,851 records), healthcare insurance data (7508 records) and cancer care centre's data (686 records). The selection algorithm permitted performances improvement for notification of tumour topography (F-measure 0.926 with vs. 0.857 without selection) and tumour morphology (F-measure 0.805 with vs. 0.750 without selection). These results show that selection of information according to its origin is efficient in reducing noise generated by imprecise coding. Further research is needed for solving the semantic problems relating to the integration of heterogeneous data and the use of non-structured information.
Al Ansari, Ahmed; Al Khalifa, Khalid; Al Azzawi, Mohamed; Al Amer, Rashed; Al Sharqi, Dana; Al-Mansoor, Anwar; Munshi, Fadi M
2015-01-01
We aimed to design, implement, and evaluate the feasibility and reliability of a multisource feedback (MSF) system to assess interns in their clerkship year in the Middle Eastern culture, the Kingdom of Bahrain. The study was undertaken in the Bahrain Defense Force Hospital, a military teaching hospital in the Kingdom of Bahrain. A total of 21 interns (who represent the total population of the interns for the given year) were assessed in this study. All of the interns were rotating through our hospital during their year-long clerkship rotation. The study sample consisted of nine males and 12 females. Each participating intern was evaluated by three groups of raters, eight medical intern colleagues, eight senior medical colleagues, and eight coworkers from different departments. A total of 21 interns (nine males and 12 females) were assessed in this study. The total mean response rates were 62.3%. A factor analysis was conducted that found that the data on the questionnaire grouped into three factors that counted for 76.4% of the total variance. These three factors were labeled as professionalism, collaboration, and communication. Reliability analysis indicated that the full instrument scale had high internal consistency (Cronbach's α 0.98). The generalizability coefficients for the surveys were estimated to be 0.78. Based on our results and analysis, we conclude that the MSF tool we used on the interns rotating in their clerkship year within our Middle Eastern culture provides an effective method of evaluation because it offers a reliable, valid, and feasible process.
A Newton-Raphson Method Approach to Adjusting Multi-Source Solar Simulators
NASA Technical Reports Server (NTRS)
Snyder, David B.; Wolford, David S.
2012-01-01
NASA Glenn Research Center has been using an in house designed X25 based multi-source solar simulator since 2003. The simulator is set up for triple junction solar cells prior to measurements b y adjusting the three sources to produce the correct short circuit current, lsc, in each of three AM0 calibrated sub-cells. The past practice has been to adjust one source on one sub-cell at a time, iterating until all the sub-cells have the calibrated Isc. The new approach is to create a matrix of measured lsc for small source changes on each sub-cell. A matrix, A, is produced. This is normalized to unit changes in the sources so that Ax(delta)s = (delta)isc. This matrix can now be inverted and used with the known Isc differences from the AM0 calibrated values to indicate changes in the source settings, (delta)s = A ·'x.(delta)isc This approach is still an iterative one, but all sources are changed during each iteration step. It typically takes four to six steps to converge on the calibrated lsc values. Even though the source lamps may degrade over time, the initial matrix evaluation i s not performed each time, since measurement matrix needs to be only approximate. Because an iterative approach is used the method will still continue to be valid. This method may become more important as state-of-the-art solar cell junction responses overlap the sources of the simulator. Also, as the number of cell junctions and sources increase, this method should remain applicable.
Yuan, Lei; Wang, Yalin; Thompson, Paul M.; Narayan, Vaibhav A.; Ye, Jieping
2012-01-01
Analysis of incomplete data is a big challenge when integrating large-scale brain imaging datasets from different imaging modalities. In the Alzheimer’s Disease Neuroimaging Initiative (ADNI), for example, over half of the subjects lack cerebrospinal fluid (CSF) measurements; an independent half of the subjects do not have fluorodeoxyglucose positron emission tomography (FDG-PET) scans; many lack proteomics measurements. Traditionally, subjects with missing measures are discarded, resulting in a severe loss of available information. In this paper, we address this problem by proposing an incomplete Multi-Source Feature (iMSF) learning method where all the samples (with at least one available data source) can be used. To illustrate the proposed approach, we classify patients from the ADNI study into groups with Alzheimer’s disease (AD), mild cognitive impairment (MCI) and normal controls, based on the multi-modality data. At baseline, ADNI’s 780 participants (172 AD, 397 MCI, 211 NC), have at least one of four data types: magnetic resonance imaging (MRI), FDG-PET, CSF and proteomics. These data are used to test our algorithm. Depending on the problem being solved, we divide our samples according to the availability of data sources, and we learn shared sets of features with state-of-the-art sparse learning methods. To build a practical and robust system, we construct a classifier ensemble by combining our method with four other methods for missing value estimation. Comprehensive experiments with various parameters show that our proposed iMSF method and the ensemble model yield stable and promising results. PMID:22498655
A Novel Artificial Bee Colony Based Clustering Algorithm for Categorical Data
2015-01-01
Data with categorical attributes are ubiquitous in the real world. However, existing partitional clustering algorithms for categorical data are prone to fall into local optima. To address this issue, in this paper we propose a novel clustering algorithm, ABC-K-Modes (Artificial Bee Colony clustering based on K-Modes), based on the traditional k-modes clustering algorithm and the artificial bee colony approach. In our approach, we first introduce a one-step k-modes procedure, and then integrate this procedure with the artificial bee colony approach to deal with categorical data. In the search process performed by scout bees, we adopt the multi-source search inspired by the idea of batch processing to accelerate the convergence of ABC-K-Modes. The performance of ABC-K-Modes is evaluated by a series of experiments in comparison with that of the other popular algorithms for categorical data. PMID:25993469
Li, Wen-Jie; Zhang, Shi-Huang; Wang, Hui-Min
2011-12-01
Ecosystem services evaluation is a hot topic in current ecosystem management, and has a close link with human beings welfare. This paper summarized the research progress on the evaluation of ecosystem services based on geographic information system (GIS) and remote sensing (RS) technology, which could be reduced to the following three characters, i. e., ecological economics theory is widely applied as a key method in quantifying ecosystem services, GIS and RS technology play a key role in multi-source data acquisition, spatiotemporal analysis, and integrated platform, and ecosystem mechanism model becomes a powerful tool for understanding the relationships between natural phenomena and human activities. Aiming at the present research status and its inadequacies, this paper put forward an "Assembly Line" framework, which was a distributed one with scalable characteristics, and discussed the future development trend of the integration research on ecosystem services evaluation based on GIS and RS technologies.
A novel artificial bee colony based clustering algorithm for categorical data.
Ji, Jinchao; Pang, Wei; Zheng, Yanlin; Wang, Zhe; Ma, Zhiqiang
2015-01-01
Data with categorical attributes are ubiquitous in the real world. However, existing partitional clustering algorithms for categorical data are prone to fall into local optima. To address this issue, in this paper we propose a novel clustering algorithm, ABC-K-Modes (Artificial Bee Colony clustering based on K-Modes), based on the traditional k-modes clustering algorithm and the artificial bee colony approach. In our approach, we first introduce a one-step k-modes procedure, and then integrate this procedure with the artificial bee colony approach to deal with categorical data. In the search process performed by scout bees, we adopt the multi-source search inspired by the idea of batch processing to accelerate the convergence of ABC-K-Modes. The performance of ABC-K-Modes is evaluated by a series of experiments in comparison with that of the other popular algorithms for categorical data.
Economic dispatch optimization for system integrating renewable energy sources
NASA Astrophysics Data System (ADS)
Jihane, Kartite; Mohamed, Cherkaoui
2018-05-01
Nowadays, the use of energy is growing especially in transportation and electricity industries. However this energy is based on conventional sources which pollute the environment. Multi-source system is seen as the best solution to sustainable development. This paper proposes the Economic Dispatch (ED) of hybrid renewable power system. The hybrid system is composed of ten thermal generators, photovoltaic (PV) generator and wind turbine generator. To show the importance of renewable energy sources (RES) in the energy mix we have ran the simulation for system integrated PV only and PV plus wind. The result shows that the system with renewable energy sources (RES) is more compromising than the system without RES in terms of fuel cost.
[Effect of different snow depth and area on the snow cover retrieval using remote sensing data].
Jiang, Hong-bo; Qin, Qi-ming; Zhang, Ning; Dong, Heng; Chen, Chao
2011-12-01
For the needs of snow cover monitoring using multi-source remote sensing data, in the present article, based on the spectrum analysis of different depth and area of snow, the effect of snow depth on the results of snow cover retrieval using normalized difference snow index (NDSI) is discussed. Meanwhile, taking the HJ-1B and MODIS remote sensing data as an example, the snow area effect on the snow cover monitoring is also studied. The results show that: the difference of snow depth does not contribute to the retrieval results, while the snow area affects the results of retrieval to some extents because of the constraints of spatial resolution.
Deep Learning Based Binaural Speech Separation in Reverberant Environments.
Zhang, Xueliang; Wang, DeLiang
2017-05-01
Speech signal is usually degraded by room reverberation and additive noises in real environments. This paper focuses on separating target speech signal in reverberant conditions from binaural inputs. Binaural separation is formulated as a supervised learning problem, and we employ deep learning to map from both spatial and spectral features to a training target. With binaural inputs, we first apply a fixed beamformer and then extract several spectral features. A new spatial feature is proposed and extracted to complement the spectral features. The training target is the recently suggested ideal ratio mask. Systematic evaluations and comparisons show that the proposed system achieves very good separation performance and substantially outperforms related algorithms under challenging multi-source and reverberant environments.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
Efficient Multi-Source Data Fusion for Decentralized Sensor Networks
2006-10-01
Operating Picture (COP). Robovolc, accessing a single DDF node associated with a CCTV camera (marked in orange in Figure 3a), defends a ‘ sensitive ...Gaussian environments. Figure 10: Particle Distribution Snapshots osition error between each target and the me ed particle set at the bearing-only
Data Mining Algorithms for Classification of Complex Biomedical Data
ERIC Educational Resources Information Center
Lan, Liang
2012-01-01
In my dissertation, I will present my research which contributes to solve the following three open problems from biomedical informatics: (1) Multi-task approaches for microarray classification; (2) Multi-label classification of gene and protein prediction from multi-source biological data; (3) Spatial scan for movement data. In microarray…
Directed Vapor Deposition: Low Vacuum Materials Processing Technology
2000-01-01
constituent A Crucible with constituent B Electron beam AB Substrate Deposit Flux of A Flux of B Composition "Skull" melt Electron beam Coolant Copper ... crucible Evaporation target Evaporant material Vapor flux Fibrous Coating Surface a) b) sharp (0.5 mm) beam focussing. When used with multisource
Evaluation of Professional Role Competency during Psychiatry Residency
ERIC Educational Resources Information Center
Grujich, Nikola N.; Razmy, Ajmal; Zaretsky, Ari; Styra, Rima G.; Sockalingam, Sanjeev
2012-01-01
Objective: The authors sought to determine psychiatry residents' perceptions on the current method of evaluating professional role competency and the use of multi-source feedback (MSF) as an assessment tool. Method: Authors disseminated a structured, anonymous survey to 128 University of Toronto psychiatry residents, evaluating the current mode of…
Cross-Modulation Interference with Lateralization of Mixed-Modulated Waveforms
ERIC Educational Resources Information Center
Hsieh, I-Hui; Petrosyan, Agavni; Goncalves, Oscar F.; Hickok, Gregory; Saberi, Kourosh
2010-01-01
Purpose: This study investigated the ability to use spatial information in mixed-modulated (MM) sounds containing concurrent frequency-modulated (FM) and amplitude-modulated (AM) sounds by exploring patterns of interference when different modulation types originated from different loci as may occur in a multisource acoustic field. Method:…
The Effect of Surgeon Empathy and Emotional Intelligence on Patient Satisfaction
ERIC Educational Resources Information Center
Weng, Hui-Ching; Steed, James F.; Yu, Shang-Won; Liu, Yi-Ten; Hsu, Chia-Chang; Yu, Tsan-Jung; Chen, Wency
2011-01-01
We investigated the associations of surgeons' emotional intelligence and surgeons' empathy with patient-surgeon relationships, patient perceptions of their health, and patient satisfaction before and after surgical procedures. We used multi-source approaches to survey 50 surgeons and their 549 outpatients during initial and follow-up visits.…
Single Mothers of Early Adolescents: Perceptions of Competence
ERIC Educational Resources Information Center
Beckert, Troy E.; Strom, Paris S.; Strom, Robert D.; Darre, Kathryn; Weed, Ane
2008-01-01
The purpose of this study was to examine similarities and differences in single mothers' and adolescents' perceptions of parenting competencies from a developmental assets approach. A multi-source (mothers [n = 29] and 10-14-year-old adolescent children [n = 29]), single-method (both generations completed the Parent Success Indicator)…
Genetic algorithms with memory- and elitism-based immigrants in dynamic environments.
Yang, Shengxiang
2008-01-01
In recent years the genetic algorithm community has shown a growing interest in studying dynamic optimization problems. Several approaches have been devised. The random immigrants and memory schemes are two major ones. The random immigrants scheme addresses dynamic environments by maintaining the population diversity while the memory scheme aims to adapt genetic algorithms quickly to new environments by reusing historical information. This paper investigates a hybrid memory and random immigrants scheme, called memory-based immigrants, and a hybrid elitism and random immigrants scheme, called elitism-based immigrants, for genetic algorithms in dynamic environments. In these schemes, the best individual from memory or the elite from the previous generation is retrieved as the base to create immigrants into the population by mutation. This way, not only can diversity be maintained but it is done more efficiently to adapt genetic algorithms to the current environment. Based on a series of systematically constructed dynamic problems, experiments are carried out to compare genetic algorithms with the memory-based and elitism-based immigrants schemes against genetic algorithms with traditional memory and random immigrants schemes and a hybrid memory and multi-population scheme. The sensitivity analysis regarding some key parameters is also carried out. Experimental results show that the memory-based and elitism-based immigrants schemes efficiently improve the performance of genetic algorithms in dynamic environments.
NASA Astrophysics Data System (ADS)
Lv, Zheng; Sui, Haigang; Zhang, Xilin; Huang, Xianfeng
2007-11-01
As one of the most important geo-spatial objects and military establishment, airport is always a key target in fields of transportation and military affairs. Therefore, automatic recognition and extraction of airport from remote sensing images is very important and urgent for updating of civil aviation and military application. In this paper, a new multi-source data fusion approach on automatic airport information extraction, updating and 3D modeling is addressed. Corresponding key technologies including feature extraction of airport information based on a modified Ostu algorithm, automatic change detection based on new parallel lines-based buffer detection algorithm, 3D modeling based on gradual elimination of non-building points algorithm, 3D change detecting between old airport model and LIDAR data, typical CAD models imported and so on are discussed in detail. At last, based on these technologies, we develop a prototype system and the results show our method can achieve good effects.
You, Siming; Wang, Wei; Dai, Yanjun; Tong, Yen Wah; Wang, Chi-Hwa
2016-10-01
The compositions of food wastes and their co-gasification producer gas were compared with the existing data of sewage sludge. Results showed that food wastes are more favorable than sewage sludge for co-gasification based on residue generation and energy output. Two decentralized gasification-based schemes were proposed to dispose of the sewage sludge and food wastes in Singapore. Monte Carlo simulation-based cost-benefit analysis was conducted to compare the proposed schemes with the existing incineration-based scheme. It was found that the gasification-based schemes are financially superior to the incineration-based scheme based on the data of net present value (NPV), benefit-cost ratio (BCR), and internal rate of return (IRR). Sensitivity analysis was conducted to suggest effective measures to improve the economics of the schemes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2008-09-15
We present two PMD compensation schemes suitable for use in multilevel (M>or=2) block-coded modulation schemes with coherent detection. The first scheme is based on a BLAST-type polarization-interference cancellation scheme, and the second scheme is based on iterative polarization cancellation. Both schemes use the LDPC codes as channel codes. The proposed PMD compensations schemes are evaluated by employing coded-OFDM and coherent detection. When used in combination with girth-10 LDPC codes those schemes outperform polarization-time coding based OFDM by 1 dB at BER of 10(-9), and provide two times higher spectral efficiency. The proposed schemes perform comparable and are able to compensate even 1200 ps of differential group delay with negligible penalty.
Multisource Estimation of Long-term Global Terrestrial Surface Radiation
NASA Astrophysics Data System (ADS)
Peng, L.; Sheffield, J.
2017-12-01
Land surface net radiation is the essential energy source at the earth's surface. It determines the surface energy budget and its partitioning, drives the hydrological cycle by providing available energy, and offers heat, light, and energy for biological processes. Individual components in net radiation have changed historically due to natural and anthropogenic climate change and land use change. Decadal variations in radiation such as global dimming or brightening have important implications for hydrological and carbon cycles. In order to assess the trends and variability of net radiation and evapotranspiration, there is a need for accurate estimates of long-term terrestrial surface radiation. While large progress in measuring top of atmosphere energy budget has been made, huge discrepancies exist among ground observations, satellite retrievals, and reanalysis fields of surface radiation, due to the lack of observational networks, the difficulty in measuring from space, and the uncertainty in algorithm parameters. To overcome the weakness of single source datasets, we propose a multi-source merging approach to fully utilize and combine multiple datasets of radiation components separately, as they are complementary in space and time. First, we conduct diagnostic analysis of multiple satellite and reanalysis datasets based on in-situ measurements such as Global Energy Balance Archive (GEBA), existing validation studies, and other information such as network density and consistency with other meteorological variables. Then, we calculate the optimal weighted average of multiple datasets by minimizing the variance of error between in-situ measurements and other observations. Finally, we quantify the uncertainties in the estimates of surface net radiation and employ physical constraints based on the surface energy balance to reduce these uncertainties. The final dataset is evaluated in terms of the long-term variability and its attribution to changes in individual components. The goal of this study is to provide a merged observational benchmark for large-scale diagnostic analyses, remote sensing and land surface modeling.
Müller, Andreas; Weigl, Matthias
2017-01-01
Background: Individuals' behavioral strategies like selection, optimization, and compensation (SOC) contribute to efficient use of available resources. In the work context, previous studies revealed positive associations between employees' SOC use and favorable individual outcomes, like engagement and job performance. However, the social implications of self-directed behaviors like SOC that are favorable for the employee but may imply consequences for coworkers have not been investigated yet in an interpersonal work context. Objective: This study aimed to assess associations between employees' use of SOC behaviors at work and their organizational citizenship behaviors (OCB) toward the benefits of co-workers rated by their peers at work. We further sought to identify age-specific associations between SOC use and OCB. Design and Method: A cross-sectional design combining multi-source data was applied in primary school teachers (age range: 23-58 years) who frequently teach in dyads. N = 114 dyads were finally included. Teachers reported on their SOC strategies at work. Their peer colleagues evaluated teachers' OCB. Control variables were gender, workload, working hours, and perceived proximity of relationship between the dyads. Results: We observed a positive effect of loss-based selection behaviors on peer-rated OCB. Moreover, there was a significant two-way interaction effect between the use of compensation strategies and age on OCB, such that there was a positive association for older employees and a negative association for younger employees. There were no significant main and age-related interaction effects of elective selection, optimization, and of overall SOC strategies on OCB. Conclusion: Our study suggests that high use of loss-based selection and high use of compensation strategies in older employees is positively related with OCB as perceived by their colleagues. However, high use of compensation strategies in younger employees is perceived negatively related with OCB. Our findings contribute to a better understanding of the age-differentiated interpersonal effects of successful aging strategies in terms of SOC in organizations.
Müller, Andreas; Weigl, Matthias
2017-01-01
Background: Individuals’ behavioral strategies like selection, optimization, and compensation (SOC) contribute to efficient use of available resources. In the work context, previous studies revealed positive associations between employees’ SOC use and favorable individual outcomes, like engagement and job performance. However, the social implications of self-directed behaviors like SOC that are favorable for the employee but may imply consequences for coworkers have not been investigated yet in an interpersonal work context. Objective: This study aimed to assess associations between employees’ use of SOC behaviors at work and their organizational citizenship behaviors (OCB) toward the benefits of co-workers rated by their peers at work. We further sought to identify age-specific associations between SOC use and OCB. Design and Method: A cross-sectional design combining multi-source data was applied in primary school teachers (age range: 23–58 years) who frequently teach in dyads. N = 114 dyads were finally included. Teachers reported on their SOC strategies at work. Their peer colleagues evaluated teachers’ OCB. Control variables were gender, workload, working hours, and perceived proximity of relationship between the dyads. Results: We observed a positive effect of loss-based selection behaviors on peer-rated OCB. Moreover, there was a significant two-way interaction effect between the use of compensation strategies and age on OCB, such that there was a positive association for older employees and a negative association for younger employees. There were no significant main and age-related interaction effects of elective selection, optimization, and of overall SOC strategies on OCB. Conclusion: Our study suggests that high use of loss-based selection and high use of compensation strategies in older employees is positively related with OCB as perceived by their colleagues. However, high use of compensation strategies in younger employees is perceived negatively related with OCB. Our findings contribute to a better understanding of the age-differentiated interpersonal effects of successful aging strategies in terms of SOC in organizations. PMID:29085315
Report on Pairing-based Cryptography.
Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily
2015-01-01
This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST's position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed.
Report on Pairing-based Cryptography
Moody, Dustin; Peralta, Rene; Perlner, Ray; Regenscheid, Andrew; Roginsky, Allen; Chen, Lily
2015-01-01
This report summarizes study results on pairing-based cryptography. The main purpose of the study is to form NIST’s position on standardizing and recommending pairing-based cryptography schemes currently published in research literature and standardized in other standard bodies. The report reviews the mathematical background of pairings. This includes topics such as pairing-friendly elliptic curves and how to compute various pairings. It includes a brief introduction to existing identity-based encryption (IBE) schemes and other cryptographic schemes using pairing technology. The report provides a complete study of the current status of standard activities on pairing-based cryptographic schemes. It explores different application scenarios for pairing-based cryptography schemes. As an important aspect of adopting pairing-based schemes, the report also considers the challenges inherent in validation testing of cryptographic algorithms and modules. Based on the study, the report suggests an approach for including pairing-based cryptography schemes in the NIST cryptographic toolkit. The report also outlines several questions that will require further study if this approach is followed. PMID:26958435
ERIC Educational Resources Information Center
Blackman, Gabrielle L.; Ostrander, Rick; Herman, Keith C.
2005-01-01
Although ADHD and depression are common comorbidities in youth, few studies have examined this particular clinical presentation. To address method bias limitations of previous research, this study uses multiple informants to compare the academic, social, and clinical functioning of children with ADHD, children with ADHD and depression, and…
ERIC Educational Resources Information Center
Powers, Joshua B.
This study investigated institutional resource factors that may explain differential performance with university technology transfer--the process by which university research is transformed into marketable products. Using multi-source data on 108 research universities, a set of internal resources (financial, physical, human capital, and…
ERIC Educational Resources Information Center
Sargeant, Joan; MacLeod, Tanya; Sinclair, Douglas; Power, Mary
2011-01-01
Introduction: The Colleges of Physicians and Surgeons of Alberta and Nova Scotia (CPSNS) use a standardized multisource feedback program, the Physician Achievement Review (PAR/NSPAR), to provide physicians with performance assessment data via questionnaires from medical colleagues, coworkers, and patients on 5 practice domains: consultation…
ERIC Educational Resources Information Center
Lans, Thomas; Biemans, Harm; Mulder, Martin; Verstegen, Jos
2010-01-01
An important assumption of entrepreneurial competence is that (at least part of) it can be learned and developed. However, human resources development (HRD) practices aimed at further strengthening and developing small-business owner-managers' entrepreneurial competence are complex and underdeveloped. A multisource assessment of owner-managers'…
Advances in audio source seperation and multisource audio content retrieval
NASA Astrophysics Data System (ADS)
Vincent, Emmanuel
2012-06-01
Audio source separation aims to extract the signals of individual sound sources from a given recording. In this paper, we review three recent advances which improve the robustness of source separation in real-world challenging scenarios and enable its use for multisource content retrieval tasks, such as automatic speech recognition (ASR) or acoustic event detection (AED) in noisy environments. We present a Flexible Audio Source Separation Toolkit (FASST) and discuss its advantages compared to earlier approaches such as independent component analysis (ICA) and sparse component analysis (SCA). We explain how cues as diverse as harmonicity, spectral envelope, temporal fine structure or spatial location can be jointly exploited by this toolkit. We subsequently present the uncertainty decoding (UD) framework for the integration of audio source separation and audio content retrieval. We show how the uncertainty about the separated source signals can be accurately estimated and propagated to the features. Finally, we explain how this uncertainty can be efficiently exploited by a classifier, both at the training and the decoding stage. We illustrate the resulting performance improvements in terms of speech separation quality and speaker recognition accuracy.
WHO Expert Committee on Specifications for Pharmaceutical Preparations. Forty-ninth report.
2015-01-01
The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use. Revised procedure for the development of monographs and other texts for The International Pharmacopoeia; Revised updating mechanism for the section on radiopharmaceuticals in The International Pharmacopoeia; Revision of the supplementary guidelines on good manufacturing practices: validation, Appendix 7: non-sterile process validation; General guidance for inspectors on hold-time studies; 16 technical supplements to Model guidance for the storage and transport of time- and temperature-sensitive pharmaceutical products; Recommendations for quality requirements when plant-derived artemisinin is used as a starting material in the production of antimalarial active pharmaceutical ingredients; Multisource (generic) pharmaceutical products: guidelines on registration requirements to establish interchangeability: revision; Guidance on the selection of comparator pharmaceutical products for equivalence assessment of interchangeable multisource (generic) products: revision; and Good review practices: guidelines for national and regional regulatory authorities.
Miranda, Elaine Silva; Pinto, Cláudia Du Bocage Santos; dos Reis, André Luis de Almeida; Emmerick, Isabel Cristina Martins; Campos, Mônica Rodrigues; Luiza, Vera Lucia; Osorio-de-Castro, Claudia Garcia Serpa
2009-10-01
A study to identify availability and prices of medicines, according to type of provider, was conducted in the five regions of Brazil. A list of medicines to treat prevalent diseases was investigated, using the medicines price methodology developed by the World Health Organization and Health Action International, adapted for Brazil. In the public sector, bioequivalent (vis-à-vis reference brand) generics are less available than multisource products. For most medicines (71.4%), the availability of bioequivalent generics was less than 10%. In the private sector, the average number of different bioequivalent generic versions in the outlets was far smaller than the number of versions on the market. There was a positive correlation between the number of generics on the market, or those found at outlets, and the price variation in bioequivalent generic products, in relation to the maximum consumer price. It is estimated that price competition is occurring among bioequivalent generic drugs and between them and multisource products for the same substance, but not with reference brands.
Lee, Tian-Fu; Liu, Chuan-Ming
2013-06-01
A smart-card based authentication scheme for telecare medicine information systems enables patients, doctors, nurses, health visitors and the medicine information systems to establish a secure communication platform through public networks. Zhu recently presented an improved authentication scheme in order to solve the weakness of the authentication scheme of Wei et al., where the off-line password guessing attacks cannot be resisted. This investigation indicates that the improved scheme of Zhu has some faults such that the authentication scheme cannot execute correctly and is vulnerable to the attack of parallel sessions. Additionally, an enhanced authentication scheme based on the scheme of Zhu is proposed. The enhanced scheme not only avoids the weakness in the original scheme, but also provides users' anonymity and authenticated key agreements for secure data communications.
Wang, Chengqi; Zhang, Xiao; Zheng, Zhiming
2016-01-01
With the security requirements of networks, biometrics authenticated schemes which are applied in the multi-server environment come to be more crucial and widely deployed. In this paper, we propose a novel biometric-based multi-server authentication and key agreement scheme which is based on the cryptanalysis of Mishra et al.'s scheme. The informal and formal security analysis of our scheme are given, which demonstrate that our scheme satisfies the desirable security requirements. The presented scheme provides a variety of significant functionalities, in which some features are not considered in the most of existing authentication schemes, such as, user revocation or re-registration and biometric information protection. Compared with several related schemes, our scheme has more secure properties and lower computation cost. It is obviously more appropriate for practical applications in the remote distributed networks.
ID-based encryption scheme with revocation
NASA Astrophysics Data System (ADS)
Othman, Hafizul Azrie; Ismail, Eddie Shahril
2017-04-01
In 2015, Meshram proposed an efficient ID-based cryptographic encryption based on the difficulty of solving discrete logarithm and integer-factoring problems. The scheme was pairing free and claimed to be secure against adaptive chosen plaintext attacks (CPA). Later, Tan et al. proved that the scheme was insecure by presenting a method to recover the secret master key and to obtain prime factorization of modulo n. In this paper, we propose a new pairing-free ID-based encryption scheme with revocation based on Meshram's ID-based encryption scheme, which is also secure against Tan et al.'s attacks.
A secure biometrics-based authentication scheme for telecare medicine information systems.
Yan, Xiaopeng; Li, Weiheng; Li, Ping; Wang, Jiantao; Hao, Xinhong; Gong, Peng
2013-10-01
The telecare medicine information system (TMIS) allows patients and doctors to access medical services or medical information at remote sites. Therefore, it could bring us very big convenient. To safeguard patients' privacy, authentication schemes for the TMIS attracted wide attention. Recently, Tan proposed an efficient biometrics-based authentication scheme for the TMIS and claimed their scheme could withstand various attacks. However, in this paper, we point out that Tan's scheme is vulnerable to the Denial-of-Service attack. To enhance security, we also propose an improved scheme based on Tan's work. Security and performance analysis shows our scheme not only could overcome weakness in Tan's scheme but also has better performance.
Carlson, Josh J; Sullivan, Sean D; Garrison, Louis P; Neumann, Peter J; Veenstra, David L
2010-08-01
To identify, categorize and examine performance-based health outcomes reimbursement schemes for medical technology. We performed a review of performance-based health outcomes reimbursement schemes over the past 10 years (7/98-010/09) using publicly available databases, web and grey literature searches, and input from healthcare reimbursement experts. We developed a taxonomy of scheme types by inductively organizing the schemes identified according to the timing, execution, and health outcomes measured in the schemes. Our search yielded 34 coverage with evidence development schemes, 10 conditional treatment continuation schemes, and 14 performance-linked reimbursement schemes. The majority of schemes are in Europe and Australia, with an increasing number in Canada and the U.S. These schemes have the potential to alter the reimbursement and pricing landscape for medical technology, but significant challenges, including high transaction costs and insufficient information systems, may limit their long-term impact. Future studies regarding experiences and outcomes of implemented schemes are necessary. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.
Multisource inverse-geometry CT. Part II. X-ray source design and prototype
Neculaes, V. Bogdan; Caiafa, Antonio; Cao, Yang; De Man, Bruno; Edic, Peter M.; Frutschy, Kristopher; Gunturi, Satish; Inzinna, Lou; Reynolds, Joseph; Vermilyea, Mark; Wagner, David; Zhang, Xi; Zou, Yun; Pelc, Norbert J.; Lounsberry, Brian
2016-01-01
Purpose: This paper summarizes the development of a high-power distributed x-ray source, or “multisource,” designed for inverse-geometry computed tomography (CT) applications [see B. De Man et al., “Multisource inverse-geometry CT. Part I. System concept and development,” Med. Phys. 43, 4607–4616 (2016)]. The paper presents the evolution of the source architecture, component design (anode, emitter, beam optics, control electronics, high voltage insulator), and experimental validation. Methods: Dispenser cathode emitters were chosen as electron sources. A modular design was adopted, with eight electron emitters (two rows of four emitters) per module, wherein tungsten targets were brazed onto copper anode blocks—one anode block per module. A specialized ceramic connector provided high voltage standoff capability and cooling oil flow to the anode. A matrix topology and low-noise electronic controls provided switching of the emitters. Results: Four modules (32 x-ray sources in two rows of 16) have been successfully integrated into a single vacuum vessel and operated on an inverse-geometry computed tomography system. Dispenser cathodes provided high beam current (>1000 mA) in pulse mode, and the electrostatic lenses focused the current beam to a small optical focal spot size (0.5 × 1.4 mm). Controlled emitter grid voltage allowed the beam current to be varied for each source, providing the ability to modulate beam current across the fan of the x-ray beam, denoted as a virtual bowtie filter. The custom designed controls achieved x-ray source switching in <1 μs. The cathode-grounded source was operated successfully up to 120 kV. Conclusions: A high-power, distributed x-ray source for inverse-geometry CT applications was successfully designed, fabricated, and operated. Future embodiments may increase the number of spots and utilize fast read out detectors to increase the x-ray flux magnitude further, while still staying within the stationary target inherent thermal limitations. PMID:27487878
Multisource inverse-geometry CT. Part II. X-ray source design and prototype
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neculaes, V. Bogdan, E-mail: neculaes@ge.com; Caia
2016-08-15
Purpose: This paper summarizes the development of a high-power distributed x-ray source, or “multisource,” designed for inverse-geometry computed tomography (CT) applications [see B. De Man et al., “Multisource inverse-geometry CT. Part I. System concept and development,” Med. Phys. 43, 4607–4616 (2016)]. The paper presents the evolution of the source architecture, component design (anode, emitter, beam optics, control electronics, high voltage insulator), and experimental validation. Methods: Dispenser cathode emitters were chosen as electron sources. A modular design was adopted, with eight electron emitters (two rows of four emitters) per module, wherein tungsten targets were brazed onto copper anode blocks—one anode blockmore » per module. A specialized ceramic connector provided high voltage standoff capability and cooling oil flow to the anode. A matrix topology and low-noise electronic controls provided switching of the emitters. Results: Four modules (32 x-ray sources in two rows of 16) have been successfully integrated into a single vacuum vessel and operated on an inverse-geometry computed tomography system. Dispenser cathodes provided high beam current (>1000 mA) in pulse mode, and the electrostatic lenses focused the current beam to a small optical focal spot size (0.5 × 1.4 mm). Controlled emitter grid voltage allowed the beam current to be varied for each source, providing the ability to modulate beam current across the fan of the x-ray beam, denoted as a virtual bowtie filter. The custom designed controls achieved x-ray source switching in <1 μs. The cathode-grounded source was operated successfully up to 120 kV. Conclusions: A high-power, distributed x-ray source for inverse-geometry CT applications was successfully designed, fabricated, and operated. Future embodiments may increase the number of spots and utilize fast read out detectors to increase the x-ray flux magnitude further, while still staying within the stationary target inherent thermal limitations.« less
van Ruitenbeek, Gemma M C; Zijlstra, Fred R H; Hülsheger, Ute R
2018-06-04
Purpose Participation in regular paid jobs positively affects mental and physical health of all people, including people with limited work capacities (LWC), people that are limited in their work capacity as a consequence of their disability, such as chronic mental illness, psychological or developmental disorder. For successful participation, a good fit between on one hand persons' capacities and on the other hand well-suited individual support and a suitable work environment is necessary in order to meet the demands of work. However, to date there is a striking paucity of validated measures that indicate the capability to work of people with LWC and that outline directions for support that facilitate the fit. Goal of the present study was therefore to develop such an instrument. Specifically, we adjusted measures of mental ability, conscientiousness, self-efficacy, and coping by simplifying the language level of these measures to make the scales accessible for people with low literacy. In order to validate these adjusted self-report and observer measures we conducted two studies, using multi-source, longitudinal data. Method Study 1 was a longitudinal multi-source study in which the newly developed instrument was administered twice to people with LWC and their significant other. We statistically tested the psychometric properties with respect to dimensionality and reliability. In Study 2, we collected new multi-source data and conducted a confirmatory factor analysis (CFA). Results Studies yielded a congruous factor structure in both samples, internally consistent measures with adequate content validity of scales and subscales, and high test-retest reliability. The CFA confirmed the factorial validity of the scales. Conclusion The adjusted self-report and the observer scales of mental ability, conscientiousness, self-efficacy, and coping are reliable measures that are well-suited to assess the work capability of people with LWC. Further research is needed to examine criterion-related validity with respect to the work demands such as work-behaviour and task performance.
Graph-based Data Modeling and Analysis for Data Fusion in Remote Sensing
NASA Astrophysics Data System (ADS)
Fan, Lei
Hyperspectral imaging provides the capability of increased sensitivity and discrimination over traditional imaging methods by combining standard digital imaging with spectroscopic methods. For each individual pixel in a hyperspectral image (HSI), a continuous spectrum is sampled as the spectral reflectance/radiance signature to facilitate identification of ground cover and surface material. The abundant spectrum knowledge allows all available information from the data to be mined. The superior qualities within hyperspectral imaging allow wide applications such as mineral exploration, agriculture monitoring, and ecological surveillance, etc. The processing of massive high-dimensional HSI datasets is a challenge since many data processing techniques have a computational complexity that grows exponentially with the dimension. Besides, a HSI dataset may contain a limited number of degrees of freedom due to the high correlations between data points and among the spectra. On the other hand, merely taking advantage of the sampled spectrum of individual HSI data point may produce inaccurate results due to the mixed nature of raw HSI data, such as mixed pixels, optical interferences and etc. Fusion strategies are widely adopted in data processing to achieve better performance, especially in the field of classification and clustering. There are mainly three types of fusion strategies, namely low-level data fusion, intermediate-level feature fusion, and high-level decision fusion. Low-level data fusion combines multi-source data that is expected to be complementary or cooperative. Intermediate-level feature fusion aims at selection and combination of features to remove redundant information. Decision level fusion exploits a set of classifiers to provide more accurate results. The fusion strategies have wide applications including HSI data processing. With the fast development of multiple remote sensing modalities, e.g. Very High Resolution (VHR) optical sensors, LiDAR, etc., fusion of multi-source data can in principal produce more detailed information than each single source. On the other hand, besides the abundant spectral information contained in HSI data, features such as texture and shape may be employed to represent data points from a spatial perspective. Furthermore, feature fusion also includes the strategy of removing redundant and noisy features in the dataset. One of the major problems in machine learning and pattern recognition is to develop appropriate representations for complex nonlinear data. In HSI processing, a particular data point is usually described as a vector with coordinates corresponding to the intensities measured in the spectral bands. This vector representation permits the application of linear and nonlinear transformations with linear algebra to find an alternative representation of the data. More generally, HSI is multi-dimensional in nature and the vector representation may lose the contextual correlations. Tensor representation provides a more sophisticated modeling technique and a higher-order generalization to linear subspace analysis. In graph theory, data points can be generalized as nodes with connectivities measured from the proximity of a local neighborhood. The graph-based framework efficiently characterizes the relationships among the data and allows for convenient mathematical manipulation in many applications, such as data clustering, feature extraction, feature selection and data alignment. In this thesis, graph-based approaches applied in the field of multi-source feature and data fusion in remote sensing area are explored. We will mainly investigate the fusion of spatial, spectral and LiDAR information with linear and multilinear algebra under graph-based framework for data clustering and classification problems.
Automatic vs. manual curation of a multi-source chemical dictionary: the impact on text mining.
Hettne, Kristina M; Williams, Antony J; van Mulligen, Erik M; Kleinjans, Jos; Tkachenko, Valery; Kors, Jan A
2010-03-23
Previously, we developed a combined dictionary dubbed Chemlist for the identification of small molecules and drugs in text based on a number of publicly available databases and tested it on an annotated corpus. To achieve an acceptable recall and precision we used a number of automatic and semi-automatic processing steps together with disambiguation rules. However, it remained to be investigated which impact an extensive manual curation of a multi-source chemical dictionary would have on chemical term identification in text. ChemSpider is a chemical database that has undergone extensive manual curation aimed at establishing valid chemical name-to-structure relationships. We acquired the component of ChemSpider containing only manually curated names and synonyms. Rule-based term filtering, semi-automatic manual curation, and disambiguation rules were applied. We tested the dictionary from ChemSpider on an annotated corpus and compared the results with those for the Chemlist dictionary. The ChemSpider dictionary of ca. 80 k names was only a 1/3 to a 1/4 the size of Chemlist at around 300 k. The ChemSpider dictionary had a precision of 0.43 and a recall of 0.19 before the application of filtering and disambiguation and a precision of 0.87 and a recall of 0.19 after filtering and disambiguation. The Chemlist dictionary had a precision of 0.20 and a recall of 0.47 before the application of filtering and disambiguation and a precision of 0.67 and a recall of 0.40 after filtering and disambiguation. We conclude the following: (1) The ChemSpider dictionary achieved the best precision but the Chemlist dictionary had a higher recall and the best F-score; (2) Rule-based filtering and disambiguation is necessary to achieve a high precision for both the automatically generated and the manually curated dictionary. ChemSpider is available as a web service at http://www.chemspider.com/ and the Chemlist dictionary is freely available as an XML file in Simple Knowledge Organization System format on the web at http://www.biosemantics.org/chemlist.
Automatic vs. manual curation of a multi-source chemical dictionary: the impact on text mining
2010-01-01
Background Previously, we developed a combined dictionary dubbed Chemlist for the identification of small molecules and drugs in text based on a number of publicly available databases and tested it on an annotated corpus. To achieve an acceptable recall and precision we used a number of automatic and semi-automatic processing steps together with disambiguation rules. However, it remained to be investigated which impact an extensive manual curation of a multi-source chemical dictionary would have on chemical term identification in text. ChemSpider is a chemical database that has undergone extensive manual curation aimed at establishing valid chemical name-to-structure relationships. Results We acquired the component of ChemSpider containing only manually curated names and synonyms. Rule-based term filtering, semi-automatic manual curation, and disambiguation rules were applied. We tested the dictionary from ChemSpider on an annotated corpus and compared the results with those for the Chemlist dictionary. The ChemSpider dictionary of ca. 80 k names was only a 1/3 to a 1/4 the size of Chemlist at around 300 k. The ChemSpider dictionary had a precision of 0.43 and a recall of 0.19 before the application of filtering and disambiguation and a precision of 0.87 and a recall of 0.19 after filtering and disambiguation. The Chemlist dictionary had a precision of 0.20 and a recall of 0.47 before the application of filtering and disambiguation and a precision of 0.67 and a recall of 0.40 after filtering and disambiguation. Conclusions We conclude the following: (1) The ChemSpider dictionary achieved the best precision but the Chemlist dictionary had a higher recall and the best F-score; (2) Rule-based filtering and disambiguation is necessary to achieve a high precision for both the automatically generated and the manually curated dictionary. ChemSpider is available as a web service at http://www.chemspider.com/ and the Chemlist dictionary is freely available as an XML file in Simple Knowledge Organization System format on the web at http://www.biosemantics.org/chemlist. PMID:20331846
NASA Astrophysics Data System (ADS)
Frommholz, D.; Linkiewicz, M.; Poznanska, A. M.
2016-06-01
This paper proposes an in-line method for the simplified reconstruction of city buildings from nadir and oblique aerial images that at the same time are being used for multi-source texture mapping with minimal resampling. Further, the resulting unrectified texture atlases are analyzed for façade elements like windows to be reintegrated into the original 3D models. Tests on real-world data of Heligoland/ Germany comprising more than 800 buildings exposed a median positional deviation of 0.31 m at the façades compared to the cadastral map, a correctness of 67% for the detected windows and good visual quality when being rendered with GPU-based perspective correction. As part of the process building reconstruction takes the oriented input images and transforms them into dense point clouds by semi-global matching (SGM). The point sets undergo local RANSAC-based regression and topology analysis to detect adjacent planar surfaces and determine their semantics. Based on this information the roof, wall and ground surfaces found get intersected and limited in their extension to form a closed 3D building hull. For texture mapping the hull polygons are projected into each possible input bitmap to find suitable color sources regarding the coverage and resolution. Occlusions are detected by ray-casting a full-scale digital surface model (DSM) of the scene and stored in pixel-precise visibility maps. These maps are used to derive overlap statistics and radiometric adjustment coefficients to be applied when the visible image parts for each building polygon are being copied into a compact texture atlas without resampling whenever possible. The atlas bitmap is passed to a commercial object-based image analysis (OBIA) tool running a custom rule set to identify windows on the contained façade patches. Following multi-resolution segmentation and classification based on brightness and contrast differences potential window objects are evaluated against geometric constraints and conditionally grown, fused and filtered morphologically. The output polygons are vectorized and reintegrated into the previously reconstructed buildings by sparsely ray-tracing their vertices. Finally the enhanced 3D models get stored as textured geometry for visualization and semantically annotated "LOD-2.5" CityGML objects for GIS applications.
Mishra, Dheerendra
2015-03-01
Smart card based authentication and key agreement schemes for telecare medicine information systems (TMIS) enable doctors, nurses, patients and health visitors to use smart cards for secure login to medical information systems. In recent years, several authentication and key agreement schemes have been proposed to present secure and efficient solution for TMIS. Most of the existing authentication schemes for TMIS have either higher computation overhead or are vulnerable to attacks. To reduce the computational overhead and enhance the security, Lee recently proposed an authentication and key agreement scheme using chaotic maps for TMIS. Xu et al. also proposed a password based authentication and key agreement scheme for TMIS using elliptic curve cryptography. Both the schemes provide better efficiency from the conventional public key cryptography based schemes. These schemes are important as they present an efficient solution for TMIS. We analyze the security of both Lee's scheme and Xu et al.'s schemes. Unfortunately, we identify that both the schemes are vulnerable to denial of service attack. To understand the security failures of these cryptographic schemes which are the key of patching existing schemes and designing future schemes, we demonstrate the security loopholes of Lee's scheme and Xu et al.'s scheme in this paper.
Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Xie, Dong; Yang, Yixian
2015-06-01
The Telecare Medicine Information Systems (TMISs) provide an efficient communicating platform supporting the patients access health-care delivery services via internet or mobile networks. Authentication becomes an essential need when a remote patient logins into the telecare server. Recently, many extended chaotic maps based authentication schemes using smart cards for TMISs have been proposed. Li et al. proposed a secure smart cards based authentication scheme for TMISs using extended chaotic maps based on Lee's and Jiang et al.'s scheme. In this study, we show that Li et al.'s scheme has still some weaknesses such as violation the session key security, vulnerability to user impersonation attack and lack of local verification. To conquer these flaws, we propose a chaotic maps and smart cards based password authentication scheme by applying biometrics technique and hash function operations. Through the informal and formal security analyses, we demonstrate that our scheme is resilient possible known attacks including the attacks found in Li et al.'s scheme. As compared with the previous authentication schemes, the proposed scheme is more secure and efficient and hence more practical for telemedical environments.
Enhanced smartcard-based password-authenticated key agreement using extended chaotic maps.
Lee, Tian-Fu; Hsiao, Chia-Hung; Hwang, Shi-Han; Lin, Tsung-Hung
2017-01-01
A smartcard based password-authenticated key agreement scheme enables a legal user to log in to a remote authentication server and access remote services through public networks using a weak password and a smart card. Lin recently presented an improved chaotic maps-based password-authenticated key agreement scheme that used smartcards to eliminate the weaknesses of the scheme of Guo and Chang, which does not provide strong user anonymity and violates session key security. However, the improved scheme of Lin does not exhibit the freshness property and the validity of messages so it still fails to withstand denial-of-service and privileged-insider attacks. Additionally, a single malicious participant can predetermine the session key such that the improved scheme does not exhibit the contributory property of key agreements. This investigation discusses these weaknesses and proposes an enhanced smartcard-based password-authenticated key agreement scheme that utilizes extended chaotic maps. The session security of this enhanced scheme is based on the extended chaotic map-based Diffie-Hellman problem, and is proven in the real-or-random and the sequence of games models. Moreover, the enhanced scheme ensures the freshness of communicating messages by appending timestamps, and thereby avoids the weaknesses in previous schemes.
Enhanced smartcard-based password-authenticated key agreement using extended chaotic maps
Lee, Tian-Fu; Hsiao, Chia-Hung; Hwang, Shi-Han
2017-01-01
A smartcard based password-authenticated key agreement scheme enables a legal user to log in to a remote authentication server and access remote services through public networks using a weak password and a smart card. Lin recently presented an improved chaotic maps-based password-authenticated key agreement scheme that used smartcards to eliminate the weaknesses of the scheme of Guo and Chang, which does not provide strong user anonymity and violates session key security. However, the improved scheme of Lin does not exhibit the freshness property and the validity of messages so it still fails to withstand denial-of-service and privileged-insider attacks. Additionally, a single malicious participant can predetermine the session key such that the improved scheme does not exhibit the contributory property of key agreements. This investigation discusses these weaknesses and proposes an enhanced smartcard-based password-authenticated key agreement scheme that utilizes extended chaotic maps. The session security of this enhanced scheme is based on the extended chaotic map-based Diffie-Hellman problem, and is proven in the real-or-random and the sequence of games models. Moreover, the enhanced scheme ensures the freshness of communicating messages by appending timestamps, and thereby avoids the weaknesses in previous schemes. PMID:28759615
Wang, Chengqi; Zhang, Xiao; Zheng, Zhiming
2016-01-01
With the security requirements of networks, biometrics authenticated schemes which are applied in the multi-server environment come to be more crucial and widely deployed. In this paper, we propose a novel biometric-based multi-server authentication and key agreement scheme which is based on the cryptanalysis of Mishra et al.’s scheme. The informal and formal security analysis of our scheme are given, which demonstrate that our scheme satisfies the desirable security requirements. The presented scheme provides a variety of significant functionalities, in which some features are not considered in the most of existing authentication schemes, such as, user revocation or re-registration and biometric information protection. Compared with several related schemes, our scheme has more secure properties and lower computation cost. It is obviously more appropriate for practical applications in the remote distributed networks. PMID:26866606
Potential effects of sulfur pollutants on grape production in New York State
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knudson, D.A.; Viessman, S.
1983-01-01
This paper presents the results of a prototype analysis of sulfur pollutants on graph production in New York State. Principal grape production areas for the state are defined and predictions of sulfur dioxide concentrations associated with present and projected sources are computed. Sulfur dioxide concentrations are based on the results of a multi-source dispersion model, whereas concentrations for other pollutants are derived from observations. This information is used in conjunction with results from experiments conducted to identify threshold levels of damage and/or injury to a variety of grape species to pollutants. Determination is then made whether the subject crop ismore » at risk from present and projected concentrations of pollutants.« less
Manzo, C; Mei, A; Zampetti, E; Bassani, C; Paciucci, L; Manetti, P
2017-04-15
This paper describes a methodology to perform chemical analyses in landfill areas by integrating multisource geomatic data. We used a top-down approach to identify Environmental Point of Interest (EPI) based on very high-resolution satellite data (Pleiades and WorldView 2) and on in situ thermal and photogrammetric surveys. Change detection techniques and geostatistical analysis supported the chemical survey, undertaken using an accumulation chamber and an RIIA, an unmanned ground vehicle developed by CNR IIA, equipped with a multiparameter sensor platform for environmental monitoring. Such an approach improves site characterization, identifying the key environmental points of interest where it is necessary to perform detailed chemical analyses. Copyright © 2017 Elsevier B.V. All rights reserved.
A parametric method for determining the number of signals in narrow-band direction finding
NASA Astrophysics Data System (ADS)
Wu, Qiang; Fuhrmann, Daniel R.
1991-08-01
A novel and more accurate method to determine the number of signals in the multisource direction finding problem is developed. The information-theoretic criteria of Yin and Krishnaiah (1988) are applied to a set of quantities which are evaluated from the log-likelihood function. Based on proven asymptotic properties of the maximum likelihood estimation, these quantities have the properties required by the criteria. Since the information-theoretic criteria use these quantities instead of the eigenvalues of the estimated correlation matrix, this approach possesses the advantage of not requiring a subjective threshold, and also provides higher performance than when eigenvalues are used. Simulation results are presented and compared to those obtained from the nonparametric method given by Wax and Kailath (1985).
NASA Astrophysics Data System (ADS)
Tuia, Devis; Marcos, Diego; Camps-Valls, Gustau
2016-10-01
Remote sensing image classification exploiting multiple sensors is a very challenging problem: data from different modalities are affected by spectral distortions and mis-alignments of all kinds, and this hampers re-using models built for one image to be used successfully in other scenes. In order to adapt and transfer models across image acquisitions, one must be able to cope with datasets that are not co-registered, acquired under different illumination and atmospheric conditions, by different sensors, and with scarce ground references. Traditionally, methods based on histogram matching have been used. However, they fail when densities have very different shapes or when there is no corresponding band to be matched between the images. An alternative builds upon manifold alignment. Manifold alignment performs a multidimensional relative normalization of the data prior to product generation that can cope with data of different dimensionality (e.g. different number of bands) and possibly unpaired examples. Aligning data distributions is an appealing strategy, since it allows to provide data spaces that are more similar to each other, regardless of the subsequent use of the transformed data. In this paper, we study a methodology that aligns data from different domains in a nonlinear way through kernelization. We introduce the Kernel Manifold Alignment (KEMA) method, which provides a flexible and discriminative projection map, exploits only a few labeled samples (or semantic ties) in each domain, and reduces to solving a generalized eigenvalue problem. We successfully test KEMA in multi-temporal and multi-source very high resolution classification tasks, as well as on the task of making a model invariant to shadowing for hyperspectral imaging.
Al Ansari, Ahmed; Al Khalifa, Khalid; Al Azzawi, Mohamed; Al Amer, Rashed; Al Sharqi, Dana; Al-Mansoor, Anwar; Munshi, Fadi M
2015-01-01
Background We aimed to design, implement, and evaluate the feasibility and reliability of a multisource feedback (MSF) system to assess interns in their clerkship year in the Middle Eastern culture, the Kingdom of Bahrain. Method The study was undertaken in the Bahrain Defense Force Hospital, a military teaching hospital in the Kingdom of Bahrain. A total of 21 interns (who represent the total population of the interns for the given year) were assessed in this study. All of the interns were rotating through our hospital during their year-long clerkship rotation. The study sample consisted of nine males and 12 females. Each participating intern was evaluated by three groups of raters, eight medical intern colleagues, eight senior medical colleagues, and eight coworkers from different departments. Results A total of 21 interns (nine males and 12 females) were assessed in this study. The total mean response rates were 62.3%. A factor analysis was conducted that found that the data on the questionnaire grouped into three factors that counted for 76.4% of the total variance. These three factors were labeled as professionalism, collaboration, and communication. Reliability analysis indicated that the full instrument scale had high internal consistency (Cronbach’s α 0.98). The generalizability coefficients for the surveys were estimated to be 0.78. Conclusion Based on our results and analysis, we conclude that the MSF tool we used on the interns rotating in their clerkship year within our Middle Eastern culture provides an effective method of evaluation because it offers a reliable, valid, and feasible process. PMID:26316836
NASA Astrophysics Data System (ADS)
Liu, Wenbin; Sun, Fubao; Li, Yanzhong; Zhang, Guoqing; Sang, Yan-Fang; Lim, Wee Ho; Liu, Jiahong; Wang, Hong; Bai, Peng
2018-01-01
The dynamics of basin-scale water budgets over the Tibetan Plateau (TP) are not well understood nowadays due to the lack of in situ hydro-climatic observations. In this study, we investigate the seasonal cycles and trends of water budget components (e.g. precipitation P, evapotranspiration ET and runoff Q) in 18 TP river basins during the period 1982-2011 through the use of multi-source datasets (e.g. in situ observations, satellite retrievals, reanalysis outputs and land surface model simulations). A water balance-based two-step procedure, which considers the changes in basin-scale water storage on the annual scale, is also adopted to calculate actual ET. The results indicated that precipitation (mainly snowfall from mid-autumn to next spring), which are mainly concentrated during June-October (varied among different monsoons-impacted basins), was the major contributor to the runoff in TP basins. The P, ET and Q were found to marginally increase in most TP basins during the past 30 years except for the upper Yellow River basin and some sub-basins of Yalong River, which were mainly affected by the weakening east Asian monsoon. Moreover, the aridity index (PET/P) and runoff coefficient (Q/P) decreased slightly in most basins, which were in agreement with the warming and moistening climate in the Tibetan Plateau. The results obtained demonstrated the usefulness of integrating multi-source datasets to hydrological applications in the data-sparse regions. More generally, such an approach might offer helpful insights into understanding the water and energy budgets and sustainability of water resource management practices of data-sparse regions in a changing environment.
Validation of multisource electronic health record data: an application to blood transfusion data.
Hoeven, Loan R van; Bruijne, Martine C de; Kemper, Peter F; Koopman, Maria M W; Rondeel, Jan M M; Leyte, Anja; Koffijberg, Hendrik; Janssen, Mart P; Roes, Kit C B
2017-07-14
Although data from electronic health records (EHR) are often used for research purposes, systematic validation of these data prior to their use is not standard practice. Existing validation frameworks discuss validity concepts without translating these into practical implementation steps or addressing the potential influence of linking multiple sources. Therefore we developed a practical approach for validating routinely collected data from multiple sources and to apply it to a blood transfusion data warehouse to evaluate the usability in practice. The approach consists of identifying existing validation frameworks for EHR data or linked data, selecting validity concepts from these frameworks and establishing quantifiable validity outcomes for each concept. The approach distinguishes external validation concepts (e.g. concordance with external reports, previous literature and expert feedback) and internal consistency concepts which use expected associations within the dataset itself (e.g. completeness, uniformity and plausibility). In an example case, the selected concepts were applied to a transfusion dataset and specified in more detail. Application of the approach to a transfusion dataset resulted in a structured overview of data validity aspects. This allowed improvement of these aspects through further processing of the data and in some cases adjustment of the data extraction. For example, the proportion of transfused products that could not be linked to the corresponding issued products initially was 2.2% but could be improved by adjusting data extraction criteria to 0.17%. This stepwise approach for validating linked multisource data provides a basis for evaluating data quality and enhancing interpretation. When the process of data validation is adopted more broadly, this contributes to increased transparency and greater reliability of research based on routinely collected electronic health records.
An efficient and provable secure revocable identity-based encryption scheme.
Wang, Changji; Li, Yuan; Xia, Xiaonan; Zheng, Kangjia
2014-01-01
Revocation functionality is necessary and crucial to identity-based cryptosystems. Revocable identity-based encryption (RIBE) has attracted a lot of attention in recent years, many RIBE schemes have been proposed in the literature but shown to be either insecure or inefficient. In this paper, we propose a new scalable RIBE scheme with decryption key exposure resilience by combining Lewko and Waters' identity-based encryption scheme and complete subtree method, and prove our RIBE scheme to be semantically secure using dual system encryption methodology. Compared to existing scalable and semantically secure RIBE schemes, our proposed RIBE scheme is more efficient in term of ciphertext size, public parameters size and decryption cost at price of a little looser security reduction. To the best of our knowledge, this is the first construction of scalable and semantically secure RIBE scheme with constant size public system parameters.
Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling
2016-01-01
Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model. PMID:27898703
Wang, Shangping; Zhang, Xiaoxue; Zhang, Yaling
2016-01-01
Cipher-policy attribute-based encryption (CP-ABE) focus on the problem of access control, and keyword-based searchable encryption scheme focus on the problem of finding the files that the user interested in the cloud storage quickly. To design a searchable and attribute-based encryption scheme is a new challenge. In this paper, we propose an efficiently multi-user searchable attribute-based encryption scheme with attribute revocation and grant for cloud storage. In the new scheme the attribute revocation and grant processes of users are delegated to proxy server. Our scheme supports multi attribute are revoked and granted simultaneously. Moreover, the keyword searchable function is achieved in our proposed scheme. The security of our proposed scheme is reduced to the bilinear Diffie-Hellman (BDH) assumption. Furthermore, the scheme is proven to be secure under the security model of indistinguishability against selective ciphertext-policy and chosen plaintext attack (IND-sCP-CPA). And our scheme is also of semantic security under indistinguishability against chosen keyword attack (IND-CKA) in the random oracle model.
A provably-secure ECC-based authentication scheme for wireless sensor networks.
Nam, Junghyun; Kim, Moonseong; Paik, Juryon; Lee, Youngsook; Won, Dongho
2014-11-06
A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme) is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000). Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC), and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure) schemes.
A Provably-Secure ECC-Based Authentication Scheme for Wireless Sensor Networks
Nam, Junghyun; Kim, Moonseong; Paik, Juryon; Lee, Youngsook; Won, Dongho
2014-01-01
A smart-card-based user authentication scheme for wireless sensor networks (in short, a SUA-WSN scheme) is designed to restrict access to the sensor data only to users who are in possession of both a smart card and the corresponding password. While a significant number of SUA-WSN schemes have been suggested in recent years, their intended security properties lack formal definitions and proofs in a widely-accepted model. One consequence is that SUA-WSN schemes insecure against various attacks have proliferated. In this paper, we devise a security model for the analysis of SUA-WSN schemes by extending the widely-accepted model of Bellare, Pointcheval and Rogaway (2000). Our model provides formal definitions of authenticated key exchange and user anonymity while capturing side-channel attacks, as well as other common attacks. We also propose a new SUA-WSN scheme based on elliptic curve cryptography (ECC), and prove its security properties in our extended model. To the best of our knowledge, our proposed scheme is the first SUA-WSN scheme that provably achieves both authenticated key exchange and user anonymity. Our scheme is also computationally competitive with other ECC-based (non-provably secure) schemes. PMID:25384009
A soft-hard combination-based cooperative spectrum sensing scheme for cognitive radio networks.
Do, Nhu Tri; An, Beongku
2015-02-13
In this paper we propose a soft-hard combination scheme, called SHC scheme, for cooperative spectrum sensing in cognitive radio networks. The SHC scheme deploys a cluster based network in which Likelihood Ratio Test (LRT)-based soft combination is applied at each cluster, and weighted decision fusion rule-based hard combination is utilized at the fusion center. The novelties of the SHC scheme are as follows: the structure of the SHC scheme reduces the complexity of cooperative detection which is an inherent limitation of soft combination schemes. By using the LRT, we can detect primary signals in a low signal-to-noise ratio regime (around an average of -15 dB). In addition, the computational complexity of the LRT is reduced since we derive the closed-form expression of the probability density function of LRT value. The SHC scheme also takes into account the different effects of large scale fading on different users in the wide area network. The simulation results show that the SHC scheme not only provides the better sensing performance compared to the conventional hard combination schemes, but also reduces sensing overhead in terms of reporting time compared to the conventional soft combination scheme using the LRT.
A Novel Passive Tracking Scheme Exploiting Geometric and Intercept Theorems
Zhou, Biao; Sun, Chao; Ahn, Deockhyeon; Kim, Youngok
2018-01-01
Passive tracking aims to track targets without assistant devices, that is, device-free targets. Passive tracking based on Radio Frequency (RF) Tomography in wireless sensor networks has recently been addressed as an emerging field. The passive tracking scheme using geometric theorems (GTs) is one of the most popular RF Tomography schemes, because the GT-based method can effectively mitigate the demand for a high density of wireless nodes. In the GT-based tracking scheme, the tracking scenario is considered as a two-dimensional geometric topology and then geometric theorems are applied to estimate crossing points (CPs) of the device-free target on line-of-sight links (LOSLs), which reveal the target’s trajectory information in a discrete form. In this paper, we review existing GT-based tracking schemes, and then propose a novel passive tracking scheme by exploiting the Intercept Theorem (IT). To create an IT-based CP estimation scheme available in the noisy non-parallel LOSL situation, we develop the equal-ratio traverse (ERT) method. Finally, we analyze properties of three GT-based tracking algorithms and the performance of these schemes is evaluated experimentally under various trajectories, node densities, and noisy topologies. Analysis of experimental results shows that tracking schemes exploiting geometric theorems can achieve remarkable positioning accuracy even under rather a low density of wireless nodes. Moreover, the proposed IT scheme can provide generally finer tracking accuracy under even lower node density and noisier topologies, in comparison to other schemes. PMID:29562621
NASA Astrophysics Data System (ADS)
Gao, Zhiqiang; Xu, Fuxiang; Song, Debin; Zheng, Xiangyu; Chen, Maosi
2017-09-01
This paper conducted dynamic monitoring over the green tide (large green alga—Ulva prolifera) occurred in the Yellow Sea in 2014 to 2016 by the use of multi-source remote sensing data, including GF-1 WFV, HJ-1A/1B CCD, CBERS-04 WFI, Landsat-7 ETM+ and Landsta-8 OLI, and by the combination of VB-FAH (index of Virtual-Baseline Floating macroAlgae Height) with manual assisted interpretation based on remote sensing and geographic information system technologies. The result shows that unmanned aerial vehicle (UAV) and shipborne platform could accurately monitor the distribution of Ulva prolifera in small spaces, and therefore provide validation data for the result of remote sensing monitoring over Ulva prolifera. The result of this research can provide effective information support for the prevention and control of Ulva prolifera.
Flow experience in teams: The role of shared leadership.
Aubé, Caroline; Rousseau, Vincent; Brunelle, Eric
2018-04-01
The present study tests a multilevel mediation model concerning the effect of shared leadership on team members' flow experience. Specifically, we investigate the mediating role of teamwork behaviors in the relationships between 2 complementary indicators of shared leadership (i.e., density and centralization) and flow. Based on a multisource approach, we collected data through observation and survey of 111 project teams (521 individuals) made up of university students participating in a project management simulation. The results show that density and centralization have both an additive effect and an interaction effect on teamwork behaviors, such that the relationship between density and teamwork behaviors is stronger when centralization is low. In addition, teamwork behaviors play a mediating role in the relationship between shared leadership and flow. Overall, the findings highlight the importance of promoting team-based shared leadership in organizations to favor the flow experience. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Disaster Emergency Rapid Assessment Based on Remote Sensing and Background Data
NASA Astrophysics Data System (ADS)
Han, X.; Wu, J.
2018-04-01
The period from starting to the stable conditions is an important stage of disaster development. In addition to collecting and reporting information on disaster situations, remote sensing images by satellites and drones and monitoring results from disaster-stricken areas should be obtained. Fusion of multi-source background data such as population, geography and topography, and remote sensing monitoring information can be used in geographic information system analysis to quickly and objectively assess the disaster information. According to the characteristics of different hazards, the models and methods driven by the rapid assessment of mission requirements are tested and screened. Based on remote sensing images, the features of exposures quickly determine disaster-affected areas and intensity levels, and extract key disaster information about affected hospitals and schools as well as cultivated land and crops, and make decisions after emergency response with visual assessment results.
How to retrieve additional information from the multiplicity distributions
NASA Astrophysics Data System (ADS)
Wilk, Grzegorz; Włodarczyk, Zbigniew
2017-01-01
Multiplicity distributions (MDs) P(N) measured in multiparticle production processes are most frequently described by the negative binomial distribution (NBD). However, with increasing collision energy some systematic discrepancies have become more and more apparent. They are usually attributed to the possible multi-source structure of the production process and described using a multi-NBD form of the MD. We investigate the possibility of keeping a single NBD but with its parameters depending on the multiplicity N. This is done by modifying the widely known clan model of particle production leading to the NBD form of P(N). This is then confronted with the approach based on the so-called cascade-stochastic formalism which is based on different types of recurrence relations defining P(N). We demonstrate that a combination of both approaches allows the retrieval of additional valuable information from the MDs, namely the oscillatory behavior of the counting statistics apparently visible in the high energy data.
Study on Data Clustering and Intelligent Decision Algorithm of Indoor Localization
NASA Astrophysics Data System (ADS)
Liu, Zexi
2018-01-01
Indoor positioning technology enables the human beings to have the ability of positional perception in architectural space, and there is a shortage of single network coverage and the problem of location data redundancy. So this article puts forward the indoor positioning data clustering algorithm and intelligent decision-making research, design the basic ideas of multi-source indoor positioning technology, analyzes the fingerprint localization algorithm based on distance measurement, position and orientation of inertial device integration. By optimizing the clustering processing of massive indoor location data, the data normalization pretreatment, multi-dimensional controllable clustering center and multi-factor clustering are realized, and the redundancy of locating data is reduced. In addition, the path is proposed based on neural network inference and decision, design the sparse data input layer, the dynamic feedback hidden layer and output layer, low dimensional results improve the intelligent navigation path planning.
Long-Term Stability of Core Language Skill in Children with Contrasting Language Skills
ERIC Educational Resources Information Center
Bornstein, Marc H.; Hahn, Chun-Shin; Putnick, Diane L.
2016-01-01
This 4-wave longitudinal study evaluated stability of core language skill in 421 European American and African American children, half of whom were identified as low (n = 201) and half of whom were average-to-high (n = 220) in later language skill. Structural equation modeling supported loadings of multivariate age-appropriate multisource measures…
Stability of Core Language Skill from Early Childhood to Adolescence: A Latent Variable Approach
ERIC Educational Resources Information Center
Bornstein, Marc H.; Hahn, Chun-Shin; Putnick, Diane L.; Suwalsky, Joan T. D.
2014-01-01
This four-wave prospective longitudinal study evaluated stability of language in 324 children from early childhood to adolescence. Structural equation modeling supported loadings of multiple age-appropriate multisource measures of child language on single-factor core language skills at 20 months and 4, 10, and 14 years. Large stability…
ERIC Educational Resources Information Center
Liao, Hui; Chuang, Aichia
2007-01-01
This longitudinal field study integrates the theories of transformational leadership (TFL) and relationship marketing to examine how TFL influences employee service performance and customer relationship outcomes by transforming both (at the micro level) the service employees' attitudes and (at the macro level) the work unit's service climate.…
ERIC Educational Resources Information Center
Coplan, Robert J.; Arbeau, Kimberley A.; Armer, Mandana
2008-01-01
The goal of this study was to explore the moderating role of maternal personality and parenting characteristics in the links between shyness and adjustment in kindergarten. Participants were 197 children enrolled in kindergarten programs (and their mothers and teachers). Multisource assessment was employed, including maternal ratings, behavioral…
Influence of Feedback, Teacher Praise, and Parental Support on Self-Competency of Third Graders.
ERIC Educational Resources Information Center
Davis, Lonnie H.; And Others
The purpose of this study was to demonstrate how an early assessment of self-competency can be combined with an effective program for preventing maladaptive affective (self-competency) and academic skills. Eleven third graders participated in this study of three interventions. Feedback of multisource data, teacher praise (positive reinforcement),…
USDA-ARS?s Scientific Manuscript database
The overall objectives of this study were to determine if a correlation exists between individual pharmacokinetic parameters and treatment outcome when feeder cattle were diagnosed with bovine respiratory disease (BRD) and treated with gamithromycin (Zactran®) at the label dose, and if there was a s...
ERIC Educational Resources Information Center
Coplan, Robert J.; Weeks, Murray
2010-01-01
The goal of this study was to explore the socioemotional adjustment of unsociable (versus shy) children in middle childhood. The participants in this study were 186 children aged 6-8 years (M[subscript age] = 7.59 years, SD = 0.31). Multisource assessment was employed, including maternal ratings, teacher ratings, and individual child interviews.…
An Improved Biometrics-Based Remote User Authentication Scheme with User Anonymity
Kumari, Saru
2013-01-01
The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability. PMID:24350272
An improved biometrics-based remote user authentication scheme with user anonymity.
Khan, Muhammad Khurram; Kumari, Saru
2013-01-01
The authors review the biometrics-based user authentication scheme proposed by An in 2012. The authors show that there exist loopholes in the scheme which are detrimental for its security. Therefore the authors propose an improved scheme eradicating the flaws of An's scheme. Then a detailed security analysis of the proposed scheme is presented followed by its efficiency comparison. The proposed scheme not only withstands security problems found in An's scheme but also provides some extra features with mere addition of only two hash operations. The proposed scheme allows user to freely change his password and also provides user anonymity with untraceability.
Provably secure identity-based identification and signature schemes from code assumptions
Zhao, Yiming
2017-01-01
Code-based cryptography is one of few alternatives supposed to be secure in a post-quantum world. Meanwhile, identity-based identification and signature (IBI/IBS) schemes are two of the most fundamental cryptographic primitives, so several code-based IBI/IBS schemes have been proposed. However, with increasingly profound researches on coding theory, the security reduction and efficiency of such schemes have been invalidated and challenged. In this paper, we construct provably secure IBI/IBS schemes from code assumptions against impersonation under active and concurrent attacks through a provably secure code-based signature technique proposed by Preetha, Vasant and Rangan (PVR signature), and a security enhancement Or-proof technique. We also present the parallel-PVR technique to decrease parameter values while maintaining the standard security level. Compared to other code-based IBI/IBS schemes, our schemes achieve not only preferable public parameter size, private key size, communication cost and signature length due to better parameter choices, but also provably secure. PMID:28809940
Provably secure identity-based identification and signature schemes from code assumptions.
Song, Bo; Zhao, Yiming
2017-01-01
Code-based cryptography is one of few alternatives supposed to be secure in a post-quantum world. Meanwhile, identity-based identification and signature (IBI/IBS) schemes are two of the most fundamental cryptographic primitives, so several code-based IBI/IBS schemes have been proposed. However, with increasingly profound researches on coding theory, the security reduction and efficiency of such schemes have been invalidated and challenged. In this paper, we construct provably secure IBI/IBS schemes from code assumptions against impersonation under active and concurrent attacks through a provably secure code-based signature technique proposed by Preetha, Vasant and Rangan (PVR signature), and a security enhancement Or-proof technique. We also present the parallel-PVR technique to decrease parameter values while maintaining the standard security level. Compared to other code-based IBI/IBS schemes, our schemes achieve not only preferable public parameter size, private key size, communication cost and signature length due to better parameter choices, but also provably secure.
Satellite radiothermovision of atmospheric mesoscale processes: case study of tropical cyclones
NASA Astrophysics Data System (ADS)
Ermakov, D. M.; Sharkov, E. A.; Chernushich, A. P.
2015-04-01
Satellite radiothermovision is a set of processing techniques applicable for multisource data of radiothermal monitoring of oceanatmosphere system, which allows creating dynamic description of mesoscale and synoptic atmospheric processes and estimating physically meaningful integral characteristics of the observed processes (like avdective flow of the latent heat through a given border). The approach is based on spatiotemporal interpolation of the satellite measurements which allows reconstructing the radiothermal fields (as well as the fields of geophysical parameters) of the ocean-atmosphere system at global scale with spatial resolution of about 0.125° and temporal resolution of 1.5 hour. The accuracy of spatiotemporal interpolation was estimated by direct comparison of interpolated data with the data of independent asynchronous measurements and was shown to correspond to the best achievable as reported in literature (for total precipitable water fields the accuracy is about 0.8 mm). The advantages of the implemented interpolation scheme are: closure under input radiothermal data, homogeneity in time scale (all data are interpolated through the same time intervals), automatic estimation of both the intermediate states of scalar field of the studied geophysical parameter and of vector field of effective velocity of advection (horizontal movements). Using this pair of fields one can calculate the flow of a given geophysical quantity though any given border. For example, in case of total precipitable water field, this flow (under proper calibration) has the meaning of latent heat advective flux. This opportunity was used to evaluate the latent heat flux though a set of circular contours, enclosing a tropical cyclone and drifting with it during its evolution. A remarkable interrelation was observed between the calculated magnitude and sign of advective latent flux and the intensity of a tropical cyclone. This interrelation is demonstrated in several examples of hurricanes and tropical cyclones of August, 2000, and typhoons of November, 2013, including super typhoon Haiyan.
Mishra, Dheerendra; Srinivas, Jangirala; Mukhopadhyay, Sourav
2014-10-01
Advancement in network technology provides new ways to utilize telecare medicine information systems (TMIS) for patient care. Although TMIS usually faces various attacks as the services are provided over the public network. Recently, Jiang et al. proposed a chaotic map-based remote user authentication scheme for TMIS. Their scheme has the merits of low cost and session key agreement using Chaos theory. It enhances the security of the system by resisting various attacks. In this paper, we analyze the security of Jiang et al.'s scheme and demonstrate that their scheme is vulnerable to denial of service attack. Moreover, we demonstrate flaws in password change phase of their scheme. Further, our aim is to propose a new chaos map-based anonymous user authentication scheme for TMIS to overcome the weaknesses of Jiang et al.'s scheme, while also retaining the original merits of their scheme. We also show that our scheme is secure against various known attacks including the attacks found in Jiang et al.'s scheme. The proposed scheme is comparable in terms of the communication and computational overheads with Jiang et al.'s scheme and other related existing schemes. Moreover, we demonstrate the validity of the proposed scheme through the BAN (Burrows, Abadi, and Needham) logic.
Research to Assembly Scheme for Satellite Deck Based on Robot Flexibility Control Principle
NASA Astrophysics Data System (ADS)
Guo, Tao; Hu, Ruiqin; Xiao, Zhengyi; Zhao, Jingjing; Fang, Zhikai
2018-03-01
Deck assembly is critical quality control point in final satellite assembly process, and cable extrusion and structure collision problems in assembly process will affect development quality and progress of satellite directly. Aimed at problems existing in deck assembly process, assembly project scheme for satellite deck based on robot flexibility control principle is proposed in this paper. Scheme is introduced firstly; secondly, key technologies on end force perception and flexible docking control in the scheme are studied; then, implementation process of assembly scheme for satellite deck is described in detail; finally, actual application case of assembly scheme is given. Result shows that compared with traditional assembly scheme, assembly scheme for satellite deck based on robot flexibility control principle has obvious advantages in work efficiency, reliability and universality aspects etc.
A keyword searchable attribute-based encryption scheme with attribute update for cloud storage.
Wang, Shangping; Ye, Jian; Zhang, Yaling
2018-01-01
Ciphertext-policy attribute-based encryption (CP-ABE) scheme is a new type of data encryption primitive, which is very suitable for data cloud storage for its fine-grained access control. Keyword-based searchable encryption scheme enables users to quickly find interesting data stored in the cloud server without revealing any information of the searched keywords. In this work, we provide a keyword searchable attribute-based encryption scheme with attribute update for cloud storage, which is a combination of attribute-based encryption scheme and keyword searchable encryption scheme. The new scheme supports the user's attribute update, especially in our new scheme when a user's attribute need to be updated, only the user's secret key related with the attribute need to be updated, while other user's secret key and the ciphertexts related with this attribute need not to be updated with the help of the cloud server. In addition, we outsource the operation with high computation cost to cloud server to reduce the user's computational burden. Moreover, our scheme is proven to be semantic security against chosen ciphertext-policy and chosen plaintext attack in the general bilinear group model. And our scheme is also proven to be semantic security against chosen keyword attack under bilinear Diffie-Hellman (BDH) assumption.
A keyword searchable attribute-based encryption scheme with attribute update for cloud storage
Wang, Shangping; Zhang, Yaling
2018-01-01
Ciphertext-policy attribute-based encryption (CP-ABE) scheme is a new type of data encryption primitive, which is very suitable for data cloud storage for its fine-grained access control. Keyword-based searchable encryption scheme enables users to quickly find interesting data stored in the cloud server without revealing any information of the searched keywords. In this work, we provide a keyword searchable attribute-based encryption scheme with attribute update for cloud storage, which is a combination of attribute-based encryption scheme and keyword searchable encryption scheme. The new scheme supports the user's attribute update, especially in our new scheme when a user's attribute need to be updated, only the user's secret key related with the attribute need to be updated, while other user's secret key and the ciphertexts related with this attribute need not to be updated with the help of the cloud server. In addition, we outsource the operation with high computation cost to cloud server to reduce the user's computational burden. Moreover, our scheme is proven to be semantic security against chosen ciphertext-policy and chosen plaintext attack in the general bilinear group model. And our scheme is also proven to be semantic security against chosen keyword attack under bilinear Diffie-Hellman (BDH) assumption. PMID:29795577
Guo, Hua; Zheng, Yandong; Zhang, Xiyong; Li, Zhoujun
2016-01-01
In resource-constrained wireless networks, resources such as storage space and communication bandwidth are limited. To guarantee secure communication in resource-constrained wireless networks, group keys should be distributed to users. The self-healing group key distribution (SGKD) scheme is a promising cryptographic tool, which can be used to distribute and update the group key for the secure group communication over unreliable wireless networks. Among all known SGKD schemes, exponential arithmetic based SGKD (E-SGKD) schemes reduce the storage overhead to constant, thus is suitable for the the resource-constrained wireless networks. In this paper, we provide a new mechanism to achieve E-SGKD schemes with backward secrecy. We first propose a basic E-SGKD scheme based on a known polynomial-based SGKD, where it has optimal storage overhead while having no backward secrecy. To obtain the backward secrecy and reduce the communication overhead, we introduce a novel approach for message broadcasting and self-healing. Compared with other E-SGKD schemes, our new E-SGKD scheme has the optimal storage overhead, high communication efficiency and satisfactory security. The simulation results in Zigbee-based networks show that the proposed scheme is suitable for the resource-restrained wireless networks. Finally, we show the application of our proposed scheme. PMID:27136550
Time reversal optical tomography locates fluorescent targets in a turbid medium
NASA Astrophysics Data System (ADS)
Wu, Binlin; Cai, W.; Gayen, S. K.
2013-03-01
A fluorescence optical tomography approach that extends time reversal optical tomography (TROT) to locate fluorescent targets embedded in a turbid medium is introduced. It uses a multi-source illumination and multi-detector signal acquisition scheme, along with TR matrix formalism, and multiple signal classification (MUSIC) to construct pseudo-image of the targets. The samples consisted of a single or two small tubes filled with water solution of Indocyanine Green (ICG) dye as targets embedded in a 250 mm × 250 mm × 60 mm rectangular cell filled with Intralipid-20% suspension as the scattering medium. The ICG concentration was 1μM, and the Intralipid-20% concentration was adjusted to provide ~ 1-mm transport length for both excitation wavelength of 790 nm and fluorescence wavelength around 825 nm. The data matrix was constructed using the diffusely transmitted fluorescence signals for all scan positions, and the TR matrix was constructed by multiplying data matrix with its transpose. A pseudo spectrum was calculated using the signal subspace of the TR matrix. Tomographic images were generated using the pseudo spectrum. The peaks in the pseudo images provided locations of the target(s) with sub-millimeter accuracy. Concurrent transmission TROT measurements corroborated fluorescence-TROT findings. The results demonstrate that TROT is a fast approach that can be used to obtain accurate three-dimensional position information of fluorescence targets embedded deep inside a highly scattering medium, such as, a contrast-enhanced tumor in a human breast.
Wang, Jie-sheng; Han, Shuang; Shen, Na-na
2014-01-01
For predicting the key technology indicators (concentrate grade and tailings recovery rate) of flotation process, an echo state network (ESN) based fusion soft-sensor model optimized by the improved glowworm swarm optimization (GSO) algorithm is proposed. Firstly, the color feature (saturation and brightness) and texture features (angular second moment, sum entropy, inertia moment, etc.) based on grey-level co-occurrence matrix (GLCM) are adopted to describe the visual characteristics of the flotation froth image. Then the kernel principal component analysis (KPCA) method is used to reduce the dimensionality of the high-dimensional input vector composed by the flotation froth image characteristics and process datum and extracts the nonlinear principal components in order to reduce the ESN dimension and network complex. The ESN soft-sensor model of flotation process is optimized by the GSO algorithm with congestion factor. Simulation results show that the model has better generalization and prediction accuracy to meet the online soft-sensor requirements of the real-time control in the flotation process. PMID:24982935
An improved biometrics-based authentication scheme for telecare medical information systems.
Guo, Dianli; Wen, Qiaoyan; Li, Wenmin; Zhang, Hua; Jin, Zhengping
2015-03-01
Telecare medical information system (TMIS) offers healthcare delivery services and patients can acquire their desired medical services conveniently through public networks. The protection of patients' privacy and data confidentiality are significant. Very recently, Mishra et al. proposed a biometrics-based authentication scheme for telecare medical information system. Their scheme can protect user privacy and is believed to resist a range of network attacks. In this paper, we analyze Mishra et al.'s scheme and identify that their scheme is insecure to against known session key attack and impersonation attack. Thereby, we present a modified biometrics-based authentication scheme for TMIS to eliminate the aforementioned faults. Besides, we demonstrate the completeness of the proposed scheme through BAN-logic. Compared to the related schemes, our protocol can provide stronger security and it is more practical.
NASA Astrophysics Data System (ADS)
Saeb Gilani, T.; Villringer, C.; Zhang, E.; Gundlach, H.; Buchmann, J.; Schrader, S.; Laufer, J.
2018-02-01
Tomographic photoacoustic (PA) images acquired using a Fabry-Perot (FP) based scanner offer high resolution and image fidelity but can result in long acquisition times due to the need for raster scanning. To reduce the acquisition times, a parallelised camera-based PA signal detection scheme is developed. The scheme is based on using a sCMOScamera and FPI sensors with high homogeneity of optical thickness. PA signals were acquired using the camera-based setup and the signal to noise ratio (SNR) was measured. A comparison of the SNR of PA signal detected using 1) a photodiode in a conventional raster scanning detection scheme and 2) a sCMOS camera in parallelised detection scheme is made. The results show that the parallelised interrogation scheme has the potential to provide high speed PA imaging.
NASA Astrophysics Data System (ADS)
Guo, Kai; Xie, Yongjie; Ye, Hu; Zhang, Song; Li, Yunfei
2018-04-01
Due to the uncertainty of stratospheric airship's shape and the security problem caused by the uncertainty, surface reconstruction and surface deformation monitoring of airship was conducted based on laser scanning technology and a √3-subdivision scheme based on Shepard interpolation was developed. Then, comparison was conducted between our subdivision scheme and the original √3-subdivision scheme. The result shows our subdivision scheme could reduce the shrinkage of surface and the number of narrow triangles. In addition, our subdivision scheme could keep the sharp features. So, surface reconstruction and surface deformation monitoring of airship could be conducted precisely by our subdivision scheme.
An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.
Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang
2018-05-22
With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.
An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks
Zhu, Hongfei; Tan, Yu-an; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang
2018-01-01
With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size. PMID:29789475
Adaptive Numerical Dissipative Control in High Order Schemes for Multi-D Non-Ideal MHD
NASA Technical Reports Server (NTRS)
Yee, H. C.; Sjoegreen, B.
2004-01-01
The goal is to extend our adaptive numerical dissipation control in high order filter schemes and our new divergence-free methods for ideal MHD to non-ideal MHD that include viscosity and resistivity. The key idea consists of automatic detection of different flow features as distinct sensors to signal the appropriate type and amount of numerical dissipation/filter where needed and leave the rest of the region free of numerical dissipation contamination. These scheme-independent detectors are capable of distinguishing shocks/shears, flame sheets, turbulent fluctuations and spurious high-frequency oscillations. The detection algorithm is based on an artificial compression method (ACM) (for shocks/shears), and redundant multi-resolution wavelets (WAV) (for the above types of flow feature). These filter approaches also provide a natural and efficient way for the minimization of Div(B) numerical error. The filter scheme consists of spatially sixth order or higher non-dissipative spatial difference operators as the base scheme for the inviscid flux derivatives. If necessary, a small amount of high order linear dissipation is used to remove spurious high frequency oscillations. For example, an eighth-order centered linear dissipation (AD8) might be included in conjunction with a spatially sixth-order base scheme. The inviscid difference operator is applied twice for the viscous flux derivatives. After the completion of a full time step of the base scheme step, the solution is adaptively filtered by the product of a 'flow detector' and the 'nonlinear dissipative portion' of a high-resolution shock-capturing scheme. In addition, the scheme independent wavelet flow detector can be used in conjunction with spatially compact, spectral or spectral element type of base schemes. The ACM and wavelet filter schemes using the dissipative portion of a second-order shock-capturing scheme with sixth-order spatial central base scheme for both the inviscid and viscous MHD flux derivatives and a fourth-order Runge-Kutta method are denoted.
Discriminant Validity of Self-Reported Emotional Intelligence: A Multitrait-Multisource Study
ERIC Educational Resources Information Center
Joseph, Dana L.; Newman, Daniel A.
2010-01-01
A major stumbling block for emotional intelligence (EI) research has been the lack of adequate evidence for discriminant validity. In a sample of 280 dyads, self- and peer-reports of EI and Big Five personality traits were used to confirm an a priori four-factor model for the Wong and Law Emotional Intelligence Scale (WLEIS) and a five-factor…
ERIC Educational Resources Information Center
Blitz, Mark H.; Modeste, Marsha
2015-01-01
The Comprehensive Assessment of Leadership for Learning (CALL) is a multi-source assessment of distributed instructional leadership. As part of the validation of CALL, researchers examined differences between teacher and leader ratings in assessing distributed leadership practices. The authors utilized a t-test for equality of means for the…
Applying an efficient K-nearest neighbor search to forest attribute imputation
Andrew O. Finley; Ronald E. McRoberts; Alan R. Ek
2006-01-01
This paper explores the utility of an efficient nearest neighbor (NN) search algorithm for applications in multi-source kNN forest attribute imputation. The search algorithm reduces the number of distance calculations between a given target vector and each reference vector, thereby, decreasing the time needed to discover the NN subset. Results of five trials show gains...
Conceptualizing High School Students' Mental Health through a Dual-Factor Model
ERIC Educational Resources Information Center
Suldo, Shannon M; Thalji-Raitano, Amanda; Kiefer, Sarah M.; Ferron, John M.
2016-01-01
Mental health is increasingly viewed as a complete state of being, consisting of the absence of psychopathology and the presence of positive factors such as subjective well-being (SWB). This cross-sectional study analyzed multimethod and multisource data for 500 high school students (ages 14-18 years, M = 15.27 years, SD = 1.0 years) to examine…
ERIC Educational Resources Information Center
Berkovich, Izhak; Eyal, Ori
2017-01-01
The present study aims to examine whether principals' emotional intelligence (specifically, their ability to recognize emotions in others) makes them more effective transformational leaders, measured by the reframing of teachers' emotions. The study uses multisource data from principals and their teachers in 69 randomly sampled primary schools.…
ERIC Educational Resources Information Center
Prager, Carolyn; And Others
The education and reeducation of health care professionals remain essential, if somewhat neglected, elements in reforming the nation's health care system. The Pew Health Professions Commission (PHPC) has made the reform of health care contingent upon the reform of education, urging educational institutions to design core curricula with…
Latent Profiles of Parental Self-Efficacy and Children's Multisource-Evaluated Social Competence
ERIC Educational Resources Information Center
Junttila, Niina; Vauras, Marja
2014-01-01
Background: The interrelation between mothers' parental self-efficacy (PSE) and their school-aged children's well-being has been repeatedly proved. The lack of research in this area situates mainly on the absence of fathers, non-existent family-level studies, the paucity of independent evaluators, and the use of global PSE estimates.…
NASA Astrophysics Data System (ADS)
Nghiem, S. V.; Small, C.; Jacobson, M. Z.; Brakenridge, G. R.; Balk, D.; Sorichetta, A.; Masetti, M.; Gaughan, A. E.; Stevens, F. R.; Mathews, A.; Frazier, A. E.; Das, N. N.
2017-12-01
An innovative paradigm to observe the rural-urban transformation over the landscape using multi-sourced satellite data is formulated as a time and space continuum, extensively in space across South and Southeast Asia and in time over a decadal scale. Rather than a disparate array of individual cities and their vicinities in separated areas and in a discontinuous collection of points in time, the time-space continuum paradigm enables significant advances in addressing rural-urban change as a continuous gradient across the landscape from the wilderness to rural to urban areas to study challenging environmental and socioeconomic issues. We use satellite data including QuikSCAT scatterometer, SRTM and Sentinel-1 SAR, Landsat, WorldView, MODIS, and SMAP together with environmental and demographic data and modeling products to investigate land cover and land use change in South and Southeast Asia and associated impacts. Utilizing the new observational advances and effectively capitalizing current capabilities, we will present interdisciplinary results on urbanization in three dimensions, flood and drought, wildfire, air and water pollution, urban change, policy effects, population dynamics and vector-borne disease, agricultural assessment, and land degradation and desertification.
Error function attack of chaos synchronization based encryption schemes.
Wang, Xingang; Zhan, Meng; Lai, C-H; Gang, Hu
2004-03-01
Different chaos synchronization based encryption schemes are reviewed and compared from the practical point of view. As an efficient cryptanalysis tool for chaos encryption, a proposal based on the error function attack is presented systematically and used to evaluate system security. We define a quantitative measure (quality factor) of the effective applicability of a chaos encryption scheme, which takes into account the security, the encryption speed, and the robustness against channel noise. A comparison is made of several encryption schemes and it is found that a scheme based on one-way coupled chaotic map lattices performs outstandingly well, as judged from quality factor. Copyright 2004 American Institute of Physics.
An, Younghwa
2012-01-01
Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server.
An, Younghwa
2012-01-01
Recently, many biometrics-based user authentication schemes using smart cards have been proposed to improve the security weaknesses in user authentication system. In 2011, Das proposed an efficient biometric-based remote user authentication scheme using smart cards that can provide strong authentication and mutual authentication. In this paper, we analyze the security of Das's authentication scheme, and we have shown that Das's authentication scheme is still insecure against the various attacks. Also, we proposed the enhanced scheme to remove these security problems of Das's authentication scheme, even if the secret information stored in the smart card is revealed to an attacker. As a result of security analysis, we can see that the enhanced scheme is secure against the user impersonation attack, the server masquerading attack, the password guessing attack, and the insider attack and provides mutual authentication between the user and the server. PMID:22899887
Lee, Tian-Fu
2013-12-01
A smartcard-based authentication and key agreement scheme for telecare medicine information systems enables patients, doctors, nurses and health visitors to use smartcards for secure login to medical information systems. Authorized users can then efficiently access remote services provided by the medicine information systems through public networks. Guo and Chang recently improved the efficiency of a smartcard authentication and key agreement scheme by using chaotic maps. Later, Hao et al. reported that the scheme developed by Guo and Chang had two weaknesses: inability to provide anonymity and inefficient double secrets. Therefore, Hao et al. proposed an authentication scheme for telecare medicine information systems that solved these weaknesses and improved performance. However, a limitation in both schemes is their violation of the contributory property of key agreements. This investigation discusses these weaknesses and proposes a new smartcard-based authentication and key agreement scheme that uses chaotic maps for telecare medicine information systems. Compared to conventional schemes, the proposed scheme provides fewer weaknesses, better security, and more efficiency.
Lightweight ECC based RFID authentication integrated with an ID verifier transfer protocol.
He, Debiao; Kumar, Neeraj; Chilamkurti, Naveen; Lee, Jong-Hyouk
2014-10-01
The radio frequency identification (RFID) technology has been widely adopted and being deployed as a dominant identification technology in a health care domain such as medical information authentication, patient tracking, blood transfusion medicine, etc. With more and more stringent security and privacy requirements to RFID based authentication schemes, elliptic curve cryptography (ECC) based RFID authentication schemes have been proposed to meet the requirements. However, many recently published ECC based RFID authentication schemes have serious security weaknesses. In this paper, we propose a new ECC based RFID authentication integrated with an ID verifier transfer protocol that overcomes the weaknesses of the existing schemes. A comprehensive security analysis has been conducted to show strong security properties that are provided from the proposed authentication scheme. Moreover, the performance of the proposed authentication scheme is analyzed in terms of computational cost, communicational cost, and storage requirement.
Searchable attribute-based encryption scheme with attribute revocation in cloud storage.
Wang, Shangping; Zhao, Duqiao; Zhang, Yaling
2017-01-01
Attribute based encryption (ABE) is a good way to achieve flexible and secure access control to data, and attribute revocation is the extension of the attribute-based encryption, and the keyword search is an indispensable part for cloud storage. The combination of both has an important application in the cloud storage. In this paper, we construct a searchable attribute-based encryption scheme with attribute revocation in cloud storage, the keyword search in our scheme is attribute based with access control, when the search succeeds, the cloud server returns the corresponding cipher text to user and the user can decrypt the cipher text definitely. Besides, our scheme supports multiple keywords search, which makes the scheme more practical. Under the assumption of decisional bilinear Diffie-Hellman exponent (q-BDHE) and decisional Diffie-Hellman (DDH) in the selective security model, we prove that our scheme is secure.
Simple scheme to implement decoy-state reference-frame-independent quantum key distribution
NASA Astrophysics Data System (ADS)
Zhang, Chunmei; Zhu, Jianrong; Wang, Qin
2018-06-01
We propose a simple scheme to implement decoy-state reference-frame-independent quantum key distribution (RFI-QKD), where signal states are prepared in Z, X, and Y bases, decoy states are prepared in X and Y bases, and vacuum states are set to no bases. Different from the original decoy-state RFI-QKD scheme whose decoy states are prepared in Z, X and Y bases, in our scheme decoy states are only prepared in X and Y bases, which avoids the redundancy of decoy states in Z basis, saves the random number consumption, simplifies the encoding device of practical RFI-QKD systems, and makes the most of the finite pulses in a short time. Numerical simulations show that, considering the finite size effect with reasonable number of pulses in practical scenarios, our simple decoy-state RFI-QKD scheme exhibits at least comparable or even better performance than that of the original decoy-state RFI-QKD scheme. Especially, in terms of the resistance to the relative rotation of reference frames, our proposed scheme behaves much better than the original scheme, which has great potential to be adopted in current QKD systems.
ERIC Educational Resources Information Center
Kis, Viktoria
2016-01-01
Realising the potential of work-based learning schemes as a driver of productivity requires careful design and support. The length of work-based learning schemes should be adapted to the profile of productivity gains. A scheme that is too long for a given skill set might be unattractive for learners and waste public resources, but a scheme that is…
Mishra, Dheerendra; Mukhopadhyay, Sourav; Chaturvedi, Ankita; Kumari, Saru; Khan, Muhammad Khurram
2014-06-01
Remote user authentication is desirable for a Telecare Medicine Information System (TMIS) for the safety, security and integrity of transmitted data over the public channel. In 2013, Tan presented a biometric based remote user authentication scheme and claimed that his scheme is secure. Recently, Yan et al. demonstrated some drawbacks in Tan's scheme and proposed an improved scheme to erase the drawbacks of Tan's scheme. We analyze Yan et al.'s scheme and identify that their scheme is vulnerable to off-line password guessing attack, and does not protect anonymity. Moreover, in their scheme, login and password change phases are inefficient to identify the correctness of input where inefficiency in password change phase can cause denial of service attack. Further, we design an improved scheme for TMIS with the aim to eliminate the drawbacks of Yan et al.'s scheme.
Moon, Jongho; Choi, Younsung; Kim, Jiye; Won, Dongho
2016-03-01
Recently, numerous extended chaotic map-based password authentication schemes that employ smart card technology were proposed for Telecare Medical Information Systems (TMISs). In 2015, Lu et al. used Li et al.'s scheme as a basis to propose a password authentication scheme for TMISs that is based on biometrics and smart card technology and employs extended chaotic maps. Lu et al. demonstrated that Li et al.'s scheme comprises some weaknesses such as those regarding a violation of the session-key security, a vulnerability to the user impersonation attack, and a lack of local verification. In this paper, however, we show that Lu et al.'s scheme is still insecure with respect to issues such as a violation of the session-key security, and that it is vulnerable to both the outsider attack and the impersonation attack. To overcome these drawbacks, we retain the useful properties of Lu et al.'s scheme to propose a new password authentication scheme that is based on smart card technology and requires the use of chaotic maps. Then, we show that our proposed scheme is more secure and efficient and supports security properties.
[Real-time detection of quality of Chinese materia medica: strategy of NIR model evaluation].
Wu, Zhi-sheng; Shi, Xin-yuan; Xu, Bing; Dai, Xing-xing; Qiao, Yan-jiang
2015-07-01
The definition of critical quality attributes of Chinese materia medica ( CMM) was put forward based on the top-level design concept. Nowadays, coupled with the development of rapid analytical science, rapid assessment of critical quality attributes of CMM was firstly carried out, which was the secondary discipline branch of CMM. Taking near infrared (NIR) spectroscopy as an example, which is a rapid analytical technology in pharmaceutical process over the past decade, systematic review is the chemometric parameters in NIR model evaluation. According to the characteristics of complexity of CMM and trace components analysis, a multi-source information fusion strategy of NIR model was developed for assessment of critical quality attributes of CMM. The strategy has provided guideline for NIR reliable analysis in critical quality attributes of CMM.
Transformational and transactional leadership: a meta-analytic test of their relative validity.
Judge, Timothy A; Piccolo, Ronald F
2004-10-01
This study provided a comprehensive examination of the full range of transformational, transactional, and laissez-faire leadership. Results (based on 626 correlations from 87 sources) revealed an overall validity of .44 for transformational leadership, and this validity generalized over longitudinal and multisource designs. Contingent reward (.39) and laissez-faire (-.37) leadership had the next highest overall relations; management by exception (active and passive) was inconsistently related to the criteria. Surprisingly, there were several criteria for which contingent reward leadership had stronger relations than did transformational leadership. Furthermore, transformational leadership was strongly correlated with contingent reward (.80) and laissez-faire (-.65) leadership. Transformational and contingent reward leadership generally predicted criteria controlling for the other leadership dimensions, although transformational leadership failed to predict leader job performance. (c) 2004 APA, all rights reserved
Mishra, Raghavendra; Barnwal, Amit Kumar
2015-05-01
The Telecare medical information system (TMIS) presents effective healthcare delivery services by employing information and communication technologies. The emerging privacy and security are always a matter of great concern in TMIS. Recently, Chen at al. presented a password based authentication schemes to address the privacy and security. Later on, it is proved insecure against various active and passive attacks. To erase the drawbacks of Chen et al.'s anonymous authentication scheme, several password based authentication schemes have been proposed using public key cryptosystem. However, most of them do not present pre-smart card authentication which leads to inefficient login and password change phases. To present an authentication scheme with pre-smart card authentication, we present an improved anonymous smart card based authentication scheme for TMIS. The proposed scheme protects user anonymity and satisfies all the desirable security attributes. Moreover, the proposed scheme presents efficient login and password change phases where incorrect input can be quickly detected and a user can freely change his password without server assistance. Moreover, we demonstrate the validity of the proposed scheme by utilizing the widely-accepted BAN (Burrows, Abadi, and Needham) logic. The proposed scheme is also comparable in terms of computational overheads with relevant schemes.
NASA Astrophysics Data System (ADS)
Cheng, Siyang; Zhou, Lingxi; Tans, Pieter P.; An, Xingqin; Liu, Yunsong
2018-05-01
As CO2 is a primary driving factor of climate change, the mole fraction and source-sink characteristics of atmospheric CO2 over China are constantly inferred from multi-source and multi-site data. In this paper, we compared ground-based CO2 measurements with satellite retrievals and investigated the source-sink regional representativeness at China's four WMO/GAW stations. The results indicate that, firstly, atmospheric CO2 mole fractions from ground-based sampling measurement and Greenhouse Gases Observing Satellite (GOSAT) products reveal similar seasonal variation. The seasonal amplitude of the column-averaged CO2 mole fractions is smaller than that of the ground-based CO2 at all stations. The extrema of the seasonal cycle of ground-based and column CO2 mole fractions are basically synchronous except a slight phase delay at Lin'an (LAN) station. For the two-year average, the column CO2 is lower than ground-based CO2, and both of them reveal the lowest CO2 mole fraction at Waliguan (WLG) station. The lowest (∼4 ppm) and largest (∼8 ppm) differences between the column and ground-based CO2 appear at WLG and Longfengshan (LFS) stations, respectively. The CO2 mole fraction and its difference between GOSAT and ground-based measurement are smaller in summer than in winter. The differences of summer column CO2 among these stations are also much smaller than their ground-based counterparts. In winter, the maximum of ground-based CO2 mole fractions and the greatest difference between the two (ground-based and column) datasets appear at the LFS station. Secondly, the representative areas of the monthly CO2 background mole fractions at each station were found by employing footprints and emissions. Smaller representative areas appeared at Shangdianzi (SDZ) and LFS, whereas larger ones were seen at WLG and LAN. The representative areas in summer are larger than those in winter at WLG and SDZ, but the situation is opposite at LAN and LFS. The representative areas for the stations are different in summer and winter, distributed in four typical regions. The CO2 net fluxes in these representative areas show obvious seasonal cycles with similar trends but different varying ranges and different time of the strongest sink. The intensities and uncertainties of the CO2 fluxes are different at different stations in different months and source-sink sectors. Overall, the WLG station is almost a carbon sink, but the other three stations present stronger carbon sources for most of the year. These findings could be conducive to the application of multi-source CO2 data and the understanding of regional CO2 source-sink characteristics and patterns over China.
Color encryption scheme based on adapted quantum logistic map
NASA Astrophysics Data System (ADS)
Zaghloul, Alaa; Zhang, Tiejun; Amin, Mohamed; Abd El-Latif, Ahmed A.
2014-04-01
This paper presents a new color image encryption scheme based on quantum chaotic system. In this scheme, a new encryption scheme is accomplished by generating an intermediate chaotic key stream with the help of quantum chaotic logistic map. Then, each pixel is encrypted by the cipher value of the previous pixel and the adapted quantum logistic map. The results show that the proposed scheme has adequate security for the confidentiality of color images.
Li, Chun-Ta; Weng, Chi-Yao; Lee, Cheng-Chi; Wang, Chun-Cheng
2015-11-01
To protect patient privacy and ensure authorized access to remote medical services, many remote user authentication schemes for the integrated electronic patient record (EPR) information system have been proposed in the literature. In a recent paper, Das proposed a hash based remote user authentication scheme using passwords and smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various passive and active attacks. However, in this paper, we found that Das's authentication scheme is still vulnerable to modification and user duplication attacks. Thereafter we propose a secure and efficient authentication scheme for the integrated EPR information system based on lightweight hash function and bitwise exclusive-or (XOR) operations. The security proof and performance analysis show our new scheme is well-suited to adoption in remote medical healthcare services.
A Practical and Secure Coercion-Resistant Scheme for Internet Voting
NASA Astrophysics Data System (ADS)
Araújo, Roberto; Foulle, Sébastien; Traoré, Jacques
Juels, Catalano, and Jakobsson (JCJ) proposed at WPES 2005 the first voting scheme that considers real-world threats and that is more realistic for Internet elections. Their scheme, though, has a quadratic work factor and thereby is not efficient for large scale elections. Based on the work of JCJ, Smith proposed an efficient scheme that has a linear work factor. In this paper we first show that Smith's scheme is insecure. Then we present a new coercion-resistant election scheme with a linear work factor that overcomes the flaw of Smith's proposal. Our solution is based on the group signature scheme of Camenisch and Lysyanskaya (Crypto 2004).
We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme for two case studies involving 1) Lake Superior tributaries and 2) watersheds of riverine coastal wetlands...
We compared classification schemes based on watershed storage (wetland + lake area/watershed area) and forest fragmentation with a geographically-based classification scheme for two case studies involving 1)Lake Superior tributaries and 2) watersheds of riverine coastal wetlands ...
NASA Astrophysics Data System (ADS)
Yang, Lei; Yan, Hongyong; Liu, Hong
2017-03-01
Implicit staggered-grid finite-difference (ISFD) scheme is competitive for its great accuracy and stability, whereas its coefficients are conventionally determined by the Taylor-series expansion (TE) method, leading to a loss in numerical precision. In this paper, we modify the TE method using the minimax approximation (MA), and propose a new optimal ISFD scheme based on the modified TE (MTE) with MA method. The new ISFD scheme takes the advantage of the TE method that guarantees great accuracy at small wavenumbers, and keeps the property of the MA method that keeps the numerical errors within a limited bound at the same time. Thus, it leads to great accuracy for numerical solution of the wave equations. We derive the optimal ISFD coefficients by applying the new method to the construction of the objective function, and using a Remez algorithm to minimize its maximum. Numerical analysis is made in comparison with the conventional TE-based ISFD scheme, indicating that the MTE-based ISFD scheme with appropriate parameters can widen the wavenumber range with high accuracy, and achieve greater precision than the conventional ISFD scheme. The numerical modeling results also demonstrate that the MTE-based ISFD scheme performs well in elastic wave simulation, and is more efficient than the conventional ISFD scheme for elastic modeling.
An effective and secure key-management scheme for hierarchical access control in E-medicine system.
Odelu, Vanga; Das, Ashok Kumar; Goswami, Adrijit
2013-04-01
Recently several hierarchical access control schemes are proposed in the literature to provide security of e-medicine systems. However, most of them are either insecure against 'man-in-the-middle attack' or they require high storage and computational overheads. Wu and Chen proposed a key management method to solve dynamic access control problems in a user hierarchy based on hybrid cryptosystem. Though their scheme improves computational efficiency over Nikooghadam et al.'s approach, it suffers from large storage space for public parameters in public domain and computational inefficiency due to costly elliptic curve point multiplication. Recently, Nikooghadam and Zakerolhosseini showed that Wu-Chen's scheme is vulnerable to man-in-the-middle attack. In order to remedy this security weakness in Wu-Chen's scheme, they proposed a secure scheme which is again based on ECC (elliptic curve cryptography) and efficient one-way hash function. However, their scheme incurs huge computational cost for providing verification of public information in the public domain as their scheme uses ECC digital signature which is costly when compared to symmetric-key cryptosystem. In this paper, we propose an effective access control scheme in user hierarchy which is only based on symmetric-key cryptosystem and efficient one-way hash function. We show that our scheme reduces significantly the storage space for both public and private domains, and computational complexity when compared to Wu-Chen's scheme, Nikooghadam-Zakerolhosseini's scheme, and other related schemes. Through the informal and formal security analysis, we further show that our scheme is secure against different attacks and also man-in-the-middle attack. Moreover, dynamic access control problems in our scheme are also solved efficiently compared to other related schemes, making our scheme is much suitable for practical applications of e-medicine systems.
High-Order Central WENO Schemes for Multi-Dimensional Hamilton-Jacobi Equations
NASA Technical Reports Server (NTRS)
Bryson, Steve; Levy, Doron; Biegel, Bryan (Technical Monitor)
2002-01-01
We present new third- and fifth-order Godunov-type central schemes for approximating solutions of the Hamilton-Jacobi (HJ) equation in an arbitrary number of space dimensions. These are the first central schemes for approximating solutions of the HJ equations with an order of accuracy that is greater than two. In two space dimensions we present two versions for the third-order scheme: one scheme that is based on a genuinely two-dimensional Central WENO reconstruction, and another scheme that is based on a simpler dimension-by-dimension reconstruction. The simpler dimension-by-dimension variant is then extended to a multi-dimensional fifth-order scheme. Our numerical examples in one, two and three space dimensions verify the expected order of accuracy of the schemes.
A Quantum Proxy Signature Scheme Based on Genuine Five-qubit Entangled State
NASA Astrophysics Data System (ADS)
Cao, Hai-Jing; Huang, Jun; Yu, Yao-Feng; Jiang, Xiu-Li
2014-09-01
In this paper a very efficient and secure proxy signature scheme is proposed. It is based on controlled quantum teleportation. Genuine five-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. Quantum key distribution and one-time pad are adopted in our scheme, which could guarantee not only the unconditional security of the scheme but also the anonymity of the messages owner.
Das, Ashok Kumar; Bruhadeshwar, Bezawada
2013-10-01
Recently Lee and Liu proposed an efficient password based authentication and key agreement scheme using smart card for the telecare medicine information system [J. Med. Syst. (2013) 37:9933]. In this paper, we show that though their scheme is efficient, their scheme still has two security weaknesses such as (1) it has design flaws in authentication phase and (2) it has design flaws in password change phase. In order to withstand these flaws found in Lee-Liu's scheme, we propose an improvement of their scheme. Our improved scheme keeps also the original merits of Lee-Liu's scheme. We show that our scheme is efficient as compared to Lee-Liu's scheme. Further, through the security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool to show that our scheme is secure against passive and active attacks.
ERIC Educational Resources Information Center
Blickle, Gerhard; Schneider, Paula B.; Perrewe, Pamela L.; Blass, Fred R.; Ferris, Gerald R.
2008-01-01
Purpose: The purpose of this study was to investigate the role of protege self-presentation by self-disclosure, modesty, and self-monitoring in mentoring. Design/methodology/approach: This study used three data sources (i.e. employees, peers, and mentors) and a longitudinal design over a period of two years. Findings: Employee self-disclosure and…
Time-Resolved and Spectroscopic Three-Dimensional Optical Breast Tomography
2009-03-01
polarization sensitive imaging 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT 18. NUMBER OF PAGES 19a. NAME OF RESPONSIBLE PERSON R. R...project; • Development of a near-infrared center of intensity time gated imaging approach; and • Polarization sensitive imaging. We provide an...spectroscopic imaging arrangement, and a multi-source illumination and multi- detector signal acquisition arrangement. 5 5.1.1. Time-resolved transillumination
Wei Wu; James Clark; James Vose
2010-01-01
Hierarchical Bayesian (HB) modeling allows for multiple sources of uncertainty by factoring complex relationships into conditional distributions that can be used to draw inference and make predictions. We applied an HB model to estimate the parameters and state variables of a parsimonious hydrological model â GR4J â by coherently assimilating the uncertainties from the...
ERIC Educational Resources Information Center
Sonnentag, Sabine; Kuttler, Iris; Fritz, Charlotte
2010-01-01
This paper examines psychological detachment (i.e., mentally "switching off") from work during non-work time as a partial mediator between job stressors and low work-home boundaries on the one hand and strain reactions (emotional exhaustion, need for recovery) on the other hand. Survey data were collected from a sample of protestant pastors (N =…
1979-12-01
vegetation shows on the imagery but emphasis has been placed on the detection of wooded and scrub areas and the differentiation between deciduous and...S. A., 1974b, Phenology and remote sensing, phenology and seasonality modeling: in Helmut Lieth, H. (ed.), Ecological Studies-Analysis and Synthesis...Remote Sensing of Ecology , University of d-eorgia Press, Athens, Georgia, p. 63-94. Phillipson, W. R. and T. Liang, 1975, Airphoto analysis in the
ERIC Educational Resources Information Center
Kim, Loretta; Wong, Shun Han Rebekah
2015-01-01
This article discusses the objectives and outcomes of a project to enhance digital humanities training at the undergraduate level in a Hong Kong university. The co-investigators re-designed a multi-source data-set as an example and then taught a multi-step curriculum about gathering, organizing, and presenting original data to an introductory…
ERIC Educational Resources Information Center
Restubog, Simon Lloyd D.; Scott, Kristin L.; Zagenczyk, Thomas J.
2011-01-01
We developed a model of the relationships among aggressive norms, abusive supervision, psychological distress, family undermining, and supervisor-directed deviance. We tested the model in 2 studies using multisource data: a 3-wave investigation of 184 full-time employees (Study 1) and a 2-wave investigation of 188 restaurant workers (Study 2).…
Moon, Jongho; Lee, Donghoon; Lee, Youngsook; Won, Dongho
2017-04-25
User authentication in wireless sensor networks is more difficult than in traditional networks owing to sensor network characteristics such as unreliable communication, limited resources, and unattended operation. For these reasons, various authentication schemes have been proposed to provide secure and efficient communication. In 2016, Park et al. proposed a secure biometric-based authentication scheme with smart card revocation/reissue for wireless sensor networks. However, we found that their scheme was still insecure against impersonation attack, and had a problem in the smart card revocation/reissue phase. In this paper, we show how an adversary can impersonate a legitimate user or sensor node, illegal smart card revocation/reissue and prove that Park et al.'s scheme fails to provide revocation/reissue. In addition, we propose an enhanced scheme that provides efficiency, as well as anonymity and security. Finally, we provide security and performance analysis between previous schemes and the proposed scheme, and provide formal analysis based on the random oracle model. The results prove that the proposed scheme can solve the weaknesses of impersonation attack and other security flaws in the security analysis section. Furthermore, performance analysis shows that the computational cost is lower than the previous scheme.
Moon, Jongho; Lee, Donghoon; Lee, Youngsook; Won, Dongho
2017-01-01
User authentication in wireless sensor networks is more difficult than in traditional networks owing to sensor network characteristics such as unreliable communication, limited resources, and unattended operation. For these reasons, various authentication schemes have been proposed to provide secure and efficient communication. In 2016, Park et al. proposed a secure biometric-based authentication scheme with smart card revocation/reissue for wireless sensor networks. However, we found that their scheme was still insecure against impersonation attack, and had a problem in the smart card revocation/reissue phase. In this paper, we show how an adversary can impersonate a legitimate user or sensor node, illegal smart card revocation/reissue and prove that Park et al.’s scheme fails to provide revocation/reissue. In addition, we propose an enhanced scheme that provides efficiency, as well as anonymity and security. Finally, we provide security and performance analysis between previous schemes and the proposed scheme, and provide formal analysis based on the random oracle model. The results prove that the proposed scheme can solve the weaknesses of impersonation attack and other security flaws in the security analysis section. Furthermore, performance analysis shows that the computational cost is lower than the previous scheme. PMID:28441331
Estimating average alcohol consumption in the population using multiple sources: the case of Spain.
Sordo, Luis; Barrio, Gregorio; Bravo, María J; Villalbí, Joan R; Espelt, Albert; Neira, Montserrat; Regidor, Enrique
2016-01-01
National estimates on per capita alcohol consumption are provided regularly by various sources and may have validity problems, so corrections are needed for monitoring and assessment purposes. Our objectives were to compare different alcohol availability estimates for Spain, to build the best estimate (actual consumption), characterize its time trend during 2001-2011, and quantify the extent to which other estimates (coverage) approximated actual consumption. Estimates were: alcohol availability from the Spanish Tax Agency (Tax Agency availability), World Health Organization (WHO availability) and other international agencies, self-reported purchases from the Spanish Food Consumption Panel, and self-reported consumption from population surveys. Analyses included calculating: between-agency discrepancy in availability, multisource availability (correcting Tax Agency availability by underestimation of wine and cider), actual consumption (adjusting multisource availability by unrecorded alcohol consumption/purchases and alcohol losses), and coverage of selected estimates. Sensitivity analyses were undertaken. Time trends were characterized by joinpoint regression. Between-agency discrepancy in alcohol availability remained high in 2011, mainly because of wine and spirits, although some decrease was observed during the study period. The actual consumption was 9.5 l of pure alcohol/person-year in 2011, decreasing 2.3 % annually, mainly due to wine and spirits. 2011 coverage of WHO availability, Tax Agency availability, self-reported purchases, and self-reported consumption was 99.5, 99.5, 66.3, and 28.0 %, respectively, generally with downward trends (last three estimates, especially self-reported consumption). The multisource availability overestimated actual consumption by 12.3 %, mainly due to tourism imbalance. Spanish estimates of per capita alcohol consumption show considerable weaknesses. Using uncorrected estimates, especially self-reported consumption, for monitoring or other purposes is misleading. To obtain conservative estimates of alcohol-attributable disease burden or heavy drinking prevalence, self-reported consumption should be shifted upwards by more than 85 % (91 % in 2011) of Tax Agency or WHO availability figures. The weaknesses identified can probably also be found worldwide, thus much empirical work remains to be done to improve estimates of per capita alcohol consumption.
An improved scheme for Flip-OFDM based on Hartley transform in short-range IM/DD systems.
Zhou, Ji; Qiao, Yaojun; Cai, Zhuo; Ji, Yuefeng
2014-08-25
In this paper, an improved Flip-OFDM scheme is proposed for IM/DD optical systems, where the modulation/demodulation processing takes advantage of the fast Hartley transform (FHT) algorithm. We realize the improved scheme in one symbol period while conventional Flip-OFDM scheme based on fast Fourier transform (FFT) in two consecutive symbol periods. So the complexity of many operations in improved scheme is half of that in conventional scheme, such as CP operation, polarity inversion and symbol delay. Compared to FFT with complex input constellation, the complexity of FHT with real input constellation is halved. The transmission experiment over 50-km SSMF has been realized to verify the feasibility of improved scheme. In conclusion, the improved scheme has the same BER performance with conventional scheme, but great superiority on complexity.
Chain-Based Communication in Cylindrical Underwater Wireless Sensor Networks
Javaid, Nadeem; Jafri, Mohsin Raza; Khan, Zahoor Ali; Alrajeh, Nabil; Imran, Muhammad; Vasilakos, Athanasios
2015-01-01
Appropriate network design is very significant for Underwater Wireless Sensor Networks (UWSNs). Application-oriented UWSNs are planned to achieve certain objectives. Therefore, there is always a demand for efficient data routing schemes, which can fulfill certain requirements of application-oriented UWSNs. These networks can be of any shape, i.e., rectangular, cylindrical or square. In this paper, we propose chain-based routing schemes for application-oriented cylindrical networks and also formulate mathematical models to find a global optimum path for data transmission. In the first scheme, we devise four interconnected chains of sensor nodes to perform data communication. In the second scheme, we propose routing scheme in which two chains of sensor nodes are interconnected, whereas in third scheme single-chain based routing is done in cylindrical networks. After finding local optimum paths in separate chains, we find global optimum paths through their interconnection. Moreover, we develop a computational model for the analysis of end-to-end delay. We compare the performance of the above three proposed schemes with that of Power Efficient Gathering System in Sensor Information Systems (PEGASIS) and Congestion adjusted PEGASIS (C-PEGASIS). Simulation results show that our proposed 4-chain based scheme performs better than the other selected schemes in terms of network lifetime, end-to-end delay, path loss, transmission loss, and packet sending rate. PMID:25658394
NASA Astrophysics Data System (ADS)
Li, D.
2016-12-01
Sudden water pollution accidents are unavoidable risk events that we must learn to co-exist with. In China's Taihu River Basin, the river flow conditions are complicated with frequently artificial interference. Sudden water pollution accident occurs mainly in the form of a large number of abnormal discharge of wastewater, and has the characteristics with the sudden occurrence, the uncontrollable scope, the uncertainty object and the concentrated distribution of many risk sources. Effective prevention of pollution accidents that may occur is of great significance for the water quality safety management. Bayesian networks can be applied to represent the relationship between pollution sources and river water quality intuitively. Using the time sequential Monte Carlo algorithm, the pollution sources state switching model, water quality model for river network and Bayesian reasoning is integrated together, and the sudden water pollution risk assessment model for river network is developed to quantify the water quality risk under the collective influence of multiple pollution sources. Based on the isotope water transport mechanism, a dynamic tracing model of multiple pollution sources is established, which can describe the relationship between the excessive risk of the system and the multiple risk sources. Finally, the diagnostic reasoning algorithm based on Bayesian network is coupled with the multi-source tracing model, which can identify the contribution of each risk source to the system risk under the complex flow conditions. Taking Taihu Lake water system as the research object, the model is applied to obtain the reasonable results under the three typical years. Studies have shown that the water quality risk at critical sections are influenced by the pollution risk source, the boundary water quality, the hydrological conditions and self -purification capacity, and the multiple pollution sources have obvious effect on water quality risk of the receiving water body. The water quality risk assessment approach developed in this study offers a effective tool for systematically quantifying the random uncertainty in plain river network system, and it also provides the technical support for the decision-making of controlling the sudden water pollution through identification of critical pollution sources.
Al Ansari, Ahmed; Donnon, Tyrone; Al Khalifa, Khalid; Darwish, Abdulla; Violato, Claudio
2014-01-01
Background The purpose of this study was to conduct a meta-analysis on the construct and criterion validity of multi-source feedback (MSF) to assess physicians and surgeons in practice. Methods In this study, we followed the guidelines for the reporting of observational studies included in a meta-analysis. In addition to PubMed and MEDLINE databases, the CINAHL, EMBASE, and PsycINFO databases were searched from January 1975 to November 2012. All articles listed in the references of the MSF studies were reviewed to ensure that all relevant publications were identified. All 35 articles were independently coded by two authors (AA, TD), and any discrepancies (eg, effect size calculations) were reviewed by the other authors (KA, AD, CV). Results Physician/surgeon performance measures from 35 studies were identified. A random-effects model of weighted mean effect size differences (d) resulted in: construct validity coefficients for the MSF system on physician/surgeon performance across different levels in practice ranged from d=0.14 (95% confidence interval [CI] 0.40–0.69) to d=1.78 (95% CI 1.20–2.30); construct validity coefficients for the MSF on physician/surgeon performance on two different occasions ranged from d=0.23 (95% CI 0.13–0.33) to d=0.90 (95% CI 0.74–1.10); concurrent validity coefficients for the MSF based on differences in assessor group ratings ranged from d=0.50 (95% CI 0.47–0.52) to d=0.57 (95% CI 0.55–0.60); and predictive validity coefficients for the MSF on physician/surgeon performance across different standardized measures ranged from d=1.28 (95% CI 1.16–1.41) to d=1.43 (95% CI 0.87–2.00). Conclusion The construct and criterion validity of the MSF system is supported by small to large effect size differences based on the MSF process and physician/surgeon performance across different clinical and nonclinical domain measures. PMID:24600300
Public-key quantum digital signature scheme with one-time pad private-key
NASA Astrophysics Data System (ADS)
Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua
2018-01-01
A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.
Gas-Kinetic Theory Based Flux Splitting Method for Ideal Magnetohydrodynamics
NASA Technical Reports Server (NTRS)
Xu, Kun
1998-01-01
A gas-kinetic solver is developed for the ideal magnetohydrodynamics (MHD) equations. The new scheme is based on the direct splitting of the flux function of the MHD equations with the inclusion of "particle" collisions in the transport process. Consequently, the artificial dissipation in the new scheme is much reduced in comparison with the MHD Flux Vector Splitting Scheme. At the same time, the new scheme is compared with the well-developed Roe-type MHD solver. It is concluded that the kinetic MHD scheme is more robust and efficient than the Roe- type method, and the accuracy is competitive. In this paper the general principle of splitting the macroscopic flux function based on the gas-kinetic theory is presented. The flux construction strategy may shed some light on the possible modification of AUSM- and CUSP-type schemes for the compressible Euler equations, as well as to the development of new schemes for a non-strictly hyperbolic system.
Universal block diagram based modeling and simulation schemes for fractional-order control systems.
Bai, Lu; Xue, Dingyü
2017-05-08
Universal block diagram based schemes are proposed for modeling and simulating the fractional-order control systems in this paper. A fractional operator block in Simulink is designed to evaluate the fractional-order derivative and integral. Based on the block, the fractional-order control systems with zero initial conditions can be modeled conveniently. For modeling the system with nonzero initial conditions, the auxiliary signal is constructed in the compensation scheme. Since the compensation scheme is very complicated, therefore the integrator chain scheme is further proposed to simplify the modeling procedures. The accuracy and effectiveness of the schemes are assessed in the examples, the computation results testify the block diagram scheme is efficient for all Caputo fractional-order ordinary differential equations (FODEs) of any complexity, including the implicit Caputo FODEs. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Secure Communications in CIoT Networks with a Wireless Energy Harvesting Untrusted Relay
Hu, Hequn; Liao, Xuewen
2017-01-01
The Internet of Things (IoT) represents a bright prospect that a variety of common appliances can connect to one another, as well as with the rest of the Internet, to vastly improve our lives. Unique communication and security challenges have been brought out by the limited hardware, low-complexity, and severe energy constraints of IoT devices. In addition, a severe spectrum scarcity problem has also been stimulated by the use of a large number of IoT devices. In this paper, cognitive IoT (CIoT) is considered where an IoT network works as the secondary system using underlay spectrum sharing. A wireless energy harvesting (EH) node is used as a relay to improve the coverage of an IoT device. However, the relay could be a potential eavesdropper to intercept the IoT device’s messages. This paper considers the problem of secure communication between the IoT device (e.g., sensor) and a destination (e.g., controller) via the wireless EH untrusted relay. Since the destination can be equipped with adequate energy supply, secure schemes based on destination-aided jamming are proposed based on power splitting (PS) and time splitting (TS) policies, called intuitive secure schemes based on PS (Int-PS), precoded secure scheme based on PS (Pre-PS), intuitive secure scheme based on TS (Int-TS) and precoded secure scheme based on TS (Pre-TS), respectively. The secure performances of the proposed schemes are evaluated through the metric of probability of successfully secure transmission (PSST), which represents the probability that the interference constraint of the primary user is satisfied and the secrecy rate is positive. PSST is analyzed for the proposed secure schemes, and the closed form expressions of PSST for Pre-PS and Pre-TS are derived and validated through simulation results. Numerical results show that the precoded secure schemes have better PSST than the intuitive secure schemes under similar power consumption. When the secure schemes based on PS and TS polices have similar PSST, the average transmit power consumption of the secure scheme based on TS is lower. The influences of power splitting and time slitting ratios are also discussed through simulations. PMID:28869540
Efficient and Provable Secure Pairing-Free Security-Mediated Identity-Based Identification Schemes
Chin, Ji-Jian; Tan, Syh-Yuan; Heng, Swee-Huay; Phan, Raphael C.-W.
2014-01-01
Security-mediated cryptography was first introduced by Boneh et al. in 2001. The main motivation behind security-mediated cryptography was the capability to allow instant revocation of a user's secret key by necessitating the cooperation of a security mediator in any given transaction. Subsequently in 2003, Boneh et al. showed how to convert a RSA-based security-mediated encryption scheme from a traditional public key setting to an identity-based one, where certificates would no longer be required. Following these two pioneering papers, other cryptographic primitives that utilize a security-mediated approach began to surface. However, the security-mediated identity-based identification scheme (SM-IBI) was not introduced until Chin et al. in 2013 with a scheme built on bilinear pairings. In this paper, we improve on the efficiency results for SM-IBI schemes by proposing two schemes that are pairing-free and are based on well-studied complexity assumptions: the RSA and discrete logarithm assumptions. PMID:25207333
Efficient and provable secure pairing-free security-mediated identity-based identification schemes.
Chin, Ji-Jian; Tan, Syh-Yuan; Heng, Swee-Huay; Phan, Raphael C-W
2014-01-01
Security-mediated cryptography was first introduced by Boneh et al. in 2001. The main motivation behind security-mediated cryptography was the capability to allow instant revocation of a user's secret key by necessitating the cooperation of a security mediator in any given transaction. Subsequently in 2003, Boneh et al. showed how to convert a RSA-based security-mediated encryption scheme from a traditional public key setting to an identity-based one, where certificates would no longer be required. Following these two pioneering papers, other cryptographic primitives that utilize a security-mediated approach began to surface. However, the security-mediated identity-based identification scheme (SM-IBI) was not introduced until Chin et al. in 2013 with a scheme built on bilinear pairings. In this paper, we improve on the efficiency results for SM-IBI schemes by proposing two schemes that are pairing-free and are based on well-studied complexity assumptions: the RSA and discrete logarithm assumptions.
Zhang, Liping; Zhu, Shaohui; Tang, Shanyu
2017-03-01
Telecare medicine information systems (TMIS) provide flexible and convenient e-health care. However, the medical records transmitted in TMIS are exposed to unsecured public networks, so TMIS are more vulnerable to various types of security threats and attacks. To provide privacy protection for TMIS, a secure and efficient authenticated key agreement scheme is urgently needed to protect the sensitive medical data. Recently, Mishra et al. proposed a biometrics-based authenticated key agreement scheme for TMIS by using hash function and nonce, they claimed that their scheme could eliminate the security weaknesses of Yan et al.'s scheme and provide dynamic identity protection and user anonymity. In this paper, however, we demonstrate that Mishra et al.'s scheme suffers from replay attacks, man-in-the-middle attacks and fails to provide perfect forward secrecy. To overcome the weaknesses of Mishra et al.'s scheme, we then propose a three-factor authenticated key agreement scheme to enable the patient to enjoy the remote healthcare services via TMIS with privacy protection. The chaotic map-based cryptography is employed in the proposed scheme to achieve a delicate balance of security and performance. Security analysis demonstrates that the proposed scheme resists various attacks and provides several attractive security properties. Performance evaluation shows that the proposed scheme increases efficiency in comparison with other related schemes.
PHACK: An Efficient Scheme for Selective Forwarding Attack Detection in WSNs.
Liu, Anfeng; Dong, Mianxiong; Ota, Kaoru; Long, Jun
2015-12-09
In this paper, a Per-Hop Acknowledgement (PHACK)-based scheme is proposed for each packet transmission to detect selective forwarding attacks. In our scheme, the sink and each node along the forwarding path generate an acknowledgement (ACK) message for each received packet to confirm the normal packet transmission. The scheme, in which each ACK is returned to the source node along a different routing path, can significantly increase the resilience against attacks because it prevents an attacker from compromising nodes in the return routing path, which can otherwise interrupt the return of nodes' ACK packets. For this case, the PHACK scheme also has better potential to detect abnormal packet loss and identify suspect nodes as well as better resilience against attacks. Another pivotal issue is the network lifetime of the PHACK scheme, as it generates more acknowledgements than previous ACK-based schemes. We demonstrate that the network lifetime of the PHACK scheme is not lower than that of other ACK-based schemes because the scheme just increases the energy consumption in non-hotspot areas and does not increase the energy consumption in hotspot areas. Moreover, the PHACK scheme greatly simplifies the protocol and is easy to implement. Both theoretical and simulation results are given to demonstrate the effectiveness of the proposed scheme in terms of high detection probability and the ability to identify suspect nodes.
PHACK: An Efficient Scheme for Selective Forwarding Attack Detection in WSNs
Liu, Anfeng; Dong, Mianxiong; Ota, Kaoru; Long, Jun
2015-01-01
In this paper, a Per-Hop Acknowledgement (PHACK)-based scheme is proposed for each packet transmission to detect selective forwarding attacks. In our scheme, the sink and each node along the forwarding path generate an acknowledgement (ACK) message for each received packet to confirm the normal packet transmission. The scheme, in which each ACK is returned to the source node along a different routing path, can significantly increase the resilience against attacks because it prevents an attacker from compromising nodes in the return routing path, which can otherwise interrupt the return of nodes’ ACK packets. For this case, the PHACK scheme also has better potential to detect abnormal packet loss and identify suspect nodes as well as better resilience against attacks. Another pivotal issue is the network lifetime of the PHACK scheme, as it generates more acknowledgements than previous ACK-based schemes. We demonstrate that the network lifetime of the PHACK scheme is not lower than that of other ACK-based schemes because the scheme just increases the energy consumption in non-hotspot areas and does not increase the energy consumption in hotspot areas. Moreover, the PHACK scheme greatly simplifies the protocol and is easy to implement. Both theoretical and simulation results are given to demonstrate the effectiveness of the proposed scheme in terms of high detection probability and the ability to identify suspect nodes. PMID:26690178
Sutrala, Anil Kumar; Das, Ashok Kumar; Odelu, Vanga; Wazid, Mohammad; Kumari, Saru
2016-10-01
Information and communication and technology (ICT) has changed the entire paradigm of society. ICT facilitates people to use medical services over the Internet, thereby reducing the travel cost, hospitalization cost and time to a greater extent. Recent advancements in Telecare Medicine Information System (TMIS) facilitate users/patients to access medical services over the Internet by gaining health monitoring facilities at home. Amin and Biswas recently proposed a RSA-based user authentication and session key agreement protocol usable for TMIS, which is an improvement over Giri et al.'s RSA-based user authentication scheme for TMIS. In this paper, we show that though Amin-Biswas's scheme considerably improves the security drawbacks of Giri et al.'s scheme, their scheme has security weaknesses as it suffers from attacks such as privileged insider attack, user impersonation attack, replay attack and also offline password guessing attack. A new RSA-based user authentication scheme for TMIS is proposed, which overcomes the security pitfalls of Amin-Biswas's scheme and also preserves user anonymity property. The careful formal security analysis using the two widely accepted Burrows-Abadi-Needham (BAN) logic and the random oracle models is done. Moreover, the informal security analysis of the scheme is also done. These security analyses show the robustness of our new scheme against the various known attacks as well as attacks found in Amin-Biswas's scheme. The simulation of the proposed scheme using the widely accepted Automated Validation of Internet Security Protocols and Applications (AVISPA) tool is also done. We present a new user authentication and session key agreement scheme for TMIS, which fixes the mentioned security pitfalls found in Amin-Biswas's scheme, and we also show that the proposed scheme provides better security than other existing schemes through the rigorous security analysis and verification tool. Furthermore, we present the formal security verification of our scheme using the widely accepted AVISPA tool. High security and extra functionality features allow our proposed scheme to be applicable for telecare medicine information systems which is used for e-health care medical applications. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Regolith thermal energy storage for lunar nighttime power
NASA Technical Reports Server (NTRS)
Tillotson, Brian
1992-01-01
A scheme for providing nighttime electric power to a lunar base is described. This scheme stores thermal energy in a pile of regolith. Any such scheme must somehow improve on the poor thermal conductivity of lunar regolith in vacuum. Two previous schemes accomplish this by casting or melting the regolith. The scheme described here wraps the regolith in a gas-tight bag and introduces a light gas to enhance thermal conductivity. This allows the system to be assembled with less energy and equipment than schemes which require melting of regolith. A point design based on the new scheme is presented. Its mass from Earth compares favorably with the mass of a regenerative fuel cell of equal capacity.
NASA Technical Reports Server (NTRS)
Lee, H.-W.; Lam, K. S.; Devries, P. L.; George, T. F.
1980-01-01
A new semiclassical decoupling scheme (the trajectory-based decoupling scheme) is introduced in a computational study of vibrational-to-electronic energy transfer for a simple model system that simulates collinear atom-diatom collisions. The probability of energy transfer (P) is calculated quasiclassically using the new scheme as well as quantum mechanically as a function of the atomic electronic-energy separation (lambda), with overall good agreement between the two sets of results. Classical mechanics with the new decoupling scheme is found to be capable of predicting resonance behavior whereas an earlier decoupling scheme (the coordinate-based decoupling scheme) failed. Interference effects are not exhibited in P vs lambda results.
Chung, Yun Won; Kwon, Jae Kyun; Park, Suwon
2014-01-01
One of the key technologies to support mobility of mobile station (MS) in mobile communication systems is location management which consists of location update and paging. In this paper, an improved movement-based location management scheme with two movement thresholds is proposed, considering bursty data traffic characteristics of packet-switched (PS) services. The analytical modeling for location update and paging signaling loads of the proposed scheme is developed thoroughly and the performance of the proposed scheme is compared with that of the conventional scheme. We show that the proposed scheme outperforms the conventional scheme in terms of total signaling load with an appropriate selection of movement thresholds.
NASA Astrophysics Data System (ADS)
Ahmed, Rounaq; Srinivasa Pai, P.; Sriram, N. S.; Bhat, Vasudeva
2018-02-01
Vibration Analysis has been extensively used in recent past for gear fault diagnosis. The vibration signals extracted is usually contaminated with noise and may lead to wrong interpretation of results. The denoising of extracted vibration signals helps the fault diagnosis by giving meaningful results. Wavelet Transform (WT) increases signal to noise ratio (SNR), reduces root mean square error (RMSE) and is effective to denoise the gear vibration signals. The extracted signals have to be denoised by selecting a proper denoising scheme in order to prevent the loss of signal information along with noise. An approach has been made in this work to show the effectiveness of Principal Component Analysis (PCA) to denoise gear vibration signal. In this regard three selected wavelet based denoising schemes namely PCA, Empirical Mode Decomposition (EMD), Neighcoeff Coefficient (NC), has been compared with Adaptive Threshold (AT) an extensively used wavelet based denoising scheme for gear vibration signal. The vibration signals acquired from a customized gear test rig were denoised by above mentioned four denoising schemes. The fault identification capability as well as SNR, Kurtosis and RMSE for the four denoising schemes have been compared. Features extracted from the denoised signals have been used to train and test artificial neural network (ANN) models. The performances of the four denoising schemes have been evaluated based on the performance of the ANN models. The best denoising scheme has been identified, based on the classification accuracy results. PCA is effective in all the regards as a best denoising scheme.
NASA Astrophysics Data System (ADS)
Siswantyo, Sepha; Susanti, Bety Hayat
2016-02-01
Preneel-Govaerts-Vandewalle (PGV) schemes consist of 64 possible single-block-length schemes that can be used to build a hash function based on block ciphers. For those 64 schemes, Preneel claimed that 4 schemes are secure. In this paper, we apply length extension attack on those 4 secure PGV schemes which use RC5 algorithm in its basic construction to test their collision resistance property. The attack result shows that the collision occurred on those 4 secure PGV schemes. Based on the analysis, we indicate that Feistel structure and data dependent rotation operation in RC5 algorithm, XOR operations on the scheme, along with selection of additional message block value also give impact on the collision to occur.
Enhancing the LVRT Capability of PMSG-Based Wind Turbines Based on R-SFCL
NASA Astrophysics Data System (ADS)
Xu, Lin; Lin, Ruixing; Ding, Lijie; Huang, Chunjun
2018-03-01
A novel low voltage ride-through (LVRT) scheme for PMSG-based wind turbines based on the Resistor Superconducting Fault Current Limiter (R-SFCL) is proposed in this paper. The LVRT scheme is mainly formed by R-SFCL in series between the transformer and the Grid Side Converter (GSC), and basic modelling has been discussed in detail. The proposed LVRT scheme is implemented to interact with PMSG model in PSCAD/EMTDC under three phase short circuit fault condition, which proves that the proposed scheme based on R-SFCL can improve the transient performance and LVRT capability to consolidate grid connection with wind turbines.
A robust anonymous biometric-based authenticated key agreement scheme for multi-server environments
Huang, Yuanfei; Ma, Fangchao
2017-01-01
In order to improve the security in remote authentication systems, numerous biometric-based authentication schemes using smart cards have been proposed. Recently, Moon et al. presented an authentication scheme to remedy the flaws of Lu et al.’s scheme, and claimed that their improved protocol supports the required security properties. Unfortunately, we found that Moon et al.’s scheme still has weaknesses. In this paper, we show that Moon et al.’s scheme is vulnerable to insider attack, server spoofing attack, user impersonation attack and guessing attack. Furthermore, we propose a robust anonymous multi-server authentication scheme using public key encryption to remove the aforementioned problems. From the subsequent formal and informal security analysis, we demonstrate that our proposed scheme provides strong mutual authentication and satisfies the desirable security requirements. The functional and performance analysis shows that the improved scheme has the best secure functionality and is computational efficient. PMID:29121050
A robust anonymous biometric-based authenticated key agreement scheme for multi-server environments.
Guo, Hua; Wang, Pei; Zhang, Xiyong; Huang, Yuanfei; Ma, Fangchao
2017-01-01
In order to improve the security in remote authentication systems, numerous biometric-based authentication schemes using smart cards have been proposed. Recently, Moon et al. presented an authentication scheme to remedy the flaws of Lu et al.'s scheme, and claimed that their improved protocol supports the required security properties. Unfortunately, we found that Moon et al.'s scheme still has weaknesses. In this paper, we show that Moon et al.'s scheme is vulnerable to insider attack, server spoofing attack, user impersonation attack and guessing attack. Furthermore, we propose a robust anonymous multi-server authentication scheme using public key encryption to remove the aforementioned problems. From the subsequent formal and informal security analysis, we demonstrate that our proposed scheme provides strong mutual authentication and satisfies the desirable security requirements. The functional and performance analysis shows that the improved scheme has the best secure functionality and is computational efficient.
An Efficient Quantum Somewhat Homomorphic Symmetric Searchable Encryption
NASA Astrophysics Data System (ADS)
Sun, Xiaoqiang; Wang, Ting; Sun, Zhiwei; Wang, Ping; Yu, Jianping; Xie, Weixin
2017-04-01
In 2009, Gentry first introduced an ideal lattices fully homomorphic encryption (FHE) scheme. Later, based on the approximate greatest common divisor problem, learning with errors problem or learning with errors over rings problem, FHE has developed rapidly, along with the low efficiency and computational security. Combined with quantum mechanics, Liang proposed a symmetric quantum somewhat homomorphic encryption (QSHE) scheme based on quantum one-time pad, which is unconditional security. And it was converted to a quantum fully homomorphic encryption scheme, whose evaluation algorithm is based on the secret key. Compared with Liang's QSHE scheme, we propose a more efficient QSHE scheme for classical input states with perfect security, which is used to encrypt the classical message, and the secret key is not required in the evaluation algorithm. Furthermore, an efficient symmetric searchable encryption (SSE) scheme is constructed based on our QSHE scheme. SSE is important in the cloud storage, which allows users to offload search queries to the untrusted cloud. Then the cloud is responsible for returning encrypted files that match search queries (also encrypted), which protects users' privacy.
A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State
NASA Astrophysics Data System (ADS)
Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping
2016-02-01
In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.
A multihop key agreement scheme for wireless ad hoc networks based on channel characteristics.
Hao, Zhuo; Zhong, Sheng; Yu, Nenghai
2013-01-01
A number of key agreement schemes based on wireless channel characteristics have been proposed recently. However, previous key agreement schemes require that two nodes which need to agree on a key are within the communication range of each other. Hence, they are not suitable for multihop wireless networks, in which nodes do not always have direct connections with each other. In this paper, we first propose a basic multihop key agreement scheme for wireless ad hoc networks. The proposed basic scheme is resistant to external eavesdroppers. Nevertheless, this basic scheme is not secure when there exist internal eavesdroppers or Man-in-the-Middle (MITM) adversaries. In order to cope with these adversaries, we propose an improved multihop key agreement scheme. We show that the improved scheme is secure against internal eavesdroppers and MITM adversaries in a single path. Both performance analysis and simulation results demonstrate that the improved scheme is efficient. Consequently, the improved key agreement scheme is suitable for multihop wireless ad hoc networks.
A Multihop Key Agreement Scheme for Wireless Ad Hoc Networks Based on Channel Characteristics
Yu, Nenghai
2013-01-01
A number of key agreement schemes based on wireless channel characteristics have been proposed recently. However, previous key agreement schemes require that two nodes which need to agree on a key are within the communication range of each other. Hence, they are not suitable for multihop wireless networks, in which nodes do not always have direct connections with each other. In this paper, we first propose a basic multihop key agreement scheme for wireless ad hoc networks. The proposed basic scheme is resistant to external eavesdroppers. Nevertheless, this basic scheme is not secure when there exist internal eavesdroppers or Man-in-the-Middle (MITM) adversaries. In order to cope with these adversaries, we propose an improved multihop key agreement scheme. We show that the improved scheme is secure against internal eavesdroppers and MITM adversaries in a single path. Both performance analysis and simulation results demonstrate that the improved scheme is efficient. Consequently, the improved key agreement scheme is suitable for multihop wireless ad hoc networks. PMID:23766725
Comparison of two SVD-based color image compression schemes.
Li, Ying; Wei, Musheng; Zhang, Fengxia; Zhao, Jianli
2017-01-01
Color image compression is a commonly used process to represent image data as few bits as possible, which removes redundancy in the data while maintaining an appropriate level of quality for the user. Color image compression algorithms based on quaternion are very common in recent years. In this paper, we propose a color image compression scheme, based on the real SVD, named real compression scheme. First, we form a new real rectangular matrix C according to the red, green and blue components of the original color image and perform the real SVD for C. Then we select several largest singular values and the corresponding vectors in the left and right unitary matrices to compress the color image. We compare the real compression scheme with quaternion compression scheme by performing quaternion SVD using the real structure-preserving algorithm. We compare the two schemes in terms of operation amount, assignment number, operation speed, PSNR and CR. The experimental results show that with the same numbers of selected singular values, the real compression scheme offers higher CR, much less operation time, but a little bit smaller PSNR than the quaternion compression scheme. When these two schemes have the same CR, the real compression scheme shows more prominent advantages both on the operation time and PSNR.
Comparison of two SVD-based color image compression schemes
Li, Ying; Wei, Musheng; Zhang, Fengxia; Zhao, Jianli
2017-01-01
Color image compression is a commonly used process to represent image data as few bits as possible, which removes redundancy in the data while maintaining an appropriate level of quality for the user. Color image compression algorithms based on quaternion are very common in recent years. In this paper, we propose a color image compression scheme, based on the real SVD, named real compression scheme. First, we form a new real rectangular matrix C according to the red, green and blue components of the original color image and perform the real SVD for C. Then we select several largest singular values and the corresponding vectors in the left and right unitary matrices to compress the color image. We compare the real compression scheme with quaternion compression scheme by performing quaternion SVD using the real structure-preserving algorithm. We compare the two schemes in terms of operation amount, assignment number, operation speed, PSNR and CR. The experimental results show that with the same numbers of selected singular values, the real compression scheme offers higher CR, much less operation time, but a little bit smaller PSNR than the quaternion compression scheme. When these two schemes have the same CR, the real compression scheme shows more prominent advantages both on the operation time and PSNR. PMID:28257451
A Target Coverage Scheduling Scheme Based on Genetic Algorithms in Directional Sensor Networks
Gil, Joon-Min; Han, Youn-Hee
2011-01-01
As a promising tool for monitoring the physical world, directional sensor networks (DSNs) consisting of a large number of directional sensors are attracting increasing attention. As directional sensors in DSNs have limited battery power and restricted angles of sensing range, maximizing the network lifetime while monitoring all the targets in a given area remains a challenge. A major technique to conserve the energy of directional sensors is to use a node wake-up scheduling protocol by which some sensors remain active to provide sensing services, while the others are inactive to conserve their energy. In this paper, we first address a Maximum Set Covers for DSNs (MSCD) problem, which is known to be NP-complete, and present a greedy algorithm-based target coverage scheduling scheme that can solve this problem by heuristics. This scheme is used as a baseline for comparison. We then propose a target coverage scheduling scheme based on a genetic algorithm that can find the optimal cover sets to extend the network lifetime while monitoring all targets by the evolutionary global search technique. To verify and evaluate these schemes, we conducted simulations and showed that the schemes can contribute to extending the network lifetime. Simulation results indicated that the genetic algorithm-based scheduling scheme had better performance than the greedy algorithm-based scheme in terms of maximizing network lifetime. PMID:22319387
Shawky, S
2010-06-01
The current health insurance system in Egypt targets the productive population through an employment-based scheme bounded by a cost ceiling and focusing on curative care. Egypt Social Contract Survey data from 2005 were used to evaluate the impact of the employment-based scheme on health system accessibility and financing. Only 22.8% of the population in the productive age range (19-59 years) benefited from any health insurance scheme. The employment-based scheme covered 39.3% of the working population and was skewed towards urban areas, older people, females and the wealthier. It did not increase service utilization, but reduced out-of-pocket expenditure. Egypt should blend all health insurance schemes and adopt an innovative approach to reach universal coverage.
Quantum attack-resistent certificateless multi-receiver signcryption scheme.
Li, Huixian; Chen, Xubao; Pang, Liaojun; Shi, Weisong
2013-01-01
The existing certificateless signcryption schemes were designed mainly based on the traditional public key cryptography, in which the security relies on the hard problems, such as factor decomposition and discrete logarithm. However, these problems will be easily solved by the quantum computing. So the existing certificateless signcryption schemes are vulnerable to the quantum attack. Multivariate public key cryptography (MPKC), which can resist the quantum attack, is one of the alternative solutions to guarantee the security of communications in the post-quantum age. Motivated by these concerns, we proposed a new construction of the certificateless multi-receiver signcryption scheme (CLMSC) based on MPKC. The new scheme inherits the security of MPKC, which can withstand the quantum attack. Multivariate quadratic polynomial operations, which have lower computation complexity than bilinear pairing operations, are employed in signcrypting a message for a certain number of receivers in our scheme. Security analysis shows that our scheme is a secure MPKC-based scheme. We proved its security under the hardness of the Multivariate Quadratic (MQ) problem and its unforgeability under the Isomorphism of Polynomials (IP) assumption in the random oracle model. The analysis results show that our scheme also has the security properties of non-repudiation, perfect forward secrecy, perfect backward secrecy and public verifiability. Compared with the existing schemes in terms of computation complexity and ciphertext length, our scheme is more efficient, which makes it suitable for terminals with low computation capacity like smart cards.
NASA Astrophysics Data System (ADS)
Nisar, Ubaid Ahmed; Ashraf, Waqas; Qamar, Shamsul
2016-08-01
Numerical solutions of the hydrodynamical model of semiconductor devices are presented in one and two-space dimension. The model describes the charge transport in semiconductor devices. Mathematically, the models can be written as a convection-diffusion type system with a right hand side describing the relaxation effects and interaction with a self consistent electric field. The proposed numerical scheme is a splitting scheme based on the conservation element and solution element (CE/SE) method for hyperbolic step, and a semi-implicit scheme for the relaxation step. The numerical results of the suggested scheme are compared with the splitting scheme based on Nessyahu-Tadmor (NT) central scheme for convection step and the same semi-implicit scheme for the relaxation step. The effects of various parameters such as low field mobility, device length, lattice temperature and voltages for one-space dimensional hydrodynamic model are explored to further validate the generic applicability of the CE/SE method for the current model equations. A two dimensional simulation is also performed by CE/SE method for a MESFET device, producing results in good agreement with those obtained by NT-central scheme.
Lu, Yanrong; Li, Lixiang; Peng, Haipeng; Yang, Yixian
2015-03-01
The telecare medical information systems (TMISs) enable patients to conveniently enjoy telecare services at home. The protection of patient's privacy is a key issue due to the openness of communication environment. Authentication as a typical approach is adopted to guarantee confidential and authorized interaction between the patient and remote server. In order to achieve the goals, numerous remote authentication schemes based on cryptography have been presented. Recently, Arshad et al. (J Med Syst 38(12): 2014) presented a secure and efficient three-factor authenticated key exchange scheme to remedy the weaknesses of Tan et al.'s scheme (J Med Syst 38(3): 2014). In this paper, we found that once a successful off-line password attack that results in an adversary could impersonate any user of the system in Arshad et al.'s scheme. In order to thwart these security attacks, an enhanced biometric and smart card based remote authentication scheme for TMISs is proposed. In addition, the BAN logic is applied to demonstrate the completeness of the enhanced scheme. Security and performance analyses show that our enhanced scheme satisfies more security properties and less computational cost compared with previously proposed schemes.
Cryptanalysis of Chatterjee-Sarkar Hierarchical Identity-Based Encryption Scheme at PKC 06
NASA Astrophysics Data System (ADS)
Park, Jong Hwan; Lee, Dong Hoon
In 2006, Chatterjee and Sarkar proposed a hierarchical identity-based encryption (HIBE) scheme which can support an unbounded number of identity levels. This property is particularly useful in providing forward secrecy by embedding time components within hierarchical identities. In this paper we show that their scheme does not provide the claimed property. Our analysis shows that if the number of identity levels becomes larger than the value of a fixed public parameter, an unintended receiver can reconstruct a new valid ciphertext and decrypt the ciphertext using his or her own private key. The analysis is similarly applied to a multi-receiver identity-based encryption scheme presented as an application of Chatterjee and Sarkar's HIBE scheme.
Algorithms for adaptive stochastic control for a class of linear systems
NASA Technical Reports Server (NTRS)
Toda, M.; Patel, R. V.
1977-01-01
Control of linear, discrete time, stochastic systems with unknown control gain parameters is discussed. Two suboptimal adaptive control schemes are derived: one is based on underestimating future control and the other is based on overestimating future control. Both schemes require little on-line computation and incorporate in their control laws some information on estimation errors. The performance of these laws is studied by Monte Carlo simulations on a computer. Two single input, third order systems are considered, one stable and the other unstable, and the performance of the two adaptive control schemes is compared with that of the scheme based on enforced certainty equivalence and the scheme where the control gain parameters are known.
Efficient Fair Exchange from Identity-Based Signature
NASA Astrophysics Data System (ADS)
Yum, Dae Hyun; Lee, Pil Joong
A fair exchange scheme is a protocol by which two parties Alice and Bob exchange items or services without allowing either party to gain advantages by quitting prematurely or otherwise misbehaving. To this end, modern cryptographic solutions use a semi-trusted arbitrator who involves only in cases where one party attempts to cheat or simply crashes. We call such a fair exchange scheme optimistic. When no registration is required between the signer and the arbitrator, we say that the fair exchange scheme is setup free. To date, the setup-free optimist fair exchange scheme under the standard RSA assumption was only possible from the generic construction of [12], which uses ring signatures. In this paper, we introduce a new setup-free optimistic fair exchange scheme under the standard RSA assumption. Our scheme uses the GQ identity-based signature and is more efficient than [12]. The construction can also be generalized by using various identity-based signature schemes. Our main technique is to allow each user to choose his (or her) own “random” public key in the identitybased signature scheme.
Genetic and economic evaluation of Japanese Black (Wagyu) cattle breeding schemes.
Kahi, A K; Hirooka, H
2005-09-01
Deterministic simulation was used to evaluate 10 breeding schemes for genetic gain and profitability and in the context of maximizing returns from investment in Japanese Black cattle breeding. A breeding objective that integrated the cow-calf and feedlot segments was considered. Ten breeding schemes that differed in the records available for use as selection criteria were defined. The schemes ranged from one that used carcass traits currently available to Japanese Black cattle breeders (Scheme 1) to one that also included linear measurements and male and female reproduction traits (Scheme 10). The latter scheme represented the highest level of performance recording. In all breeding schemes, sires were chosen from the proportion selected during the first selection stage (performance testing), modeling a two-stage selection process. The effect on genetic gain and profitability of varying test capacity and number of progeny per sire and of ultrasound scanning of live animals was examined for all breeding schemes. Breeding schemes that selected young bulls during performance testing based on additional individual traits and information on carcass traits from their relatives generated additional genetic gain and profitability. Increasing test capacity resulted in an increase in genetic gain in all schemes. Profitability was optimal in Scheme 2 (a scheme similar to Scheme 1, but selection of young bulls also was based on information on carcass traits from their relatives) to 10 when 900 to 1,000 places were available for performance testing. Similarly, as the number of progeny used in the selection of sires increased, genetic gain first increased sharply and then gradually in all schemes. Profit was optimal across all breeding schemes when sires were selected based on information from 150 to 200 progeny. Additional genetic gain and profitability were generated in each breeding scheme with ultrasound scanning of live animals for carcass traits. Ultrasound scanning of live animals was more important than the addition of any other traits in the selection criteria. These results may be used to provide guidance to Japanese Black cattle breeders.
Das, Ashok Kumar
2015-03-01
An integrated EPR (Electronic Patient Record) information system of all the patients provides the medical institutions and the academia with most of the patients' information in details for them to make corrective decisions and clinical decisions in order to maintain and analyze patients' health. In such system, the illegal access must be restricted and the information from theft during transmission over the insecure Internet must be prevented. Lee et al. proposed an efficient password-based remote user authentication scheme using smart card for the integrated EPR information system. Their scheme is very efficient due to usage of one-way hash function and bitwise exclusive-or (XOR) operations. However, in this paper, we show that though their scheme is very efficient, their scheme has three security weaknesses such as (1) it has design flaws in password change phase, (2) it fails to protect privileged insider attack and (3) it lacks the formal security verification. We also find that another recently proposed Wen's scheme has the same security drawbacks as in Lee at al.'s scheme. In order to remedy these security weaknesses found in Lee et al.'s scheme and Wen's scheme, we propose a secure and efficient password-based remote user authentication scheme using smart cards for the integrated EPR information system. We show that our scheme is also efficient as compared to Lee et al.'s scheme and Wen's scheme as our scheme only uses one-way hash function and bitwise exclusive-or (XOR) operations. Through the security analysis, we show that our scheme is secure against possible known attacks. Furthermore, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks.
Multi Sensor Fusion Using Fitness Adaptive Differential Evolution
NASA Astrophysics Data System (ADS)
Giri, Ritwik; Ghosh, Arnob; Chowdhury, Aritra; Das, Swagatam
The rising popularity of multi-source, multi-sensor networks supports real-life applications calls for an efficient and intelligent approach to information fusion. Traditional optimization techniques often fail to meet the demands. The evolutionary approach provides a valuable alternative due to its inherent parallel nature and its ability to deal with difficult problems. We present a new evolutionary approach based on a modified version of Differential Evolution (DE), called Fitness Adaptive Differential Evolution (FiADE). FiADE treats sensors in the network as distributed intelligent agents with various degrees of autonomy. Existing approaches based on intelligent agents cannot completely answer the question of how their agents could coordinate their decisions in a complex environment. The proposed approach is formulated to produce good result for the problems that are high-dimensional, highly nonlinear, and random. The proposed approach gives better result in case of optimal allocation of sensors. The performance of the proposed approach is compared with an evolutionary algorithm coordination generalized particle model (C-GPM).
Progressive simplification and transmission of building polygons based on triangle meshes
NASA Astrophysics Data System (ADS)
Li, Hongsheng; Wang, Yingjie; Guo, Qingsheng; Han, Jiafu
2010-11-01
Digital earth is a virtual representation of our planet and a data integration platform which aims at harnessing multisource, multi-resolution, multi-format spatial data. This paper introduces a research framework integrating progressive cartographic generalization and transmission of vector data. The progressive cartographic generalization provides multiple resolution data from coarse to fine as key scales and increments between them which is not available in traditional generalization framework. Based on the progressive simplification algorithm, the building polygons are triangulated into meshes and encoded according to the simplification sequence of two basic operations, edge collapse and vertex split. The map data at key scales and encoded increments between them are stored in a multi-resolution file. As the client submits requests to the server, the coarsest map is transmitted first and then the increments. After data decoding and mesh refinement the building polygons with more details will be visualized. Progressive generalization and transmission of building polygons is demonstrated in the paper.
Multi-Source Sensor Fusion for Small Unmanned Aircraft Systems Using Fuzzy Logic
NASA Technical Reports Server (NTRS)
Cook, Brandon; Cohen, Kelly
2017-01-01
As the applications for using small Unmanned Aircraft Systems (sUAS) beyond visual line of sight (BVLOS) continue to grow in the coming years, it is imperative that intelligent sensor fusion techniques be explored. In BVLOS scenarios the vehicle position must accurately be tracked over time to ensure no two vehicles collide with one another, no vehicle crashes into surrounding structures, and to identify off-nominal scenarios. Therefore, in this study an intelligent systems approach is used to estimate the position of sUAS given a variety of sensor platforms, including, GPS, radar, and on-board detection hardware. Common research challenges include, asynchronous sensor rates and sensor reliability. In an effort to realize these challenges, techniques such as a Maximum a Posteriori estimation and a Fuzzy Logic based sensor confidence determination are used.
Yang, Yi-Feng
2013-12-01
The present paper evaluates the relation between transformational leadership and market orientation along with the mediating and moderating effects of change commitment for employees in customer centers in Taiwan. 327 questionnaires were returned by personnel at several customer centers in four different insurance companies. Inter-rater agreement was acceptable based on the multiple raters (i.e., the consumer-related employees from the division groups) of one individual (i.e., a manager)--indicating the aggregated measures were acceptable. The multi-source sample comprised data taken from the four division centers: phone services, customer representatives, financial specialists, and front-line salespeople. The relations were assessed using a multiple mediation procedure incorporating bootstrap techniques and PRODCLIN2 with structural equation modeling analysis. The results reflect a mediating role for change commitment.
WHO Expert Committee on Specifications for Pharmaceutical Preparations.
2014-01-01
The Expert Committee on Specifications for Pharmaceutical Preparations works towards clear, independent and practical standards and guidelines for the quality assurance of medicines. Standards are developed by the Committee through worldwide consultation and an international consensus-building process. The following new guidelines were adopted and recommended for use, in addition to 20 monographs and general texts for inclusion in The International Pharmacopoeia and 11 new International Chemical Reference Substances. The International Pharmacopoeia--updating mechanism for the section on radiopharmaceuticals; WHO good manufacturing practices for pharmaceutical products: main principles; Model quality assurance system for procurement agencies; Assessment tool based on the model quality assurance system for procurement agencies: aide-memoire for inspection; Guidelines on submission of documentation for prequalification of finished pharmaceutical products approved by stringent regulatory authorities; and Guidelines on submission of documentation for a multisource (generic) finished pharmaceutical product: quality part.
Ultra-compact coherent receiver with serial interface for pluggable transceiver.
Itoh, Toshihiro; Nakajima, Fumito; Ohno, Tetsuichiro; Yamanaka, Shogo; Soma, Shunichi; Saida, Takashi; Nosaka, Hideyuki; Murata, Koichi
2014-09-22
An ultra-compact integrated coherent receiver with a volume of 1.3 cc using a quad-channel transimpedance amplifier (TIA)-IC chip with a serial peripheral interface (SPI) is demonstrated for the first time. The TIA with the SPI and photodiode (PD) bias circuits, a miniature dual polarization optical hybrid, an octal-PD and small optical coupling system enabled the realization of the compact receiver. Measured transmission performance with 32 Gbaud dual-polarization quadrature phase shift keying signal is equivalent to that of the conventional multi-source agreement-based integrated coherent receiver with dual channel TIA-ICs. By comparing the bit-error rate (BER) performance with that under continuous SPI access, we also confirmed that there is no BER degradation caused by SPI interface access. Such an ultra-compact receiver is promising for realizing a new generation of pluggable transceivers.
Object-oriented recognition of high-resolution remote sensing image
NASA Astrophysics Data System (ADS)
Wang, Yongyan; Li, Haitao; Chen, Hong; Xu, Yuannan
2016-01-01
With the development of remote sensing imaging technology and the improvement of multi-source image's resolution in satellite visible light, multi-spectral and hyper spectral , the high resolution remote sensing image has been widely used in various fields, for example military field, surveying and mapping, geophysical prospecting, environment and so forth. In remote sensing image, the segmentation of ground targets, feature extraction and the technology of automatic recognition are the hotspot and difficulty in the research of modern information technology. This paper also presents an object-oriented remote sensing image scene classification method. The method is consist of vehicles typical objects classification generation, nonparametric density estimation theory, mean shift segmentation theory, multi-scale corner detection algorithm, local shape matching algorithm based on template. Remote sensing vehicles image classification software system is designed and implemented to meet the requirements .
Construction of Low Dissipative High Order Well-Balanced Filter Schemes for Non-Equilibrium Flows
NASA Technical Reports Server (NTRS)
Wang, Wei; Yee, H. C.; Sjogreen, Bjorn; Magin, Thierry; Shu, Chi-Wang
2009-01-01
The goal of this paper is to generalize the well-balanced approach for non-equilibrium flow studied by Wang et al. [26] to a class of low dissipative high order shock-capturing filter schemes and to explore more advantages of well-balanced schemes in reacting flows. The class of filter schemes developed by Yee et al. [30], Sjoegreen & Yee [24] and Yee & Sjoegreen [35] consist of two steps, a full time step of spatially high order non-dissipative base scheme and an adaptive nonlinear filter containing shock-capturing dissipation. A good property of the filter scheme is that the base scheme and the filter are stand alone modules in designing. Therefore, the idea of designing a well-balanced filter scheme is straightforward, i.e., choosing a well-balanced base scheme with a well-balanced filter (both with high order). A typical class of these schemes shown in this paper is the high order central difference schemes/predictor-corrector (PC) schemes with a high order well-balanced WENO filter. The new filter scheme with the well-balanced property will gather the features of both filter methods and well-balanced properties: it can preserve certain steady state solutions exactly; it is able to capture small perturbations, e.g., turbulence fluctuations; it adaptively controls numerical dissipation. Thus it shows high accuracy, efficiency and stability in shock/turbulence interactions. Numerical examples containing 1D and 2D smooth problems, 1D stationary contact discontinuity problem and 1D turbulence/shock interactions are included to verify the improved accuracy, in addition to the well-balanced behavior.
A Secure ECC-based RFID Mutual Authentication Protocol to Enhance Patient Medication Safety.
Jin, Chunhua; Xu, Chunxiang; Zhang, Xiaojun; Li, Fagen
2016-01-01
Patient medication safety is an important issue in patient medication systems. In order to prevent medication errors, integrating Radio Frequency Identification (RFID) technology into automated patient medication systems is required in hospitals. Based on RFID technology, such systems can provide medical evidence for patients' prescriptions and medicine doses, etc. Due to the mutual authentication between the medication server and the tag, RFID authentication scheme is the best choice for automated patient medication systems. In this paper, we present a RFID mutual authentication scheme based on elliptic curve cryptography (ECC) to enhance patient medication safety. Our scheme can achieve security requirements and overcome various attacks existing in other schemes. In addition, our scheme has better performance in terms of computational cost and communication overhead. Therefore, the proposed scheme is well suitable for patient medication systems.
2013-12-14
population covariance matrix with application to array signal processing; and 5) a sample covariance matrix for which a CLT is studied on linear...Applications , (01 2012): 1150004. doi: Walid Hachem, Malika Kharouf, Jamal Najim, Jack W. Silverstein. A CLT FOR INFORMATION- THEORETIC STATISTICS...for Multi-source Power Estimation, (04 2010) Malika Kharouf, Jamal Najim, Jack W. Silverstein, Walid Hachem. A CLT FOR INFORMATION- THEORETIC
Young, Bridget; Ward, Jo; Forsey, Mary; Gravenhorst, Katja; Salmon, Peter
2011-10-01
We explored parent-doctor relationships in the care of children with leukaemia from three perspectives simultaneously: parents', doctors' and observers'. Our aim was to investigate convergence and divergence between these perspectives and thereby examine the validity of unitary theory of emotionality and authority in clinical relationships. 33 audiorecorded parent-doctor consultations and separate interviews with parents and doctors, which we analysed qualitatively and from which we selected three prototype cases. Across the whole sample doctors' sense of relationship generally converged with our observations of consultation, but parents' sense of relationship diverged strongly from each. Contrary to current assumptions, parents' sense of emotional connection with doctors did not depend on doctors' emotional behaviour, and parents did not feel disempowered by doctors' authority. Moreover, authority and emotionality were not conceptually distinct for parents, who gained emotional support from doctors' exercise of authority. The relationships looked very different from the three perspectives. These divergences indicate weaknesses in current ideas of emotionality and authority in clinical relationships and the necessity of multisource datasets to develop these ideas in a way that characterises clinical relationships from all perspectives. Methodological development will be needed to address the challenges posed by multisource datasets. Copyright © 2010 Elsevier Ireland Ltd. All rights reserved.
Greenbaum, Rebecca L; Quade, Matthew J; Mawritz, Mary B; Kim, Joongseo; Crosby, Durand
2014-11-01
We integrate deontological ethics (Folger, 1998, 2001; Kant, 1785/1948, 1797/1991) with conservation of resources theory (Hobfoll, 1989) to propose that an employee's repeated exposure to violations of moral principle can diminish the availability of resources to appropriately attend to other personal and work domains. In particular, we identify customer unethical behavior as a morally charged work demand that leads to a depletion of resources as captured by employee emotional exhaustion. In turn, emotionally exhausted employees experience higher levels of work-family conflict, relationship conflict with coworkers, and job neglect. Employee emotional exhaustion serves as the mediator between customer unethical behavior and such outcomes. To provide further evidence of a deontological effect, we demonstrate the unique effect of customer unethical behavior onto emotional exhaustion beyond perceptions of personal mistreatment and trait negative affectivity. In Study 1, we found support for our theoretical model using multisource field data from customer-service professionals across a variety of industries. In Study 2, we also found support for our theoretical model using multisource, longitudinal field data from service employees in a large government organization. Theoretical and practical implications are discussed. (PsycINFO Database Record (c) 2014 APA, all rights reserved).
Multi-source remotely sensed data fusion for improving land cover classification
NASA Astrophysics Data System (ADS)
Chen, Bin; Huang, Bo; Xu, Bing
2017-02-01
Although many advances have been made in past decades, land cover classification of fine-resolution remotely sensed (RS) data integrating multiple temporal, angular, and spectral features remains limited, and the contribution of different RS features to land cover classification accuracy remains uncertain. We proposed to improve land cover classification accuracy by integrating multi-source RS features through data fusion. We further investigated the effect of different RS features on classification performance. The results of fusing Landsat-8 Operational Land Imager (OLI) data with Moderate Resolution Imaging Spectroradiometer (MODIS), China Environment 1A series (HJ-1A), and Advanced Spaceborne Thermal Emission and Reflection (ASTER) digital elevation model (DEM) data, showed that the fused data integrating temporal, spectral, angular, and topographic features achieved better land cover classification accuracy than the original RS data. Compared with the topographic feature, the temporal and angular features extracted from the fused data played more important roles in classification performance, especially those temporal features containing abundant vegetation growth information, which markedly increased the overall classification accuracy. In addition, the multispectral and hyperspectral fusion successfully discriminated detailed forest types. Our study provides a straightforward strategy for hierarchical land cover classification by making full use of available RS data. All of these methods and findings could be useful for land cover classification at both regional and global scales.
Binaural segregation in multisource reverberant environments.
Roman, Nicoleta; Srinivasan, Soundararajan; Wang, DeLiang
2006-12-01
In a natural environment, speech signals are degraded by both reverberation and concurrent noise sources. While human listening is robust under these conditions using only two ears, current two-microphone algorithms perform poorly. The psychological process of figure-ground segregation suggests that the target signal is perceived as a foreground while the remaining stimuli are perceived as a background. Accordingly, the goal is to estimate an ideal time-frequency (T-F) binary mask, which selects the target if it is stronger than the interference in a local T-F unit. In this paper, a binaural segregation system that extracts the reverberant target signal from multisource reverberant mixtures by utilizing only the location information of target source is proposed. The proposed system combines target cancellation through adaptive filtering and a binary decision rule to estimate the ideal T-F binary mask. The main observation in this work is that the target attenuation in a T-F unit resulting from adaptive filtering is correlated with the relative strength of target to mixture. A comprehensive evaluation shows that the proposed system results in large SNR gains. In addition, comparisons using SNR as well as automatic speech recognition measures show that this system outperforms standard two-microphone beamforming approaches and a recent binaural processor.
A Multisource Approach to Assessing Child Maltreatment From Records, Caregivers, and Children.
Sierau, Susan; Brand, Tilman; Manly, Jody Todd; Schlesier-Michel, Andrea; Klein, Annette M; Andreas, Anna; Garzón, Leonhard Quintero; Keil, Jan; Binser, Martin J; von Klitzing, Kai; White, Lars O
2017-02-01
Practitioners and researchers alike face the challenge that different sources report inconsistent information regarding child maltreatment. The present study capitalizes on concordance and discordance between different sources and probes applicability of a multisource approach to data from three perspectives on maltreatment-Child Protection Services (CPS) records, caregivers, and children. The sample comprised 686 participants in early childhood (3- to 8-year-olds; n = 275) or late childhood/adolescence (9- to 16-year-olds; n = 411), 161 from two CPS sites and 525 from the community oversampled for psychosocial risk. We established three components within a factor-analytic approach: the shared variance between sources on presence of maltreatment (convergence), nonshared variance resulting from the child's own perspective, and the caregiver versus CPS perspective. The shared variance between sources was the strongest predictor of caregiver- and self-reported child symptoms. Child perspective and caregiver versus CPS perspective mainly added predictive strength of symptoms in late childhood/adolescence over and above convergence in the case of emotional maltreatment, lack of supervision, and physical abuse. By contrast, convergence almost fully accounted for child symptoms for failure to provide. Our results suggest consistent information from different sources reporting on maltreatment is, on average, the best indicator of child risk.
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth’s resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution. PMID:22737023
Molina, Iñigo; Martinez, Estibaliz; Arquero, Agueda; Pajares, Gonzalo; Sanchez, Javier
2012-01-01
Landcover is subject to continuous changes on a wide variety of temporal and spatial scales. Those changes produce significant effects in human and natural activities. Maintaining an updated spatial database with the occurred changes allows a better monitoring of the Earth's resources and management of the environment. Change detection (CD) techniques using images from different sensors, such as satellite imagery, aerial photographs, etc., have proven to be suitable and secure data sources from which updated information can be extracted efficiently, so that changes can also be inventoried and monitored. In this paper, a multisource CD methodology for multiresolution datasets is applied. First, different change indices are processed, then different thresholding algorithms for change/no_change are applied to these indices in order to better estimate the statistical parameters of these categories, finally the indices are integrated into a change detection multisource fusion process, which allows generating a single CD result from several combination of indices. This methodology has been applied to datasets with different spectral and spatial resolution properties. Then, the obtained results are evaluated by means of a quality control analysis, as well as with complementary graphical representations. The suggested methodology has also been proved efficiently for identifying the change detection index with the higher contribution.
Family physician practice visits arising from the Alberta Physician Achievement Review
2013-01-01
Background Licensed physicians in Alberta are required to participate in the Physician Achievement Review (PAR) program every 5 years, comprising multi-source feedback questionnaires with confidential feedback, and practice visits for a minority of physicians. We wished to identify and classify issues requiring change or improvement from the family practice visits, and the responses to advice. Methods Retrospective analysis of narrative practice visit reports data using a mixed methods design to study records of visits to 51 family physicians and general practitioners who participated in PAR during the period 2010 to 2011, and whose ratings in one or more major assessment domains were significantly lower than their peer group. Results Reports from visits to the practices of family physicians and general practitioners confirmed opportunities for change and improvement, with two main groupings – practice environment and physician performance. For 40/51 physicians (78%) suggested actions were discussed with physicians and changes were confirmed. Areas of particular concern included problems arising from practice isolation and diagnostic conclusions being reached with incomplete clinical evidence. Conclusion This study provides additional evidence for the construct validity of a regulatory authority educational program in which multi-source performance feedback identifies areas for practice quality improvement, and change is encouraged by supplementary contact for selected physicians. PMID:24010980
Changes in the flood frequency in the Mahanadi basin under observed and projected future climate
NASA Astrophysics Data System (ADS)
Modi, P. A.; Lakshmi, V.; Mishra, V.
2017-12-01
The Mahanadi river basin is vulnerable to multiple types of extreme events due to its topography and river networks. These extreme events are not efficiently captured by the current LSMs partly due to lack of spatial hydrological data and uncertainty in the models. This study compares and evaluates the hydrologic simulations of the recently developed community Noah model with multi-parameterization options which is an upgradation of baseline Noah LSM. The model is calibrated and validated for the Mahanadi river basin and is driven by major atmospheric forcing from the Indian Meteorological Department (IMD), Global Precipitation Measurement (GPM), Tropical rainfall Measurement Mission (TRMM) and Multi-Source Weighted-Ensemble Precipitation (MSWEP designed for hydrological modeling) precipitation datasets along with some additional forcing derived from the VIC model at 0.25-degree spatial resolution. The Noah-MP LSM is calibrated using observed daily streamflow data from 1978-1989 (India-WRIS) at the gauge stations with least human interventions with a Nash Sutcliffe Efficiency higher than 0.60. Noah MP was calibrated using different schemes for runoff with variation in all parameters sensitive to surface and sub-surface runoff. Streamflow routing was performed using a stand-alone model (VIC model) to route daily model runoff at required gauge station. Surface runoff is mainly affected by the uncertainties in major atmospheric forcing and highly sensitive parameters pertaining to soil properties. Noah MP is validated using observed streamflow from 1975-2010 which indicates the consistency of streamflow with the historical observations (NSE>0.65) thus indicating an increase in probability of future flood events.
Energy-efficient writing scheme for magnetic domain-wall motion memory
NASA Astrophysics Data System (ADS)
Kim, Kab-Jin; Yoshimura, Yoko; Ham, Woo Seung; Ernst, Rick; Hirata, Yuushou; Li, Tian; Kim, Sanghoon; Moriyama, Takahiro; Nakatani, Yoshinobu; Ono, Teruo
2017-04-01
We present an energy-efficient magnetic domain-writing scheme for domain wall (DW) motion-based memory devices. A cross-shaped nanowire is employed to inject a domain into the nanowire through current-induced DW propagation. The energy required for injecting the magnetic domain is more than one order of magnitude lower than that for the conventional field-based writing scheme. The proposed scheme is beneficial for device miniaturization because the threshold current for DW propagation scales with the device size, which cannot be achieved in the conventional field-based technique.
A quantum proxy group signature scheme based on an entangled five-qubit state
NASA Astrophysics Data System (ADS)
Wang, Meiling; Ma, Wenping; Wang, Lili; Yin, Xunru
2015-09-01
A quantum proxy group signature (QPGS) scheme based on controlled teleportation is presented, by using the entangled five-qubit quantum state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security of the scheme is guaranteed by the entanglement correlations of the entangled five-qubit state, the secret keys based on the quantum key distribution (QKD) and the one-time pad algorithm, all of which have been proven to be unconditionally secure and the signature anonymity.
A FRACTAL-BASED STOCHASTIC INTERPOLATION SCHEME IN SUBSURFACE HYDROLOGY
The need for a realistic and rational method for interpolating sparse data sets is widespread. Real porosity and hydraulic conductivity data do not vary smoothly over space, so an interpolation scheme that preserves irregularity is desirable. Such a scheme based on the properties...
CP-ABE Based Privacy-Preserving User Profile Matching in Mobile Social Networks
Cui, Weirong; Du, Chenglie; Chen, Jinchao
2016-01-01
Privacy-preserving profile matching, a challenging task in mobile social networks, is getting more attention in recent years. In this paper, we propose a novel scheme that is based on ciphertext-policy attribute-based encryption to tackle this problem. In our scheme, a user can submit a preference-profile and search for users with matching-profile in decentralized mobile social networks. In this process, no participant’s profile and the submitted preference-profile is exposed. Meanwhile, a secure communication channel can be established between the pair of successfully matched users. In contrast to existing related schemes which are mainly based on the secure multi-party computation, our scheme can provide verifiability (both the initiator and any unmatched user cannot cheat each other to pretend to be matched), and requires few interactions among users. We provide thorough security analysis and performance evaluation on our scheme, and show its advantages in terms of security, efficiency and usability over state-of-the-art schemes. PMID:27337001
CP-ABE Based Privacy-Preserving User Profile Matching in Mobile Social Networks.
Cui, Weirong; Du, Chenglie; Chen, Jinchao
2016-01-01
Privacy-preserving profile matching, a challenging task in mobile social networks, is getting more attention in recent years. In this paper, we propose a novel scheme that is based on ciphertext-policy attribute-based encryption to tackle this problem. In our scheme, a user can submit a preference-profile and search for users with matching-profile in decentralized mobile social networks. In this process, no participant's profile and the submitted preference-profile is exposed. Meanwhile, a secure communication channel can be established between the pair of successfully matched users. In contrast to existing related schemes which are mainly based on the secure multi-party computation, our scheme can provide verifiability (both the initiator and any unmatched user cannot cheat each other to pretend to be matched), and requires few interactions among users. We provide thorough security analysis and performance evaluation on our scheme, and show its advantages in terms of security, efficiency and usability over state-of-the-art schemes.
Han, Xue; Hu, Shi; Guo, Qi; Wang, Hong-Fu; Zhu, Ai-Dong; Zhang, Shou
2015-08-05
We propose effective fusion schemes for stationary electronic W state and flying photonic W state, respectively, by using the quantum-dot-microcavity coupled system. The present schemes can fuse a n-qubit W state and a m-qubit W state to a (m + n - 1)-qubit W state, that is, these schemes can be used to not only create large W state with small ones, but also to prepare 3-qubit W states with Bell states. The schemes are based on the optical selection rules and the transmission and reflection rules of the cavity and can be achieved with high probability. We evaluate the effect of experimental imperfections and the feasibility of the schemes, which shows that the present schemes can be realized with high fidelity in both the weak coupling and the strong coupling regimes. These schemes may be meaningful for the large-scale solid-state-based quantum computation and the photon-qubit-based quantum communication.
Revocable identity-based proxy re-signature against signing key exposure.
Yang, Xiaodong; Chen, Chunlin; Ma, Tingchun; Wang, Jinli; Wang, Caifen
2018-01-01
Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification.
Revocable identity-based proxy re-signature against signing key exposure
Ma, Tingchun; Wang, Jinli; Wang, Caifen
2018-01-01
Identity-based proxy re-signature (IDPRS) is a novel cryptographic primitive that allows a semi-trusted proxy to convert a signature under one identity into another signature under another identity on the same message by using a re-signature key. Due to this transformation function, IDPRS is very useful in constructing privacy-preserving schemes for various information systems. Key revocation functionality is important in practical IDPRS for managing users dynamically; however, the existing IDPRS schemes do not provide revocation mechanisms that allow the removal of misbehaving or compromised users from the system. In this paper, we first introduce a notion called revocable identity-based proxy re-signature (RIDPRS) to achieve the revocation functionality. We provide a formal definition of RIDPRS as well as its security model. Then, we present a concrete RIDPRS scheme that can resist signing key exposure and prove that the proposed scheme is existentially unforgeable against adaptive chosen identity and message attacks in the standard model. To further improve the performance of signature verification in RIDPRS, we introduce a notion called server-aided revocable identity-based proxy re-signature (SA-RIDPRS). Moreover, we extend the proposed RIDPRS scheme to the SA-RIDPRS scheme and prove that this extended scheme is secure against adaptive chosen message and collusion attacks. The analysis results show that our two schemes remain efficient in terms of computational complexity when implementing user revocation procedures. In particular, in the SA-RIDPRS scheme, the verifier needs to perform only a bilinear pairing and four exponentiation operations to verify the validity of the signature. Compared with other IDPRS schemes in the standard model, our SA-RIDPRS scheme greatly reduces the computation overhead of verification. PMID:29579125
Chaudhry, Shehzad Ashraf; Mahmood, Khalid; Naqvi, Husnain; Khan, Muhammad Khurram
2015-11-01
Telecare medicine information system (TMIS) offers the patients convenient and expedite healthcare services remotely anywhere. Patient security and privacy has emerged as key issues during remote access because of underlying open architecture. An authentication scheme can verify patient's as well as TMIS server's legitimacy during remote healthcare services. To achieve security and privacy a number of authentication schemes have been proposed. Very recently Lu et al. (J. Med. Syst. 39(3):1-8, 2015) proposed a biometric based three factor authentication scheme for TMIS to confiscate the vulnerabilities of Arshad et al.'s (J. Med. Syst. 38(12):136, 2014) scheme. Further, they emphasized the robustness of their scheme against several attacks. However, in this paper we establish that Lu et al.'s scheme is vulnerable to numerous attacks including (1) Patient anonymity violation attack, (2) Patient impersonation attack, and (3) TMIS server impersonation attack. Furthermore, their scheme does not provide patient untraceability. We then, propose an improvement of Lu et al.'s scheme. We have analyzed the security of improved scheme using popular automated tool ProVerif. The proposed scheme while retaining the plusses of Lu et al.'s scheme is also robust against known attacks.
Secure Communications in CIoT Networks with a Wireless Energy Harvesting Untrusted Relay.
Hu, Hequn; Gao, Zhenzhen; Liao, Xuewen; Leung, Victor C M
2017-09-04
The Internet of Things (IoT) represents a bright prospect that a variety of common appliances can connect to one another, as well as with the rest of the Internet, to vastly improve our lives. Unique communication and security challenges have been brought out by the limited hardware, low-complexity, and severe energy constraints of IoT devices. In addition, a severe spectrum scarcity problem has also been stimulated by the use of a large number of IoT devices. In this paper, cognitive IoT (CIoT) is considered where an IoT network works as the secondary system using underlay spectrum sharing. A wireless energy harvesting (EH) node is used as a relay to improve the coverage of an IoT device. However, the relay could be a potential eavesdropper to intercept the IoT device's messages. This paper considers the problem of secure communication between the IoT device (e.g., sensor) and a destination (e.g., controller) via the wireless EH untrusted relay. Since the destination can be equipped with adequate energy supply, secure schemes based on destination-aided jamming are proposed based on power splitting (PS) and time splitting (TS) policies, called intuitive secure schemes based on PS (Int-PS), precoded secure scheme based on PS (Pre-PS), intuitive secure scheme based on TS (Int-TS) and precoded secure scheme based on TS (Pre-TS), respectively. The secure performances of the proposed schemes are evaluated through the metric of probability of successfully secure transmission ( P S S T ), which represents the probability that the interference constraint of the primary user is satisfied and the secrecy rate is positive. P S S T is analyzed for the proposed secure schemes, and the closed form expressions of P S S T for Pre-PS and Pre-TS are derived and validated through simulation results. Numerical results show that the precoded secure schemes have better P S S T than the intuitive secure schemes under similar power consumption. When the secure schemes based on PS and TS polices have similar P S S T , the average transmit power consumption of the secure scheme based on TS is lower. The influences of power splitting and time slitting ratios are also discussed through simulations.
Identity-Based Verifiably Encrypted Signatures without Random Oracles
NASA Astrophysics Data System (ADS)
Zhang, Lei; Wu, Qianhong; Qin, Bo
Fair exchange protocol plays an important role in electronic commerce in the case of exchanging digital contracts. Verifiably encrypted signatures provide an optimistic solution to these scenarios with an off-line trusted third party. In this paper, we propose an identity-based verifiably encrypted signature scheme. The scheme is non-interactive to generate verifiably encrypted signatures and the resulting encrypted signature consists of only four group elements. Based on the computational Diffie-Hellman assumption, our scheme is proven secure without using random oracles. To the best of our knowledge, this is the first identity-based verifiably encrypted signature scheme provably secure in the standard model.
A Secure and Privacy-Preserving Navigation Scheme Using Spatial Crowdsourcing in Fog-Based VANETs
Wang, Lingling; Liu, Guozhu; Sun, Lijun
2017-01-01
Fog-based VANETs (Vehicular ad hoc networks) is a new paradigm of vehicular ad hoc networks with the advantages of both vehicular cloud and fog computing. Real-time navigation schemes based on fog-based VANETs can promote the scheme performance efficiently. In this paper, we propose a secure and privacy-preserving navigation scheme by using vehicular spatial crowdsourcing based on fog-based VANETs. Fog nodes are used to generate and release the crowdsourcing tasks, and cooperatively find the optimal route according to the real-time traffic information collected by vehicles in their coverage areas. Meanwhile, the vehicle performing the crowdsourcing task can get a reasonable reward. The querying vehicle can retrieve the navigation results from each fog node successively when entering its coverage area, and follow the optimal route to the next fog node until it reaches the desired destination. Our scheme fulfills the security and privacy requirements of authentication, confidentiality and conditional privacy preservation. Some cryptographic primitives, including the Elgamal encryption algorithm, AES, randomized anonymous credentials and group signatures, are adopted to achieve this goal. Finally, we analyze the security and the efficiency of the proposed scheme. PMID:28338620
Vijay, G S; Kumar, H S; Srinivasa Pai, P; Sriram, N S; Rao, Raj B K N
2012-01-01
The wavelet based denoising has proven its ability to denoise the bearing vibration signals by improving the signal-to-noise ratio (SNR) and reducing the root-mean-square error (RMSE). In this paper seven wavelet based denoising schemes have been evaluated based on the performance of the Artificial Neural Network (ANN) and the Support Vector Machine (SVM), for the bearing condition classification. The work consists of two parts, the first part in which a synthetic signal simulating the defective bearing vibration signal with Gaussian noise was subjected to these denoising schemes. The best scheme based on the SNR and the RMSE was identified. In the second part, the vibration signals collected from a customized Rolling Element Bearing (REB) test rig for four bearing conditions were subjected to these denoising schemes. Several time and frequency domain features were extracted from the denoised signals, out of which a few sensitive features were selected using the Fisher's Criterion (FC). Extracted features were used to train and test the ANN and the SVM. The best denoising scheme identified, based on the classification performances of the ANN and the SVM, was found to be the same as the one obtained using the synthetic signal.
A Secure and Privacy-Preserving Navigation Scheme Using Spatial Crowdsourcing in Fog-Based VANETs.
Wang, Lingling; Liu, Guozhu; Sun, Lijun
2017-03-24
Fog-based VANETs (Vehicular ad hoc networks) is a new paradigm of vehicular ad hoc networks with the advantages of both vehicular cloud and fog computing. Real-time navigation schemes based on fog-based VANETs can promote the scheme performance efficiently. In this paper, we propose a secure and privacy-preserving navigation scheme by using vehicular spatial crowdsourcing based on fog-based VANETs. Fog nodes are used to generate and release the crowdsourcing tasks, and cooperatively find the optimal route according to the real-time traffic information collected by vehicles in their coverage areas. Meanwhile, the vehicle performing the crowdsourcing task can get a reasonable reward. The querying vehicle can retrieve the navigation results from each fog node successively when entering its coverage area, and follow the optimal route to the next fog node until it reaches the desired destination. Our scheme fulfills the security and privacy requirements of authentication, confidentiality and conditional privacy preservation. Some cryptographic primitives, including the Elgamal encryption algorithm, AES, randomized anonymous credentials and group signatures, are adopted to achieve this goal. Finally, we analyze the security and the efficiency of the proposed scheme.
a Semi-Empirical Topographic Correction Model for Multi-Source Satellite Images
NASA Astrophysics Data System (ADS)
Xiao, Sa; Tian, Xinpeng; Liu, Qiang; Wen, Jianguang; Ma, Yushuang; Song, Zhenwei
2018-04-01
Topographic correction of surface reflectance in rugged terrain areas is the prerequisite for the quantitative application of remote sensing in mountainous areas. Physics-based radiative transfer model can be applied to correct the topographic effect and accurately retrieve the reflectance of the slope surface from high quality satellite image such as Landsat8 OLI. However, as more and more images data available from various of sensors, some times we can not get the accurate sensor calibration parameters and atmosphere conditions which are needed in the physics-based topographic correction model. This paper proposed a semi-empirical atmosphere and topographic corrction model for muti-source satellite images without accurate calibration parameters.Based on this model we can get the topographic corrected surface reflectance from DN data, and we tested and verified this model with image data from Chinese satellite HJ and GF. The result shows that the correlation factor was reduced almost 85 % for near infrared bands and the classification overall accuracy of classification increased 14 % after correction for HJ. The reflectance difference of slope face the sun and face away the sun have reduced after correction.
State-of-the-Art: DTM Generation Using Airborne LIDAR Data
Chen, Ziyue; Gao, Bingbo; Devereux, Bernard
2017-01-01
Digital terrain model (DTM) generation is the fundamental application of airborne Lidar data. In past decades, a large body of studies has been conducted to present and experiment a variety of DTM generation methods. Although great progress has been made, DTM generation, especially DTM generation in specific terrain situations, remains challenging. This research introduces the general principles of DTM generation and reviews diverse mainstream DTM generation methods. In accordance with the filtering strategy, these methods are classified into six categories: surface-based adjustment; morphology-based filtering, triangulated irregular network (TIN)-based refinement, segmentation and classification, statistical analysis and multi-scale comparison. Typical methods for each category are briefly introduced and the merits and limitations of each category are discussed accordingly. Despite different categories of filtering strategies, these DTM generation methods present similar difficulties when implemented in sharply changing terrain, areas with dense non-ground features and complicated landscapes. This paper suggests that the fusion of multi-sources and integration of different methods can be effective ways for improving the performance of DTM generation. PMID:28098810
Physical environment virtualization for human activities recognition
NASA Astrophysics Data System (ADS)
Poshtkar, Azin; Elangovan, Vinayak; Shirkhodaie, Amir; Chan, Alex; Hu, Shuowen
2015-05-01
Human activity recognition research relies heavily on extensive datasets to verify and validate performance of activity recognition algorithms. However, obtaining real datasets are expensive and highly time consuming. A physics-based virtual simulation can accelerate the development of context based human activity recognition algorithms and techniques by generating relevant training and testing videos simulating diverse operational scenarios. In this paper, we discuss in detail the requisite capabilities of a virtual environment to aid as a test bed for evaluating and enhancing activity recognition algorithms. To demonstrate the numerous advantages of virtual environment development, a newly developed virtual environment simulation modeling (VESM) environment is presented here to generate calibrated multisource imagery datasets suitable for development and testing of recognition algorithms for context-based human activities. The VESM environment serves as a versatile test bed to generate a vast amount of realistic data for training and testing of sensor processing algorithms. To demonstrate the effectiveness of VESM environment, we present various simulated scenarios and processed results to infer proper semantic annotations from the high fidelity imagery data for human-vehicle activity recognition under different operational contexts.
NASA Astrophysics Data System (ADS)
Moran, Niklas; Nieland, Simon; Tintrup gen. Suntrup, Gregor; Kleinschmit, Birgit
2017-02-01
Manual field surveys for nature conservation management are expensive and time-consuming and could be supplemented and streamlined by using Remote Sensing (RS). RS is critical to meet requirements of existing laws such as the EU Habitats Directive (HabDir) and more importantly to meet future challenges. The full potential of RS has yet to be harnessed as different nomenclatures and procedures hinder interoperability, comparison and provenance. Therefore, automated tools are needed to use RS data to produce comparable, empirical data outputs that lend themselves to data discovery and provenance. These issues are addressed by a novel, semi-automatic ontology-based classification method that uses machine learning algorithms and Web Ontology Language (OWL) ontologies that yields traceable, interoperable and observation-based classification outputs. The method was tested on European Union Nature Information System (EUNIS) grasslands in Rheinland-Palatinate, Germany. The developed methodology is a first step in developing observation-based ontologies in the field of nature conservation. The tests show promising results for the determination of the grassland indicators wetness and alkalinity with an overall accuracy of 85% for alkalinity and 76% for wetness.
Tiede, Dirk; Baraldi, Andrea; Sudmanns, Martin; Belgiu, Mariana; Lang, Stefan
2017-01-01
ABSTRACT Spatiotemporal analytics of multi-source Earth observation (EO) big data is a pre-condition for semantic content-based image retrieval (SCBIR). As a proof of concept, an innovative EO semantic querying (EO-SQ) subsystem was designed and prototypically implemented in series with an EO image understanding (EO-IU) subsystem. The EO-IU subsystem is automatically generating ESA Level 2 products (scene classification map, up to basic land cover units) from optical satellite data. The EO-SQ subsystem comprises a graphical user interface (GUI) and an array database embedded in a client server model. In the array database, all EO images are stored as a space-time data cube together with their Level 2 products generated by the EO-IU subsystem. The GUI allows users to (a) develop a conceptual world model based on a graphically supported query pipeline as a combination of spatial and temporal operators and/or standard algorithms and (b) create, save and share within the client-server architecture complex semantic queries/decision rules, suitable for SCBIR and/or spatiotemporal EO image analytics, consistent with the conceptual world model. PMID:29098143
Noubiap, Jean Jacques N; Joko, Walburga Yvonne A; Obama, Joel Marie N; Bigna, Jean Joel R
2013-01-01
Introduction For the last two decades, promoted by many governments and international number in sub-Saharan Africa. In 2005 in Cameroon, there were only 60 Community-based health insurance (CBHI) schemes nationwide, covering less than 1% of the population. In 2006, the Cameroon government adopted a national strategy aimed at creating at least one CBHI scheme in each health district and covering at least 40% of the population with CBHI schemes by 2015. Unfortunately, there is almost no published data on the awareness and the implementation of CBHI schemes in Cameroon. Methods Structured interviews were conducted in January 2010 with 160 informal sectors workers in the Bonassama health district (BHD) of Douala, aiming at evaluating their knowledge, concern and preferences on CBHI schemes and their financial plan to cover health costs. Results The awareness on the existence of CHBI schemes was poor awareness schemes among these informal workers. Awareness of CBHI schemes was significantly associated with a high level of education (p = 0.0001). Only 4.4% of respondents had health insurance, and specifically 1.2% were involved in a CBHI scheme. However, 128 (86.2%) respondents thought that belonging to a CBHI scheme could facilitate their access to adequate health care, and were thus willing to be involved in CBHI schemes. Our respondents would have preferred CBHI schemes run by missionaries to CBHI schemes run by the government or people of the same ethnic group (p). Conclusion There is a very low participation in CBHI schemes among the informal sector workers of the BHD. This is mainly due to the lack of awareness and limited knowledge on the basic concepts of a CBHI by this target population. Solidarity based community associations to which the vast majority of this target population belong are prime areas for sensitization on CBHI schemes. Hence these associations could possibly federalize to create CBHI schemes. PMID:24498466
Noubiap, Jean Jacques N; Joko, Walburga Yvonne A; Obama, Joel Marie N; Bigna, Jean Joel R
2013-01-01
For the last two decades, promoted by many governments and international number in sub-Saharan Africa. In 2005 in Cameroon, there were only 60 Community-based health insurance (CBHI) schemes nationwide, covering less than 1% of the population. In 2006, the Cameroon government adopted a national strategy aimed at creating at least one CBHI scheme in each health district and covering at least 40% of the population with CBHI schemes by 2015. Unfortunately, there is almost no published data on the awareness and the implementation of CBHI schemes in Cameroon. Structured interviews were conducted in January 2010 with 160 informal sectors workers in the Bonassama health district (BHD) of Douala, aiming at evaluating their knowledge, concern and preferences on CBHI schemes and their financial plan to cover health costs. The awareness on the existence of CHBI schemes was poor awareness schemes among these informal workers. Awareness of CBHI schemes was significantly associated with a high level of education (p = 0.0001). Only 4.4% of respondents had health insurance, and specifically 1.2% were involved in a CBHI scheme. However, 128 (86.2%) respondents thought that belonging to a CBHI scheme could facilitate their access to adequate health care, and were thus willing to be involved in CBHI schemes. Our respondents would have preferred CBHI schemes run by missionaries to CBHI schemes run by the government or people of the same ethnic group (p). There is a very low participation in CBHI schemes among the informal sector workers of the BHD. This is mainly due to the lack of awareness and limited knowledge on the basic concepts of a CBHI by this target population. Solidarity based community associations to which the vast majority of this target population belong are prime areas for sensitization on CBHI schemes. Hence these associations could possibly federalize to create CBHI schemes.
NASA Astrophysics Data System (ADS)
Guan, X.; Shen, H.; Li, X.; Gan, W.
2017-12-01
Mountainous area hosts approximately a quarter of the global land surface, with complex climate and ecosystem conditions. More knowledge about mountainous ecosystem could highly advance our understanding of the global carbon cycle and climate change. Net Primary Productivity (NPP), the biomass increment of plants, is a widely used ecological indicator that can be obtained by remote sensing methods. However, limited by the defective characteristic of sensors, which cannot be long-term with enough spatial details synchronously, the mountainous NPP was far from being understood. In this study, a multi-sensor fusion framework was applied to synthesize a 1-km NPP series from 1982 to 2014 in mountainous southwest China, where elevation ranged from 76m to 6740m. The validation with field-measurements proved this framework greatly improved the accuracy of NPP (r=0.79, p<0.01). The detailed spatial and temporal analysis indicated that NPP variation trends changed from decreasing to increasing with the ascending elevation, as a result of a warmer and drier climate over the region. The correlation of NPP and temperature varied from negative to positive almost at the same elevation break-point of NPP trends, but the opposite for precipitation. This phenomenon was determined by the altitudinal and seasonally uneven allocation of climatic factors, as well as the downward run-off. What is more, it was indicated that the NPP variation showed three distinct stages at the year break-point of 1992 and 2002 over the region. The NPP in low-elevation area varied almost triple more drastic than the high-elevation area for all the three stages, due to the much greater change rate of precipitation. In summary, this study innovatively conducted a long-term and accurate NPP study on the not understood mountainous ecosystem with multi-source data, the framework and conclusions will be beneficial for the further cognition of global climate change.
Lian, Jijian; Zhang, Wenjiao; Ma, Bin; Liu, Dongming
2017-01-01
As excess water is discharged from a high dam, low frequency noise (air pulsation lower than 10 Hz, LFN) is generated and propagated in the surrounding areas, causing environmental hazards such as the vibration of windows and doors and the discomfort of local residents. To study the generation mechanisms and key influencing factors of LFN induced by flood discharge and energy dissipation from a high dam with a ski-jump type spillway, detailed prototype observations and analyses of LFN are carried out. The discharge flow field is simulated and analyzed using a gas-liquid turbulent flow model. The acoustic response characteristics of the air cavity, which is formed between the discharge nappe and dam body, are analyzed using an acoustic numerical model. The multi-sources generation mechanisms are first proposed basing on the prototype observation results, vortex sound model, turbulent flow model and acoustic numerical model. Two kinds of sources of LFN are studied. One comes from the energy dissipation of submerged jets in the plunge pool, the other comes from nappe-cavity coupled vibration. The results of the analyses reveal that the submerged jets in the plunge pool only contribute to an on-site LFN energy of 0–1.0 Hz, and the strong shear layers around the high-velocity submerged jets and wall jet development areas are the main acoustic source regions of LFN in the plunge pool. In addition, the nappe-cavity coupled vibration, which is induced when the discharge nappe vibrates with close frequency to the model frequency of the cavity, can induce on-site LFN energy with wider frequency spectrum energy within 0–4.0 Hz. By contrast, the contribution degrees to LFN energy from two acoustic sources are almost same, while the contribution degree from nappe-cavity coupled vibration is slightly higher. PMID:29189750
Surveillance for work-related skull fractures in Michigan.
Kica, Joanna; Rosenman, Kenneth D
2014-12-01
The objective was to develop a multisource surveillance system for work-related skull fractures. Records on work-related skull fractures were obtained from Michigan's 134 hospitals, Michigan's Workers' Compensation Agency and death certificates. Cases from the three sources were matched to eliminate duplicates from more than one source. Workplaces where the most severe injuries occurred were referred to OSHA for an enforcement inspection. There were 318 work related skull fractures, not including facial fractures, between 2010 and 2012. In 2012, after the inclusion of facial fractures, 316 fractures were identified of which 218 (69%) were facial fractures. The Bureau of Labor Statistic's (BLS) 2012 estimate of skull fractures in Michigan, which includes facial fractures, was 170, which was 53.8% of those identified from our review of medical records. The inclusion of facial fractures in the surveillance system increased the percentage of women identified from 15.4% to 31.2%, decreased severity (hospitalization went from 48.7% to 10.6% and loss of consciousness went from 56.5% to 17.8%), decreased falls from 48.2% to 27.6%, and increased assaults from 5.0% to 20.2%, shifted the most common industry from construction (13.3%) to health care and social assistance (15.0%) and the highest incidence rate from males 65+ (6.8 per 100,000) to young men, 20-24 years (9.6 per 100,000). Workplace inspections resulted in 45 violations and $62,750 in penalties. The Michigan multisource surveillance system of workplace injuries had two major advantages over the existing national system: (a) workplace investigations were initiated hazards identified and safety changes implemented at the facilities where the injuries occurred; and (b) a more accurate count was derived, with 86% more work-related skull fractures identified than BLS's employer based estimate. A more comprehensive system to identify and target interventions for workplace injuries was implemented using hospital and emergency department medical records. Copyright © 2014 National Safety Council and Elsevier Ltd. All rights reserved.
Lee, Tian-Fu; Chang, I-Pin; Lin, Tsung-Hung; Wang, Ching-Cheng
2013-06-01
The integrated EPR information system supports convenient and rapid e-medicine services. A secure and efficient authentication scheme for the integrated EPR information system provides safeguarding patients' electronic patient records (EPRs) and helps health care workers and medical personnel to rapidly making correct clinical decisions. Recently, Wu et al. proposed an efficient password-based user authentication scheme using smart cards for the integrated EPR information system, and claimed that the proposed scheme could resist various malicious attacks. However, their scheme is still vulnerable to lost smart card and stolen verifier attacks. This investigation discusses these weaknesses and proposes a secure and efficient authentication scheme for the integrated EPR information system as alternative. Compared with related approaches, the proposed scheme not only retains a lower computational cost and does not require verifier tables for storing users' secrets, but also solves the security problems in previous schemes and withstands possible attacks.
NASA Astrophysics Data System (ADS)
Su, Yonggang; Tang, Chen; Li, Biyuan; Lei, Zhenkun
2018-05-01
This paper presents a novel optical colour image watermarking scheme based on phase-truncated linear canonical transform (PT-LCT) and image decomposition (ID). In this proposed scheme, a PT-LCT-based asymmetric cryptography is designed to encode the colour watermark into a noise-like pattern, and an ID-based multilevel embedding method is constructed to embed the encoded colour watermark into a colour host image. The PT-LCT-based asymmetric cryptography, which can be optically implemented by double random phase encoding with a quadratic phase system, can provide a higher security to resist various common cryptographic attacks. And the ID-based multilevel embedding method, which can be digitally implemented by a computer, can make the information of the colour watermark disperse better in the colour host image. The proposed colour image watermarking scheme possesses high security and can achieve a higher robustness while preserving the watermark’s invisibility. The good performance of the proposed scheme has been demonstrated by extensive experiments and comparison with other relevant schemes.
Das, Ashok Kumar; Goswami, Adrijit
2014-06-01
Recently, Awasthi and Srivastava proposed a novel biometric remote user authentication scheme for the telecare medicine information system (TMIS) with nonce. Their scheme is very efficient as it is based on efficient chaotic one-way hash function and bitwise XOR operations. In this paper, we first analyze Awasthi-Srivastava's scheme and then show that their scheme has several drawbacks: (1) incorrect password change phase, (2) fails to preserve user anonymity property, (3) fails to establish a secret session key beween a legal user and the server, (4) fails to protect strong replay attack, and (5) lacks rigorous formal security analysis. We then a propose a novel and secure biometric-based remote user authentication scheme in order to withstand the security flaw found in Awasthi-Srivastava's scheme and enhance the features required for an idle user authentication scheme. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. In addition, we simulate our scheme for the formal security verification using the widely-accepted AVISPA (Automated Validation of Internet Security Protocols and Applications) tool and show that our scheme is secure against passive and active attacks, including the replay and man-in-the-middle attacks. Our scheme is also efficient as compared to Awasthi-Srivastava's scheme.
Moon, Jongho; Choi, Younsung; Jung, Jaewook; Won, Dongho
2015-01-01
In multi-server environments, user authentication is a very important issue because it provides the authorization that enables users to access their data and services; furthermore, remote user authentication schemes for multi-server environments have solved the problem that has arisen from user's management of different identities and passwords. For this reason, numerous user authentication schemes that are designed for multi-server environments have been proposed over recent years. In 2015, Lu et al. improved upon Mishra et al.'s scheme, claiming that their remote user authentication scheme is more secure and practical; however, we found that Lu et al.'s scheme is still insecure and incorrect. In this paper, we demonstrate that Lu et al.'s scheme is vulnerable to outsider attack and user impersonation attack, and we propose a new biometrics-based scheme for authentication and key agreement that can be used in multi-server environments; then, we show that our proposed scheme is more secure and supports the required security properties.
NASA Astrophysics Data System (ADS)
Ford, Neville J.; Connolly, Joseph A.
2009-07-01
We give a comparison of the efficiency of three alternative decomposition schemes for the approximate solution of multi-term fractional differential equations using the Caputo form of the fractional derivative. The schemes we compare are based on conversion of the original problem into a system of equations. We review alternative approaches and consider how the most appropriate numerical scheme may be chosen to solve a particular equation.
Lou, Der-Chyuan; Lee, Tian-Fu; Lin, Tsung-Hung
2015-05-01
Authenticated key agreements for telecare medicine information systems provide patients, doctors, nurses and health visitors with accessing medical information systems and getting remote services efficiently and conveniently through an open network. In order to have higher security, many authenticated key agreement schemes appended biometric keys to realize identification except for using passwords and smartcards. Due to too many transmissions and computational costs, these authenticated key agreement schemes are inefficient in communication and computation. This investigation develops two secure and efficient authenticated key agreement schemes for telecare medicine information systems by using biometric key and extended chaotic maps. One scheme is synchronization-based, while the other nonce-based. Compared to related approaches, the proposed schemes not only retain the same security properties with previous schemes, but also provide users with privacy protection and have fewer transmissions and lower computational cost.
Phase-Image Encryption Based on 3D-Lorenz Chaotic System and Double Random Phase Encoding
NASA Astrophysics Data System (ADS)
Sharma, Neha; Saini, Indu; Yadav, AK; Singh, Phool
2017-12-01
In this paper, an encryption scheme for phase-images based on 3D-Lorenz chaotic system in Fourier domain under the 4f optical system is presented. The encryption scheme uses a random amplitude mask in the spatial domain and a random phase mask in the frequency domain. Its inputs are phase-images, which are relatively more secure as compared to the intensity images because of non-linearity. The proposed scheme further derives its strength from the use of 3D-Lorenz transform in the frequency domain. Although the experimental setup for optical realization of the proposed scheme has been provided, the results presented here are based on simulations on MATLAB. It has been validated for grayscale images, and is found to be sensitive to the encryption parameters of the Lorenz system. The attacks analysis shows that the key-space is large enough to resist brute-force attack, and the scheme is also resistant to the noise and occlusion attacks. Statistical analysis and the analysis based on correlation distribution of adjacent pixels have been performed to test the efficacy of the encryption scheme. The results have indicated that the proposed encryption scheme possesses a high level of security.
A scheme of hidden-structure attribute-based encryption with multiple authorities
NASA Astrophysics Data System (ADS)
Ling, J.; Weng, A. X.
2018-05-01
In the most of the CP-ABE schemes with hidden access structure, both all the user attributes and the key generation are managed by only one authority. The key generation efficiency will decrease as the number of user increases, and the data will encounter security issues as the only authority is attacked. We proposed a scheme of hidden-structure attribute-based encryption with multiple authorities, which introduces multiple semi-trusted attribute authorities, avoiding the threat even though one or more authorities are attacked. We also realized user revocation by managing a revocation list. Based on DBDH assumption, we proved that our scheme is of IND-CMA security. The analysis shows that our scheme improves the key generation efficiency.
Quantum Attack-Resistent Certificateless Multi-Receiver Signcryption Scheme
Li, Huixian; Chen, Xubao; Pang, Liaojun; Shi, Weisong
2013-01-01
The existing certificateless signcryption schemes were designed mainly based on the traditional public key cryptography, in which the security relies on the hard problems, such as factor decomposition and discrete logarithm. However, these problems will be easily solved by the quantum computing. So the existing certificateless signcryption schemes are vulnerable to the quantum attack. Multivariate public key cryptography (MPKC), which can resist the quantum attack, is one of the alternative solutions to guarantee the security of communications in the post-quantum age. Motivated by these concerns, we proposed a new construction of the certificateless multi-receiver signcryption scheme (CLMSC) based on MPKC. The new scheme inherits the security of MPKC, which can withstand the quantum attack. Multivariate quadratic polynomial operations, which have lower computation complexity than bilinear pairing operations, are employed in signcrypting a message for a certain number of receivers in our scheme. Security analysis shows that our scheme is a secure MPKC-based scheme. We proved its security under the hardness of the Multivariate Quadratic (MQ) problem and its unforgeability under the Isomorphism of Polynomials (IP) assumption in the random oracle model. The analysis results show that our scheme also has the security properties of non-repudiation, perfect forward secrecy, perfect backward secrecy and public verifiability. Compared with the existing schemes in terms of computation complexity and ciphertext length, our scheme is more efficient, which makes it suitable for terminals with low computation capacity like smart cards. PMID:23967037
Li, Chun-Ta; Wu, Tsu-Yang; Chen, Chin-Ling; Lee, Cheng-Chi; Chen, Chien-Ming
2017-06-23
In recent years, with the increase in degenerative diseases and the aging population in advanced countries, demands for medical care of older or solitary people have increased continually in hospitals and healthcare institutions. Applying wireless sensor networks for the IoT-based telemedicine system enables doctors, caregivers or families to monitor patients' physiological conditions at anytime and anyplace according to the acquired information. However, transmitting physiological data through the Internet concerns the personal privacy of patients. Therefore, before users can access medical care services in IoT-based medical care system, they must be authenticated. Typically, user authentication and data encryption are most critical for securing network communications over a public channel between two or more participants. In 2016, Liu and Chung proposed a bilinear pairing-based password authentication scheme for wireless healthcare sensor networks. They claimed their authentication scheme cannot only secure sensor data transmission, but also resist various well-known security attacks. In this paper, we demonstrate that Liu-Chung's scheme has some security weaknesses, and we further present an improved secure authentication and data encryption scheme for the IoT-based medical care system, which can provide user anonymity and prevent the security threats of replay and password/sensed data disclosure attacks. Moreover, we modify the authentication process to reduce redundancy in protocol design, and the proposed scheme is more efficient in performance compared with previous related schemes. Finally, the proposed scheme is provably secure in the random oracle model under ECDHP.
Li, Chun-Ta; Lee, Cheng-Chi; Weng, Chi-Yao; Chen, Song-Jhih
2016-11-01
Secure user authentication schemes in many e-Healthcare applications try to prevent unauthorized users from intruding the e-Healthcare systems and a remote user and a medical server can establish session keys for securing the subsequent communications. However, many schemes does not mask the users' identity information while constructing a login session between two or more parties, even though personal privacy of users is a significant topic for e-Healthcare systems. In order to preserve personal privacy of users, dynamic identity based authentication schemes are hiding user's real identity during the process of network communications and only the medical server knows login user's identity. In addition, most of the existing dynamic identity based authentication schemes ignore the inputs verification during login condition and this flaw may subject to inefficiency in the case of incorrect inputs in the login phase. Regarding the use of secure authentication mechanisms for e-Healthcare systems, this paper presents a new dynamic identity and chaotic maps based authentication scheme and a secure data protection approach is employed in every session to prevent illegal intrusions. The proposed scheme can not only quickly detect incorrect inputs during the phases of login and password change but also can invalidate the future use of a lost/stolen smart card. Compared the functionality and efficiency with other authentication schemes recently, the proposed scheme satisfies desirable security attributes and maintains acceptable efficiency in terms of the computational overheads for e-Healthcare systems.
A more secure anonymous user authentication scheme for the integrated EPR information system.
Wen, Fengtong
2014-05-01
Secure and efficient user mutual authentication is an essential task for integrated electronic patient record (EPR) information system. Recently, several authentication schemes have been proposed to meet this requirement. In a recent paper, Lee et al. proposed an efficient and secure password-based authentication scheme used smart cards for the integrated EPR information system. This scheme is believed to have many abilities to resist a range of network attacks. Especially, they claimed that their scheme could resist lost smart card attack. However, we reanalyze the security of Lee et al.'s scheme, and show that it fails to protect off-line password guessing attack if the secret information stored in the smart card is compromised. This also renders that their scheme is insecure against user impersonation attacks. Then, we propose a new user authentication scheme for integrated EPR information systems based on the quadratic residues. The new scheme not only resists a range of network attacks but also provides user anonymity. We show that our proposed scheme can provide stronger security.
Tan, Maxine; Aghaei, Faranak; Wang, Yunzhi; Zheng, Bin
2017-01-01
The purpose of this study is to evaluate a new method to improve performance of computer-aided detection (CAD) schemes of screening mammograms with two approaches. In the first approach, we developed a new case based CAD scheme using a set of optimally selected global mammographic density, texture, spiculation, and structural similarity features computed from all four full-field digital mammography (FFDM) images of the craniocaudal (CC) and mediolateral oblique (MLO) views by using a modified fast and accurate sequential floating forward selection feature selection algorithm. Selected features were then applied to a “scoring fusion” artificial neural network (ANN) classification scheme to produce a final case based risk score. In the second approach, we combined the case based risk score with the conventional lesion based scores of a conventional lesion based CAD scheme using a new adaptive cueing method that is integrated with the case based risk scores. We evaluated our methods using a ten-fold cross-validation scheme on 924 cases (476 cancer and 448 recalled or negative), whereby each case had all four images from the CC and MLO views. The area under the receiver operating characteristic curve was AUC = 0.793±0.015 and the odds ratio monotonically increased from 1 to 37.21 as CAD-generated case based detection scores increased. Using the new adaptive cueing method, the region based and case based sensitivities of the conventional CAD scheme at a false positive rate of 0.71 per image increased by 2.4% and 0.8%, respectively. The study demonstrated that supplementary information can be derived by computing global mammographic density image features to improve CAD-cueing performance on the suspicious mammographic lesions. PMID:27997380
Genetic progress in multistage dairy cattle breeding schemes using genetic markers.
Schrooten, C; Bovenhuis, H; van Arendonk, J A M; Bijma, P
2005-04-01
The aim of this paper was to explore general characteristics of multistage breeding schemes and to evaluate multistage dairy cattle breeding schemes that use information on quantitative trait loci (QTL). Evaluation was either for additional genetic response or for reduction in number of progeny-tested bulls while maintaining the same response. The reduction in response in multistage breeding schemes relative to comparable single-stage breeding schemes (i.e., with the same overall selection intensity and the same amount of information in the final stage of selection) depended on the overall selection intensity, the selection intensity in the various stages of the breeding scheme, and the ratio of the accuracies of selection in the various stages of the breeding scheme. When overall selection intensity was constant, reduction in response increased with increasing selection intensity in the first stage. The decrease in response was highest in schemes with lower overall selection intensity. Reduction in response was limited in schemes with low to average emphasis on first-stage selection, especially if the accuracy of selection in the first stage was relatively high compared with the accuracy in the final stage. Closed nucleus breeding schemes in dairy cattle that use information on QTL were evaluated by deterministic simulation. In the base scheme, the selection index consisted of pedigree information and own performance (dams), or pedigree information and performance of 100 daughters (sires). In alternative breeding schemes, information on a QTL was accounted for by simulating an additional index trait. The fraction of the variance explained by the QTL determined the correlation between the additional index trait and the breeding goal trait. Response in progeny test schemes relative to a base breeding scheme without QTL information ranged from +4.5% (QTL explaining 5% of the additive genetic variance) to +21.2% (QTL explaining 50% of the additive genetic variance). A QTL explaining 5% of the additive genetic variance allowed a 35% reduction in the number of progeny tested bulls, while maintaining genetic response at the level of the base scheme. Genetic progress was up to 31.3% higher for schemes with increased embryo production and selection of embryos based on QTL information. The challenge for breeding organizations is to find the optimum breeding program with regard to additional genetic progress and additional (or reduced) cost.
Proposed new classification scheme for chemical injury to the human eye.
Bagley, Daniel M; Casterton, Phillip L; Dressler, William E; Edelhauser, Henry F; Kruszewski, Francis H; McCulley, James P; Nussenblatt, Robert B; Osborne, Rosemarie; Rothenstein, Arthur; Stitzel, Katherine A; Thomas, Karluss; Ward, Sherry L
2006-07-01
Various ocular alkali burn classification schemes have been published and used to grade human chemical eye injuries for the purpose of identifying treatments and forecasting outcomes. The ILSI chemical eye injury classification scheme was developed for the additional purpose of collecting detailed human eye injury data to provide information on the mechanisms associated with chemical eye injuries. This information will have clinical application, as well as use in the development and validation of new methods to assess ocular toxicity. A panel of ophthalmic researchers proposed the new classification scheme based upon current knowledge of the mechanisms of eye injury, and their collective clinical and research experience. Additional ophthalmologists and researchers were surveyed to critique the scheme. The draft scheme was revised, and the proposed scheme represents the best consensus from at least 23 physicians and scientists. The new scheme classifies chemical eye injury into five categories based on clinical signs, symptoms, and expected outcomes. Diagnostic classification is based primarily on two clinical endpoints: (1) the extent (area) of injury at the limbus, and (2) the degree of injury (area and depth) to the cornea. The new classification scheme provides a uniform system for scoring eye injury across chemical classes, and provides enough detail for the clinician to collect data that will be relevant to identifying the mechanisms of ocular injury.
Törnros, Tobias; Dorn, Helen; Reichert, Markus; Ebner-Priemer, Ulrich; Salize, Hans-Joachim; Tost, Heike; Meyer-Lindenberg, Andreas; Zipf, Alexander
2016-11-21
Self-reporting is a well-established approach within the medical and psychological sciences. In order to avoid recall bias, i.e. past events being remembered inaccurately, the reports can be filled out on a smartphone in real-time and in the natural environment. This is often referred to as ambulatory assessment and the reports are usually triggered at regular time intervals. With this sampling scheme, however, rare events (e.g. a visit to a park or recreation area) are likely to be missed. When addressing the correlation between mood and the environment, it may therefore be beneficial to include participant locations within the ambulatory assessment sampling scheme. Based on the geographical coordinates, the database query system then decides if a self-report should be triggered or not. We simulated four different ambulatory assessment sampling schemes based on movement data (coordinates by minute) from 143 voluntary participants tracked for seven consecutive days. Two location-based sampling schemes incorporating the environmental characteristics (land use and population density) at each participant's location were introduced and compared to a time-based sampling scheme triggering a report on the hour as well as to a sampling scheme incorporating physical activity. We show that location-based sampling schemes trigger a report less often, but we obtain more unique trigger positions and a greater spatial spread in comparison to sampling strategies based on time and distance. Additionally, the location-based methods trigger significantly more often at rarely visited types of land use and less often outside the study region where no underlying environmental data are available.
Multiparty Quantum Blind Signature Scheme Based on Graph States
NASA Astrophysics Data System (ADS)
Jian-Wu, Liang; Xiao-Shu, Liu; Jin-Jing, Shi; Ying, Guo
2018-05-01
A multiparty quantum blind signature scheme is proposed based on the principle of graph state, in which the unitary operations of graph state particles can be applied to generate the quantum blind signature and achieve verification. Different from the classical blind signature based on the mathematical difficulty, the scheme could guarantee not only the anonymity but also the unconditionally security. The analysis shows that the length of the signature generated in our scheme does not become longer as the number of signers increases, and it is easy to increase or decrease the number of signers.
Triangle based TVD schemes for hyperbolic conservation laws
NASA Technical Reports Server (NTRS)
Durlofsky, Louis J.; Osher, Stanley; Engquist, Bjorn
1990-01-01
A triangle based total variation diminishing (TVD) scheme for the numerical approximation of hyperbolic conservation laws in two space dimensions is constructed. The novelty of the scheme lies in the nature of the preprocessing of the cell averaged data, which is accomplished via a nearest neighbor linear interpolation followed by a slope limiting procedures. Two such limiting procedures are suggested. The resulting method is considerably more simple than other triangle based non-oscillatory approximations which, like this scheme, approximate the flux up to second order accuracy. Numerical results for linear advection and Burgers' equation are presented.
Mishra, Dheerendra
2015-01-01
Telecare medical information systems (TMIS) enable healthcare delivery services. However, access of these services via public channel raises security and privacy issues. In recent years, several smart card based authentication schemes have been introduced to ensure secure and authorized communication between remote entities over the public channel for the (TMIS). We analyze the security of some of the recently proposed authentication schemes of Lin, Xie et al., Cao and Zhai, and Wu and Xu's for TMIS. Unfortunately, we identify that these schemes failed to satisfy desirable security attributes. In this article we briefly discuss four dynamic ID-based authentication schemes and demonstrate their failure to satisfy desirable security attributes. The study is aimed to demonstrate how inefficient password change phase can lead to denial of server scenario for an authorized user, and how an inefficient login phase causes the communication and computational overhead and decrease the performance of the system. Moreover, we show the vulnerability of Cao and Zhai's scheme to known session specific temporary information attack, vulnerability of Wu and Xu's scheme to off-line password guessing attack, and vulnerability of Xie et al.'s scheme to untraceable on-line password guessing attack.
Dinov, Ivo D; Heavner, Ben; Tang, Ming; Glusman, Gustavo; Chard, Kyle; Darcy, Mike; Madduri, Ravi; Pa, Judy; Spino, Cathie; Kesselman, Carl; Foster, Ian; Deutsch, Eric W; Price, Nathan D; Van Horn, John D; Ames, Joseph; Clark, Kristi; Hood, Leroy; Hampstead, Benjamin M; Dauer, William; Toga, Arthur W
2016-01-01
A unique archive of Big Data on Parkinson's Disease is collected, managed and disseminated by the Parkinson's Progression Markers Initiative (PPMI). The integration of such complex and heterogeneous Big Data from multiple sources offers unparalleled opportunities to study the early stages of prevalent neurodegenerative processes, track their progression and quickly identify the efficacies of alternative treatments. Many previous human and animal studies have examined the relationship of Parkinson's disease (PD) risk to trauma, genetics, environment, co-morbidities, or life style. The defining characteristics of Big Data-large size, incongruency, incompleteness, complexity, multiplicity of scales, and heterogeneity of information-generating sources-all pose challenges to the classical techniques for data management, processing, visualization and interpretation. We propose, implement, test and validate complementary model-based and model-free approaches for PD classification and prediction. To explore PD risk using Big Data methodology, we jointly processed complex PPMI imaging, genetics, clinical and demographic data. Collective representation of the multi-source data facilitates the aggregation and harmonization of complex data elements. This enables joint modeling of the complete data, leading to the development of Big Data analytics, predictive synthesis, and statistical validation. Using heterogeneous PPMI data, we developed a comprehensive protocol for end-to-end data characterization, manipulation, processing, cleaning, analysis and validation. Specifically, we (i) introduce methods for rebalancing imbalanced cohorts, (ii) utilize a wide spectrum of classification methods to generate consistent and powerful phenotypic predictions, and (iii) generate reproducible machine-learning based classification that enables the reporting of model parameters and diagnostic forecasting based on new data. We evaluated several complementary model-based predictive approaches, which failed to generate accurate and reliable diagnostic predictions. However, the results of several machine-learning based classification methods indicated significant power to predict Parkinson's disease in the PPMI subjects (consistent accuracy, sensitivity, and specificity exceeding 96%, confirmed using statistical n-fold cross-validation). Clinical (e.g., Unified Parkinson's Disease Rating Scale (UPDRS) scores), demographic (e.g., age), genetics (e.g., rs34637584, chr12), and derived neuroimaging biomarker (e.g., cerebellum shape index) data all contributed to the predictive analytics and diagnostic forecasting. Model-free Big Data machine learning-based classification methods (e.g., adaptive boosting, support vector machines) can outperform model-based techniques in terms of predictive precision and reliability (e.g., forecasting patient diagnosis). We observed that statistical rebalancing of cohort sizes yields better discrimination of group differences, specifically for predictive analytics based on heterogeneous and incomplete PPMI data. UPDRS scores play a critical role in predicting diagnosis, which is expected based on the clinical definition of Parkinson's disease. Even without longitudinal UPDRS data, however, the accuracy of model-free machine learning based classification is over 80%. The methods, software and protocols developed here are openly shared and can be employed to study other neurodegenerative disorders (e.g., Alzheimer's, Huntington's, amyotrophic lateral sclerosis), as well as for other predictive Big Data analytics applications.
High-Order Semi-Discrete Central-Upwind Schemes for Multi-Dimensional Hamilton-Jacobi Equations
NASA Technical Reports Server (NTRS)
Bryson, Steve; Levy, Doron; Biegel, Bryan (Technical Monitor)
2002-01-01
We present the first fifth order, semi-discrete central upwind method for approximating solutions of multi-dimensional Hamilton-Jacobi equations. Unlike most of the commonly used high order upwind schemes, our scheme is formulated as a Godunov-type scheme. The scheme is based on the fluxes of Kurganov-Tadmor and Kurganov-Tadmor-Petrova, and is derived for an arbitrary number of space dimensions. A theorem establishing the monotonicity of these fluxes is provided. The spacial discretization is based on a weighted essentially non-oscillatory reconstruction of the derivative. The accuracy and stability properties of our scheme are demonstrated in a variety of examples. A comparison between our method and other fifth-order schemes for Hamilton-Jacobi equations shows that our method exhibits smaller errors without any increase in the complexity of the computations.
A Quantum Proxy Blind Signature Scheme Based on Genuine Five-Qubit Entangled State
NASA Astrophysics Data System (ADS)
Zeng, Chuan; Zhang, Jian-Zhong; Xie, Shu-Cui
2017-06-01
In this paper, a quantum proxy blind signature scheme based on controlled quantum teleportation is proposed. This scheme uses a genuine five-qubit entangled state as quantum channel and adopts the classical Vernam algorithm to blind message. We use the physical characteristics of quantum mechanics to implement delegation, signature and verification. Security analysis shows that our scheme is valid and satisfy the properties of a proxy blind signature, such as blindness, verifiability, unforgeability, undeniability.
2011-07-01
10%. These results demonstrate that the IOP-based BRDF correction scheme (which is composed of the R„ model along with the IOP retrieval...distribution was averaged over 10 min 5. Validation of the lOP-Based BRDF Correction Scheme The IOP-based BRDF correction scheme is applied to both...oceanic and coastal waters were very consistent qualitatively and quantitatively and thus validate the IOP- based BRDF correction system, at least
NASA Astrophysics Data System (ADS)
Zhang, Junwei; Hong, Xuezhi; Liu, Jie; Guo, Changjian
2018-04-01
In this work, we investigate and experimentally demonstrate an orthogonal frequency division multiplexing (OFDM) based high speed wavelength-division multiplexed (WDM) visible light communication (VLC) system using an inter-block data precoding and superimposed pilots (DP-SP) based channel estimation (CE) scheme. The residual signal-to-pilot interference (SPI) can be eliminated by using inter-block data precoding, resulting in a significant improvement in estimated accuracy and the overall system performance compared with uncoded SP based CE scheme. We also study the power allocation/overhead problem of the training for DP-SP, uncoded SP and conventional preamble based CE schemes, from which we obtain the optimum signal-to-pilot power ratio (SPR)/overhead percentage for all above cases. Intra-symbol frequency-domain averaging (ISFA) is also adopted to further enhance the accuracy of CE. By using the DP-SP based CE scheme, aggregate data rates of 1.87-Gbit/s and 1.57-Gbit/s are experimentally demonstrated over 0.8-m and 2-m indoor free space transmission, respectively, using a commercially available red, green and blue (RGB) light emitting diode (LED) with WDM. Experimental results show that the DP-SP based CE scheme is comparable to the conventional preamble based CE scheme in term of received Q factor and data rate while entailing a much smaller overhead-size.
A digital memories based user authentication scheme with privacy preservation.
Liu, JunLiang; Lyu, Qiuyun; Wang, Qiuhua; Yu, Xiangxiang
2017-01-01
The traditional username/password or PIN based authentication scheme, which still remains the most popular form of authentication, has been proved insecure, unmemorable and vulnerable to guessing, dictionary attack, key-logger, shoulder-surfing and social engineering. Based on this, a large number of new alternative methods have recently been proposed. However, most of them rely on users being able to accurately recall complex and unmemorable information or using extra hardware (such as a USB Key), which makes authentication more difficult and confusing. In this paper, we propose a Digital Memories based user authentication scheme adopting homomorphic encryption and a public key encryption design which can protect users' privacy effectively, prevent tracking and provide multi-level security in an Internet & IoT environment. Also, we prove the superior reliability and security of our scheme compared to other schemes and present a performance analysis and promising evaluation results.
A digital memories based user authentication scheme with privacy preservation
Liu, JunLiang; Lyu, Qiuyun; Wang, Qiuhua; Yu, Xiangxiang
2017-01-01
The traditional username/password or PIN based authentication scheme, which still remains the most popular form of authentication, has been proved insecure, unmemorable and vulnerable to guessing, dictionary attack, key-logger, shoulder-surfing and social engineering. Based on this, a large number of new alternative methods have recently been proposed. However, most of them rely on users being able to accurately recall complex and unmemorable information or using extra hardware (such as a USB Key), which makes authentication more difficult and confusing. In this paper, we propose a Digital Memories based user authentication scheme adopting homomorphic encryption and a public key encryption design which can protect users’ privacy effectively, prevent tracking and provide multi-level security in an Internet & IoT environment. Also, we prove the superior reliability and security of our scheme compared to other schemes and present a performance analysis and promising evaluation results. PMID:29190659
1980-12-01
92626. I DECEMBER 1980 APPROVED FOR PUBLIC RELEASE: DISTRIBUTION UNLIMITED Prepared for U.S. ARMY CORPS OF ENGINEERS ENGINEER TOPOGRAPHIC LABORATORIES J...CONTROLLING OFFICE NAME AND ADDRESS 12. REPORT DATE December 1980 U. S. Army Engineer Topographic Laboratories 13. NUMBER OF PAGES Fort Belvoir...infrared and panchromatic imagery was collected by the Oregon Army National Guard at the Corvallis, Oregon, test site on 13 and 19 August 1980 . Ground f
Multi-Source Fusion for Explosive Hazard Detection in Forward Looking Sensors
2016-12-01
include; (1) Investigating (a) thermal, (b) synthetic aperture acoustics ( SAA ) and (c) voxel space Radar for buried and side threat attacks. (2...detection. (3) With respect to SAA , we developed new approaches in the time and frequency domains for analyzing signature of concealed targets (called...Fraz). We also developed a method to extract a multi-spectral signature from SAA and deep learning was used on limited training and class imbalance
A Robust and Effective Smart-Card-Based Remote User Authentication Mechanism Using Hash Function
Odelu, Vanga; Goswami, Adrijit
2014-01-01
In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme. PMID:24892078
A robust and effective smart-card-based remote user authentication mechanism using hash function.
Das, Ashok Kumar; Odelu, Vanga; Goswami, Adrijit
2014-01-01
In a remote user authentication scheme, a remote server verifies whether a login user is genuine and trustworthy, and also for mutual authentication purpose a login user validates whether the remote server is genuine and trustworthy. Several remote user authentication schemes using the password, the biometrics, and the smart card have been proposed in the literature. However, most schemes proposed in the literature are either computationally expensive or insecure against several known attacks. In this paper, we aim to propose a new robust and effective password-based remote user authentication scheme using smart card. Our scheme is efficient, because our scheme uses only efficient one-way hash function and bitwise XOR operations. Through the rigorous informal and formal security analysis, we show that our scheme is secure against possible known attacks. We perform the simulation for the formal security analysis using the widely accepted AVISPA (Automated Validation Internet Security Protocols and Applications) tool to ensure that our scheme is secure against passive and active attacks. Furthermore, our scheme supports efficiently the password change phase always locally without contacting the remote server and correctly. In addition, our scheme performs significantly better than other existing schemes in terms of communication, computational overheads, security, and features provided by our scheme.
A remark on the GNSS single difference model with common clock scheme for attitude determination
NASA Astrophysics Data System (ADS)
Chen, Wantong
2016-09-01
GNSS-based attitude determination technique is an important field of study, in which two schemes can be used to construct the actual system: the common clock scheme and the non-common clock scheme. Compared with the non-common clock scheme, the common clock scheme can strongly improve both the reliability and the accuracy. However, in order to gain these advantages, specific care must be taken in the implementation. The cares are thus discussed, based on the generating technique of carrier phase measurement in GNSS receivers. A qualitative assessment of potential phase bias contributes is also carried out. Possible technical difficulties are pointed out for the development of single-board multi-antenna GNSS attitude systems with a common clock.
NASA Astrophysics Data System (ADS)
Chao, Luo
2015-11-01
In this paper, a novel digital secure communication scheme is firstly proposed. Different from the usual secure communication schemes based on chaotic synchronization, the proposed scheme employs asynchronous communication which avoids the weakness of synchronous systems and is susceptible to environmental interference. Moreover, as to the transmission errors and data loss in the process of communication, the proposed scheme has the ability to be error-checking and error-correcting in real time. In order to guarantee security, the fractional-order complex chaotic system with the shifting of order is utilized to modulate the transmitted signal, which has high nonlinearity and complexity in both frequency and time domains. The corresponding numerical simulations demonstrate the effectiveness and feasibility of the scheme.
Student Loans Schemes in Mauritius: Experience, Analysis and Scenarios
ERIC Educational Resources Information Center
Mohadeb, Praveen
2006-01-01
This study makes a comprehensive review of the situation of student loans schemes in Mauritius, and makes recommendations, based on best practices, for setting up a national scheme that attempts to avoid weaknesses identified in some of the loans schemes of other countries. It suggests that such a scheme would be cost-effective and beneficial both…
A Hybrid Key Management Scheme for WSNs Based on PPBR and a Tree-Based Path Key Establishment Method
Zhang, Ying; Liang, Jixing; Zheng, Bingxin; Chen, Wei
2016-01-01
With the development of wireless sensor networks (WSNs), in most application scenarios traditional WSNs with static sink nodes will be gradually replaced by Mobile Sinks (MSs), and the corresponding application requires a secure communication environment. Current key management researches pay less attention to the security of sensor networks with MS. This paper proposes a hybrid key management schemes based on a Polynomial Pool-based key pre-distribution and Basic Random key pre-distribution (PPBR) to be used in WSNs with MS. The scheme takes full advantages of these two kinds of methods to improve the cracking difficulty of the key system. The storage effectiveness and the network resilience can be significantly enhanced as well. The tree-based path key establishment method is introduced to effectively solve the problem of communication link connectivity. Simulation clearly shows that the proposed scheme performs better in terms of network resilience, connectivity and storage effectiveness compared to other widely used schemes. PMID:27070624
TripSense: A Trust-Based Vehicular Platoon Crowdsensing Scheme with Privacy Preservation in VANETs
Hu, Hao; Lu, Rongxing; Huang, Cheng; Zhang, Zonghua
2016-01-01
In this paper, we propose a trust-based vehicular platoon crowdsensing scheme, named TripSense, in VANET. The proposed TripSense scheme introduces a trust-based system to evaluate vehicles’ sensing abilities and then selects the more capable vehicles in order to improve sensing results accuracy. In addition, the sensing tasks are accomplished by platoon member vehicles and preprocessed by platoon head vehicles before the data are uploaded to server. Hence, it is less time-consuming and more efficient compared with the way where the data are submitted by individual platoon member vehicles. Hence it is more suitable in ephemeral networks like VANET. Moreover, our proposed TripSense scheme integrates unlinkable pseudo-ID techniques to achieve PM vehicle identity privacy, and employs a privacy-preserving sensing vehicle selection scheme without involving the PM vehicle’s trust score to keep its location privacy. Detailed security analysis shows that our proposed TripSense scheme not only achieves desirable privacy requirements but also resists against attacks launched by adversaries. In addition, extensive simulations are conducted to show the correctness and effectiveness of our proposed scheme. PMID:27258287