Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment
NASA Technical Reports Server (NTRS)
Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun
2006-01-01
Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.
48 CFR 9904.418-50 - Techniques for application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... objective's allocation base data shall be excluded from the base used to allocate the pool. (g) Use of... which the pool relates. (c) Change in allocation base. No change in an existing indirect cost pool allocation base is required if the allocation resulting from the existing base does not differ materially...
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Noise suppression in surface microseismic data
Forghani-Arani, Farnoush; Batzle, Mike; Behura, Jyoti; Willis, Mark; Haines, Seth S.; Davidson, Michael
2012-01-01
We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform. We introduce a passive noise suppression technique, based on the τ − p transform. In the τ − p domain, one can separate microseismic events from surface noise based on distinct characteristics that are not visible in the time-offset domain. By applying the inverse τ − p transform to the separated microseismic event, we suppress the surface noise in the data. Our technique significantly improves the signal-to-noise ratios of the microseismic events and is superior to existing techniques for passive noise suppression in the sense that it preserves the waveform.
Semantics driven approach for knowledge acquisition from EMRs.
Perera, Sujan; Henson, Cory; Thirunarayan, Krishnaprasad; Sheth, Amit; Nair, Suhas
2014-03-01
Semantic computing technologies have matured to be applicable to many critical domains such as national security, life sciences, and health care. However, the key to their success is the availability of a rich domain knowledge base. The creation and refinement of domain knowledge bases pose difficult challenges. The existing knowledge bases in the health care domain are rich in taxonomic relationships, but they lack nontaxonomic (domain) relationships. In this paper, we describe a semiautomatic technique for enriching existing domain knowledge bases with causal relationships gleaned from Electronic Medical Records (EMR) data. We determine missing causal relationships between domain concepts by validating domain knowledge against EMR data sources and leveraging semantic-based techniques to derive plausible relationships that can rectify knowledge gaps. Our evaluation demonstrates that semantic techniques can be employed to improve the efficiency of knowledge acquisition.
NASA Technical Reports Server (NTRS)
Djorgovski, George
1993-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multiparameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resource.
NASA Technical Reports Server (NTRS)
Djorgovski, Stanislav
1992-01-01
The existing and forthcoming data bases from NASA missions contain an abundance of information whose complexity cannot be efficiently tapped with simple statistical techniques. Powerful multivariate statistical methods already exist which can be used to harness much of the richness of these data. Automatic classification techniques have been developed to solve the problem of identifying known types of objects in multi parameter data sets, in addition to leading to the discovery of new physical phenomena and classes of objects. We propose an exploratory study and integration of promising techniques in the development of a general and modular classification/analysis system for very large data bases, which would enhance and optimize data management and the use of human research resources.
Conditional Random Field-Based Offline Map Matching for Indoor Environments
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-01-01
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892
Conditional Random Field-Based Offline Map Matching for Indoor Environments.
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-08-16
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.
A ROle-Oriented Filtering (ROOF) approach for collaborative recommendation
NASA Astrophysics Data System (ADS)
Ghani, Imran; Jeong, Seung Ryul
2016-09-01
In collaborative filtering (CF) recommender systems, existing techniques frequently focus on determining similarities among users' historical interests. This generally refers to situations in which each user normally plays a single role and his/her taste remains consistent over the long term. However, we note that existing techniques have not been significantly employed in a role-oriented context. This is especially so in situations where users may change their roles over time or play multiple roles simultaneously, while still expecting to access relevant information resources accordingly. Such systems include enterprise architecture management systems, e-commerce sites or journal management systems. In scenarios involving existing techniques, each user needs to build up very different profiles (preferences and interests) based on multiple roles which change over time. Should this not occur to a satisfactory degree, their previous information will either be lost or not utilised at all. To limit the occurrence of such issues, we propose a ROle-Oriented Filtering (ROOF) approach focusing on the manner in which multiple user profiles are obtained and maintained over time. We conducted a number of experiments using an enterprise architecture management scenario. In so doing, we observed that the ROOF approach performs better in comparison with other existing collaborative filtering-based techniques.
Steganography based on pixel intensity value decomposition
NASA Astrophysics Data System (ADS)
Abdulla, Alan Anwar; Sellahewa, Harin; Jassim, Sabah A.
2014-05-01
This paper focuses on steganography based on pixel intensity value decomposition. A number of existing schemes such as binary, Fibonacci, Prime, Natural, Lucas, and Catalan-Fibonacci (CF) are evaluated in terms of payload capacity and stego quality. A new technique based on a specific representation is proposed to decompose pixel intensity values into 16 (virtual) bit-planes suitable for embedding purposes. The proposed decomposition has a desirable property whereby the sum of all bit-planes does not exceed the maximum pixel intensity value, i.e. 255. Experimental results demonstrate that the proposed technique offers an effective compromise between payload capacity and stego quality of existing embedding techniques based on pixel intensity value decomposition. Its capacity is equal to that of binary and Lucas, while it offers a higher capacity than Fibonacci, Prime, Natural, and CF when the secret bits are embedded in 1st Least Significant Bit (LSB). When the secret bits are embedded in higher bit-planes, i.e., 2nd LSB to 8th Most Significant Bit (MSB), the proposed scheme has more capacity than Natural numbers based embedding. However, from the 6th bit-plane onwards, the proposed scheme offers better stego quality. In general, the proposed decomposition scheme has less effect in terms of quality on pixel value when compared to most existing pixel intensity value decomposition techniques when embedding messages in higher bit-planes.
NASA Astrophysics Data System (ADS)
Zhao, Jin; Han-Ming, Zhang; Bin, Yan; Lei, Li; Lin-Yuan, Wang; Ai-Long, Cai
2016-03-01
Sparse-view x-ray computed tomography (CT) imaging is an interesting topic in CT field and can efficiently decrease radiation dose. Compared with spatial reconstruction, a Fourier-based algorithm has advantages in reconstruction speed and memory usage. A novel Fourier-based iterative reconstruction technique that utilizes non-uniform fast Fourier transform (NUFFT) is presented in this work along with advanced total variation (TV) regularization for a fan sparse-view CT. The proposition of a selective matrix contributes to improve reconstruction quality. The new method employs the NUFFT and its adjoin to iterate back and forth between the Fourier and image space. The performance of the proposed algorithm is demonstrated through a series of digital simulations and experimental phantom studies. Results of the proposed algorithm are compared with those of existing TV-regularized techniques based on compressed sensing method, as well as basic algebraic reconstruction technique. Compared with the existing TV-regularized techniques, the proposed Fourier-based technique significantly improves convergence rate and reduces memory allocation, respectively. Projected supported by the National High Technology Research and Development Program of China (Grant No. 2012AA011603) and the National Natural Science Foundation of China (Grant No. 61372172).
NLPIR: A Theoretical Framework for Applying Natural Language Processing to Information Retrieval.
ERIC Educational Resources Information Center
Zhou, Lina; Zhang, Dongsong
2003-01-01
Proposes a theoretical framework called NLPIR that integrates natural language processing (NLP) into information retrieval (IR) based on the assumption that there exists representation distance between queries and documents. Discusses problems in traditional keyword-based IR, including relevance, and describes some existing NLP techniques.…
Efficient morse decompositions of vector fields.
Chen, Guoning; Mischaikow, Konstantin; Laramee, Robert S; Zhang, Eugene
2008-01-01
Existing topology-based vector field analysis techniques rely on the ability to extract the individual trajectories such as fixed points, periodic orbits, and separatrices that are sensitive to noise and errors introduced by simulation and interpolation. This can make such vector field analysis unsuitable for rigorous interpretations. We advocate the use of Morse decompositions, which are robust with respect to perturbations, to encode the topological structures of a vector field in the form of a directed graph, called a Morse connection graph (MCG). While an MCG exists for every vector field, it need not be unique. Previous techniques for computing MCG's, while fast, are overly conservative and usually results in MCG's that are too coarse to be useful for the applications. To address this issue, we present a new technique for performing Morse decomposition based on the concept of tau-maps, which typically provides finer MCG's than existing techniques. Furthermore, the choice of tau provides a natural tradeoff between the fineness of the MCG's and the computational costs. We provide efficient implementations of Morse decomposition based on tau-maps, which include the use of forward and backward mapping techniques and an adaptive approach in constructing better approximations of the images of the triangles in the meshes used for simulation.. Furthermore, we propose the use of spatial tau-maps in addition to the original temporal tau-maps. These techniques provide additional trade-offs between the quality of the MCGs and the speed of computation. We demonstrate the utility of our technique with various examples in the plane and on surfaces including engine simulation data sets.
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout RH; Stewart-Knox, Barbara J; Mathers, John C
2018-01-01
Background To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. Objective The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. Methods The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype–based, and intake+phenotype+gene–based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Results Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. Conclusions The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. Trial Registration ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1) PMID:29631993
Determining the semantic similarities among Gene Ontology terms.
Taha, Kamal
2013-05-01
We present in this paper novel techniques that determine the semantic relationships among GeneOntology (GO) terms. We implemented these techniques in a prototype system called GoSE, which resides between user application and GO database. Given a set S of GO terms, GoSE would return another set S' of GO terms, where each term in S' is semantically related to each term in S. Most current research is focused on determining the semantic similarities among GO ontology terms based solely on their IDs and proximity to one another in the GO graph structure, while overlooking the contexts of the terms, which may lead to erroneous results. The context of a GO term T is the set of other terms, whose existence in the GO graph structure is dependent on T. We propose novel techniques that determine the contexts of terms based on the concept of existence dependency. We present a stack-based sort-merge algorithm employing these techniques for determining the semantic similarities among GO terms.We evaluated GoSE experimentally and compared it with three existing methods. The results of measuring the semantic similarities among genes in KEGG and Pfam pathways retrieved from the DBGET and Sanger Pfam databases, respectively, have shown that our method outperforms the other three methods in recall and precision.
The Optimization of In-Memory Space Partitioning Trees for Cache Utilization
NASA Astrophysics Data System (ADS)
Yeo, Myung Ho; Min, Young Soo; Bok, Kyoung Soo; Yoo, Jae Soo
In this paper, a novel cache conscious indexing technique based on space partitioning trees is proposed. Many researchers investigated efficient cache conscious indexing techniques which improve retrieval performance of in-memory database management system recently. However, most studies considered data partitioning and targeted fast information retrieval. Existing data partitioning-based index structures significantly degrade performance due to the redundant accesses of overlapped spaces. Specially, R-tree-based index structures suffer from the propagation of MBR (Minimum Bounding Rectangle) information by updating data frequently. In this paper, we propose an in-memory space partitioning index structure for optimal cache utilization. The proposed index structure is compared with the existing index structures in terms of update performance, insertion performance and cache-utilization rate in a variety of environments. The results demonstrate that the proposed index structure offers better performance than existing index structures.
Graph-based real-time fault diagnostics
NASA Technical Reports Server (NTRS)
Padalkar, S.; Karsai, G.; Sztipanovits, J.
1988-01-01
A real-time fault detection and diagnosis capability is absolutely crucial in the design of large-scale space systems. Some of the existing AI-based fault diagnostic techniques like expert systems and qualitative modelling are frequently ill-suited for this purpose. Expert systems are often inadequately structured, difficult to validate and suffer from knowledge acquisition bottlenecks. Qualitative modelling techniques sometimes generate a large number of failure source alternatives, thus hampering speedy diagnosis. In this paper we present a graph-based technique which is well suited for real-time fault diagnosis, structured knowledge representation and acquisition and testing and validation. A Hierarchical Fault Model of the system to be diagnosed is developed. At each level of hierarchy, there exist fault propagation digraphs denoting causal relations between failure modes of subsystems. The edges of such a digraph are weighted with fault propagation time intervals. Efficient and restartable graph algorithms are used for on-line speedy identification of failure source components.
A Survey on Anomaly Based Host Intrusion Detection System
NASA Astrophysics Data System (ADS)
Jose, Shijoe; Malathi, D.; Reddy, Bharath; Jayaseeli, Dorathi
2018-04-01
An intrusion detection system (IDS) is hardware, software or a combination of two, for monitoring network or system activities to detect malicious signs. In computer security, designing a robust intrusion detection system is one of the most fundamental and important problems. The primary function of system is detecting intrusion and gives alerts when user tries to intrusion on timely manner. In these techniques when IDS find out intrusion it will send alert massage to the system administrator. Anomaly detection is an important problem that has been researched within diverse research areas and application domains. This survey tries to provide a structured and comprehensive overview of the research on anomaly detection. From the existing anomaly detection techniques, each technique has relative strengths and weaknesses. The current state of the experiment practice in the field of anomaly-based intrusion detection is reviewed and survey recent studies in this. This survey provides a study of existing anomaly detection techniques, and how the techniques used in one area can be applied in another application domain.
A New Adaptive Framework for Collaborative Filtering Prediction
Almosallam, Ibrahim A.; Shang, Yi
2010-01-01
Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix’s system. PMID:21572924
A New Adaptive Framework for Collaborative Filtering Prediction.
Almosallam, Ibrahim A; Shang, Yi
2008-06-01
Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix's system.
A fuzzy optimal threshold technique for medical images
NASA Astrophysics Data System (ADS)
Thirupathi Kannan, Balaji; Krishnasamy, Krishnaveni; Pradeep Kumar Kenny, S.
2012-01-01
A new fuzzy based thresholding method for medical images especially cervical cytology images having blob and mosaic structures is proposed in this paper. Many existing thresholding algorithms may segment either blob or mosaic images but there aren't any single algorithm that can do both. In this paper, an input cervical cytology image is binarized, preprocessed and the pixel value with minimum Fuzzy Gaussian Index is identified as an optimal threshold value and used for segmentation. The proposed technique is tested on various cervical cytology images having blob or mosaic structures, compared with various existing algorithms and proved better than the existing algorithms.
A survey and taxonomy on energy efficient resource allocation techniques for cloud computing systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv
In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subjectmore » that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hameed, Abdul; Khoshkbarforoushha, Alireza; Ranjan, Rajiv
In a cloud computing paradigm, energy efficient allocation of different virtualized ICT resources (servers, storage disks, and networks, and the like) is a complex problem due to the presence of heterogeneous application (e.g., content delivery networks, MapReduce, web applications, and the like) workloads having contentious allocation requirements in terms of ICT resource capacities (e.g., network bandwidth, processing speed, response time, etc.). Several recent papers have tried to address the issue of improving energy efficiency in allocating cloud resources to applications with varying degree of success. However, to the best of our knowledge there is no published literature on this subjectmore » that clearly articulates the research problem and provides research taxonomy for succinct classification of existing techniques. Hence, the main aim of this paper is to identify open challenges associated with energy efficient resource allocation. In this regard, the study, first, outlines the problem and existing hardware and software-based techniques available for this purpose. Furthermore, available techniques already presented in the literature are summarized based on the energy-efficient research dimension taxonomy. The advantages and disadvantages of the existing techniques are comprehensively analyzed against the proposed research dimension taxonomy namely: resource adaption policy, objective function, allocation method, allocation operation, and interoperability.« less
Using Motion Planning to Determine the Existence of an Accessible Route in a CAD Environment
ERIC Educational Resources Information Center
Pan, Xiaoshan; Han, Charles S.; Law, Kincho H.
2010-01-01
We describe an algorithm based on motion-planning techniques to determine the existence of an accessible route through a facility for a wheeled mobility device. The algorithm is based on LaValle's work on rapidly exploring random trees and is enhanced to take into consideration the particularities of the accessible route domain. Specifically, the…
2006-03-31
from existing image steganography and steganalysis techniques, the overall objective of Task (b) is to design and implement audio steganography in...general design of the VoIP steganography algorithm is based on known LSB hiding techniques (used for example in StegHide (http...system. Nasir Memon et. al. described a steganalyzer based on image quality metrics [AMS03]. Basically, the main idea to detect steganography by
Choi, Kihwan; Li, Ruijiang; Nam, Haewon; Xing, Lei
2014-06-21
As a solution to iterative CT image reconstruction, first-order methods are prominent for the large-scale capability and the fast convergence rate [Formula: see text]. In practice, the CT system matrix with a large condition number may lead to slow convergence speed despite the theoretically promising upper bound. The aim of this study is to develop a Fourier-based scaling technique to enhance the convergence speed of first-order methods applied to CT image reconstruction. Instead of working in the projection domain, we transform the projection data and construct a data fidelity model in Fourier space. Inspired by the filtered backprojection formalism, the data are appropriately weighted in Fourier space. We formulate an optimization problem based on weighted least-squares in the Fourier space and total-variation (TV) regularization in image space for parallel-beam, fan-beam and cone-beam CT geometry. To achieve the maximum computational speed, the optimization problem is solved using a fast iterative shrinkage-thresholding algorithm with backtracking line search and GPU implementation of projection/backprojection. The performance of the proposed algorithm is demonstrated through a series of digital simulation and experimental phantom studies. The results are compared with the existing TV regularized techniques based on statistics-based weighted least-squares as well as basic algebraic reconstruction technique. The proposed Fourier-based compressed sensing (CS) method significantly improves both the image quality and the convergence rate compared to the existing CS techniques.
Locality-Aware CTA Clustering For Modern GPUs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Ang; Song, Shuaiwen; Liu, Weifeng
2017-04-08
In this paper, we proposed a novel clustering technique for tapping into the performance potential of a largely ignored type of locality: inter-CTA locality. We first demonstrated the capability of the existing GPU hardware to exploit such locality, both spatially and temporally, on L1 or L1/Tex unified cache. To verify the potential of this locality, we quantified its existence in a broad spectrum of applications and discussed its sources of origin. Based on these insights, we proposed the concept of CTA-Clustering and its associated software techniques. Finally, We evaluated these techniques on all modern generations of NVIDIA GPU architectures. Themore » experimental results showed that our proposed clustering techniques could significantly improve on-chip cache performance.« less
Statistical approach for selection of biologically informative genes.
Das, Samarendra; Rai, Anil; Mishra, D C; Rai, Shesh N
2018-05-20
Selection of informative genes from high dimensional gene expression data has emerged as an important research area in genomics. Many gene selection techniques have been proposed so far are either based on relevancy or redundancy measure. Further, the performance of these techniques has been adjudged through post selection classification accuracy computed through a classifier using the selected genes. This performance metric may be statistically sound but may not be biologically relevant. A statistical approach, i.e. Boot-MRMR, was proposed based on a composite measure of maximum relevance and minimum redundancy, which is both statistically sound and biologically relevant for informative gene selection. For comparative evaluation of the proposed approach, we developed two biological sufficient criteria, i.e. Gene Set Enrichment with QTL (GSEQ) and biological similarity score based on Gene Ontology (GO). Further, a systematic and rigorous evaluation of the proposed technique with 12 existing gene selection techniques was carried out using five gene expression datasets. This evaluation was based on a broad spectrum of statistically sound (e.g. subject classification) and biological relevant (based on QTL and GO) criteria under a multiple criteria decision-making framework. The performance analysis showed that the proposed technique selects informative genes which are more biologically relevant. The proposed technique is also found to be quite competitive with the existing techniques with respect to subject classification and computational time. Our results also showed that under the multiple criteria decision-making setup, the proposed technique is best for informative gene selection over the available alternatives. Based on the proposed approach, an R Package, i.e. BootMRMR has been developed and available at https://cran.r-project.org/web/packages/BootMRMR. This study will provide a practical guide to select statistical techniques for selecting informative genes from high dimensional expression data for breeding and system biology studies. Published by Elsevier B.V.
Macready, Anna L; Fallaize, Rosalind; Butler, Laurie T; Ellis, Judi A; Kuznesof, Sharron; Frewer, Lynn J; Celis-Morales, Carlos; Livingstone, Katherine M; Araújo-Soares, Vera; Fischer, Arnout Rh; Stewart-Knox, Barbara J; Mathers, John C; Lovegrove, Julie A
2018-04-09
To determine the efficacy of behavior change techniques applied in dietary and physical activity intervention studies, it is first necessary to record and describe techniques that have been used during such interventions. Published frameworks used in dietary and smoking cessation interventions undergo continuous development, and most are not adapted for Web-based delivery. The Food4Me study (N=1607) provided the opportunity to use existing frameworks to describe standardized Web-based techniques employed in a large-scale, internet-based intervention to change dietary behavior and physical activity. The aims of this study were (1) to describe techniques embedded in the Food4Me study design and explain the selection rationale and (2) to demonstrate the use of behavior change technique taxonomies, develop standard operating procedures for training, and identify strengths and limitations of the Food4Me framework that will inform its use in future studies. The 6-month randomized controlled trial took place simultaneously in seven European countries, with participants receiving one of four levels of personalized advice (generalized, intake-based, intake+phenotype-based, and intake+phenotype+gene-based). A three-phase approach was taken: (1) existing taxonomies were reviewed and techniques were identified a priori for possible inclusion in the Food4Me study, (2) a standard operating procedure was developed to maintain consistency in the use of methods and techniques across research centers, and (3) the Food4Me behavior change technique framework was reviewed and updated post intervention. An analysis of excluded techniques was also conducted. Of 46 techniques identified a priori as being applicable to Food4Me, 17 were embedded in the intervention design; 11 were from a dietary taxonomy, and 6 from a smoking cessation taxonomy. In addition, the four-category smoking cessation framework structure was adopted for clarity of communication. Smoking cessation texts were adapted for dietary use where necessary. A posteriori, a further 9 techniques were included. Examination of excluded items highlighted the distinction between techniques considered appropriate for face-to-face versus internet-based delivery. The use of existing taxonomies facilitated the description and standardization of techniques used in Food4Me. We recommend that for complex studies of this nature, technique analysis should be conducted a priori to develop standardized procedures and training and reviewed a posteriori to audit the techniques actually adopted. The present framework description makes a valuable contribution to future systematic reviews and meta-analyses that explore technique efficacy and underlying psychological constructs. This was a novel application of the behavior change taxonomies and was the first internet-based personalized nutrition intervention to use such a framework remotely. ClinicalTrials.gov NCT01530139; https://clinicaltrials.gov/ct2/show/NCT01530139 (Archived by WebCite at http://www.webcitation.org/6y8XYUft1). ©Anna L Macready, Rosalind Fallaize, Laurie T Butler, Judi A Ellis, Sharron Kuznesof, Lynn J Frewer, Carlos Celis-Morales, Katherine M Livingstone, Vera Araújo-Soares, Arnout RH Fischer, Barbara J Stewart-Knox, John C Mathers, Julie A Lovegrove. Originally published in JMIR Research Protocols (http://www.researchprotocols.org), 09.04.2018.
Image steganalysis using Artificial Bee Colony algorithm
NASA Astrophysics Data System (ADS)
Sajedi, Hedieh
2017-09-01
Steganography is the science of secure communication where the presence of the communication cannot be detected while steganalysis is the art of discovering the existence of the secret communication. Processing a huge amount of information takes extensive execution time and computational sources most of the time. As a result, it is needed to employ a phase of preprocessing, which can moderate the execution time and computational sources. In this paper, we propose a new feature-based blind steganalysis method for detecting stego images from the cover (clean) images with JPEG format. In this regard, we present a feature selection technique based on an improved Artificial Bee Colony (ABC). ABC algorithm is inspired by honeybees' social behaviour in their search for perfect food sources. In the proposed method, classifier performance and the dimension of the selected feature vector depend on using wrapper-based methods. The experiments are performed using two large data-sets of JPEG images. Experimental results demonstrate the effectiveness of the proposed steganalysis technique compared to the other existing techniques.
NASA Astrophysics Data System (ADS)
Binol, Hamidullah; Bal, Abdullah; Cukur, Huseyin
2015-10-01
The performance of the kernel based techniques depends on the selection of kernel parameters. That's why; suitable parameter selection is an important problem for many kernel based techniques. This article presents a novel technique to learn the kernel parameters in kernel Fukunaga-Koontz Transform based (KFKT) classifier. The proposed approach determines the appropriate values of kernel parameters through optimizing an objective function constructed based on discrimination ability of KFKT. For this purpose we have utilized differential evolution algorithm (DEA). The new technique overcomes some disadvantages such as high time consumption existing in the traditional cross-validation method, and it can be utilized in any type of data. The experiments for target detection applications on the hyperspectral images verify the effectiveness of the proposed method.
Scalable graphene production: perspectives and challenges of plasma applications
NASA Astrophysics Data System (ADS)
Levchenko, Igor; Ostrikov, Kostya (Ken); Zheng, Jie; Li, Xingguo; Keidar, Michael; B. K. Teo, Kenneth
2016-05-01
Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h-1 m-2 was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.
Scalable graphene production: perspectives and challenges of plasma applications.
Levchenko, Igor; Ostrikov, Kostya Ken; Zheng, Jie; Li, Xingguo; Keidar, Michael; B K Teo, Kenneth
2016-05-19
Graphene, a newly discovered and extensively investigated material, has many unique and extraordinary properties which promise major technological advances in fields ranging from electronics to mechanical engineering and food production. Unfortunately, complex techniques and high production costs hinder commonplace applications. Scaling of existing graphene production techniques to the industrial level without compromising its properties is a current challenge. This article focuses on the perspectives and challenges of scalability, equipment, and technological perspectives of the plasma-based techniques which offer many unique possibilities for the synthesis of graphene and graphene-containing products. The plasma-based processes are amenable for scaling and could also be useful to enhance the controllability of the conventional chemical vapour deposition method and some other techniques, and to ensure a good quality of the produced graphene. We examine the unique features of the plasma-enhanced graphene production approaches, including the techniques based on inductively-coupled and arc discharges, in the context of their potential scaling to mass production following the generic scaling approaches applicable to the existing processes and systems. This work analyses a large amount of the recent literature on graphene production by various techniques and summarizes the results in a tabular form to provide a simple and convenient comparison of several available techniques. Our analysis reveals a significant potential of scalability for plasma-based technologies, based on the scaling-related process characteristics. Among other processes, a greater yield of 1 g × h(-1) m(-2) was reached for the arc discharge technology, whereas the other plasma-based techniques show process yields comparable to the neutral-gas based methods. Selected plasma-based techniques show lower energy consumption than in thermal CVD processes, and the ability to produce graphene flakes of various sizes reaching hundreds of square millimetres, and the thickness varying from a monolayer to 10-20 layers. Additional factors such as electrical voltage and current, not available in thermal CVD processes could potentially lead to better scalability, flexibility and control of the plasma-based processes. Advantages and disadvantages of various systems are also considered.
NASA Astrophysics Data System (ADS)
Rajshekhar, G.; Gorthi, Sai Siva; Rastogi, Pramod
2010-04-01
For phase estimation in digital holographic interferometry, a high-order instantaneous moments (HIM) based method was recently developed which relies on piecewise polynomial approximation of phase and subsequent evaluation of the polynomial coefficients using the HIM operator. A crucial step in the method is mapping the polynomial coefficient estimation to single-tone frequency determination for which various techniques exist. The paper presents a comparative analysis of the performance of the HIM operator based method in using different single-tone frequency estimation techniques for phase estimation. The analysis is supplemented by simulation results.
Hardware implementation of hierarchical volume subdivision-based elastic registration.
Dandekar, Omkar; Walimbe, Vivek; Shekhar, Raj
2006-01-01
Real-time, elastic and fully automated 3D image registration is critical to the efficiency and effectiveness of many image-guided diagnostic and treatment procedures relying on multimodality image fusion or serial image comparison. True, real-time performance will make many 3D image registration-based techniques clinically viable. Hierarchical volume subdivision-based image registration techniques are inherently faster than most elastic registration techniques, e.g. free-form deformation (FFD)-based techniques, and are more amenable for achieving real-time performance through hardware acceleration. Our group has previously reported an FPGA-based architecture for accelerating FFD-based image registration. In this article we show how our existing architecture can be adapted to support hierarchical volume subdivision-based image registration. A proof-of-concept implementation of the architecture achieved speedups of 100 for elastic registration against an optimized software implementation on a 3.2 GHz Pentium III Xeon workstation. Due to inherent parallel nature of the hierarchical volume subdivision-based image registration techniques further speedup can be achieved by using several computing modules in parallel.
A Structural and Content-Based Analysis for Web Filtering.
ERIC Educational Resources Information Center
Lee, P. Y.; Hui, S. C.; Fong, A. C. M.
2003-01-01
Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)
Technique for Early Reliability Prediction of Software Components Using Behaviour Models
Ali, Awad; N. A. Jawawi, Dayang; Adham Isa, Mohd; Imran Babar, Muhammad
2016-01-01
Behaviour models are the most commonly used input for predicting the reliability of a software system at the early design stage. A component behaviour model reveals the structure and behaviour of the component during the execution of system-level functionalities. There are various challenges related to component reliability prediction at the early design stage based on behaviour models. For example, most of the current reliability techniques do not provide fine-grained sequential behaviour models of individual components and fail to consider the loop entry and exit points in the reliability computation. Moreover, some of the current techniques do not tackle the problem of operational data unavailability and the lack of analysis results that can be valuable for software architects at the early design stage. This paper proposes a reliability prediction technique that, pragmatically, synthesizes system behaviour in the form of a state machine, given a set of scenarios and corresponding constraints as input. The state machine is utilized as a base for generating the component-relevant operational data. The state machine is also used as a source for identifying the nodes and edges of a component probabilistic dependency graph (CPDG). Based on the CPDG, a stack-based algorithm is used to compute the reliability. The proposed technique is evaluated by a comparison with existing techniques and the application of sensitivity analysis to a robotic wheelchair system as a case study. The results indicate that the proposed technique is more relevant at the early design stage compared to existing works, and can provide a more realistic and meaningful prediction. PMID:27668748
Cost-sensitive AdaBoost algorithm for ordinal regression based on extreme learning machine.
Riccardi, Annalisa; Fernández-Navarro, Francisco; Carloni, Sante
2014-10-01
In this paper, the well known stagewise additive modeling using a multiclass exponential (SAMME) boosting algorithm is extended to address problems where there exists a natural order in the targets using a cost-sensitive approach. The proposed ensemble model uses an extreme learning machine (ELM) model as a base classifier (with the Gaussian kernel and the additional regularization parameter). The closed form of the derived weighted least squares problem is provided, and it is employed to estimate analytically the parameters connecting the hidden layer to the output layer at each iteration of the boosting algorithm. Compared to the state-of-the-art boosting algorithms, in particular those using ELM as base classifier, the suggested technique does not require the generation of a new training dataset at each iteration. The adoption of the weighted least squares formulation of the problem has been presented as an unbiased and alternative approach to the already existing ELM boosting techniques. Moreover, the addition of a cost model for weighting the patterns, according to the order of the targets, enables the classifier to tackle ordinal regression problems further. The proposed method has been validated by an experimental study by comparing it with already existing ensemble methods and ELM techniques for ordinal regression, showing competitive results.
Measurement of water pressure and deformation with time domain reflectometry cables
NASA Astrophysics Data System (ADS)
Dowding, Charles H.; Pierce, Charles E.
1995-05-01
Time domain reflectometry (TDR) techniques can be deployed to measure water pressures and relative dam abutment displacement with an array of coaxial cables either drilled and grouted or retrofitted through existing passages. Application of TDR to dam monitoring requires determination of appropriate cable types and methods to install these cables in existing dams or during new construction. This paper briefly discusses currently applied and developing TDR techniques and describes initial design considerations for TDR-based dam instrumentation. Water pressure at the base of or within the dam can be determined by measuring the water level within a hollow or air-filled coaxial cable. The ability to retrofit existing porous stone-tipped piezometers is an attractive attribute of the TDR system. Measurement of relative lateral movement can be accomplished by monitoring local shearing of a solid polyethylene-filled coaxial cable at the interface of the dam base and foundation materials or along adversely oriented joints. Uplift can be recorded by measuring cable extension as the dam displaces upward off its foundation. Since each monitoring technique requires measurements with different types of coaxial cables, a variety may be installed within the array. Multiplexing of these cables will allow monitoring from a single pulser, and measurements can be recorded on site or remotely via a modem at any time.
Weighted image de-fogging using luminance dark prior
NASA Astrophysics Data System (ADS)
Kansal, Isha; Kasana, Singara Singh
2017-10-01
In this work, the weighted image de-fogging process based upon dark channel prior is modified by using luminance dark prior. Dark channel prior estimates the transmission by using three colour channels whereas luminance dark prior does the same by making use of only Y component of YUV colour space. For each pixel in a patch of ? size, the luminance dark prior uses ? pixels, rather than ? pixels used in DCP technique, which speeds up the de-fogging process. To estimate the transmission map, weighted approach based upon difference prior is used which mitigates halo artefacts at the time of transmission estimation. The major drawback of weighted technique is that it does not maintain the constancy of the transmission in a local patch even if there are no significant depth disruptions, due to which the de-fogged image looks over smooth and has low contrast. Apart from this, in some images, weighted transmission still carries less visible halo artefacts. Therefore, Gaussian filter is used to blur the estimated weighted transmission map which enhances the contrast of de-fogged images. In addition to this, a novel approach is proposed to remove the pixels belonging to bright light source(s) during the atmospheric light estimation process based upon histogram of YUV colour space. To show the effectiveness, the proposed technique is compared with existing techniques. This comparison shows that the proposed technique performs better than the existing techniques.
Study of TEC and foF2 with the Help of GPS and Ionosonde Data over Maitri, Antarctica
NASA Astrophysics Data System (ADS)
Khatarkar, Prakash; Gwal, Ashok Kumar
Prakash Khatarkar, Purusottam Bhaware, Azad Ahmad Mansoori, Varsha Kachneria, Shweta Thakur, and A. K. Gwal Abstract The behavior of ionosphere can be diagnosed by a number of techniques. The common techniques used are the space based Global Positioning System and the ground based Ionosonde. We have compared the variability of ionospheric parameters by using two different techniques GPS and Ionosonde, during December 2009 to November 2010 at the Indian base station Maitri (11.45E, 70.45S). The comparison between the measurements of two techniques was realized through the Total Electron Content (TEC) parameters derived by using different methods. The comparison was made diurnally, seasonally, polar day and polar night variations and the annually. From our analysis we found that a strong correlation exists between the GPS derived TEC and Ionosonde derived foF2 during the day period while during the night time the correlation is insignificant. At the same time we found that a strong correlation exists between the Ionosonde and GPS derived TEC. The pattern of variation of ionospheric parameters derived from two techniques is strikingly similar indicating that the high degree of synchronization between them. This has a practical applicability by allowing calculating the error in one technique by comparing with other. Keywords: Ionosphere, Ionosonde, GPS, foF2, TEC.
Visibility enhancement of color images using Type-II fuzzy membership function
NASA Astrophysics Data System (ADS)
Singh, Harmandeep; Khehra, Baljit Singh
2018-04-01
Images taken in poor environmental conditions decrease the visibility and hidden information of digital images. Therefore, image enhancement techniques are necessary for improving the significant details of these images. An extensive review has shown that histogram-based enhancement techniques greatly suffer from over/under enhancement issues. Fuzzy-based enhancement techniques suffer from over/under saturated pixels problems. In this paper, a novel Type-II fuzzy-based image enhancement technique has been proposed for improving the visibility of images. The Type-II fuzzy logic can automatically extract the local atmospheric light and roughly eliminate the atmospheric veil in local detail enhancement. The proposed technique has been evaluated on 10 well-known weather degraded color images and is also compared with four well-known existing image enhancement techniques. The experimental results reveal that the proposed technique outperforms others regarding visible edge ratio, color gradients and number of saturated pixels.
Study of synthesis techniques for insensitive aircraft control systems
NASA Technical Reports Server (NTRS)
Harvey, C. A.; Pope, R. E.
1977-01-01
Insensitive flight control system design criteria was defined in terms of maximizing performance (handling qualities, RMS gust response, transient response, stability margins) over a defined parameter range. Wing load alleviation for the C-5A was chosen as a design problem. The C-5A model was a 79-state, two-control structure with uncertainties assumed to exist in dynamic pressure, structural damping and frequency, and the stability derivative, M sub w. Five new techniques (mismatch estimation, uncertainty weighting, finite dimensional inverse, maximum difficulty, dual Lyapunov) were developed. Six existing techniques (additive noise, minimax, multiplant, sensitivity vector augmentation, state dependent noise, residualization) and the mismatch estimation and uncertainty weighting techniques were synthesized and evaluated on the design example. Evaluation and comparison of these six techniques indicated that the minimax and the uncertainty weighting techniques were superior to the other six, and of these two, uncertainty weighting has lower computational requirements. Techniques based on the three remaining new concepts appear promising and are recommended for further research.
NASA Astrophysics Data System (ADS)
Avetisyan, H.; Bruna, O.; Holub, J.
2016-11-01
A numerous techniques and algorithms are dedicated to extract emotions from input data. In our investigation it was stated that emotion-detection approaches can be classified into 3 following types: Keyword based / lexical-based, learning based, and hybrid. The most commonly used techniques, such as keyword-spotting method, Support Vector Machines, Naïve Bayes Classifier, Hidden Markov Model and hybrid algorithms, have impressive results in this sphere and can reach more than 90% determining accuracy.
Sissons, B; Gray, W A; Bater, A; Morrey, D
2007-03-01
The vision of evidence-based medicine is that of experienced clinicians systematically using the best research evidence to meet the individual patient's needs. This vision remains distant from clinical reality, as no complete methodology exists to apply objective, population-based research evidence to the needs of an individual real-world patient. We describe an approach, based on techniques from machine learning, to bridge this gap between evidence and individual patients in oncology. We examine existing proposals for tackling this gap and the relative benefits and challenges of our proposed, k-nearest-neighbour-based, approach.
Block sparsity-based joint compressed sensing recovery of multi-channel ECG signals.
Singh, Anurag; Dandapat, Samarendra
2017-04-01
In recent years, compressed sensing (CS) has emerged as an effective alternative to conventional wavelet based data compression techniques. This is due to its simple and energy-efficient data reduction procedure, which makes it suitable for resource-constrained wireless body area network (WBAN)-enabled electrocardiogram (ECG) telemonitoring applications. Both spatial and temporal correlations exist simultaneously in multi-channel ECG (MECG) signals. Exploitation of both types of correlations is very important in CS-based ECG telemonitoring systems for better performance. However, most of the existing CS-based works exploit either of the correlations, which results in a suboptimal performance. In this work, within a CS framework, the authors propose to exploit both types of correlations simultaneously using a sparse Bayesian learning-based approach. A spatiotemporal sparse model is employed for joint compression/reconstruction of MECG signals. Discrete wavelets transform domain block sparsity of MECG signals is exploited for simultaneous reconstruction of all the channels. Performance evaluations using Physikalisch-Technische Bundesanstalt MECG diagnostic database show a significant gain in the diagnostic reconstruction quality of the MECG signals compared with the state-of-the art techniques at reduced number of measurements. Low measurement requirement may lead to significant savings in the energy-cost of the existing CS-based WBAN systems.
Innovative research on the group teaching mode based on the LabVIEW virtual environment
NASA Astrophysics Data System (ADS)
Liang, Pei; Huang, Jie; Gong, Hua-ping; Dong, Qian-min; Dong, Yan-yan; Sun, Cai-xia
2017-08-01
This paper discusses the widely existing problems of increasing demand of professional engineer in electronic science major and the backward of the teaching mode at present. From one specialized course "Virtual Instrument technique and LABVIEW programming", we explore the new group-teaching mode based on the Virtual Instrument technique, and then the Specific measures and implementation procedures and effect of this teaching mode summarized in the end.
Geometric prepatterning-based tuning of the period doubling onset strain during thin-film wrinkling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saha, Sourabh K.
Wrinkling of thin films is an easy-to-implement and low-cost technique to fabricate stretch-tunable periodic micro and nanoscale structures. However, the tunability of such structures is often limited by the emergence of an undesirable period-doubled mode at high strains. Predictively tuning the onset strain for period doubling via existing techniques requires one to have extensive knowledge about the nonlinear pattern formation behavior. Herein, a geometric prepatterning-based technique is introduced that can be implemented even with limited system knowledge to predictively delay period doubling. The technique comprises prepatterning the film/base bilayer with a sinusoidal pattern that has the same period as themore » natural period of the system. This technique has been verified via physical and computational experiments on the polydimethylsiloxane (PDMS)/glass bilayer system. It is observed that the onset strain can be increased from the typical value of 20% for flat films to greater than 30% with a modest prepattern aspect ratio (2·amplitude/period) of 0.15. In addition, finite element simulations reveal that (i) the onset strain increases with increasing prepattern amplitude and (ii) the delaying effect can be captured entirely by the prepattern geometry. Therefore, one can implement this technique even with limited system knowledge, such as material properties or film thickness, by simply replicating pre-existing wrinkled patterns to generate prepatterned bilayers. Furthermore, geometric prepatterning is a practical scheme to increase the operating range of stretch-tunable wrinkle-based devices by at least 50%.« less
Geometric prepatterning-based tuning of the period doubling onset strain during thin-film wrinkling
Saha, Sourabh K.
2017-04-05
Wrinkling of thin films is an easy-to-implement and low-cost technique to fabricate stretch-tunable periodic micro and nanoscale structures. However, the tunability of such structures is often limited by the emergence of an undesirable period-doubled mode at high strains. Predictively tuning the onset strain for period doubling via existing techniques requires one to have extensive knowledge about the nonlinear pattern formation behavior. Herein, a geometric prepatterning-based technique is introduced that can be implemented even with limited system knowledge to predictively delay period doubling. The technique comprises prepatterning the film/base bilayer with a sinusoidal pattern that has the same period as themore » natural period of the system. This technique has been verified via physical and computational experiments on the polydimethylsiloxane (PDMS)/glass bilayer system. It is observed that the onset strain can be increased from the typical value of 20% for flat films to greater than 30% with a modest prepattern aspect ratio (2·amplitude/period) of 0.15. In addition, finite element simulations reveal that (i) the onset strain increases with increasing prepattern amplitude and (ii) the delaying effect can be captured entirely by the prepattern geometry. Therefore, one can implement this technique even with limited system knowledge, such as material properties or film thickness, by simply replicating pre-existing wrinkled patterns to generate prepatterned bilayers. Furthermore, geometric prepatterning is a practical scheme to increase the operating range of stretch-tunable wrinkle-based devices by at least 50%.« less
A Different Web-Based Geocoding Service Using Fuzzy Techniques
NASA Astrophysics Data System (ADS)
Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.
2015-12-01
Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.
NASA Technical Reports Server (NTRS)
Keuper, H. R.; Peplies, R. W.; Gillooly, R. P.
1977-01-01
The use of machine scanning and/or computer-based techniques to provide greater objectivity in the photomorphic approach was investigated. Photomorphic analysis and its application in regional planning are discussed. Topics included: delineation of photomorphic regions; inadequacies of existing classification systems; tonal and textural characteristics and signature analysis techniques; pattern recognition and Fourier transform analysis; and optical experiments. A bibliography is included.
Empirical likelihood-based confidence intervals for mean medical cost with censored data.
Jeyarajah, Jenny; Qin, Gengsheng
2017-11-10
In this paper, we propose empirical likelihood methods based on influence function and jackknife techniques for constructing confidence intervals for mean medical cost with censored data. We conduct a simulation study to compare the coverage probabilities and interval lengths of our proposed confidence intervals with that of the existing normal approximation-based confidence intervals and bootstrap confidence intervals. The proposed methods have better finite-sample performances than existing methods. Finally, we illustrate our proposed methods with a relevant example. Copyright © 2017 John Wiley & Sons, Ltd.
Ravi, Logesh; Vairavasundaram, Subramaniyaswamy
2016-01-01
Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented.
Propagation-based x-ray phase contrast imaging using an iterative phase diversity technique
NASA Astrophysics Data System (ADS)
Carroll, Aidan J.; van Riessen, Grant A.; Balaur, Eugeniu; Dolbnya, Igor P.; Tran, Giang N.; Peele, Andrew G.
2018-03-01
Through the use of a phase diversity technique, we demonstrate a near-field in-line x-ray phase contrast algorithm that provides improved object reconstruction when compared to our previous iterative methods for a homogeneous sample. Like our previous methods, the new technique uses the sample refractive index distribution during the reconstruction process. The technique complements existing monochromatic and polychromatic methods and is useful in situations where experimental phase contrast data is affected by noise.
MEMS-based power generation techniques for implantable biosensing applications.
Lueke, Jonathan; Moussa, Walied A
2011-01-01
Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient.
Gundupalli, Sathish Paulraj; Hait, Subrata; Thakur, Atul
2017-12-01
There has been a significant rise in municipal solid waste (MSW) generation in the last few decades due to rapid urbanization and industrialization. Due to the lack of source segregation practice, a need for automated segregation of recyclables from MSW exists in the developing countries. This paper reports a thermal imaging based system for classifying useful recyclables from simulated MSW sample. Experimental results have demonstrated the possibility to use thermal imaging technique for classification and a robotic system for sorting of recyclables in a single process step. The reported classification system yields an accuracy in the range of 85-96% and is comparable with the existing single-material recyclable classification techniques. We believe that the reported thermal imaging based system can emerge as a viable and inexpensive large-scale classification-cum-sorting technology in recycling plants for processing MSW in developing countries. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Survey of UML Based Regression Testing
NASA Astrophysics Data System (ADS)
Fahad, Muhammad; Nadeem, Aamer
Regression testing is the process of ensuring software quality by analyzing whether changed parts behave as intended, and unchanged parts are not affected by the modifications. Since it is a costly process, a lot of techniques are proposed in the research literature that suggest testers how to build regression test suite from existing test suite with minimum cost. In this paper, we discuss the advantages and drawbacks of using UML diagrams for regression testing and analyze that UML model helps in identifying changes for regression test selection effectively. We survey the existing UML based regression testing techniques and provide an analysis matrix to give a quick insight into prominent features of the literature work. We discuss the open research issues like managing and reducing the size of regression test suite, prioritization of the test cases that would be helpful during strict schedule and resources that remain to be addressed for UML based regression testing.
NASA Technical Reports Server (NTRS)
Sheffner, E. J.; Hlavka, C. A.; Bauer, E. M.
1984-01-01
Two techniques have been developed for the mapping and area estimation of small grains in California from Landsat digital data. The two techniques are Band Ratio Thresholding, a semi-automated version of a manual procedure, and LCLS, a layered classification technique which can be fully automated and is based on established clustering and classification technology. Preliminary evaluation results indicate that the two techniques have potential for providing map products which can be incorporated into existing inventory procedures and automated alternatives to traditional inventory techniques and those which currently employ Landsat imagery.
Identification of Extraterrestrial Microbiology
NASA Technical Reports Server (NTRS)
Flynn, Michael; Rasky, Daniel J. (Technical Monitor)
1998-01-01
Many of the key questions addressed in the field of Astrobiology are based upon the assumption that life exists, or at one time existed, in locations throughout the universe. However, this assumption is just that, an assumption. No definitive proof exists. On Earth, life has been found to exist in many diverse environment. We believe that this tendency towards diversity supports the assumption that life could exists throughout the universe. This paper provides a summary of several innovative techniques for the detection of extraterrestrial life forms. The primary questions addressed are does life currently exist beyond Earth and if it does, is that life evolutionary related to life on Earth?
2014-01-01
Linear algebraic concept of subspace plays a significant role in the recent techniques of spectrum estimation. In this article, the authors have utilized the noise subspace concept for finding hidden periodicities in DNA sequence. With the vast growth of genomic sequences, the demand to identify accurately the protein-coding regions in DNA is increasingly rising. Several techniques of DNA feature extraction which involves various cross fields have come up in the recent past, among which application of digital signal processing tools is of prime importance. It is known that coding segments have a 3-base periodicity, while non-coding regions do not have this unique feature. One of the most important spectrum analysis techniques based on the concept of subspace is the least-norm method. The least-norm estimator developed in this paper shows sharp period-3 peaks in coding regions completely eliminating background noise. Comparison of proposed method with existing sliding discrete Fourier transform (SDFT) method popularly known as modified periodogram method has been drawn on several genes from various organisms and the results show that the proposed method has better as well as an effective approach towards gene prediction. Resolution, quality factor, sensitivity, specificity, miss rate, and wrong rate are used to establish superiority of least-norm gene prediction method over existing method. PMID:24386895
Usability-driven pruning of large ontologies: the case of SNOMED CT.
López-García, Pablo; Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan
2012-06-01
To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Graph-traversal heuristics provided high coverage (71-96% of terms in the test sets of discharge summaries) at the expense of subset size (17-51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24-55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available.
Sim, Kok Swee; NorHisham, Syafiq
2016-11-01
A technique based on linear Least Squares Regression (LSR) model is applied to estimate signal-to-noise ratio (SNR) of scanning electron microscope (SEM) images. In order to test the accuracy of this technique on SNR estimation, a number of SEM images are initially corrupted with white noise. The autocorrelation function (ACF) of the original and the corrupted SEM images are formed to serve as the reference point to estimate the SNR value of the corrupted image. The LSR technique is then compared with the previous three existing techniques known as nearest neighbourhood, first-order interpolation, and the combination of both nearest neighborhood and first-order interpolation. The actual and the estimated SNR values of all these techniques are then calculated for comparison purpose. It is shown that the LSR technique is able to attain the highest accuracy compared to the other three existing techniques as the absolute difference between the actual and the estimated SNR value is relatively small. SCANNING 38:771-782, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.
Use of Formative Classroom Assessment Techniques in a Project Management Course
ERIC Educational Resources Information Center
Purcell, Bernice M.
2014-01-01
Formative assessment is considered to be an evaluation technique that informs the instructor of the level of student learning, giving evidence when it may be necessary for the instructor to make a change in delivery based upon the results. Several theories of formative assessment exist, all which propound the importance of feedback to the student.…
ERIC Educational Resources Information Center
Arnquist, Isaac J.; Beussman, Douglas J.
2007-01-01
Biological mass spectrometry is an important analytical technique in drug discovery, proteomics, and research at the biology-chemistry interface. Currently, few hands-on opportunities exist for undergraduate students to learn about this technique. With the 2002 Nobel Prize being awarded, in part, for the development of biological mass…
Global and Local Existence for the Dissipative Critical SQG Equation with Small Oscillations
NASA Astrophysics Data System (ADS)
Lazar, Omar
2015-09-01
This article is devoted to the study of the critical dissipative surface quasi-geostrophic ( SQG) equation in . For any initial data belonging to the space , we show that the critical (SQG) equation has at least one global weak solution in time for all 1/4 ≤ s ≤ 1/2 and at least one local weak solution in time for all 0 < s < 1/4. The proof for the global existence is based on a new energy inequality which improves the one obtain in Lazar (Commun Math Phys 322:73-93, 2013) whereas the local existence uses more refined energy estimates based on Besov space techniques.
Geometric pre-patterning based tuning of the period doubling onset strain during thin film wrinkling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saha, Sourabh K.
Wrinkling of supported thin films is an easy-to-implement and low-cost fabrication technique for generation of stretch-tunable periodic micro and nano-scale structures. However, the tunability of such structures is often limited by the emergence of an undesirable period doubled mode at high strains. Predictively tuning the onset strain for period doubling via existing techniques requires one to have extensive knowledge about the nonlinear pattern formation behavior. Herein, a geometric pre-patterning based technique is introduced to delay the onset of period doubling that can be implemented to predictively tune the onset strain even with limited system knowledge. The technique comprises pre-patterning themore » film/base bilayer with a sinusoidal pattern that has the same period as the natural wrinkle period of the system. The effectiveness of this technique has been verified via physical and computational experiments on the polydimethylsiloxane/glass bilayer system. It is observed that the period doubling onset strain can be increased from the typical value of 20% for flat films to greater than 30% with a modest pre-pattern aspect ratio (2∙amplitude/period) of 0.15. In addition, finite element simulations reveal that (i) the onset strain can be increased up to a limit by increasing the amplitude of the pre-patterns and (ii) the delaying effect can be captured entirely by the pre-pattern geometry. As a result, one can implement this technique even with limited system knowledge, such as material properties or film thickness, by simply replicating pre-existing wrinkled patterns to generate prepatterned bilayers. Thus, geometric pre-patterning is a practical scheme to suppress period doubling that can increase the operating range of stretch-tunable wrinkle-based devices by at least 50%.« less
From Phenomena to Objects: Segmentation of Fuzzy Objects and its Application to Oceanic Eddies
NASA Astrophysics Data System (ADS)
Wu, Qingling
A challenging image analysis problem that has received limited attention to date is the isolation of fuzzy objects---i.e. those with inherently indeterminate boundaries---from continuous field data. This dissertation seeks to bridge the gap between, on the one hand, the recognized need for Object-Based Image Analysis of fuzzy remotely sensed features, and on the other, the optimization of existing image segmentation techniques for the extraction of more discretely bounded features. Using mesoscale oceanic eddies as a case study of a fuzzy object class evident in Sea Surface Height Anomaly (SSHA) imagery, the dissertation demonstrates firstly, that the widely used region-growing and watershed segmentation techniques can be optimized and made comparable in the absence of ground truth data using the principle of parsimony. However, they both have significant shortcomings, with the region growing procedure creating contour polygons that do not follow the shape of eddies while the watershed technique frequently subdivides eddies or groups together separate eddy objects. Secondly, it was determined that these problems can be remedied by using a novel Non-Euclidian Voronoi (NEV) tessellation technique. NEV is effective in isolating the extrema associated with eddies in SSHA data while using a non-Euclidian cost-distance based procedure (based on cumulative gradients in ocean height) to define the boundaries between fuzzy objects. Using this procedure as the first stage in isolating candidate eddy objects, a novel "region-shrinking" multicriteria eddy identification algorithm was developed that includes consideration of shape and vorticity. Eddies identified by this region-shrinking technique compare favorably with those identified by existing techniques, while simplifying and improving existing automated eddy detection algorithms. However, it also tends to find a larger number of eddies as a result of its ability to separate what other techniques identify as connected eddies. The research presented here is of significance not only to eddy research in oceanography, but also to other areas of Earth System Science for which the automated detection of features lacking rigid boundary definitions is of importance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dolen, James; Harris, Philip; Marzani, Simone
Here, we explore the scale-dependence and correlations of jet substructure observables to improve upon existing techniques in the identification of highly Lorentz-boosted objects. Modified observables are designed to remove correlations from existing theoretically well-understood observables, providing practical advantages for experimental measurements and searches for new phenomena. We study such observables in W jet tagging and provide recommendations for observables based on considerations beyond signal and background efficiencies.
2005-07-21
or solution-based methods such as spin casting or drop casting,’ 1ś self-assembly,1922 Langmuir - Blodgett techniques,23 or electrochemical methods...and Langmuir - exist. Molecules containing a perylene diimide core have Blodgett techniques.’ 8 In many situations, the molecules also been proposed for...remain soluble in the W. J. Langmuir 1996, 12, 2169. absence of other ionic species. These systems represent (35) Antonietti, M.; Conrad, J. Angew
NASA Astrophysics Data System (ADS)
Woolfitt, Adrian R.; Boyer, Anne E.; Quinn, Conrad P.; Hoffmaster, Alex R.; Kozel, Thomas R.; de, Barun K.; Gallegos, Maribel; Moura, Hercules; Pirkle, James L.; Barr, John R.
A range of mass spectrometry-based techniques have been used to identify, characterize and differentiate Bacillus anthracis, both in culture for forensic applications and for diagnosis during infection. This range of techniques could usefully be considered to exist as a continuum, based on the degrees of specificity involved. We show two examples here, a whole-organism fingerprinting method and a high-specificity assay for one unique protein, anthrax lethal factor.
Kruskal-Wallis-based computationally efficient feature selection for face recognition.
Ali Khan, Sajid; Hussain, Ayyaz; Basit, Abdul; Akram, Sheeraz
2014-01-01
Face recognition in today's technological world, and face recognition applications attain much more importance. Most of the existing work used frontal face images to classify face image. However these techniques fail when applied on real world face images. The proposed technique effectively extracts the prominent facial features. Most of the features are redundant and do not contribute to representing face. In order to eliminate those redundant features, computationally efficient algorithm is used to select the more discriminative face features. Extracted features are then passed to classification step. In the classification step, different classifiers are ensemble to enhance the recognition accuracy rate as single classifier is unable to achieve the high accuracy. Experiments are performed on standard face database images and results are compared with existing techniques.
A structural informatics approach to mine kinase knowledge bases.
Brooijmans, Natasja; Mobilio, Dominick; Walker, Gary; Nilakantan, Ramaswamy; Denny, Rajiah A; Feyfant, Eric; Diller, David; Bikker, Jack; Humblet, Christine
2010-03-01
In this paper, we describe a combination of structural informatics approaches developed to mine data extracted from existing structure knowledge bases (Protein Data Bank and the GVK database) with a focus on kinase ATP-binding site data. In contrast to existing systems that retrieve and analyze protein structures, our techniques are centered on a database of ligand-bound geometries in relation to residues lining the binding site and transparent access to ligand-based SAR data. We illustrate the systems in the context of the Abelson kinase and related inhibitor structures. 2009 Elsevier Ltd. All rights reserved.
Image-based corrosion recognition for ship steel structures
NASA Astrophysics Data System (ADS)
Ma, Yucong; Yang, Yang; Yao, Yuan; Li, Shengyuan; Zhao, Xuefeng
2018-03-01
Ship structures are subjected to corrosion inevitably in service. Existed image-based methods are influenced by the noises in images because they recognize corrosion by extracting features. In this paper, a novel method of image-based corrosion recognition for ship steel structures is proposed. The method utilizes convolutional neural networks (CNN) and will not be affected by noises in images. A CNN used to recognize corrosion was designed through fine-turning an existing CNN architecture and trained by datasets built using lots of images. Combining the trained CNN classifier with a sliding window technique, the corrosion zone in an image can be recognized.
Expanding access and choice for health care consumers through tax reform.
Butler, S; Kendall, D B
1999-01-01
A refundable tax credit for the uninsured would complement the existing job-based health insurance system while letting people keep their job-based coverage if they wish. Among the wide variety of design options for a tax credit, policy and political analysis does not reveal an obvious choice, but a tax credit based on a percentage of spending may have a slight advantage. Congress should give states maximum flexibility to use existing funding sources to supplement the value of a federal tax credit and encourage the use of techniques to create stable insurance pools.
Cellular-based preemption system
NASA Technical Reports Server (NTRS)
Bachelder, Aaron D. (Inventor)
2011-01-01
A cellular-based preemption system that uses existing cellular infrastructure to transmit preemption related data to allow safe passage of emergency vehicles through one or more intersections. A cellular unit in an emergency vehicle is used to generate position reports that are transmitted to the one or more intersections during an emergency response. Based on this position data, the one or more intersections calculate an estimated time of arrival (ETA) of the emergency vehicle, and transmit preemption commands to traffic signals at the intersections based on the calculated ETA. Additional techniques may be used for refining the position reports, ETA calculations, and the like. Such techniques include, without limitation, statistical preemption, map-matching, dead-reckoning, augmented navigation, and/or preemption optimization techniques, all of which are described in further detail in the above-referenced patent applications.
Test techniques for evaluating flight displays
NASA Technical Reports Server (NTRS)
Haworth, Loran A.; Newman, Richard L.
1993-01-01
The rapid development of graphics technology allows for greater flexibility in aircraft displays, but display evaluation techniques have not kept pace. Historically, display evaluation has been based on subjective opinion and not on the actual aircraft/pilot performance. Existing electronic display specifications and evaluation techniques are reviewed. A display rating technique analogous to handling qualities ratings was developed and is recommended for future evaluations. The choice of evaluation pilots is also discussed and the use of a limited number of trained evaluators is recommended over the use of a large number of operational pilots.
Tu, Haohua; Zhao, Youbo; Liu, Yuan; Liu, Yuan-Zhi; Boppart, Stephen
2014-08-25
Optical sources in the visible region immediately adjacent to the near-infrared biological optical window are preferred in imaging techniques such as spectroscopic optical coherence tomography of endogenous absorptive molecules and two-photon fluorescence microscopy of intrinsic fluorophores. However, existing sources based on fiber supercontinuum generation are known to have high relative intensity noise and low spectral coherence, which may degrade imaging performance. Here we compare the optical noise and pulse compressibility of three high-power fiber Cherenkov radiation sources developed recently, and evaluate their potential to replace the existing supercontinuum sources in these imaging techniques.
Neutron-based nonintrusive inspection techniques
NASA Astrophysics Data System (ADS)
Gozani, Tsahi
1997-02-01
Non-intrusive inspection of large objects such as trucks, sea-going shipping containers, air cargo containers and pallets is gaining attention as a vital tool in combating terrorism, drug smuggling and other violation of international and national transportation and Customs laws. Neutrons are the preferred probing radiation when material specificity is required, which is most often the case. Great strides have been made in neutron based inspection techniques. Fast and thermal neutrons, whether in steady state or in microsecond, or even nanosecond pulses are being employed to interrogate, at high speeds, for explosives, drugs, chemical agents, and nuclear and many other smuggled materials. Existing neutron techniques will be compared and their current status reported.
MEMS-Based Power Generation Techniques for Implantable Biosensing Applications
Lueke, Jonathan; Moussa, Walied A.
2011-01-01
Implantable biosensing is attractive for both medical monitoring and diagnostic applications. It is possible to monitor phenomena such as physical loads on joints or implants, vital signs, or osseointegration in vivo and in real time. Microelectromechanical (MEMS)-based generation techniques can allow for the autonomous operation of implantable biosensors by generating electrical power to replace or supplement existing battery-based power systems. By supplementing existing battery-based power systems for implantable biosensors, the operational lifetime of the sensor is increased. In addition, the potential for a greater amount of available power allows additional components to be added to the biosensing module, such as computational and wireless and components, improving functionality and performance of the biosensor. Photovoltaic, thermovoltaic, micro fuel cell, electrostatic, electromagnetic, and piezoelectric based generation schemes are evaluated in this paper for applicability for implantable biosensing. MEMS-based generation techniques that harvest ambient energy, such as vibration, are much better suited for implantable biosensing applications than fuel-based approaches, producing up to milliwatts of electrical power. High power density MEMS-based approaches, such as piezoelectric and electromagnetic schemes, allow for supplemental and replacement power schemes for biosensing applications to improve device capabilities and performance. In addition, this may allow for the biosensor to be further miniaturized, reducing the need for relatively large batteries with respect to device size. This would cause the implanted biosensor to be less invasive, increasing the quality of care received by the patient. PMID:22319362
Unisys' experience in software quality and productivity management of an existing system
NASA Technical Reports Server (NTRS)
Munson, John B.
1988-01-01
A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.
[Aging explosive detection using terahertz time-domain spectroscopy].
Meng, Kun; Li, Ze-ren; Liu, Qiao
2011-05-01
Detecting the aging situation of stock explosive is essentially meaningful to the research on the capability, security and stability of explosive. Existing aging explosive detection techniques, such as scan microscope technique, Fourier transfer infrared spectrum technique, gas chromatogram mass spectrum technique and so on, are either not able to differentiate whether the explosive is aging or not, or not able to image the structure change of the molecule. In the present paper, using the density functional theory (DFT), the absorb spectrum changes after the explosive aging were calculated, from which we can clearly find the difference of spectrum between explosive molecule and aging ones in the terahertz band. The terahertz time-domain spectrum (THz-TDS) system as well as its frequency spectrum resolution and measured range are analyzed. Combined with the existing experimental results and the essential characters of the terahertz wave, the application of THz-TDS technique to the detection of aging explosive was demonstrated from the aspects of feasibility, veracity and practicability. On the base of that, the authors advance the new method of aging explosive detection using the terahertz time-domain spectrum technique.
Arif, Anmar; Wang, Zhaoyu; Wang, Jianhui; ...
2017-05-02
Load modeling has significant impact on power system studies. This paper presents a review on load modeling and identification techniques. Load models can be classified into two broad categories: static and dynamic models, while there are two types of approaches to identify model parameters: measurement-based and component-based. Load modeling has received more attention in recent years because of the renewable integration, demand-side management, and smart metering devices. However, the commonly used load models are outdated, and cannot represent emerging loads. There is a need to systematically review existing load modeling techniques and suggest future research directions to meet the increasingmore » interests from industry and academia. In this study, we provide a thorough survey on the academic research progress and industry practices, and highlight existing issues and new trends in load modeling.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Arif, Anmar; Wang, Zhaoyu; Wang, Jianhui
Load modeling has significant impact on power system studies. This paper presents a review on load modeling and identification techniques. Load models can be classified into two broad categories: static and dynamic models, while there are two types of approaches to identify model parameters: measurement-based and component-based. Load modeling has received more attention in recent years because of the renewable integration, demand-side management, and smart metering devices. However, the commonly used load models are outdated, and cannot represent emerging loads. There is a need to systematically review existing load modeling techniques and suggest future research directions to meet the increasingmore » interests from industry and academia. In this study, we provide a thorough survey on the academic research progress and industry practices, and highlight existing issues and new trends in load modeling.« less
Communication: Electron ionization of DNA bases.
Rahman, M A; Krishnakumar, E
2016-04-28
No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve the existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.
Ravi, Logesh; Vairavasundaram, Subramaniyaswamy
2016-01-01
Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented. PMID:27069468
Various approaches and tools exist to estimate local and regional PM2.5 impacts from a single emissions source, ranging from simple screening techniques to Gaussian based dispersion models and complex grid-based Eulerian photochemical transport models. These approache...
Fechter, Dominik; Storch, Ilse
2014-01-01
Due to legislative protection, many species, including large carnivores, are currently recolonizing Europe. To address the impending human-wildlife conflicts in advance, predictive habitat models can be used to determine potentially suitable habitat and areas likely to be recolonized. As field data are often limited, quantitative rule based models or the extrapolation of results from other studies are often the techniques of choice. Using the wolf (Canis lupus) in Germany as a model for habitat generalists, we developed a habitat model based on the location and extent of twelve existing wolf home ranges in Eastern Germany, current knowledge on wolf biology, different habitat modeling techniques and various input data to analyze ten different input parameter sets and address the following questions: (1) How do a priori assumptions and different input data or habitat modeling techniques affect the abundance and distribution of potentially suitable wolf habitat and the number of wolf packs in Germany? (2) In a synthesis across input parameter sets, what areas are predicted to be most suitable? (3) Are existing wolf pack home ranges in Eastern Germany consistent with current knowledge on wolf biology and habitat relationships? Our results indicate that depending on which assumptions on habitat relationships are applied in the model and which modeling techniques are chosen, the amount of potentially suitable habitat estimated varies greatly. Depending on a priori assumptions, Germany could accommodate between 154 and 1769 wolf packs. The locations of the existing wolf pack home ranges in Eastern Germany indicate that wolves are able to adapt to areas densely populated by humans, but are limited to areas with low road densities. Our analysis suggests that predictive habitat maps in general, should be interpreted with caution and illustrates the risk for habitat modelers to concentrate on only one selection of habitat factors or modeling technique. PMID:25029506
USDA-ARS?s Scientific Manuscript database
The invasive brown marmorated stink bug, Halyomorpha halys (Stål), has become a serious pest in mid-Atlantic apple orchards. Because no decision support tools exist for H. halys management, calendar-based insecticide applications have been the only successful technique for mitigating H. halys injur...
NASA Astrophysics Data System (ADS)
Demigha, Souâd.
2016-03-01
The paper presents a Case-Based Reasoning Tool for Breast Cancer Knowledge Management to improve breast cancer screening. To develop this tool, we combine both concepts and techniques of Case-Based Reasoning (CBR) and Data Mining (DM). Physicians and radiologists ground their diagnosis on their expertise (past experience) based on clinical cases. Case-Based Reasoning is the process of solving new problems based on the solutions of similar past problems and structured as cases. CBR is suitable for medical use. On the other hand, existing traditional hospital information systems (HIS), Radiological Information Systems (RIS) and Picture Archiving Information Systems (PACS) don't allow managing efficiently medical information because of its complexity and heterogeneity. Data Mining is the process of mining information from a data set and transform it into an understandable structure for further use. Combining CBR to Data Mining techniques will facilitate diagnosis and decision-making of medical experts.
Usability-driven pruning of large ontologies: the case of SNOMED CT
Boeker, Martin; Illarramendi, Arantza; Schulz, Stefan
2012-01-01
Objectives To study ontology modularization techniques when applied to SNOMED CT in a scenario in which no previous corpus of information exists and to examine if frequency-based filtering using MEDLINE can reduce subset size without discarding relevant concepts. Materials and Methods Subsets were first extracted using four graph-traversal heuristics and one logic-based technique, and were subsequently filtered with frequency information from MEDLINE. Twenty manually coded discharge summaries from cardiology patients were used as signatures and test sets. The coverage, size, and precision of extracted subsets were measured. Results Graph-traversal heuristics provided high coverage (71–96% of terms in the test sets of discharge summaries) at the expense of subset size (17–51% of the size of SNOMED CT). Pre-computed subsets and logic-based techniques extracted small subsets (1%), but coverage was limited (24–55%). Filtering reduced the size of large subsets to 10% while still providing 80% coverage. Discussion Extracting subsets to annotate discharge summaries is challenging when no previous corpus exists. Ontology modularization provides valuable techniques, but the resulting modules grow as signatures spread across subhierarchies, yielding a very low precision. Conclusion Graph-traversal strategies and frequency data from an authoritative source can prune large biomedical ontologies and produce useful subsets that still exhibit acceptable coverage. However, a clinical corpus closer to the specific use case is preferred when available. PMID:22268217
Compressed ECG biometric: a fast, secured and efficient method for identification of CVD patient.
Sufi, Fahim; Khalil, Ibrahim; Mahmood, Abdun
2011-12-01
Adoption of compression technology is often required for wireless cardiovascular monitoring, due to the enormous size of Electrocardiography (ECG) signal and limited bandwidth of Internet. However, compressed ECG must be decompressed before performing human identification using present research on ECG based biometric techniques. This additional step of decompression creates a significant processing delay for identification task. This becomes an obvious burden on a system, if this needs to be done for a trillion of compressed ECG per hour by the hospital. Even though the hospital might be able to come up with an expensive infrastructure to tame the exuberant processing, for small intermediate nodes in a multihop network identification preceded by decompression is confronting. In this paper, we report a technique by which a person can be identified directly from his / her compressed ECG. This technique completely obviates the step of decompression and therefore upholds biometric identification less intimidating for the smaller nodes in a multihop network. The biometric template created by this new technique is lower in size compared to the existing ECG based biometrics as well as other forms of biometrics like face, finger, retina etc. (up to 8302 times lower than face template and 9 times lower than existing ECG based biometric template). Lower size of the template substantially reduces the one-to-many matching time for biometric recognition, resulting in a faster biometric authentication mechanism.
Characterization of Orbital Debris via Hyper-Velocity Ground-Based Tests
NASA Technical Reports Server (NTRS)
Cowardin, Heather
2016-01-01
The purpose of the DebriSat project is to replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoDand NASA breakup models.
Mousavi Kahaki, Seyed Mostafa; Nordin, Md Jan; Ashtari, Amir H.; J. Zahra, Sophia
2016-01-01
An invariant feature matching method is proposed as a spatially invariant feature matching approach. Deformation effects, such as affine and homography, change the local information within the image and can result in ambiguous local information pertaining to image points. New method based on dissimilarity values, which measures the dissimilarity of the features through the path based on Eigenvector properties, is proposed. Evidence shows that existing matching techniques using similarity metrics—such as normalized cross-correlation, squared sum of intensity differences and correlation coefficient—are insufficient for achieving adequate results under different image deformations. Thus, new descriptor’s similarity metrics based on normalized Eigenvector correlation and signal directional differences, which are robust under local variation of the image information, are proposed to establish an efficient feature matching technique. The method proposed in this study measures the dissimilarity in the signal frequency along the path between two features. Moreover, these dissimilarity values are accumulated in a 2D dissimilarity space, allowing accurate corresponding features to be extracted based on the cumulative space using a voting strategy. This method can be used in image registration applications, as it overcomes the limitations of the existing approaches. The output results demonstrate that the proposed technique outperforms the other methods when evaluated using a standard dataset, in terms of precision-recall and corner correspondence. PMID:26985996
NASA Astrophysics Data System (ADS)
Hafezalkotob, Arian; Hafezalkotob, Ashkan
2017-06-01
A target-based MADM method covers beneficial and non-beneficial attributes besides target values for some attributes. Such techniques are considered as the comprehensive forms of MADM approaches. Target-based MADM methods can also be used in traditional decision-making problems in which beneficial and non-beneficial attributes only exist. In many practical selection problems, some attributes have given target values. The values of decision matrix and target-based attributes can be provided as intervals in some of such problems. Some target-based decision-making methods have recently been developed; however, a research gap exists in the area of MADM techniques with target-based attributes under uncertainty of information. We extend the MULTIMOORA method for solving practical material selection problems in which material properties and their target values are given as interval numbers. We employ various concepts of interval computations to reduce degeneration of uncertain data. In this regard, we use interval arithmetic and introduce innovative formula for interval distance of interval numbers to create interval target-based normalization technique. Furthermore, we use a pairwise preference matrix based on the concept of degree of preference of interval numbers to calculate the maximum, minimum, and ranking of these numbers. Two decision-making problems regarding biomaterials selection of hip and knee prostheses are discussed. Preference degree-based ranking lists for subordinate parts of the extended MULTIMOORA method are generated by calculating the relative degrees of preference for the arranged assessment values of the biomaterials. The resultant rankings for the problem are compared with the outcomes of other target-based models in the literature.
A performance model for GPUs with caches
Dao, Thanh Tuan; Kim, Jungwon; Seo, Sangmin; ...
2014-06-24
To exploit the abundant computational power of the world's fastest supercomputers, an even workload distribution to the typically heterogeneous compute devices is necessary. While relatively accurate performance models exist for conventional CPUs, accurate performance estimation models for modern GPUs do not exist. This paper presents two accurate models for modern GPUs: a sampling-based linear model, and a model based on machine-learning (ML) techniques which improves the accuracy of the linear model and is applicable to modern GPUs with and without caches. We first construct the sampling-based linear model to predict the runtime of an arbitrary OpenCL kernel. Based on anmore » analysis of NVIDIA GPUs' scheduling policies we determine the earliest sampling points that allow an accurate estimation. The linear model cannot capture well the significant effects that memory coalescing or caching as implemented in modern GPUs have on performance. We therefore propose a model based on ML techniques that takes several compiler-generated statistics about the kernel as well as the GPU's hardware performance counters as additional inputs to obtain a more accurate runtime performance estimation for modern GPUs. We demonstrate the effectiveness and broad applicability of the model by applying it to three different NVIDIA GPU architectures and one AMD GPU architecture. On an extensive set of OpenCL benchmarks, on average, the proposed model estimates the runtime performance with less than 7 percent error for a second-generation GTX 280 with no on-chip caches and less than 5 percent for the Fermi-based GTX 580 with hardware caches. On the Kepler-based GTX 680, the linear model has an error of less than 10 percent. On an AMD GPU architecture, Radeon HD 6970, the model estimates with 8 percent of error rates. As a result, the proposed technique outperforms existing models by a factor of 5 to 6 in terms of accuracy.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyüre, B.; Márkus, B. G.; Bernáth, B.
2015-09-15
We present a novel method to determine the resonant frequency and quality factor of microwave resonators which is faster, more stable, and conceptually simpler than the yet existing techniques. The microwave resonator is pumped with the microwave radiation at a frequency away from its resonance. It then emits an exponentially decaying radiation at its eigen-frequency when the excitation is rapidly switched off. The emitted microwave signal is down-converted with a microwave mixer, digitized, and its Fourier transformation (FT) directly yields the resonance curve in a single shot. Being a FT based method, this technique possesses the Fellgett (multiplex) and Connesmore » (accuracy) advantages and it conceptually mimics that of pulsed nuclear magnetic resonance. We also establish a novel benchmark to compare accuracy of the different approaches of microwave resonator measurements. This shows that the present method has similar accuracy to the existing ones, which are based on sweeping or modulating the frequency of the microwave radiation.« less
Towards microscale electrohydrodynamic three-dimensional printing
NASA Astrophysics Data System (ADS)
He, Jiankang; Xu, Fangyuan; Cao, Yi; Liu, Yaxiong; Li, Dichen
2016-02-01
It is challenging for the existing three-dimensional (3D) printing techniques to fabricate high-resolution 3D microstructures with low costs and high efficiency. In this work we present a solvent-based electrohydrodynamic 3D printing technique that allows fabrication of microscale structures like single walls, crossed walls, lattice and concentric circles. Process parameters were optimized to deposit tiny 3D patterns with a wall width smaller than 10 μm and a high aspect ratio of about 60. Tight bonding among neighbour layers could be achieved with a smooth lateral surface. In comparison with the existing microscale 3D printing techniques, the presented method is low-cost, highly efficient and applicable to multiple polymers. It is envisioned that this simple microscale 3D printing strategy might provide an alternative and innovative way for application in MEMS, biosensor and flexible electronics.
System Identification of Mistuned Bladed Disks from Traveling Wave Response Measurements
NASA Technical Reports Server (NTRS)
Feiner, D. M.; Griffin, J. H.; Jones, K. W.; Kenyon, J. A.; Mehmed, O.; Kurkov, A. P.
2003-01-01
A new approach to modal analysis is presented. By applying this technique to bladed disk system identification methods, one can determine the mistuning in a rotor based on its response to a traveling wave excitation. This allows system identification to be performed under rotating conditions, and thus expands the applicability of existing mistuning identification techniques from integrally bladed rotors to conventional bladed disks.
[Psychological debriefing and post-immediate psychotherapeutic intervention].
Prieto, Nathalie; Cheucle, Eric; Meylan, Françoise
2010-01-01
Psychological debriefing is a controversial treatment technique. In principle, many such treatments exist based on apparently indisputable conclusions which only assess the personal traumatic effect and neglect the collective impact, which is the original reason for the creation of this technique. Therefore, it is essential to take a look at the way in which debriefings are conducted, its indications, its limits and the psychodynamic processes at play.
The Future of Pharmaceutical Manufacturing Sciences
2015-01-01
The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial‐scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state‐of‐art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular‐based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot‐melt processing and printing‐based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association J Pharm Sci 104:3612–3638, 2015 PMID:26280993
The Future of Pharmaceutical Manufacturing Sciences.
Rantanen, Jukka; Khinast, Johannes
2015-11-01
The entire pharmaceutical sector is in an urgent need of both innovative technological solutions and fundamental scientific work, enabling the production of highly engineered drug products. Commercial-scale manufacturing of complex drug delivery systems (DDSs) using the existing technologies is challenging. This review covers important elements of manufacturing sciences, beginning with risk management strategies and design of experiments (DoE) techniques. Experimental techniques should, where possible, be supported by computational approaches. With that regard, state-of-art mechanistic process modeling techniques are described in detail. Implementation of materials science tools paves the way to molecular-based processing of future DDSs. A snapshot of some of the existing tools is presented. Additionally, general engineering principles are discussed covering process measurement and process control solutions. Last part of the review addresses future manufacturing solutions, covering continuous processing and, specifically, hot-melt processing and printing-based technologies. Finally, challenges related to implementing these technologies as a part of future health care systems are discussed. © 2015 The Authors. Journal of Pharmaceutical Sciences published by Wiley Periodicals, Inc. and the American Pharmacists Association.
The development of additive manufacturing technique for nickel-base alloys: A review
NASA Astrophysics Data System (ADS)
Zadi-Maad, Ahmad; Basuki, Arif
2018-04-01
Nickel-base alloys are an attractive alloy due to its excellent mechanical properties, a high resistance to creep deformation, corrosion, and oxidation. However, it is a hard task to control performance when casting or forging for this material. In recent years, additive manufacturing (AM) process has been implemented to replace the conventional directional solidification process for the production of nickel-base alloys. Due to its potentially lower cost and flexibility manufacturing process, AM is considered as a substitute technique for the existing. This paper provides a comprehensive review of the previous work related to the AM techniques for Ni-base alloys while highlighting current challenges and methods to solving them. The properties of conventionally manufactured Ni-base alloys are also compared with the AM fabricated alloys. The mechanical properties obtained from tension, hardness and fatigue test are included, along with discussions of the effect of post-treatment process. Recommendations for further work are also provided.
The workload book: Assessment of operator workload to engineering systems
NASA Technical Reports Server (NTRS)
Gopher, D.
1983-01-01
The structure and initial work performed toward the creation of a handbook for workload analysis directed at the operational community of engineers and human factors psychologists are described. The goal, when complete, will be to make accessible to such individuals the results of theoretically-based research that are of practical interest and utility in the analysis and prediction of operator workload in advanced and existing systems. In addition, the results of laboratory study focused on the development of a subjective rating technique for workload that is based on psychophysical scaling techniques are described.
Huang, Gui-Wen; Xiao, Hong-Mei; Fu, Shao-Yun
2014-08-07
Here a facile, green and efficient printing-filtration-press (PFP) technique is reported for room-temperature (RT) mass-production of low-cost, environmentally friendly, high performance paper-based electronic circuits. The as-prepared silver nanowires (Ag-NWs) are uniformly deposited at RT on a pre-printed paper substrate to form high quality circuits via vacuum filtration and pressing. The PFP circuit exhibits more excellent electrical property and bending stability compared with other flexible circuits made by existing techniques. Furthermore, practical applications of the PFP circuits are demonstrated.
Fourier-Mellin moment-based intertwining map for image encryption
NASA Astrophysics Data System (ADS)
Kaur, Manjit; Kumar, Vijay
2018-03-01
In this paper, a robust image encryption technique that utilizes Fourier-Mellin moments and intertwining logistic map is proposed. Fourier-Mellin moment-based intertwining logistic map has been designed to overcome the issue of low sensitivity of an input image. Multi-objective Non-Dominated Sorting Genetic Algorithm (NSGA-II) based on Reinforcement Learning (MNSGA-RL) has been used to optimize the required parameters of intertwining logistic map. Fourier-Mellin moments are used to make the secret keys more secure. Thereafter, permutation and diffusion operations are carried out on input image using secret keys. The performance of proposed image encryption technique has been evaluated on five well-known benchmark images and also compared with seven well-known existing encryption techniques. The experimental results reveal that the proposed technique outperforms others in terms of entropy, correlation analysis, a unified average changing intensity and the number of changing pixel rate. The simulation results reveal that the proposed technique provides high level of security and robustness against various types of attacks.
Communication: Electron ionization of DNA bases
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rahman, M. A.; Krishnakumar, E., E-mail: ekkumar@tifr.res.in
2016-04-28
No reliable experimental data exist for the partial and total electron ionization cross sections for DNA bases, which are very crucial for modeling radiation damage in genetic material of living cell. We have measured a complete set of absolute partial electron ionization cross sections up to 500 eV for DNA bases for the first time by using the relative flow technique. These partial cross sections are summed to obtain total ion cross sections for all the four bases and are compared with the existing theoretical calculations and the only set of measured absolute cross sections. Our measurements clearly resolve themore » existing discrepancy between the theoretical and experimental results, thereby providing for the first time reliable numbers for partial and total ion cross sections for these molecules. The results on fragmentation analysis of adenine supports the theory of its formation in space.« less
Green Infrastructure Implementation Strategy for the Town of Franklin, Massachusetts
The report outlines best techniques for the Town, based on land uses and physical constraints, experience with the implementation of existing practices, and the findings of recently completed reviews of current programs and practices.
Code of Federal Regulations, 2012 CFR
2012-07-01
...) National Emissions Standards for Hazardous Air Pollutants: Reinforced Plastic Composites Production... to incorporation of pollution-prevention control techniques, existing facilities may base the average...
The Node Acquisition and Integration Technique: A Node-Link Based Teaching/Learning Strategy.
ERIC Educational Resources Information Center
Diekhoff, George M.
This paper presents the results of three experiments conducted in connection with development of a node-link based teaching/learning strategy. In experiment 1, subjects were instructed to either define concepts selected from a unit of introductory psychology or to describe the relationships existing between pairs of concepts. The cognitive…
Mobile Formative Assessment Tool Based on Data Mining Techniques for Supporting Web-Based Learning
ERIC Educational Resources Information Center
Chen, Chih-Ming; Chen, Ming-Chuan
2009-01-01
Current trends clearly indicate that online learning has become an important learning mode. However, no effective assessment mechanism for learning performance yet exists for e-learning systems. Learning performance assessment aims to evaluate what learners learned during the learning process. Traditional summative evaluation only considers final…
Career Goal-Based E-Learning Recommendation Using Enhanced Collaborative Filtering and PrefixSpan
ERIC Educational Resources Information Center
Ma, Xueying; Ye, Lu
2018-01-01
This article describes how e-learning recommender systems nowadays have applied different kinds of techniques to recommend personalized learning content for users based on their preference, goals, interests and background information. However, the cold-start problem which exists in traditional recommendation algorithms are still left over in…
A quantitative investigation of the fracture pump-in/flowback test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plahn, S.V.; Nolte, K.G.; Miska, S.
1995-12-31
Fracture closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test where strong indications of fracture closure are rarely seen. Various techniques exist for extracting closure pressure from the flowback pressure response. Unfortunately, these procedures give different estimates for closure pressure and their theoretical bases are not well established. We present results that place the PIFB test on a more solid foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. Based on our simulation results, we propose an interpretation procedure which gives better estimates for closure pressure than existing techniques.« less
Application of pattern recognition techniques to crime analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bender, C.F.; Cox, L.A. Jr.; Chappell, G.A.
1976-08-15
The initial goal was to evaluate the capabilities of current pattern recognition techniques when applied to existing computerized crime data. Performance was to be evaluated both in terms of the system's capability to predict crimes and to optimize police manpower allocation. A relation was sought to predict the crime's susceptibility to solution, based on knowledge of the crime type, location, time, etc. The preliminary results of this work are discussed. They indicate that automatic crime analysis involving pattern recognition techniques is feasible, and that efforts to determine optimum variables and techniques are warranted. 47 figures (RWR)
3D shape measurement of moving object with FFT-based spatial matching
NASA Astrophysics Data System (ADS)
Guo, Qinghua; Ruan, Yuxi; Xi, Jiangtao; Song, Limei; Zhu, Xinjun; Yu, Yanguang; Tong, Jun
2018-03-01
This work presents a new technique for 3D shape measurement of moving object in translational motion, which finds applications in online inspection, quality control, etc. A low-complexity 1D fast Fourier transform (FFT)-based spatial matching approach is devised to obtain accurate object displacement estimates, and it is combined with single shot fringe pattern prolometry (FPP) techniques to achieve high measurement performance with multiple captured images through coherent combining. The proposed technique overcomes some limitations of existing ones. Specifically, the placement of marks on object surface and synchronization between projector and camera are not needed, the velocity of the moving object is not required to be constant, and there is no restriction on the movement trajectory. Both simulation and experimental results demonstrate the effectiveness of the proposed technique.
Erickson, Brandon J; Cvetanovich, Gregory L; Frank, Rachel M; Bach, Bernard R; Cohen, Mark S; Bush-Joseph, Charles A; Cole, Brian J; Romeo, Anthony A
2016-11-01
Ulnar collateral ligament reconstruction (UCLR) has become a common procedure performed in overhead-throwing athletes of many athletic levels. The purpose of this study was to determine whether clinical outcomes and return-to-sport (RTS) rates differ among patients undergoing UCLR based on graft choice, surgical technique, athletic competition level, handedness, and treatment of the ulnar nerve. We hypothesized that no differences would exist in clinical outcomes or RTS rates between technique, graft choice, or other variables. Cohort study; Level of evidence, 3. All patients who underwent UCLR from January 1, 2004 through December 31, 2014 at a single institution were identified. Charts were reviewed to determine patient age, sex, date of surgery, sport played, handedness, athletic level, surgical technique, graft type, and complications. Patients were contacted via telephone to obtain the RTS rate, Conway-Jobe score, Timmerman-Andrews score, and Kerlan-Jobe Orthopaedic Clinic (KJOC) Shoulder and Elbow score. Eighty-five patients (mean age at surgery, 19.3 ± 4.7 years; 92% male; 78% right hand-dominant) underwent UCLR between 2004 and 2014 and were available for follow-up. Overall, 87% were baseball pitchers, 49.4% were college athletes, and 41.2% were high school athletes. No significant difference existed between the docking and double-docking techniques, graft choice, handedness, sex, activity level, and treatment of the ulnar nerve with regard to clinical outcomes, RTS, or subsequent surgeries (all P > .05). More complications were seen in the docking technique compared with the double-docking technique ( P = .036). Hamstring autograft was used more commonly with the docking technique ( P = .023) while allograft was used more commonly with the double-docking technique ( P = .0006). Both the docking and double-docking techniques produce excellent clinical outcomes in patients undergoing UCLR. No difference in outcome scores was seen between surgical technique or graft type. The double-docking technique had fewer complications than the docking technique.
The Effects of Practice-Based Training on Graduate Teaching Assistants’ Classroom Practices
Becker, Erin A.; Easlon, Erin J.; Potter, Sarah C.; Guzman-Alvarez, Alberto; Spear, Jensen M.; Facciotti, Marc T.; Igo, Michele M.; Singer, Mitchell; Pagliarulo, Christopher
2017-01-01
Evidence-based teaching is a highly complex skill, requiring repeated cycles of deliberate practice and feedback to master. Despite existing well-characterized frameworks for practice-based training in K–12 teacher education, the major principles of these frameworks have not yet been transferred to instructor development in higher educational contexts, including training of graduate teaching assistants (GTAs). We sought to determine whether a practice-based training program could help GTAs learn and use evidence-based teaching methods in their classrooms. We implemented a weekly training program for introductory biology GTAs that included structured drills of techniques selected to enhance student practice, logic development, and accountability and reduce apprehension. These elements were selected based on their previous characterization as dimensions of active learning. GTAs received regular performance feedback based on classroom observations. To quantify use of target techniques and levels of student participation, we collected and coded 160 h of video footage. We investigated the relationship between frequency of GTA implementation of target techniques and student exam scores; however, we observed no significant relationship. Although GTAs adopted and used many of the target techniques with high frequency, techniques that enforced student participation were not stably adopted, and their use was unresponsive to formal feedback. We also found that techniques discussed in training, but not practiced, were not used at quantifiable frequencies, further supporting the importance of practice-based training for influencing instructional practices. PMID:29146664
Efficient Translation of LTL Formulae into Buchi Automata
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Lerda, Flavio
2001-01-01
Model checking is a fully automated technique for checking that a system satisfies a set of required properties. With explicit-state model checkers, properties are typically defined in linear-time temporal logic (LTL), and are translated into B chi automata in order to be checked. This report presents how we have combined and improved existing techniques to obtain an efficient LTL to B chi automata translator. In particular, we optimize the core of existing tableau-based approaches to generate significantly smaller automata. Our approach has been implemented and is being released as part of the Java PathFinder software (JPF), an explicit state model checker under development at the NASA Ames Research Center.
Image Analysis Technique for Material Behavior Evaluation in Civil Structures
Moretti, Michele; Rossi, Gianluca
2017-01-01
The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques. PMID:28773129
Image Analysis Technique for Material Behavior Evaluation in Civil Structures.
Speranzini, Emanuela; Marsili, Roberto; Moretti, Michele; Rossi, Gianluca
2017-07-08
The article presents a hybrid monitoring technique for the measurement of the deformation field. The goal is to obtain information about crack propagation in existing structures, for the purpose of monitoring their state of health. The measurement technique is based on the capture and analysis of a digital image set. Special markers were used on the surface of the structures that can be removed without damaging existing structures as the historical masonry. The digital image analysis was done using software specifically designed in Matlab to follow the tracking of the markers and determine the evolution of the deformation state. The method can be used in any type of structure but is particularly suitable when it is necessary not to damage the surface of structures. A series of experiments carried out on masonry walls of the Oliverian Museum (Pesaro, Italy) and Palazzo Silvi (Perugia, Italy) have allowed the validation of the procedure elaborated by comparing the results with those derived from traditional measuring techniques.
Low-grade fibrosarcoma of the anterior skull base: endoscopic resection and repair.
Kuhn, Frederick A; Javer, Amin R
2003-01-01
Fibrosarcomas of the paranasal sinuses and skull base are uncommon tumors. Traditionally, "open approach" surgery remains the mainstay for treatment of choice for these tumors. A 49-year-old man underwent resection of a right anterior skull base fibrosarcoma using the endoscopic approach. Close follow-up using both endoscopic and imaging methods over a period of four years has revealed a well-healed skull base with no evidence of recurrence. Significant resistance exists at present for such a technique to deal with malignant diseases of the head and neck but results from advanced centers continue to prove that this may be a technique worth mastering and improving on.
Design and Evaluation of Fusion Approach for Combining Brain and Gaze Inputs for Target Selection
Évain, Andéol; Argelaguet, Ferran; Casiez, Géry; Roussel, Nicolas; Lécuyer, Anatole
2016-01-01
Gaze-based interfaces and Brain-Computer Interfaces (BCIs) allow for hands-free human–computer interaction. In this paper, we investigate the combination of gaze and BCIs. We propose a novel selection technique for 2D target acquisition based on input fusion. This new approach combines the probabilistic models for each input, in order to better estimate the intent of the user. We evaluated its performance against the existing gaze and brain–computer interaction techniques. Twelve participants took part in our study, in which they had to search and select 2D targets with each of the evaluated techniques. Our fusion-based hybrid interaction technique was found to be more reliable than the previous gaze and BCI hybrid interaction techniques for 10 participants over 12, while being 29% faster on average. However, similarly to what has been observed in hybrid gaze-and-speech interaction, gaze-only interaction technique still provides the best performance. Our results should encourage the use of input fusion, as opposed to sequential interaction, in order to design better hybrid interfaces. PMID:27774048
Polarization-based and specular-reflection-based noncontact latent fingerprint imaging and lifting
NASA Astrophysics Data System (ADS)
Lin, Shih-Schön; Yemelyanov, Konstantin M.; Pugh, Edward N., Jr.; Engheta, Nader
2006-09-01
In forensic science the finger marks left unintentionally by people at a crime scene are referred to as latent fingerprints. Most existing techniques to detect and lift latent fingerprints require application of a certain material directly onto the exhibit. The chemical and physical processing applied to the fingerprint potentially degrades or prevents further forensic testing on the same evidence sample. Many existing methods also have deleterious side effects. We introduce a method to detect and extract latent fingerprint images without applying any powder or chemicals on the object. Our method is based on the optical phenomena of polarization and specular reflection together with the physiology of fingerprint formation. The recovered image quality is comparable to existing methods. In some cases, such as the sticky side of tape, our method shows unique advantages.
Hosseini, Marjan; Kerachian, Reza
2017-09-01
This paper presents a new methodology for analyzing the spatiotemporal variability of water table levels and redesigning a groundwater level monitoring network (GLMN) using the Bayesian Maximum Entropy (BME) technique and a multi-criteria decision-making approach based on ordered weighted averaging (OWA). The spatial sampling is determined using a hexagonal gridding pattern and a new method, which is proposed to assign a removal priority number to each pre-existing station. To design temporal sampling, a new approach is also applied to consider uncertainty caused by lack of information. In this approach, different time lag values are tested by regarding another source of information, which is simulation result of a numerical groundwater flow model. Furthermore, to incorporate the existing uncertainties in available monitoring data, the flexibility of the BME interpolation technique is taken into account in applying soft data and improving the accuracy of the calculations. To examine the methodology, it is applied to the Dehgolan plain in northwestern Iran. Based on the results, a configuration of 33 monitoring stations for a regular hexagonal grid of side length 3600 m is proposed, in which the time lag between samples is equal to 5 weeks. Since the variance estimation errors of the BME method are almost identical for redesigned and existing networks, the redesigned monitoring network is more cost-effective and efficient than the existing monitoring network with 52 stations and monthly sampling frequency.
Community Detection for Correlation Matrices
NASA Astrophysics Data System (ADS)
MacMahon, Mel; Garlaschelli, Diego
2015-04-01
A challenging problem in the study of complex systems is that of resolving, without prior information, the emergent, mesoscopic organization determined by groups of units whose dynamical activity is more strongly correlated internally than with the rest of the system. The existing techniques to filter correlations are not explicitly oriented towards identifying such modules and can suffer from an unavoidable information loss. A promising alternative is that of employing community detection techniques developed in network theory. Unfortunately, this approach has focused predominantly on replacing network data with correlation matrices, a procedure that we show to be intrinsically biased because of its inconsistency with the null hypotheses underlying the existing algorithms. Here, we introduce, via a consistent redefinition of null models based on random matrix theory, the appropriate correlation-based counterparts of the most popular community detection techniques. Our methods can filter out both unit-specific noise and system-wide dependencies, and the resulting communities are internally correlated and mutually anticorrelated. We also implement multiresolution and multifrequency approaches revealing hierarchically nested subcommunities with "hard" cores and "soft" peripheries. We apply our techniques to several financial time series and identify mesoscopic groups of stocks which are irreducible to a standard, sectorial taxonomy; detect "soft stocks" that alternate between communities; and discuss implications for portfolio optimization and risk management.
Instrumentation development for drug detection on the breath
DOT National Transportation Integrated Search
1972-09-01
Based on a survey of candidate analytical methods, mass spectrometry was identified as a promising technique for drug detection on the breath. To demonstrate its capabilities, an existing laboratory mass spectrometer was modified by the addition of a...
Robust volcano plot: identification of differential metabolites in the presence of outliers.
Kumar, Nishith; Hoque, Md Aminul; Sugimoto, Masahiro
2018-04-11
The identification of differential metabolites in metabolomics is still a big challenge and plays a prominent role in metabolomics data analyses. Metabolomics datasets often contain outliers because of analytical, experimental, and biological ambiguity, but the currently available differential metabolite identification techniques are sensitive to outliers. We propose a kernel weight based outlier-robust volcano plot for identifying differential metabolites from noisy metabolomics datasets. Two numerical experiments are used to evaluate the performance of the proposed technique against nine existing techniques, including the t-test and the Kruskal-Wallis test. Artificially generated data with outliers reveal that the proposed method results in a lower misclassification error rate and a greater area under the receiver operating characteristic curve compared with existing methods. An experimentally measured breast cancer dataset to which outliers were artificially added reveals that our proposed method produces only two non-overlapping differential metabolites whereas the other nine methods produced between seven and 57 non-overlapping differential metabolites. Our data analyses show that the performance of the proposed differential metabolite identification technique is better than that of existing methods. Thus, the proposed method can contribute to analysis of metabolomics data with outliers. The R package and user manual of the proposed method are available at https://github.com/nishithkumarpaul/Rvolcano .
Evidence-based surgery: barriers, solutions, and the role of evidence synthesis.
Garas, George; Ibrahim, Amel; Ashrafian, Hutan; Ahmed, Kamran; Patel, Vanash; Okabayashi, Koji; Skapinakis, Petros; Darzi, Ara; Athanasiou, Thanos
2012-08-01
Surgery is a rapidly evolving field, making the rigorous testing of emerging innovations vital. However, most surgical research fails to employ randomized controlled trials (RCTs) and has particularly been based on low-quality study designs. Subsequently, the analysis of data through meta-analysis and evidence synthesis is particularly difficult. Through a systematic review of the literature, this article explores the barriers to achieving a strong evidence base in surgery and offers potential solutions to overcome the barriers. Many barriers exist to evidence-based surgical research. They include enabling factors, such as funding, time, infrastructure, patient preference, ethical issues, and additionally barriers associated with specific attributes related to researchers, methodologies, or interventions. Novel evidence synthesis techniques in surgery are discussed, including graphics synthesis, treatment networks, and network meta-analyses that help overcome many of the limitations associated with existing techniques. They offer the opportunity to assess gaps and quantitatively present inconsistencies within the existing evidence of RCTs. Poorly or inadequately performed RCTs and meta-analyses can give rise to incorrect results and thus fail to inform clinical practice or revise policy. The above barriers can be overcome by providing academic leadership and good organizational support to ensure that adequate personnel, resources, and funding are allocated to the researcher. Training in research methodology and data interpretation can ensure that trials are conducted correctly and evidence is adequately synthesized and disseminated. The ultimate goal of overcoming the barriers to evidence-based surgery includes the improved quality of patient care in addition to enhanced patient outcomes.
Extending enterprise architecture modelling with business goals and requirements
NASA Astrophysics Data System (ADS)
Engelsman, Wilco; Quartel, Dick; Jonkers, Henk; van Sinderen, Marten
2011-02-01
The methods for enterprise architecture (EA), such as The Open Group Architecture Framework, acknowledge the importance of requirements modelling in the development of EAs. Modelling support is needed to specify, document, communicate and reason about goals and requirements. The current modelling techniques for EA focus on the products, services, processes and applications of an enterprise. In addition, techniques may be provided to describe structured requirements lists and use cases. Little support is available however for modelling the underlying motivation of EAs in terms of stakeholder concerns and the high-level goals that address these concerns. This article describes a language that supports the modelling of this motivation. The definition of the language is based on existing work on high-level goal and requirements modelling and is aligned with an existing standard for enterprise modelling: the ArchiMate language. Furthermore, the article illustrates how EA can benefit from analysis techniques from the requirements engineering domain.
Advances in paper-based sample pretreatment for point-of-care testing.
Tang, Rui Hua; Yang, Hui; Choi, Jane Ru; Gong, Yan; Feng, Shang Sheng; Pingguan-Murphy, Belinda; Huang, Qing Sheng; Shi, Jun Ling; Mei, Qi Bing; Xu, Feng
2017-06-01
In recent years, paper-based point-of-care testing (POCT) has been widely used in medical diagnostics, food safety and environmental monitoring. However, a high-cost, time-consuming and equipment-dependent sample pretreatment technique is generally required for raw sample processing, which are impractical for low-resource and disease-endemic areas. Therefore, there is an escalating demand for a cost-effective, simple and portable pretreatment technique, to be coupled with the commonly used paper-based assay (e.g. lateral flow assay) in POCT. In this review, we focus on the importance of using paper as a platform for sample pretreatment. We firstly discuss the beneficial use of paper for sample pretreatment, including sample collection and storage, separation, extraction, and concentration. We highlight the working principle and fabrication of each sample pretreatment device, the existing challenges and the future perspectives for developing paper-based sample pretreatment technique.
Existence of a coupled system of fractional differential equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ibrahim, Rabha W.; Siri, Zailan
2015-10-22
We manage the existence and uniqueness of a fractional coupled system containing Schrödinger equations. Such a system appears in quantum mechanics. We confirm that the fractional system under consideration admits a global solution in appropriate functional spaces. The solution is shown to be unique. The method is based on analytic technique of the fixed point theory. The fractional differential operator is considered from the virtue of the Riemann-Liouville differential operator.
Thinking outside the ROCs: Designing decorrelated taggers (DDT) for jet substructure
Dolen, James; Harris, Philip; Marzani, Simone; ...
2016-05-26
Here, we explore the scale-dependence and correlations of jet substructure observables to improve upon existing techniques in the identification of highly Lorentz-boosted objects. Modified observables are designed to remove correlations from existing theoretically well-understood observables, providing practical advantages for experimental measurements and searches for new phenomena. We study such observables in W jet tagging and provide recommendations for observables based on considerations beyond signal and background efficiencies.
Human tracking over camera networks: a review
NASA Astrophysics Data System (ADS)
Hou, Li; Wan, Wanggen; Hwang, Jenq-Neng; Muhammad, Rizwan; Yang, Mingyang; Han, Kang
2017-12-01
In recent years, automated human tracking over camera networks is getting essential for video surveillance. The tasks of tracking human over camera networks are not only inherently challenging due to changing human appearance, but also have enormous potentials for a wide range of practical applications, ranging from security surveillance to retail and health care. This review paper surveys the most widely used techniques and recent advances for human tracking over camera networks. Two important functional modules for the human tracking over camera networks are addressed, including human tracking within a camera and human tracking across non-overlapping cameras. The core techniques of human tracking within a camera are discussed based on two aspects, i.e., generative trackers and discriminative trackers. The core techniques of human tracking across non-overlapping cameras are then discussed based on the aspects of human re-identification, camera-link model-based tracking and graph model-based tracking. Our survey aims to address existing problems, challenges, and future research directions based on the analyses of the current progress made toward human tracking techniques over camera networks.
Analysis of a Knowledge-Management-Based Process of Transferring Project Management Skills
ERIC Educational Resources Information Center
Ioi, Toshihiro; Ono, Masakazu; Ishii, Kota; Kato, Kazuhiko
2012-01-01
Purpose: The purpose of this paper is to propose a method for the transfer of knowledge and skills in project management (PM) based on techniques in knowledge management (KM). Design/methodology/approach: The literature contains studies on methods to extract experiential knowledge in PM, but few studies exist that focus on methods to convert…
Raman sorting and identification of single living micro-organisms with optical tweezers
NASA Astrophysics Data System (ADS)
Xie, Changan; Chen, De; Li, Yong-Qing
2005-07-01
We report on a novel technique for sorting and identification of single biological cells and food-borne bacteria based on laser tweezers and Raman spectroscopy (LTRS). With this technique, biological cells of different physiological states in a sample chamber were identified by their Raman spectral signatures and then they were selectively manipulated into a clean collection chamber with optical tweezers through a microchannel. As an example, we sorted the live and dead yeast cells into the collection chamber and validated this with a standard staining technique. We also demonstrated that bacteria existing in spoiled foods could be discriminated from a variety of food particles based on their characteristic Raman spectra and then isolated with laser manipulation. This label-free LTRS sorting technique may find broad applications in microbiology and rapid examination of food-borne diseases.
NASA Astrophysics Data System (ADS)
Dovetta, Simone
2018-04-01
We investigate the existence of stationary solutions for the nonlinear Schrödinger equation on compact metric graphs. In the L2-subcritical setting, we prove the existence of an infinite number of such solutions, for every value of the mass. In the critical regime, the existence of infinitely many solutions is established if the mass is lower than a threshold value, while global minimizers of the NLS energy exist if and only if the mass is lower or equal to the threshold. Moreover, the relation between this threshold and the topology of the graph is characterized. The investigation is based on variational techniques and some new versions of Gagliardo-Nirenberg inequalities.
NASA Astrophysics Data System (ADS)
Liao, S.; Chen, L.; Li, J.; Xiong, W.; Wu, Q.
2015-07-01
Existing spatiotemporal database supports spatiotemporal aggregation query over massive moving objects datasets. Due to the large amounts of data and single-thread processing method, the query speed cannot meet the application requirements. On the other hand, the query efficiency is more sensitive to spatial variation then temporal variation. In this paper, we proposed a spatiotemporal aggregation query method using multi-thread parallel technique based on regional divison and implemented it on the server. Concretely, we divided the spatiotemporal domain into several spatiotemporal cubes, computed spatiotemporal aggregation on all cubes using the technique of multi-thread parallel processing, and then integrated the query results. By testing and analyzing on the real datasets, this method has improved the query speed significantly.
Radar fall detection using principal component analysis
NASA Astrophysics Data System (ADS)
Jokanovic, Branka; Amin, Moeness; Ahmad, Fauzia; Boashash, Boualem
2016-05-01
Falls are a major cause of fatal and nonfatal injuries in people aged 65 years and older. Radar has the potential to become one of the leading technologies for fall detection, thereby enabling the elderly to live independently. Existing techniques for fall detection using radar are based on manual feature extraction and require significant parameter tuning in order to provide successful detections. In this paper, we employ principal component analysis for fall detection, wherein eigen images of observed motions are employed for classification. Using real data, we demonstrate that the PCA based technique provides performance improvement over the conventional feature extraction methods.
Metamodels for Computer-Based Engineering Design: Survey and Recommendations
NASA Technical Reports Server (NTRS)
Simpson, Timothy W.; Peplinski, Jesse; Koch, Patrick N.; Allen, Janet K.
1997-01-01
The use of statistical techniques to build approximations of expensive computer analysis codes pervades much of todays engineering design. These statistical approximations, or metamodels, are used to replace the actual expensive computer analyses, facilitating multidisciplinary, multiobjective optimization and concept exploration. In this paper we review several of these techniques including design of experiments, response surface methodology, Taguchi methods, neural networks, inductive learning, and kriging. We survey their existing application in engineering design and then address the dangers of applying traditional statistical techniques to approximate deterministic computer analysis codes. We conclude with recommendations for the appropriate use of statistical approximation techniques in given situations and how common pitfalls can be avoided.
A neural network simulation package in CLIPS
NASA Technical Reports Server (NTRS)
Bhatnagar, Himanshu; Krolak, Patrick D.; Mcgee, Brenda J.; Coleman, John
1990-01-01
The intrinsic similarity between the firing of a rule and the firing of a neuron has been captured in this research to provide a neural network development system within an existing production system (CLIPS). A very important by-product of this research has been the emergence of an integrated technique of using rule based systems in conjunction with the neural networks to solve complex problems. The systems provides a tool kit for an integrated use of the two techniques and is also extendible to accommodate other AI techniques like the semantic networks, connectionist networks, and even the petri nets. This integrated technique can be very useful in solving complex AI problems.
Non-invasive acoustic-based monitoring of uranium in solution and H/D ratio
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pantea, Cristian; Beedle, Christopher Craig; Sinha, Dipen N.
The primary objective of this project is to adapt existing non-invasive acoustic techniques (Swept-Frequency Acoustic Interferometry and Gaussian-pulse acoustic technique) for the purpose of demonstrating the ability to quantify U or H/D ratios in solution. Furthermore, a successful demonstration will provide an easily implemented, low cost, and non-invasive method for remote and unattended uranium mass measurements for International Atomic Energy Agency (IAEA).
Report on Non-invasive acoustic monitoring of D2O concentration Oct 31 2017
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pantea, Cristian; Sinha, Dipen N.; Lakis, Rollin Evan
There is an urgent need for real-time monitoring of the hydrogen /deuterium ratio (H/D) for heavy water production monitoring. Based upon published literature, sound speed is sensitive to the deuterium content of heavy water and can be measured using existing acoustic methods to determine the deuterium concentration in heavy water solutions. We plan to adapt existing non-invasive acoustic techniques (Swept-Frequency Acoustic Interferometry and Gaussian-pulse acoustic technique) for the purpose of quantifying H/D ratios in solution. A successful demonstration will provide an easily implemented, low cost, and non-invasive method for remote and unattended H/D ratio measurements with a resolution of lessmore » than 0.2% vol.« less
Comparison of existing digital image analysis systems for the analysis of Thematic Mapper data
NASA Technical Reports Server (NTRS)
Likens, W. C.; Wrigley, R. C.
1984-01-01
Most existing image analysis systems were designed with the Landsat Multi-Spectral Scanner in mind, leaving open the question of whether or not these systems could adequately process Thematic Mapper data. In this report, both hardware and software systems have been evaluated for compatibility with TM data. Lack of spectral analysis capability was not found to be a problem, though techniques for spatial filtering and texture varied. Computer processing speed and data storage of currently existing mini-computer based systems may be less than adequate. Upgrading to more powerful hardware may be required for many TM applications.
Oakley, Paul A.; Harrison, Donald D.; Harrison, Deed E.; Haas, Jason W.
2005-01-01
BACKGROUND Although practice protocols exist for SMT and functional rehabilitation, no practice protocols exist for structural rehabilitation. Traditional chiropractic practice guidelines have been limited to acute and chronic pain treatment, with limited inclusion of functional and exclusion of structural rehabilitation procedures. OBJECTIVE (1) To derive an evidence-based practice protocol for structural rehabilitation from publications on Clinical Biomechanics of Posture (CBP®) methods, and (2) to compare the evidence for Diversified, SMT, and CBP®. METHODS Clinical control trials utilizing CBP® methods and spinal manipulative therapy (SMT) were obtained from searches in Mantis, CINAHL, and Index Medicus. Using data from SMT review articles, evidence for Diversified Technique (as taught in chiropractic colleges), SMT, and CBP® were rated and compared. RESULTS From the evidence from Clinical Control Trials on SMT and CBP®, there is very little evidence support for Diversified (our rating = 18), as taught in chiropractic colleges, for the treatment of pain subjects, while CBP® (our rating = 46) and SMT for neck pain (rating = 58) and low back pain (our rating = 202) have evidence-based support. CONCLUSIONS While CBP® Technique has approximately as much evidence-based support as SMT for neck pain, CBP® has more evidence to support its methods than the Diversified technique taught in chiropractic colleges, but not as much as SMT for low back pain. The evolution of chiropractic specialization has occurred, and doctors providing structural-based chiropractic care require protocol guidelines for patient quality assurance and standardization. A structural rehabilitation protocol was developed based on evidence from CBP® publications. PMID:17549209
NASA software specification and evaluation system design, part 2
NASA Technical Reports Server (NTRS)
1976-01-01
A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.
Monopulse azimuth measurement in the ATC Radar Beacon System
DOT National Transportation Integrated Search
1971-12-01
A review is made of the application of sum-difference beam : techniques to the ATC Radar Beacon System. A detailed error analysis : is presented for the case of a monopulse azimuth measurement based : on the existing beacon antenna with a modified fe...
Curveslam: Utilizing Higher Level Structure In Stereo Vision-Based Navigation
2012-01-01
consider their applica- tion to SLAM . The work of [31] [32] develops a spline-based SLAM framework, but this is only for application to LIDAR -based SLAM ...Existing approaches to visual Simultaneous Localization and Mapping ( SLAM ) typically utilize points as visual feature primitives to represent landmarks...regions of interest. Further, previous SLAM techniques that propose the use of higher level structures often place constraints on the environment, such as
2013-10-29
COVERED (From - To) 4. TITLE AND SUBTITLE 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...based on contextual information, 3) develop vision-based techniques for learning of contextual information, and detection and identification of...that takes into account many possible contexts. The probability distributions of these contexts will be learned from existing databases on common sense
Quantum logic gates based on ballistic transport in graphene
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dragoman, Daniela; Academy of Romanian Scientists, Splaiul Independentei 54, 050094 Bucharest; Dragoman, Mircea, E-mail: mircea.dragoman@imt.ro
2016-03-07
The paper presents various configurations for the implementation of graphene-based Hadamard, C-phase, controlled-NOT, and Toffoli gates working at room temperature. These logic gates, essential for any quantum computing algorithm, involve ballistic graphene devices for qubit generation and processing and can be fabricated using existing nanolithographical techniques. All quantum gate configurations are based on the very large mean-free-paths of carriers in graphene at room temperature.
Yamamoto, F; Yamamoto, M
2004-07-01
We previously developed a PCR-based DNA fingerprinting technique named the Methylation Sensitive (MS)-AFLP method, which permits comparative genome-wide scanning of methylation status with a manageable number of fingerprinting experiments. The technique uses the methylation sensitive restriction enzyme NotI in the context of the existing Amplified Fragment Length Polymorphism (AFLP) method. Here we report the successful conversion of this gel electrophoresis-based DNA fingerprinting technique into a DNA microarray hybridization technique (DNA Microarray MS-AFLP). By performing a total of 30 (15 x 2 reciprocal labeling) DNA Microarray MS-AFLP hybridization experiments on genomic DNA from two breast and three prostate cancer cell lines in all pairwise combinations, and Southern hybridization experiments using more than 100 different probes, we have demonstrated that the DNA Microarray MS-AFLP is a reliable method for genetic and epigenetic analyses. No statistically significant differences were observed in the number of differences between the breast-prostate hybridization experiments and the breast-breast or prostate-prostate comparisons.
Using cognitive task analysis to develop simulation-based training for medical tasks.
Cannon-Bowers, Jan; Bowers, Clint; Stout, Renee; Ricci, Katrina; Hildabrand, Annette
2013-10-01
Pressures to increase the efficacy and effectiveness of medical training are causing the Department of Defense to investigate the use of simulation technologies. This article describes a comprehensive cognitive task analysis technique that can be used to simultaneously generate training requirements, performance metrics, scenario requirements, and simulator/simulation requirements for medical tasks. On the basis of a variety of existing techniques, we developed a scenario-based approach that asks experts to perform the targeted task multiple times, with each pass probing a different dimension of the training development process. In contrast to many cognitive task analysis approaches, we argue that our technique can be highly cost effective because it is designed to accomplish multiple goals. The technique was pilot tested with expert instructors from a large military medical training command. These instructors were employed to generate requirements for two selected combat casualty care tasks-cricothyroidotomy and hemorrhage control. Results indicated that the technique is feasible to use and generates usable data to inform simulation-based training system design. Reprint & Copyright © 2013 Association of Military Surgeons of the U.S.
Havens: Explicit Reliable Memory Regions for HPC Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hukerikar, Saurabh; Engelmann, Christian
2016-01-01
Supporting error resilience in future exascale-class supercomputing systems is a critical challenge. Due to transistor scaling trends and increasing memory density, scientific simulations are expected to experience more interruptions caused by transient errors in the system memory. Existing hardware-based detection and recovery techniques will be inadequate to manage the presence of high memory fault rates. In this paper we propose a partial memory protection scheme based on region-based memory management. We define the concept of regions called havens that provide fault protection for program objects. We provide reliability for the regions through a software-based parity protection mechanism. Our approach enablesmore » critical program objects to be placed in these havens. The fault coverage provided by our approach is application agnostic, unlike algorithm-based fault tolerance techniques.« less
Gehrmann, Sebastian; Dernoncourt, Franck; Li, Yeran; Carlson, Eric T; Wu, Joy T; Welt, Jonathan; Foote, John; Moseley, Edward T; Grant, David W; Tyler, Patrick D; Celi, Leo A
2018-01-01
In secondary analysis of electronic health records, a crucial task consists in correctly identifying the patient cohort under investigation. In many cases, the most valuable and relevant information for an accurate classification of medical conditions exist only in clinical narratives. Therefore, it is necessary to use natural language processing (NLP) techniques to extract and evaluate these narratives. The most commonly used approach to this problem relies on extracting a number of clinician-defined medical concepts from text and using machine learning techniques to identify whether a particular patient has a certain condition. However, recent advances in deep learning and NLP enable models to learn a rich representation of (medical) language. Convolutional neural networks (CNN) for text classification can augment the existing techniques by leveraging the representation of language to learn which phrases in a text are relevant for a given medical condition. In this work, we compare concept extraction based methods with CNNs and other commonly used models in NLP in ten phenotyping tasks using 1,610 discharge summaries from the MIMIC-III database. We show that CNNs outperform concept extraction based methods in almost all of the tasks, with an improvement in F1-score of up to 26 and up to 7 percentage points in area under the ROC curve (AUC). We additionally assess the interpretability of both approaches by presenting and evaluating methods that calculate and extract the most salient phrases for a prediction. The results indicate that CNNs are a valid alternative to existing approaches in patient phenotyping and cohort identification, and should be further investigated. Moreover, the deep learning approach presented in this paper can be used to assist clinicians during chart review or support the extraction of billing codes from text by identifying and highlighting relevant phrases for various medical conditions.
Kumar, Rajnish; Mishra, Bharat Kumar; Lahiri, Tapobrata; Kumar, Gautam; Kumar, Nilesh; Gupta, Rahul; Pal, Manoj Kumar
2017-06-01
Online retrieval of the homologous nucleotide sequences through existing alignment techniques is a common practice against the given database of sequences. The salient point of these techniques is their dependence on local alignment techniques and scoring matrices the reliability of which is limited by computational complexity and accuracy. Toward this direction, this work offers a novel way for numerical representation of genes which can further help in dividing the data space into smaller partitions helping formation of a search tree. In this context, this paper introduces a 36-dimensional Periodicity Count Value (PCV) which is representative of a particular nucleotide sequence and created through adaptation from the concept of stochastic model of Kolekar et al. (American Institute of Physics 1298:307-312, 2010. doi: 10.1063/1.3516320 ). The PCV construct uses information on physicochemical properties of nucleotides and their positional distribution pattern within a gene. It is observed that PCV representation of gene reduces computational cost in the calculation of distances between a pair of genes while being consistent with the existing methods. The validity of PCV-based method was further tested through their use in molecular phylogeny constructs in comparison with that using existing sequence alignment methods.
pyNSMC: A Python Module for Null-Space Monte Carlo Uncertainty Analysis
NASA Astrophysics Data System (ADS)
White, J.; Brakefield, L. K.
2015-12-01
The null-space monte carlo technique is a non-linear uncertainty analyses technique that is well-suited to high-dimensional inverse problems. While the technique is powerful, the existing workflow for completing null-space monte carlo is cumbersome, requiring the use of multiple commandline utilities, several sets of intermediate files and even a text editor. pyNSMC is an open-source python module that automates the workflow of null-space monte carlo uncertainty analyses. The module is fully compatible with the PEST and PEST++ software suites and leverages existing functionality of pyEMU, a python framework for linear-based uncertainty analyses. pyNSMC greatly simplifies the existing workflow for null-space monte carlo by taking advantage of object oriented design facilities in python. The core of pyNSMC is the ensemble class, which draws and stores realized random vectors and also provides functionality for exporting and visualizing results. By relieving users of the tedium associated with file handling and command line utility execution, pyNSMC instead focuses the user on the important steps and assumptions of null-space monte carlo analysis. Furthermore, pyNSMC facilitates learning through flow charts and results visualization, which are available at many points in the algorithm. The ease-of-use of the pyNSMC workflow is compared to the existing workflow for null-space monte carlo for a synthetic groundwater model with hundreds of estimable parameters.
A review of techniques to determine alternative selection in design for remanufacturing
NASA Astrophysics Data System (ADS)
Noor, A. Z. Mohamed; Fauadi, M. H. F. Md; Jafar, F. A.; Mohamad, N. R.; Yunos, A. S. Mohd
2017-10-01
This paper discusses the techniques used for optimization in manufacturing system. Although problem domain is focused on sustainable manufacturing, techniques used to optimize general manufacturing system were also discussed. Important aspects of Design for Remanufacturing (DFReM) considered include indexes, weighted average, grey decision making and Fuzzy TOPSIS. The limitation of existing techniques are most of them is highly based on decision maker’s perspective. Different experts may have different understanding and eventually scale it differently. Therefore, the objective of this paper is to determine available techniques and identify the lacking feature in it. Once all the techniques have been reviewed, a decision will be made by create another technique which should counter the lacking of discussed techniques. In this paper, shows that the hybrid computation of Fuzzy Analytic Hierarchy Process (AHP) and Artificial Neural Network (ANN) is suitable and fill the gap of all discussed technique.
Tropospheric wet refractivity tomography using multiplicative algebraic reconstruction technique
NASA Astrophysics Data System (ADS)
Xiaoying, Wang; Ziqiang, Dai; Enhong, Zhang; Fuyang, K. E.; Yunchang, Cao; Lianchun, Song
2014-01-01
Algebraic reconstruction techniques (ART) have been successfully used to reconstruct the total electron content (TEC) of the ionosphere and in recent years be tentatively used in tropospheric wet refractivity and water vapor tomography in the ground-based GNSS technology. The previous research on ART used in tropospheric water vapor tomography focused on the convergence and relaxation parameters for various algebraic reconstruction techniques and rarely discussed the impact of Gaussian constraints and initial field on the iteration results. The existing accuracy evaluation parameters calculated from slant wet delay can only evaluate the resultant precision of the voxels penetrated by slant paths and cannot evaluate that of the voxels not penetrated by any slant path. The paper proposes two new statistical parameters Bias and RMS, calculated from wet refractivity of the total voxels, to improve the deficiencies of existing evaluation parameters and then discusses the effect of the Gaussian constraints and initial field on the convergence and tomography results in multiplicative algebraic reconstruction technique (MART) to reconstruct the 4D tropospheric wet refractivity field using simulation method.
ERIC Educational Resources Information Center
Educational Innovators Press, Tucson, AZ.
This booklet contains five papers which examine the activities, successes, and pitfalls encountered by educators who are introducing accountability techniques into instructional programs where they did not exist in the past. The papers are based on actual programs and offer possible solutions in the areas considered, which are 1) performance…
A new storage-ring light source
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chao, Alex
2015-06-01
A recently proposed technique in storage ring accelerators is applied to provide potential high-power sources of photon radiation. The technique is based on the steady-state microbunching (SSMB) mechanism. As examples of this application, one may consider a high-power DUV photon source for research in atomic and molecular physics or a high-power EUV radiation source for industrial lithography. A less challenging proof-of-principle test to produce IR radiation using an existing storage ring is also considered.
Factory approach can streamline patient accounting.
Rands, J; Muench, M
1991-08-01
Although they may seem fundamentally different, similarities exist between operations of factories and healthcare organizations' business offices. As a result, a patient accounting approach based on manufacturing firms' management techniques may help smooth healthcare business processes. Receivables performance management incorporates the Japanese techniques of "just-in-time" and total quality management to reduce unbilled accounts and information backlog and accelerate payment. A preliminary diagnostic assessment of a patient accounting process helps identify bottlenecks and set priorities for work flow.
Alió Del Barrio, Jorge L; Vargas, Verónica; Al-Shymali, Olena; Alió, Jorge L
2017-01-01
Small Incision Lenticule Extraction (SMILE) is a flap-free intrastromal technique for the correction of myopia and myopic astigmatism. To date, this technique lacks automated centration and cyclotorsion control, so several concerns have been raised regarding its capability to correct moderate or high levels of astigmatism. The objective of this paper is to review the reported SMILE outcomes for the correction of myopic astigmatism associated with a cylinder over 0.75 D, and its comparison with the outcomes reported with the excimer laser-based corneal refractive surgery techniques. A total of five studies clearly reporting SMILE astigmatic outcomes were identified. SMILE shows acceptable outcomes for the correction of myopic astigmatism, although a general agreement exists about the superiority of the excimer laser-based techniques for low to moderate levels of astigmatism. Manual correction of the static cyclotorsion should be adopted for any SMILE astigmatic correction over 0.75 D.
Hybrid registration of PET/CT in thoracic region with pre-filtering PET sinogram
NASA Astrophysics Data System (ADS)
Mokri, S. S.; Saripan, M. I.; Marhaban, M. H.; Nordin, A. J.; Hashim, S.
2015-11-01
The integration of physiological (PET) and anatomical (CT) images in cancer delineation requires an accurate spatial registration technique. Although hybrid PET/CT scanner is used to co-register these images, significant misregistrations exist due to patient and respiratory/cardiac motions. This paper proposes a hybrid feature-intensity based registration technique for hybrid PET/CT scanner. First, simulated PET sinogram was filtered with a 3D hybrid mean-median before reconstructing the image. The features were then derived from the segmented structures (lung, heart and tumor) from both images. The registration was performed based on modified multi-modality demon registration with multiresolution scheme. Apart from visual observations improvements, the proposed registration technique increased the normalized mutual information index (NMI) between the PET/CT images after registration. All nine tested datasets show marked improvements in mutual information (MI) index than free form deformation (FFD) registration technique with the highest MI increase is 25%.
Observations of the Geometry of Horizon-Based Optical Navigation
NASA Technical Reports Server (NTRS)
Christian, John; Robinson, Shane
2016-01-01
NASA's Orion Project has sparked a renewed interest in horizon-based optical navigation(OPNAV) techniques for spacecraft in the Earth-Moon system. Some approaches have begun to explore the geometry of horizon-based OPNAV and exploit the fact that it is a conic section problem. Therefore, the present paper focuses more deeply on understanding and leveraging the various geometric interpretations of horizon-based OPNAV. These results provide valuable insight into the fundamental workings of OPNAV solution methods, their convergence properties, and associated estimate covariance. Most importantly, the geometry and transformations uncovered in this paper lead to a simple and non-iterative solution to the generic horizon-based OPNAV problem. This represents a significant theoretical advancement over existing methods. Thus, we find that a clear understanding of geometric relationships is central to the prudent design, use, and operation of horizon-based OPNAV techniques.
Erickson, Brandon J.; Cvetanovich, Gregory L.; Frank, Rachel M.; Bach, Bernard R.; Cohen, Mark S.; Bush-Joseph, Charles A.; Cole, Brian J.; Romeo, Anthony A.
2016-01-01
Background: Ulnar collateral ligament reconstruction (UCLR) has become a common procedure performed in overhead-throwing athletes of many athletic levels. Purpose/Hypothesis: The purpose of this study was to determine whether clinical outcomes and return-to-sport (RTS) rates differ among patients undergoing UCLR based on graft choice, surgical technique, athletic competition level, handedness, and treatment of the ulnar nerve. We hypothesized that no differences would exist in clinical outcomes or RTS rates between technique, graft choice, or other variables. Study Design: Cohort study; Level of evidence, 3. Methods: All patients who underwent UCLR from January 1, 2004 through December 31, 2014 at a single institution were identified. Charts were reviewed to determine patient age, sex, date of surgery, sport played, handedness, athletic level, surgical technique, graft type, and complications. Patients were contacted via telephone to obtain the RTS rate, Conway-Jobe score, Timmerman-Andrews score, and Kerlan-Jobe Orthopaedic Clinic (KJOC) Shoulder and Elbow score. Results: Eighty-five patients (mean age at surgery, 19.3 ± 4.7 years; 92% male; 78% right hand–dominant) underwent UCLR between 2004 and 2014 and were available for follow-up. Overall, 87% were baseball pitchers, 49.4% were college athletes, and 41.2% were high school athletes. No significant difference existed between the docking and double-docking techniques, graft choice, handedness, sex, activity level, and treatment of the ulnar nerve with regard to clinical outcomes, RTS, or subsequent surgeries (all P > .05). More complications were seen in the docking technique compared with the double-docking technique (P = .036). Hamstring autograft was used more commonly with the docking technique (P = .023) while allograft was used more commonly with the double-docking technique (P = .0006). Conclusion: Both the docking and double-docking techniques produce excellent clinical outcomes in patients undergoing UCLR. No difference in outcome scores was seen between surgical technique or graft type. The double-docking technique had fewer complications than the docking technique. PMID:27896290
A simple low cost latent fingerprint sensor based on deflectometry and WFT analysis
NASA Astrophysics Data System (ADS)
Dhanotia, Jitendra; Chatterjee, Amit; Bhatia, Vimal; Prakash, Shashi
2018-02-01
In criminal investigations, latent fingerprints are one of the most significant forms of evidence and most commonly used forensic investigation tool worldwide. The existing non-contact latent fingerprint detection systems are bulky, expensive and require environment which is shock and vibration resistant, thereby limiting their usability outside the laboratory. In this article, a compact, full field, low cost technique for profiling of fingerprints using deflectometry is proposed. Using inexpensive mobile phone screen based structured illumination, and windowed Fourier transform (WFT) based phase retrieval mechanism, the 2D and 3D phase plots reconstruct the profile information of the fingerprint. The phase information is also used to confirm a match between two fingerprints in real time. Since the proposed technique is non-interferometric, the measurements are least affected by environmental perturbations. Using the proposed technique, a portable sensor capable of field deployment has been realized.
Validation of Regression-Based Myogenic Correction Techniques for Scalp and Source-Localized EEG
McMenamin, Brenton W.; Shackman, Alexander J.; Maxwell, Jeffrey S.; Greischar, Lawrence L.; Davidson, Richard J.
2008-01-01
EEG and EEG source-estimation are susceptible to electromyographic artifacts (EMG) generated by the cranial muscles. EMG can mask genuine effects or masquerade as a legitimate effect - even in low frequencies, such as alpha (8–13Hz). Although regression-based correction has been used previously, only cursory attempts at validation exist and the utility for source-localized data is unknown. To address this, EEG was recorded from 17 participants while neurogenic and myogenic activity were factorially varied. We assessed the sensitivity and specificity of four regression-based techniques: between-subjects, between-subjects using difference-scores, within-subjects condition-wise, and within-subject epoch-wise on the scalp and in data modeled using the LORETA algorithm. Although within-subject epoch-wise showed superior performance on the scalp, no technique succeeded in the source-space. Aside from validating the novel epoch-wise methods on the scalp, we highlight methods requiring further development. PMID:19298626
An ANN-Based Smart Tomographic Reconstructor in a Dynamic Environment
de Cos Juez, Francisco J.; Lasheras, Fernando Sánchez; Roqueñí, Nieves; Osborn, James
2012-01-01
In astronomy, the light emitted by an object travels through the vacuum of space and then the turbulent atmosphere before arriving at a ground based telescope. By passing through the atmosphere a series of turbulent layers modify the light's wave-front in such a way that Adaptive Optics reconstruction techniques are needed to improve the image quality. A novel reconstruction technique based in Artificial Neural Networks (ANN) is proposed. The network is designed to use the local tilts of the wave-front measured by a Shack Hartmann Wave-front Sensor (SHWFS) as inputs and estimate the turbulence in terms of Zernike coefficients. The ANN used is a Multi-Layer Perceptron (MLP) trained with simulated data with one turbulent layer changing in altitude. The reconstructor was tested using three different atmospheric profiles and compared with two existing reconstruction techniques: Least Squares type Matrix Vector Multiplication (LS) and Learn and Apply (L + A). PMID:23012524
Li, Hongsheng
2018-01-01
This review aims to compare existing robot-assisted ankle rehabilitation techniques in terms of robot design. Included studies mainly consist of selected papers in two published reviews involving a variety of robot-assisted ankle rehabilitation techniques. A free search was also made in Google Scholar and Scopus by using keywords “ankle∗,” and “robot∗,” and (“rehabilitat∗” or “treat∗”). The search is limited to English-language articles published between January 1980 and September 2016. Results show that existing robot-assisted ankle rehabilitation techniques can be classified into wearable exoskeleton and platform-based devices. Platform-based devices are mostly developed for the treatment of a variety of ankle musculoskeletal and neurological injuries, while wearable ones focus more on ankle-related gait training. In terms of robot design, comparative analysis indicates that an ideal ankle rehabilitation robot should have aligned rotation center as the ankle joint, appropriate workspace, and actuation torque, no matter how many degrees of freedom (DOFs) it has. Single-DOF ankle robots are mostly developed for specific applications, while multi-DOF devices are more suitable for comprehensive ankle rehabilitation exercises. Other factors including posture adjustability and sensing functions should also be considered to promote related clinical applications. An ankle rehabilitation robot with reconfigurability to maximize its functions will be a new research point towards optimal design, especially on parallel mechanisms. PMID:29736230
Miao, Qing; Zhang, Mingming; Wang, Congzhe; Li, Hongsheng
2018-01-01
This review aims to compare existing robot-assisted ankle rehabilitation techniques in terms of robot design. Included studies mainly consist of selected papers in two published reviews involving a variety of robot-assisted ankle rehabilitation techniques. A free search was also made in Google Scholar and Scopus by using keywords "ankle ∗ ," and "robot ∗ ," and ("rehabilitat ∗ " or "treat ∗ "). The search is limited to English-language articles published between January 1980 and September 2016. Results show that existing robot-assisted ankle rehabilitation techniques can be classified into wearable exoskeleton and platform-based devices. Platform-based devices are mostly developed for the treatment of a variety of ankle musculoskeletal and neurological injuries, while wearable ones focus more on ankle-related gait training. In terms of robot design, comparative analysis indicates that an ideal ankle rehabilitation robot should have aligned rotation center as the ankle joint, appropriate workspace, and actuation torque, no matter how many degrees of freedom (DOFs) it has. Single-DOF ankle robots are mostly developed for specific applications, while multi-DOF devices are more suitable for comprehensive ankle rehabilitation exercises. Other factors including posture adjustability and sensing functions should also be considered to promote related clinical applications. An ankle rehabilitation robot with reconfigurability to maximize its functions will be a new research point towards optimal design, especially on parallel mechanisms.
NASA Astrophysics Data System (ADS)
Moon, H.; Kim, C.; Lee, W.
2016-06-01
Regarding spatial location positioning, indoor location positioning theories based on wireless communication techniques such as Wi-Fi, beacon, UWB and Bluetooth has widely been developing across the world. These techniques are mainly focusing on spatial location detection of customers using fixed wireless APs and unique Tags in the indoor environment. Besides, since existing detection equipment and techniques using ultrasound or sound etc. to detect buried persons and identify survival status for them cause 2nd damages on the collapsed debris for rescuers. In addition, it might take time to check the buried persons. However, the collapsed disaster sites should consider both outdoor and indoor environments because empty spaces under collapsed debris exists. In order to detect buried persons from the empty spaces, we should collect wireless signals with Wi-Fi from their mobile phone. Basically, the Wi-Fi signal measure 2-D location. However, since the buried persons have Z value with burial depth, we also should collect barometer sensor data from their mobile phones in order to measure Z values according to weather conditions. Specially, for quick accessibility to the disaster area, a drone (UAV; Unmanned Arial Vehicle) system, which is equipped with a wireless detection module, was introduced. Using these framework, this study aims to provide the rescuers with effective rescue information by calculating 3-D location for buried persons based on the wireless and barometer sensor fusion.
Multipath Routing of Fragmented Data Transfer in a Smart Grid Environment
NASA Astrophysics Data System (ADS)
Borgohain, Tuhin; Borgohain, Amardeep; Borgohain, Rajdeep; Sanyal, Sugata
2015-02-01
The purpose of this paper is to do a general survey on the existing communication modes inside a smart grid, the existing security loopholes and their countermeasures. Then we suggest a detailed countermeasure, building upon the Jigsaw based secure data transfer [8] for enhanced security of the data flow inside the communication system of a smart grid. The paper has been written without the consideration of any factor of inoperability between the various security techniques inside a smart grid
Non-Conventional Techniques for the Study of Phase Transitions in NiTi-Based Alloys
NASA Astrophysics Data System (ADS)
Nespoli, Adelaide; Villa, Elena; Passaretti, Francesca; Albertini, Franca; Cabassi, Riccardo; Pasquale, Massimo; Sasso, Carlo Paolo; Coïsson, Marco
2014-07-01
Differential scanning calorimetry and electrical resistance measurements are the two most common techniques for the study of the phase transition path and temperatures of shape memory alloys (SMA) in stress-free condition. Besides, it is well known that internal friction measurements are also useful for this purpose. There are indeed some further techniques which are seldom used for the basic characterization of SMA transition: dilatometric analysis, magnetic measurements, and Seebeck coefficient study. In this work, we discuss the attitude of these techniques for the study of NiTi-based phase transition. Measurements were conducted on several fully annealed Ni50- x Ti50Cu x samples ranging from 3 to 10 at.% in Cu content, fully annealed at 850 °C for 1 h in vacuum and quenched in water at room temperature. Results show that all these techniques are sensitive to phase transition, and they provide significant information about the existence of intermediate phases.
Multiscale corner detection and classification using local properties and semantic patterns
NASA Astrophysics Data System (ADS)
Gallo, Giovanni; Giuoco, Alessandro L.
2002-05-01
A new technique to detect, localize and classify corners in digital closed curves is proposed. The technique is based on correct estimation of support regions for each point. We compute multiscale curvature to detect and to localize corners. As a further step, with the aid of some local features, it's possible to classify corners into seven distinct types. Classification is performed using a set of rules, which describe corners according to preset semantic patterns. Compared with existing techniques, the proposed approach inscribes itself into the family of algorithms that try to explain the curve, instead of simple labeling. Moreover, our technique works in manner similar to what is believed are typical mechanisms of human perception.
ECG-derived respiration based on iterated Hilbert transform and Hilbert vibration decomposition.
Sharma, Hemant; Sharma, K K
2018-06-01
Monitoring of the respiration using the electrocardiogram (ECG) is desirable for the simultaneous study of cardiac activities and the respiration in the aspects of comfort, mobility, and cost of the healthcare system. This paper proposes a new approach for deriving the respiration from single-lead ECG based on the iterated Hilbert transform (IHT) and the Hilbert vibration decomposition (HVD). The ECG signal is first decomposed into the multicomponent sinusoidal signals using the IHT technique. Afterward, the lower order amplitude components obtained from the IHT are filtered using the HVD to extract the respiration information. Experiments are performed on the Fantasia and Apnea-ECG datasets. The performance of the proposed ECG-derived respiration (EDR) approach is compared with the existing techniques including the principal component analysis (PCA), R-peak amplitudes (RPA), respiratory sinus arrhythmia (RSA), slopes of the QRS complex, and R-wave angle. The proposed technique showed the higher median values of correlation (first and third quartile) for both the Fantasia and Apnea-ECG datasets as 0.699 (0.55, 0.82) and 0.57 (0.40, 0.73), respectively. Also, the proposed algorithm provided the lowest values of the mean absolute error and the average percentage error computed from the EDR and reference (recorded) respiration signals for both the Fantasia and Apnea-ECG datasets as 1.27 and 9.3%, and 1.35 and 10.2%, respectively. In the experiments performed over different age group subjects of the Fantasia dataset, the proposed algorithm provided effective results in the younger population but outperformed the existing techniques in the case of elderly subjects. The proposed EDR technique has the advantages over existing techniques in terms of the better agreement in the respiratory rates and specifically, it reduces the need for an extra step required for the detection of fiducial points in the ECG for the estimation of respiration which makes the process effective and less-complex. The above performance results obtained from two different datasets validate that the proposed approach can be used for monitoring of the respiration using single-lead ECG.
Prediction of drug synergy in cancer using ensemble-based machine learning techniques
NASA Astrophysics Data System (ADS)
Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder
2018-04-01
Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.
USDA-ARS?s Scientific Manuscript database
Streambank stabilization techniques are often implemented to reduce sediment loads from unstable streambanks. Process-based models can predict sediment yields with stabilization scenarios prior to implementation. However, a framework does not exist on how to effectively utilize these models to evalu...
DOT National Transportation Integrated Search
2000-01-01
The ability to visualize data has grown immensely as the speed and functionality of Geographic Information Systems (GIS) have increased. Now, with modeling software and GIS, planners are able to view a prediction of the future traffic demands in thei...
Through-wall image enhancement using fuzzy and QR decomposition.
Riaz, Muhammad Mohsin; Ghafoor, Abdul
2014-01-01
QR decomposition and fuzzy logic based scheme is proposed for through-wall image enhancement. QR decomposition is less complex compared to singular value decomposition. Fuzzy inference engine assigns weights to different overlapping subspaces. Quantitative measures and visual inspection are used to analyze existing and proposed techniques.
Zeiler, Frederick A; Donnelly, Joseph; Calviello, Leanne; Menon, David K; Smielewski, Peter; Czosnyka, Marek
2017-12-01
The purpose of this study was to perform a systematic, scoping review of commonly described intermittent/semi-intermittent autoregulation measurement techniques in adult traumatic brain injury (TBI). Nine separate systematic reviews were conducted for each intermittent technique: computed tomographic perfusion (CTP)/Xenon-CT (Xe-CT), positron emission tomography (PET), magnetic resonance imaging (MRI), arteriovenous difference in oxygen (AVDO 2 ) technique, thigh cuff deflation technique (TCDT), transient hyperemic response test (THRT), orthostatic hypotension test (OHT), mean flow index (Mx), and transfer function autoregulation index (TF-ARI). MEDLINE ® , BIOSIS, EMBASE, Global Health, Scopus, Cochrane Library (inception to December 2016), and reference lists of relevant articles were searched. A two tier filter of references was conducted. The total number of articles utilizing each of the nine searched techniques for intermittent/semi-intermittent autoregulation techniques in adult TBI were: CTP/Xe-CT (10), PET (6), MRI (0), AVDO 2 (10), ARI-based TCDT (9), THRT (6), OHT (3), Mx (17), and TF-ARI (6). The premise behind all of the intermittent techniques is manipulation of systemic blood pressure/blood volume via either chemical (such as vasopressors) or mechanical (such as thigh cuffs or carotid compression) means. Exceptionally, Mx and TF-ARI are based on spontaneous fluctuations of cerebral perfusion pressure (CPP) or mean arterial pressure (MAP). The method for assessing the cerebral circulation during these manipulations varies, with both imaging-based techniques and TCD utilized. Despite the limited literature for intermittent/semi-intermittent techniques in adult TBI (minus Mx), it is important to acknowledge the availability of such tests. They have provided fundamental insight into human autoregulatory capacity, leading to the development of continuous and more commonly applied techniques in the intensive care unit (ICU). Numerous methods of intermittent/semi-intermittent pressure autoregulation assessment in adult TBI exist, including: CTP/Xe-CT, PET, AVDO 2 technique, TCDT-based ARI, THRT, OHT, Mx, and TF-ARI. MRI-based techniques in adult TBI are yet to be described, with the main focus of MRI techniques on metabolic-based cerebrovascular reactivity (CVR) and not pressure-based autoregulation.
Atmospheric observations for STS-1 landing
NASA Technical Reports Server (NTRS)
Turner, R. E.; Arnold, J. E.; Wilson, G. S.
1981-01-01
A summary of synoptic weather conditions existing over the western United States is given for the time of shuttle descent into Edwards Air Force Base, California. The techniques and methods used to furnish synoptic atmospheric data at the surface and aloft for flight verification of the STS-1 orbiter during its descent into Edwards Air Force Base are specified. Examples of the upper level data set are given.
MEMS-based platforms for mechanical manipulation and characterization of cells
NASA Astrophysics Data System (ADS)
Pan, Peng; Wang, Wenhui; Ru, Changhai; Sun, Yu; Liu, Xinyu
2017-12-01
Mechanical manipulation and characterization of single cells are important experimental techniques in biological and medical research. Because of the microscale sizes and highly fragile structures of cells, conventional cell manipulation and characterization techniques are not accurate and/or efficient enough or even cannot meet the more and more demanding needs in different types of cell-based studies. To this end, novel microelectromechanical systems (MEMS)-based technologies have been developed to improve the accuracy, efficiency, and consistency of various cell manipulation and characterization tasks, and enable new types of cell research. This article summarizes existing MEMS-based platforms developed for cell mechanical manipulation and characterization, highlights their specific design considerations making them suitable for their designated tasks, and discuss their advantages and limitations. In closing, an outlook into future trends is also provided.
NASA Astrophysics Data System (ADS)
Salcedo-Sanz, S.
2016-10-01
Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. We first review important concepts on optimization problems, search spaces and problems' difficulty. Then, the usefulness of heuristics and meta-heuristics approaches to face hard optimization problems is introduced, and some of the main existing classical versions of these algorithms are reviewed. The mathematical framework of different nonlinear physics processes is then introduced as a preparatory step to review in detail the most important meta-heuristics based on them. A discussion on the novelty of these approaches, their main computational implementation and design issues, and the evaluation of a novel meta-heuristic based on Strange Attractors mutation will be carried out to complete the review of these techniques. We also describe some of the most important application areas, in broad sense, of meta-heuristics, and describe free-accessible software frameworks which can be used to make easier the implementation of these algorithms.
Simultaneous computation of jet turbulence and noise
NASA Technical Reports Server (NTRS)
Berman, C. H.; Ramos, J. I.
1989-01-01
The existing flow computation methods, wave computation techniques, and theories based on noise source models are reviewed in order to assess the capabilities of numerical techniques to compute jet turbulence noise and understand the physical mechanisms governing it over a range of subsonic and supersonic nozzle exit conditions. In particular, attention is given to (1) methods for extrapolating near field information, obtained from flow computations, to the acoustic far field and (2) the numerical solution of the time-dependent Lilley equation.
Extending radiative transfer models by use of Bayes rule. [in atmospheric science
NASA Technical Reports Server (NTRS)
Whitney, C.
1977-01-01
This paper presents a procedure that extends some existing radiative transfer modeling techniques to problems in atmospheric science where curvature and layering of the medium and dynamic range and angular resolution of the signal are important. Example problems include twilight and limb scan simulations. Techniques that are extended include successive orders of scattering, matrix operator, doubling, Gauss-Seidel iteration, discrete ordinates and spherical harmonics. The procedure for extending them is based on Bayes' rule from probability theory.
NASA Astrophysics Data System (ADS)
Doležel, Jiří; Novák, Drahomír; Petrů, Jan
2017-09-01
Transportation routes of oversize and excessive loads are currently planned in relation to ensure the transit of a vehicle through critical points on the road. Critical points are level-intersection of roads, bridges etc. This article presents a comprehensive procedure to determine a reliability and a load-bearing capacity level of the existing bridges on highways and roads using the advanced methods of reliability analysis based on simulation techniques of Monte Carlo type in combination with nonlinear finite element method analysis. The safety index is considered as a main criterion of the reliability level of the existing construction structures and the index is described in current structural design standards, e.g. ISO and Eurocode. An example of a single-span slab bridge made of precast prestressed concrete girders of the 60 year current time and its load bearing capacity is set for the ultimate limit state and serviceability limit state. The structure’s design load capacity was estimated by the full probability nonlinear MKP analysis using a simulation technique Latin Hypercube Sampling (LHS). Load-bearing capacity values based on a fully probabilistic analysis are compared with the load-bearing capacity levels which were estimated by deterministic methods of a critical section of the most loaded girders.
Research Using Government Data Sets: An Underutilised Resource
ERIC Educational Resources Information Center
Knipe, Sally
2011-01-01
The use of existing data for education research activities can be a valuable resource. Improvement in statistical analysis and data management and retrieval techniques, as well as access to government data bases, has expanded opportunities for researchers seeking to investigate issues that are institutional in nature, such as participation…
Development and Application of a Model of Fallout Shelter Stay Times.
1978-12-29
post-attack environment is a disaster, and that human response to a nuclear disaster is an extropolation of human response to natural disasters...Soviet reaction to a nuclear disaster . This technique is not limited to fallout shelter studies. If an appropriate data base exists, subjects such as
A Theory of Term Importance in Automatic Text Analysis.
ERIC Educational Resources Information Center
Salton, G.; And Others
Most existing automatic content analysis and indexing techniques are based on work frequency characteristics applied largely in an ad hoc manner. Contradictory requirements arise in this connection, in that terms exhibiting high occurrence frequencies in individual documents are often useful for high recall performance (to retrieve many relevant…
Equal Employment Legislation: Alternative Means of Compliance.
ERIC Educational Resources Information Center
Daum, Jeffrey W.
Alternative means of compliance available to organizations to bring their manpower uses into line with existing equal employment legislation are discussed in this paper. The first area addressed concerns the classical approach to selection and placement based on testing methods. The second area discussed reviews various nontesting techniques, such…
Lance A. Vickers; Thomas R. Fox; David L. Loftis; David A. Boucugnani
2013-01-01
The difficulty of achieving reliable oak (Quercus spp.) regeneration is well documented. Application of silvicultural techniques to facilitate oak regeneration largely depends on current regeneration potential. A computer model to assess regeneration potential based on existing advanced reproduction in Appalachian hardwoods was developed by David...
Web image retrieval using an effective topic and content-based technique
NASA Astrophysics Data System (ADS)
Lee, Ching-Cheng; Prabhakara, Rashmi
2005-03-01
There has been an exponential growth in the amount of image data that is available on the World Wide Web since the early development of Internet. With such a large amount of information and image available and its usefulness, an effective image retrieval system is thus greatly needed. In this paper, we present an effective approach with both image matching and indexing techniques that improvise on existing integrated image retrieval methods. This technique follows a two-phase approach, integrating query by topic and query by example specification methods. In the first phase, The topic-based image retrieval is performed by using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. This technique consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. In the second phase, we use query by example specification to perform a low-level content-based image match in order to retrieve smaller and relatively closer results of the example image. From this, information related to the image feature is automatically extracted from the query image. The main objective of our approach is to develop a functional image search and indexing technique and to demonstrate that better retrieval results can be achieved.
Flood Detection/Monitoring Using Adjustable Histogram Equalization Technique
Riaz, Muhammad Mohsin; Ghafoor, Abdul
2014-01-01
Flood monitoring technique using adjustable histogram equalization is proposed. The technique overcomes the limitations (overenhancement, artifacts, and unnatural look) of existing technique by adjusting the contrast of images. The proposed technique takes pre- and postimages and applies different processing steps for generating flood map without user interaction. The resultant flood maps can be used for flood monitoring and detection. Simulation results show that the proposed technique provides better output quality compared to the state of the art existing technique. PMID:24558332
Li, Hongfei; Jiang, Haijun; Hu, Cheng
2016-03-01
In this paper, we investigate a class of memristor-based BAM neural networks with time-varying delays. Under the framework of Filippov solutions, boundedness and ultimate boundedness of solutions of memristor-based BAM neural networks are guaranteed by Chain rule and inequalities technique. Moreover, a new method involving Yoshizawa-like theorem is favorably employed to acquire the existence of periodic solution. By applying the theory of set-valued maps and functional differential inclusions, an available Lyapunov functional and some new testable algebraic criteria are derived for ensuring the uniqueness and global exponential stability of periodic solution of memristor-based BAM neural networks. The obtained results expand and complement some previous work on memristor-based BAM neural networks. Finally, a numerical example is provided to show the applicability and effectiveness of our theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.
Multidirectional four-dimensional shape measurement system
NASA Astrophysics Data System (ADS)
Lenar, Janusz; Sitnik, Robert; Witkowski, Marcin
2012-03-01
Currently, a lot of different scanning techniques are used for 3D imaging of human body. Most of existing systems are based on static registration of internal structures using MRI or CT techniques as well as 3D scanning of outer surface of human body by laser triangulation or structured light methods. On the other hand there is an existing mature 4D method based on tracking in time the position of retro-reflective markers attached to human body. There are two main drawbacks of this solution: markers are attached to skin (no real skeleton movement is registered) and it gives (x, y, z, t) coordinates only in those points (not for the whole surface). In this paper we present a novel multidirectional structured light measurement system that is capable of measuring 3D shape of human body surface with frequency reaching 60Hz. The developed system consists of two spectrally separated and hardware-synchronized 4D measurement heads. The principle of the measurement is based on single frame analysis. Projected frame is composed from sine-modulated intensity pattern and a special stripe allowing absolute phase measurement. Several different geometrical set-ups will be proposed depending on type of movements that are to be registered.
Non-destructive evaluation of laboratory scale hydraulic fracturing using acoustic emission
NASA Astrophysics Data System (ADS)
Hampton, Jesse Clay
The primary objective of this research is to develop techniques to characterize hydraulic fractures and fracturing processes using acoustic emission monitoring based on laboratory scale hydraulic fracturing experiments. Individual microcrack AE source characterization is performed to understand the failure mechanisms associated with small failures along pre-existing discontinuities and grain boundaries. Individual microcrack analysis methods include moment tensor inversion techniques to elucidate the mode of failure, crack slip and crack normal direction vectors, and relative volumetric deformation of an individual microcrack. Differentiation between individual microcrack analysis and AE cloud based techniques is studied in efforts to refine discrete fracture network (DFN) creation and regional damage quantification of densely fractured media. Regional damage estimations from combinations of individual microcrack analyses and AE cloud density plotting are used to investigate the usefulness of weighting cloud based AE analysis techniques with microcrack source data. Two granite types were used in several sample configurations including multi-block systems. Laboratory hydraulic fracturing was performed with sample sizes ranging from 15 x 15 x 25 cm3 to 30 x 30 x 25 cm 3 in both unconfined and true-triaxially confined stress states using different types of materials. Hydraulic fracture testing in rock block systems containing a large natural fracture was investigated in terms of AE response throughout fracture interactions. Investigations of differing scale analyses showed the usefulness of individual microcrack characterization as well as DFN and cloud based techniques. Individual microcrack characterization weighting cloud based techniques correlated well with post-test damage evaluations.
Principles and techniques in the design of ADMS+. [advanced data-base management system
NASA Technical Reports Server (NTRS)
Roussopoulos, Nick; Kang, Hyunchul
1986-01-01
'ADMS+/-' is an advanced data base management system whose architecture integrates the ADSM+ mainframe data base system with a large number of work station data base systems, designated ADMS-; no communications exist between these work stations. The use of this system radically decreases the response time of locally processed queries, since the work station runs in a single-user mode, and no dynamic security checking is required for the downloaded portion of the data base. The deferred update strategy used reduces overhead due to update synchronization in message traffic.
NASA Astrophysics Data System (ADS)
Sternberg, Oren; Bednarski, Valerie R.; Perez, Israel; Wheeland, Sara; Rockway, John D.
2016-09-01
Non-invasive optical techniques pertaining to the remote sensing of power quality disturbances (PQD) are part of an emerging technology field typically dominated by radio frequency (RF) and invasive-based techniques. Algorithms and methods to analyze and address PQD such as probabilistic neural networks and fully informed particle swarms have been explored in industry and academia. Such methods are tuned to work with RF equipment and electronics in existing power grids. As both commercial and defense assets are heavily power-dependent, understanding electrical transients and failure events using non-invasive detection techniques is crucial. In this paper we correlate power quality empirical models to the observed optical response. We also empirically demonstrate a first-order approach to map household, office and commercial equipment PQD to user functions and stress levels. We employ a physics-based image and signal processing approach, which demonstrates measured non-invasive (remote sensing) techniques to detect and map the base frequency associated with the power source to the various PQD on a calibrated source.
Nonlinear secret image sharing scheme.
Shin, Sang-Ho; Lee, Gil-Je; Yoo, Kee-Young
2014-01-01
Over the past decade, most of secret image sharing schemes have been proposed by using Shamir's technique. It is based on a linear combination polynomial arithmetic. Although Shamir's technique based secret image sharing schemes are efficient and scalable for various environments, there exists a security threat such as Tompa-Woll attack. Renvall and Ding proposed a new secret sharing technique based on nonlinear combination polynomial arithmetic in order to solve this threat. It is hard to apply to the secret image sharing. In this paper, we propose a (t, n)-threshold nonlinear secret image sharing scheme with steganography concept. In order to achieve a suitable and secure secret image sharing scheme, we adapt a modified LSB embedding technique with XOR Boolean algebra operation, define a new variable m, and change a range of prime p in sharing procedure. In order to evaluate efficiency and security of proposed scheme, we use the embedding capacity and PSNR. As a result of it, average value of PSNR and embedding capacity are 44.78 (dB) and 1.74t⌈log2 m⌉ bit-per-pixel (bpp), respectively.
Nonlinear Secret Image Sharing Scheme
Shin, Sang-Ho; Yoo, Kee-Young
2014-01-01
Over the past decade, most of secret image sharing schemes have been proposed by using Shamir's technique. It is based on a linear combination polynomial arithmetic. Although Shamir's technique based secret image sharing schemes are efficient and scalable for various environments, there exists a security threat such as Tompa-Woll attack. Renvall and Ding proposed a new secret sharing technique based on nonlinear combination polynomial arithmetic in order to solve this threat. It is hard to apply to the secret image sharing. In this paper, we propose a (t, n)-threshold nonlinear secret image sharing scheme with steganography concept. In order to achieve a suitable and secure secret image sharing scheme, we adapt a modified LSB embedding technique with XOR Boolean algebra operation, define a new variable m, and change a range of prime p in sharing procedure. In order to evaluate efficiency and security of proposed scheme, we use the embedding capacity and PSNR. As a result of it, average value of PSNR and embedding capacity are 44.78 (dB) and 1.74t⌈log2m⌉ bit-per-pixel (bpp), respectively. PMID:25140334
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1974-01-01
Based on the premises that (1) magnetic suspension techniques can play a useful role in large-scale aerodynamic testing and (2) superconductor technology offers the only practical hope for building large-scale magnetic suspensions, an all-superconductor three-component magnetic suspension and balance facility was built as a prototype and was tested successfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities have been made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.
Report of the panel on international programs
NASA Technical Reports Server (NTRS)
Anderson, Allen Joel; Fuchs, Karl W.; Ganeka, Yasuhiro; Gaur, Vinod; Green, Andrew A.; Siegfried, W.; Lambert, Anthony; Rais, Jacub; Reighber, Christopher; Seeger, Herman
1991-01-01
The panel recommends that NASA participate and take an active role in the continuous monitoring of existing regional networks, the realization of high resolution geopotential and topographic missions, the establishment of interconnection of the reference frames as defined by different space techniques, the development and implementation of automation for all ground-to-space observing systems, calibration and validation experiments for measuring techniques and data, the establishment of international space-based networks for real-time transmission of high density space data in standardized formats, tracking and support for non-NASA missions, and the extension of state-of-the art observing and analysis techniques to developing nations.
NASA Technical Reports Server (NTRS)
Zapata, R. N.; Humphris, R. R.; Henderson, K. C.
1975-01-01
Based on the premises that magnetic suspension techniques can play a useful role in large scale aerodynamic testing, and that superconductor technology offers the only practical hope for building large scale magnetic suspensions, an all-superconductor 3-component magnetic suspension and balance facility was built as a prototype and tested sucessfully. Quantitative extrapolations of design and performance characteristics of this prototype system to larger systems compatible with existing and planned high Reynolds number facilities at Langley Research Center were made and show that this experimental technique should be particularly attractive when used in conjunction with large cryogenic wind tunnels.
Advanced techniques for determining long term compatibility of materials with propellants
NASA Technical Reports Server (NTRS)
Green, R. L.
1972-01-01
The search for advanced measurement techniques for determining long term compatibility of materials with propellants was conducted in several parts. A comprehensive survey of the existing measurement and testing technology for determining material-propellant interactions was performed. Selections were made from those existing techniques which were determined could meet or be made to meet the requirements. Areas of refinement or changes were recommended for improvement of others. Investigations were also performed to determine the feasibility and advantages of developing and using new techniques to achieve significant improvements over existing ones. The most interesting demonstration was that of the new technique, the volatile metal chelate analysis. Rivaling the neutron activation analysis in terms of sensitivity and specificity, the volatile metal chelate technique was fully demonstrated.
Explosive detection technology
NASA Astrophysics Data System (ADS)
Doremus, Steven; Crownover, Robin
2017-05-01
The continuing proliferation of improvised explosive devices is an omnipresent threat to civilians and members of military and law enforcement around the world. The ability to accurately and quickly detect explosive materials from a distance would be an extremely valuable tool for mitigating the risk posed by these devices. A variety of techniques exist that are capable of accurately identifying explosive compounds, but an effective standoff technique is still yet to be realized. Most of the methods being investigated to fill this gap in capabilities are laser based. Raman spectroscopy is one such technique that has been demonstrated to be effective at a distance. Spatially Offset Raman Spectroscopy (SORS) is a technique capable of identifying chemical compounds inside of containers, which could be used to detect hidden explosive devices. Coherent Anti-Stokes Raman Spectroscopy (CARS) utilized a coherent pair of lasers to excite a sample, greatly increasing the response of sample while decreasing the strength of the lasers being used, which significantly improves the eye safety issue that typically hinders laser-based detection methods. Time-gating techniques are also being developed to improve the data collection from Raman techniques, which are often hindered fluorescence of the test sample in addition to atmospheric, substrate, and contaminant responses. Ultraviolet based techniques have also shown significant promise by greatly improved signal strength from excitation of resonance in many explosive compounds. Raman spectroscopy, which identifies compounds based on their molecular response, can be coupled with Laser Induced Breakdown Spectroscopy (LIBS) capable of characterizing the sample's atomic composition using a single laser.
Reliability analysis of a robotic system using hybridized technique
NASA Astrophysics Data System (ADS)
Kumar, Naveen; Komal; Lather, J. S.
2017-09-01
In this manuscript, the reliability of a robotic system has been analyzed using the available data (containing vagueness, uncertainty, etc). Quantification of involved uncertainties is done through data fuzzification using triangular fuzzy numbers with known spreads as suggested by system experts. With fuzzified data, if the existing fuzzy lambda-tau (FLT) technique is employed, then the computed reliability parameters have wide range of predictions. Therefore, decision-maker cannot suggest any specific and influential managerial strategy to prevent unexpected failures and consequently to improve complex system performance. To overcome this problem, the present study utilizes a hybridized technique. With this technique, fuzzy set theory is utilized to quantify uncertainties, fault tree is utilized for the system modeling, lambda-tau method is utilized to formulate mathematical expressions for failure/repair rates of the system, and genetic algorithm is utilized to solve established nonlinear programming problem. Different reliability parameters of a robotic system are computed and the results are compared with the existing technique. The components of the robotic system follow exponential distribution, i.e., constant. Sensitivity analysis is also performed and impact on system mean time between failures (MTBF) is addressed by varying other reliability parameters. Based on analysis some influential suggestions are given to improve the system performance.
Characterization of agricultural land using singular value decomposition
NASA Astrophysics Data System (ADS)
Herries, Graham M.; Danaher, Sean; Selige, Thomas
1995-11-01
A method is defined and tested for the characterization of agricultural land from multi-spectral imagery, based on singular value decomposition (SVD) and key vector analysis. The SVD technique, which bears a close resemblance to multivariate statistic techniques, has previously been successfully applied to problems of signal extraction for marine data and forestry species classification. In this study the SVD technique is used as a classifier for agricultural regions, using airborne Daedalus ATM data, with 1 m resolution. The specific region chosen is an experimental research farm in Bavaria, Germany. This farm has a large number of crops, within a very small region and hence is not amenable to existing techniques. There are a number of other significant factors which render existing techniques such as the maximum likelihood algorithm less suitable for this area. These include a very dynamic terrain and tessellated pattern soil differences, which together cause large variations in the growth characteristics of the crops. The SVD technique is applied to this data set using a multi-stage classification approach, removing unwanted land-cover classes one step at a time. Typical classification accuracy's for SVD are of the order of 85-100%. Preliminary results indicate that it is a fast and efficient classifier with the ability to differentiate between crop types such as wheat, rye, potatoes and clover. The results of characterizing 3 sub-classes of Winter Wheat are also shown.
Orbiter Entry Aeroheating Working Group Viscous CFD Boundary Layer Transition Trailblazer Solutions
NASA Technical Reports Server (NTRS)
Wood, William A.; Erickson, David W.; Greene, Francis A.
2007-01-01
Boundary layer transition correlations for the Shuttle Orbiter have been previously developed utilizing a two-layer boundary layer prediction technique. The particular two-layer technique that was used is limited to Mach numbers less than 20. To allow assessments at Mach numbers greater than 20, it is proposed to use viscous CFD to the predict boundary layer properties. This report addresses if the existing Orbiter entry aeroheating viscous CFD solutions, which were originally intended to be used for heat transfer rate predictions, adequately resolve boundary layer edge properties and if the existing two-layer results could be leveraged to reduce the number of needed CFD solutions. The boundary layer edge parameters from viscous CFD solutions are extracted along the wind side centerline of the Space Shuttle Orbiter at reentry conditions, and are compared with results from the two-layer boundary layer prediction technique. The differences between the viscous CFD and two-layer prediction techniques vary between Mach 6 and 18 flight conditions and Mach 6 wind tunnel conditions, and there is not a straightforward scaling between the viscous CFD and two-layer values. Therefore: it is not possible to leverage the existing two-layer Orbiter flight boundary layer data set as a substitute for a viscous CFD data set; but viscous CFD solutions at the current grid resolution are sufficient to produce a boundary layer data set suitable for applying edge-based boundary layer transition correlations.
NASA Technical Reports Server (NTRS)
Hong, Jaesub; Allen, Branden; Grindlay, Jonathan; Barthelmy, Scott D.
2016-01-01
Wide-field (greater than or approximately equal to 100 degrees squared) hard X-ray coded-aperture telescopes with high angular resolution (greater than or approximately equal to 2 minutes) will enable a wide range of time domain astrophysics. For instance, transient sources such as gamma-ray bursts can be precisely localized without the assistance of secondary focusing X-ray telescopes to enable rapid followup studies. On the other hand, high angular resolution in coded-aperture imaging introduces a new challenge in handling the systematic uncertainty: the average photon count per pixel is often too small to establish a proper background pattern or model the systematic uncertainty in a timescale where the model remains invariant. We introduce two new techniques to improve detection sensitivity, which are designed for, but not limited to, a high-resolution coded-aperture system: a self-background modeling scheme which utilizes continuous scan or dithering operations, and a Poisson-statistics based probabilistic approach to evaluate the significance of source detection without subtraction in handling the background. We illustrate these new imaging analysis techniques in high resolution coded-aperture telescope using the data acquired by the wide-field hard X-ray telescope ProtoEXIST2 during a high-altitude balloon flight in fall 2012. We review the imaging sensitivity of ProtoEXIST2 during the flight, and demonstrate the performance of the new techniques using our balloon flight data in comparison with a simulated ideal Poisson background.
Consistent detection and identification of individuals in a large camera network
NASA Astrophysics Data System (ADS)
Colombo, Alberto; Leung, Valerie; Orwell, James; Velastin, Sergio A.
2007-10-01
In the wake of an increasing number of terrorist attacks, counter-terrorism measures are now a main focus of many research programmes. An important issue for the police is the ability to track individuals and groups reliably through underground stations, and in the case of post-event analysis, to be able to ascertain whether specific individuals have been at the station previously. While there exist many motion detection and tracking algorithms, the reliable deployment of them in a large network is still ongoing research. Specifically, to track individuals through multiple views, on multiple levels and between levels, consistent detection and labelling of individuals is crucial. In view of these issues, we have developed a change detection algorithm to work reliably in the presence of periodic movements, e.g. escalators and scrolling advertisements, as well as a content-based retrieval technique for identification. The change detection technique automatically extracts periodically varying elements in the scene using Fourier analysis, and constructs a Markov model for the process. Training is performed online, and no manual intervention is required, making this system suitable for deployment in large networks. Experiments on real data shows significant improvement over existing techniques. The content-based retrieval technique uses MPEG-7 descriptors to identify individuals. Given the environment under which the system operates, i.e. at relatively low resolution, this approach is suitable for short timescales. For longer timescales, other forms of identification such as gait, or if the resolution allows, face recognition, will be required.
[Female genital surgery, G-spot amplification techniques--state of the science].
Bachelet, J-T; Mojallal, A; Boucher, F
2014-10-01
The G-spot amplification is a process of "functional" intimate surgery consisting of a temporary physical increase of the size and sensitivity of the G-spot with a filler injected into the septum between the bladder and the vagina's anterior wall, in order to increase the frequency and importance of female orgasm during vaginal penetration. This surgical technique is based on the existence of an eponymous anatomical area described by Dr Gräfenberg in 1950, responsible upon stimulation of systematic orgasm different from the clitoral orgasm, referring to the vaginal orgasm as described by Freud in 1905. The purpose of this article is to review the scientific basis of the G-spot, whose very existence is currently a debated topic, and to discuss the role of G-spot amplification surgery. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Parallel seed-based approach to multiple protein structure similarities detection
Chapuis, Guillaume; Le Boudic-Jamin, Mathilde; Andonov, Rumen; ...
2015-01-01
Finding similarities between protein structures is a crucial task in molecular biology. Most of the existing tools require proteins to be aligned in order-preserving way and only find single alignments even when multiple similar regions exist. We propose a new seed-based approach that discovers multiple pairs of similar regions. Its computational complexity is polynomial and it comes with a quality guarantee—the returned alignments have both root mean squared deviations (coordinate-based as well as internal-distances based) lower than a given threshold, if such exist. We do not require the alignments to be order preserving (i.e., we consider nonsequential alignments), which makesmore » our algorithm suitable for detecting similar domains when comparing multidomain proteins as well as to detect structural repetitions within a single protein. Because the search space for nonsequential alignments is much larger than for sequential ones, the computational burden is addressed by extensive use of parallel computing techniques: a coarse-grain level parallelism making use of available CPU cores for computation and a fine-grain level parallelism exploiting bit-level concurrency as well as vector instructions.« less
The Effects of Practice-Based Training on Graduate Teaching Assistants' Classroom Practices.
Becker, Erin A; Easlon, Erin J; Potter, Sarah C; Guzman-Alvarez, Alberto; Spear, Jensen M; Facciotti, Marc T; Igo, Michele M; Singer, Mitchell; Pagliarulo, Christopher
2017-01-01
Evidence-based teaching is a highly complex skill, requiring repeated cycles of deliberate practice and feedback to master. Despite existing well-characterized frameworks for practice-based training in K-12 teacher education, the major principles of these frameworks have not yet been transferred to instructor development in higher educational contexts, including training of graduate teaching assistants (GTAs). We sought to determine whether a practice-based training program could help GTAs learn and use evidence-based teaching methods in their classrooms. We implemented a weekly training program for introductory biology GTAs that included structured drills of techniques selected to enhance student practice, logic development, and accountability and reduce apprehension. These elements were selected based on their previous characterization as dimensions of active learning. GTAs received regular performance feedback based on classroom observations. To quantify use of target techniques and levels of student participation, we collected and coded 160 h of video footage. We investigated the relationship between frequency of GTA implementation of target techniques and student exam scores; however, we observed no significant relationship. Although GTAs adopted and used many of the target techniques with high frequency, techniques that enforced student participation were not stably adopted, and their use was unresponsive to formal feedback. We also found that techniques discussed in training, but not practiced, were not used at quantifiable frequencies, further supporting the importance of practice-based training for influencing instructional practices. © 2017 E. A. Becker et al. CBE—Life Sciences Education © 2017 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).
Effective evaluation of privacy protection techniques in visible and thermal imagery
NASA Astrophysics Data System (ADS)
Nawaz, Tahir; Berg, Amanda; Ferryman, James; Ahlberg, Jörgen; Felsberg, Michael
2017-09-01
Privacy protection may be defined as replacing the original content in an image region with a (less intrusive) content having modified target appearance information to make it less recognizable by applying a privacy protection technique. Indeed, the development of privacy protection techniques also needs to be complemented with an established objective evaluation method to facilitate their assessment and comparison. Generally, existing evaluation methods rely on the use of subjective judgments or assume a specific target type in image data and use target detection and recognition accuracies to assess privacy protection. An annotation-free evaluation method that is neither subjective nor assumes a specific target type is proposed. It assesses two key aspects of privacy protection: "protection" and "utility." Protection is quantified as an appearance similarity, and utility is measured as a structural similarity between original and privacy-protected image regions. We performed an extensive experimentation using six challenging datasets (having 12 video sequences), including a new dataset (having six sequences) that contains visible and thermal imagery. The new dataset is made available online for the community. We demonstrate effectiveness of the proposed method by evaluating six image-based privacy protection techniques and also show comparisons of the proposed method over existing methods.
Ooi, Chia Huey; Chetty, Madhu; Teng, Shyh Wei
2006-06-23
Due to the large number of genes in a typical microarray dataset, feature selection looks set to play an important role in reducing noise and computational cost in gene expression-based tissue classification while improving accuracy at the same time. Surprisingly, this does not appear to be the case for all multiclass microarray datasets. The reason is that many feature selection techniques applied on microarray datasets are either rank-based and hence do not take into account correlations between genes, or are wrapper-based, which require high computational cost, and often yield difficult-to-reproduce results. In studies where correlations between genes are considered, attempts to establish the merit of the proposed techniques are hampered by evaluation procedures which are less than meticulous, resulting in overly optimistic estimates of accuracy. We present two realistically evaluated correlation-based feature selection techniques which incorporate, in addition to the two existing criteria involved in forming a predictor set (relevance and redundancy), a third criterion called the degree of differential prioritization (DDP). DDP functions as a parameter to strike the balance between relevance and redundancy, providing our techniques with the novel ability to differentially prioritize the optimization of relevance against redundancy (and vice versa). This ability proves useful in producing optimal classification accuracy while using reasonably small predictor set sizes for nine well-known multiclass microarray datasets. For multiclass microarray datasets, especially the GCM and NCI60 datasets, DDP enables our filter-based techniques to produce accuracies better than those reported in previous studies which employed similarly realistic evaluation procedures.
An Abstraction-Based Data Model for Information Retrieval
NASA Astrophysics Data System (ADS)
McAllister, Richard A.; Angryk, Rafal A.
Language ontologies provide an avenue for automated lexical analysis that may be used to supplement existing information retrieval methods. This paper presents a method of information retrieval that takes advantage of WordNet, a lexical database, to generate paths of abstraction, and uses them as the basis for an inverted index structure to be used in the retrieval of documents from an indexed corpus. We present this method as a entree to a line of research on using ontologies to perform word-sense disambiguation and improve the precision of existing information retrieval techniques.
Numerical studies of the Bethe-Salpeter equation for a two-fermion bound state
NASA Astrophysics Data System (ADS)
de Paula, W.; Frederico, T.; Salmè, G.; Viviani, M.
2018-03-01
Some recent advances on the solution of the Bethe-Salpeter equation (BSE) for a two-fermion bound system directly in Minkowski space are presented. The calculations are based on the expression of the Bethe-Salpeter amplitude in terms of the so-called Nakanishi integral representation and on the light-front projection (i.e. the integration of the light-front variable k - = k 0 - k 3). The latter technique allows for the analytically exact treatment of the singularities plaguing the two-fermion BSE in Minkowski space. The good agreement observed between our results and those obtained using other existing numerical methods, based on both Minkowski and Euclidean space techniques, fully corroborate our analytical treatment.
NASA Astrophysics Data System (ADS)
Arai, Hiroyuki; Miyagawa, Isao; Koike, Hideki; Haseyama, Miki
We propose a novel technique for estimating the number of people in a video sequence; it has the advantages of being stable even in crowded situations and needing no ground-truth data. By analyzing the geometrical relationships between image pixels and their intersection volumes in the real world quantitatively, a foreground image directly indicates the number of people. Because foreground detection is possible even in crowded situations, the proposed method can be applied in such situations. Moreover, it can estimate the number of people in an a priori manner, so it needs no ground-truth data unlike existing feature-based estimation techniques. Experiments show the validity of the proposed method.
Generalizing Backtrack-Free Search: A Framework for Search-Free Constraint Satisfaction
NASA Technical Reports Server (NTRS)
Jonsson, Ari K.; Frank, Jeremy
2000-01-01
Tractable classes of constraint satisfaction problems are of great importance in artificial intelligence. Identifying and taking advantage of such classes can significantly speed up constraint problem solving. In addition, tractable classes are utilized in applications where strict worst-case performance guarantees are required, such as constraint-based plan execution. In this work, we present a formal framework for search-free (backtrack-free) constraint satisfaction. The framework is based on general procedures, rather than specific propagation techniques, and thus generalizes existing techniques in this area. We also relate search-free problem solving to the notion of decision sets and use the result to provide a constructive criterion that is sufficient to guarantee search-free problem solving.
Allen, Edwin B; Walls, Richard T; Reilly, Frank D
2008-02-01
This study investigated the effects of interactive instructional techniques in a web-based peripheral nervous system (PNS) component of a first year medical school human anatomy course. Existing data from 9 years of instruction involving 856 students were used to determine (1) the effect of web-based interactive instructional techniques on written exam item performance and (2) differences between student opinions of the benefit level of five different types of interactive learning objects used. The interactive learning objects included Patient Case studies, review Games, Simulated Interactive Patients (SIP), Flashcards, and unit Quizzes. Exam item analysis scores were found to be significantly higher (p < 0.05) for students receiving the instructional treatment incorporating the web-based interactive learning objects than for students not receiving this treatment. Questionnaires using a five-point Likert scale were analysed to determine student opinion ratings of the interactive learning objects. Students reported favorably on the benefit level of all learning objects. Students rated the benefit level of the Simulated Interactive Patients (SIP) highest, and this rating was significantly higher (p < 0.05) than all other learning objects. This study suggests that web-based interactive instructional techniques improve student exam performance. Students indicated a strong acceptance of Simulated Interactive Patient learning objects.
Self-Alignment MEMS IMU Method Based on the Rotation Modulation Technique on a Swing Base
Chen, Zhiyong; Yang, Haotian; Wang, Chengbin; Lin, Zhihui; Guo, Meifeng
2018-01-01
The micro-electro-mechanical-system (MEMS) inertial measurement unit (IMU) has been widely used in the field of inertial navigation due to its small size, low cost, and light weight, but aligning MEMS IMUs remains a challenge for researchers. MEMS IMUs have been conventionally aligned on a static base, requiring other sensors, such as magnetometers or satellites, to provide auxiliary information, which limits its application range to some extent. Therefore, improving the alignment accuracy of MEMS IMU as much as possible under swing conditions is of considerable value. This paper proposes an alignment method based on the rotation modulation technique (RMT), which is completely self-aligned, unlike the existing alignment techniques. The effect of the inertial sensor errors is mitigated by rotating the IMU. Then, inertial frame-based alignment using the rotation modulation technique (RMT-IFBA) achieved coarse alignment on the swing base. The strong tracking filter (STF) further improved the alignment accuracy. The performance of the proposed method was validated with a physical experiment, and the results of the alignment showed that the standard deviations of pitch, roll, and heading angle were 0.0140°, 0.0097°, and 0.91°, respectively, which verified the practicality and efficacy of the proposed method for the self-alignment of the MEMS IMU on a swing base. PMID:29649150
When Using the Mean is Meaningless: Examples from Probability Theory and Cardiology.
ERIC Educational Resources Information Center
Liebovitch, Larry S.; Todorov, Angelo T.; Wood, Mark A.; Ellenbogen, Kenneth A.
This chapter describes how the mean of fractal processes does not exist and is not a meaningful measure of some data. It discusses how important it is to stay open to the possibility that sometimes analytic techniques fail to satisfy some assumptions on which the mean is based. (KHR)
A System for English Vocabulary Acquisition Based on Code-Switching
ERIC Educational Resources Information Center
Mazur, Michal; Karolczak, Krzysztof; Rzepka, Rafal; Araki, Kenji
2016-01-01
Vocabulary plays an important part in second language learning and there are many existing techniques to facilitate word acquisition. One of these methods is code-switching, or mixing the vocabulary of two languages in one sentence. In this paper the authors propose an experimental system for computer-assisted English vocabulary learning in…
Classifying Correlation Matrices into Relatively Homogeneous Subgroups: A Cluster Analytic Approach
ERIC Educational Resources Information Center
Cheung, Mike W.-L.; Chan, Wai
2005-01-01
Researchers are becoming interested in combining meta-analytic techniques and structural equation modeling to test theoretical models from a pool of studies. Most existing procedures are based on the assumption that all correlation matrices are homogeneous. Few studies have addressed what the next step should be when studies being analyzed are…
ERIC Educational Resources Information Center
Mattern, Krista D.; Marini, Jessica P.; Shaw, Emily J.
2015-01-01
Throughout the college retention literature, there is a recurring theme that students leave college for a variety of reasons making retention a difficult phenomenon to model. In the current study, cluster analysis techniques were employed to investigate whether multiple empirically based profiles of nonreturning students existed to more fully…
Organizational Training across Cultures: Variations in Practices and Attitudes
ERIC Educational Resources Information Center
Hassi, Abderrahman; Storti, Giovanna
2011-01-01
Purpose: The purpose of this paper is to provide a synthesis based on a review of the existing literature with respect to the variations in training practices and attitudes across national cultures. Design/methodology/approach: A content analysis technique was adopted with a comparative cross-cultural management perspective as a backdrop to…
Reliable Record Matching for a College Admissions System.
ERIC Educational Resources Information Center
Fitt, Paul D.
Prospective student data, supplied by various national college testing and student search services, can be matched with existing student records in a college admissions database. Instead of relying on one unique record identifier, such as the student's social security number, a technique has been developed that is based on a number of common data…
Evidence-Based Behavioral Treatment of Dog Phobia with Young Children: Two Case Examples
ERIC Educational Resources Information Center
May, Anna C.; Rudy, Brittany M.; Davis, Thompson E., III; Matson, Johnny L.
2013-01-01
Specific phobias are among the most common anxiety disorders, especially in children. Unfortunately, a paucity of literature exists regarding the treatment of specific phobia in young children, despite the knowledge that traditional techniques (i.e., cognitive-behavioral therapy [CBT]) may not be practical. Therefore, the purpose of this article…
Acquiring Software Design Schemas: A Machine Learning Perspective
NASA Technical Reports Server (NTRS)
Harandi, Mehdi T.; Lee, Hing-Yan
1991-01-01
In this paper, we describe an approach based on machine learning that acquires software design schemas from design cases of existing applications. An overview of the technique, design representation, and acquisition system are presented. the paper also addresses issues associated with generalizing common features such as biases. The generalization process is illustrated using an example.
Hoogendoorn, Mark; Szolovits, Peter; Moons, Leon M G; Numans, Mattijs E
2016-05-01
Machine learning techniques can be used to extract predictive models for diseases from electronic medical records (EMRs). However, the nature of EMRs makes it difficult to apply off-the-shelf machine learning techniques while still exploiting the rich content of the EMRs. In this paper, we explore the usage of a range of natural language processing (NLP) techniques to extract valuable predictors from uncoded consultation notes and study whether they can help to improve predictive performance. We study a number of existing techniques for the extraction of predictors from the consultation notes, namely a bag of words based approach and topic modeling. In addition, we develop a dedicated technique to match the uncoded consultation notes with a medical ontology. We apply these techniques as an extension to an existing pipeline to extract predictors from EMRs. We evaluate them in the context of predictive modeling for colorectal cancer (CRC), a disease known to be difficult to diagnose before performing an endoscopy. Our results show that we are able to extract useful information from the consultation notes. The predictive performance of the ontology-based extraction method moves significantly beyond the benchmark of age and gender alone (area under the receiver operating characteristic curve (AUC) of 0.870 versus 0.831). We also observe more accurate predictive models by adding features derived from processing the consultation notes compared to solely using coded data (AUC of 0.896 versus 0.882) although the difference is not significant. The extracted features from the notes are shown be equally predictive (i.e. there is no significant difference in performance) compared to the coded data of the consultations. It is possible to extract useful predictors from uncoded consultation notes that improve predictive performance. Techniques linking text to concepts in medical ontologies to derive these predictors are shown to perform best for predicting CRC in our EMR dataset. Copyright © 2016 Elsevier B.V. All rights reserved.
Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon
2011-01-01
Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.
NASA Astrophysics Data System (ADS)
Tong, Minh Q.; Hasan, M. Monirul; Gregory, Patrick D.; Shah, Jasmine; Park, B. Hyle; Hirota, Koji; Liu, Junze; Choi, Andy; Low, Karen; Nam, Jin
2017-02-01
We demonstrate a computationally-efficient optical coherence elastography (OCE) method based on fringe washout. By introducing ultrasound in alternating depth profile, we can obtain information on the mechanical properties of a sample within acquisition of a single image. This can be achieved by simply comparing the intensity in adjacent depth profiles in order to quantify the degree of fringe washout. Phantom agar samples with various densities were measured and quantified by our OCE technique, the correlation to Young's modulus measurement by atomic force micrscopy (AFM) were observed. Knee cartilage samples of monoiodo acetate-induced arthiritis (MIA) rat models were utilized to replicate cartilage damages where our proposed OCE technique along with intensity and birefringence analyses and AFM measurements were applied. The results indicate that our OCE technique shows a correlation to the techniques as polarization-sensitive OCT, AFM Young's modulus measurements and histology were promising. Our OCE is applicable to any of existing OCT systems and demonstrated to be computationally-efficient.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zeng, L., E-mail: zeng@fusion.gat.com; Doyle, E. J.; Rhodes, T. L.
2016-11-15
A new model-based technique for fast estimation of the pedestal electron density gradient has been developed. The technique uses ordinary mode polarization profile reflectometer time delay data and does not require direct profile inversion. Because of its simple data processing, the technique can be readily implemented via a Field-Programmable Gate Array, so as to provide a real-time density gradient estimate, suitable for use in plasma control systems such as envisioned for ITER, and possibly for DIII-D and Experimental Advanced Superconducting Tokamak. The method is based on a simple edge plasma model with a linear pedestal density gradient and low scrape-off-layermore » density. By measuring reflectometer time delays for three adjacent frequencies, the pedestal density gradient can be estimated analytically via the new approach. Using existing DIII-D profile reflectometer data, the estimated density gradients obtained from the new technique are found to be in good agreement with the actual density gradients for a number of dynamic DIII-D plasma conditions.« less
An efficient and accurate molecular alignment and docking technique using ab initio quality scoring
Füsti-Molnár, László; Merz, Kenneth M.
2008-01-01
An accurate and efficient molecular alignment technique is presented based on first principle electronic structure calculations. This new scheme maximizes quantum similarity matrices in the relative orientation of the molecules and uses Fourier transform techniques for two purposes. First, building up the numerical representation of true ab initio electronic densities and their Coulomb potentials is accelerated by the previously described Fourier transform Coulomb method. Second, the Fourier convolution technique is applied for accelerating optimizations in the translational coordinates. In order to avoid any interpolation error, the necessary analytical formulas are derived for the transformation of the ab initio wavefunctions in rotational coordinates. The results of our first implementation for a small test set are analyzed in detail and compared with published results of the literature. A new way of refinement of existing shape based alignments is also proposed by using Fourier convolutions of ab initio or other approximate electron densities. This new alignment technique is generally applicable for overlap, Coulomb, kinetic energy, etc., quantum similarity measures and can be extended to a genuine docking solution with ab initio scoring. PMID:18624561
Training and certification in endobronchial ultrasound-guided transbronchial needle aspiration
Konge, Lars; Nayahangan, Leizl Joy; Clementsen, Paul Frost
2017-01-01
Endobronchial ultrasound-guided transbronchial needle aspiration (EBUS-TBNA) plays a key role in the staging of lung cancer, which is crucial for allocation to surgical treatment. EBUS-TBNA is a complicated procedure and simulation-based training is helpful in the first part of the long learning curve prior to performing the procedure on actual patients. New trainees should follow a structured training programme consisting of training on simulators to proficiency as assessed with a validated test followed by supervised practice on patients. The simulation-based training is superior to the traditional apprenticeship model and is recommended in the newest guidelines. EBUS-TBNA and oesophageal ultrasound-guided fine needle aspiration (EUS-FNA or EUS-B-FNA) are complementary to each other and the combined techniques are superior to either technique alone. It is logical to learn and to perform the two techniques in combination, however, for lung cancer staging solely EBUS-TBNA simulators exist, but hopefully in the future simulation-based training in EUS will be possible. PMID:28840013
Characterization of Orbital Debris Via Hyper-Velocity Ground-Based Tests
NASA Technical Reports Server (NTRS)
Cowardin, Heather
2015-01-01
To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models. DebriSat is intended to be representative of modern LEO satellites.Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. A key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992 .Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.
Miyake, S; Lucas-Miyake, M
1989-01-01
This article will describe a marketing model for the development of a role for occupational therapy in the industrial market. Health promotion activities are used as a means to diversify existing revenue bases by establishing new referral sources in industry. The technique of need satisfaction -selling or marketing one's services to a customer based on needs expressed by the customer - is reviewed, and implementation of this approach is described from two settings, one in psychiatry and the other in rehabilitation.
Biggs, Jason D.; Voll, Judith A.; Mukamel, Shaul
2012-01-01
Two types of diagrammatic approaches for the design and simulation of nonlinear optical experiments (closed-time path loops based on the wave function and double-sided Feynman diagrams for the density matrix) are presented and compared. We give guidelines for the assignment of relevant pathways and provide rules for the interpretation of existing nonlinear experiments in carotenoids. PMID:22753822
NASA Astrophysics Data System (ADS)
Raza, Syed Ali; Zaighum, Isma; Shah, Nida
2018-02-01
This paper examines the relationship between economic policy uncertainty and equity premium in G7 countries over a period of the monthly data from January 1989 to December 2015 using a novel technique namely QQ regression proposed by Sim and Zhou (2015). Based on QQ approach, we estimate how the quantiles of the economic policy uncertainty affect the quantiles of the equity premium. Thus, it provides a comprehensive insight into the overall dependence structure between the equity premium and economic policy uncertainty as compared to traditional techniques like OLS or quantile regression. Overall, our empirical evidence suggests the existence of a negative association between equity premium and EPU predominately in all G7 countries, especially in the extreme low and extreme high tails. However, differences exist among countries and across different quantiles of EPU and the equity premium within each country. The existence of this heterogeneity among countries is due to the differences in terms of dependency on economic policy, other stock markets, and the linkages with other country's equity market.
The Microstructure and Physical Properties of Incinerated Paper-Cullet-Clay Ceramics
NASA Astrophysics Data System (ADS)
Sahar, M. R.; Hamzah, K.; Rohani, M. S.; Samah, K. A.; Razi, M. M.
A series of ceramic based on (x) incinerated recycle paper - (80-x) cullet - 20 Kaolin clay (where 10×45 wt%) has successfully been made by slip casting technique followed by sintering at 1000 °C. The actual composition of ceramic is analyzed using Energy Dispersive of X-Ray (EDAX) while the phase existence is determined using X-Ray Diffraction (XRD) technique. Their microstructural morphology is observed under Scanning Electron Microscope (SEM) and the physical properties are measured in term of their thermal shrinkage and hardness. It is found that the ceramic contain mostly of Silica and the phase is dominated by the existence of Quartz (SiO2), Wollastonite (CaSiO3) and Anorthite (Ca(Al2SiO8)). The SEM micrograph shows that the morphology is dominated by the existence of granular structure, and then become smoother as the cullet level is further increased. It is also found out that the thermal shrinkage is in the range 18% - 6.5% while the hardness is in the range of 152MPa- 1.463 GPa depending on composition.
NASA Astrophysics Data System (ADS)
Abbasi, Madiha; Imran Baig, Mirza; Shafique Shaikh, Muhammad
2013-12-01
At present existence OTDR based techniques have become a standard practice for measuring chromatic dispersion distribution along an optical fiber transmission link. A constructive measurement technique has been offered in this paper, in which a four wavelength bidirectional optical time domain reflectometer (OTDR) has been used to compute the chromatic dispersion allocation beside an optical fiber transmission system. To improve the correction factor a novel formulation has been developed, which leads to an enhanced and defined measurement. The investigational outcomes obtained are in good harmony.
Evaluation of automobiles with alternative fuels utilizing multicriteria techniques
NASA Astrophysics Data System (ADS)
Brey, J. J.; Contreras, I.; Carazo, A. F.; Brey, R.; Hernández-Díaz, A. G.; Castro, A.
This work applies the non-parametric technique of Data Envelopment Analysis (DEA) to conduct a multicriteria comparison of some existing and under development technologies in the automotive sector. The results indicate that some of the technologies under development, such as hydrogen fuel cell vehicles, can be classified as efficient when evaluated in function of environmental and economic criteria, with greater importance being given to the environmental criteria. The article also demonstrates the need to improve the hydrogen-based technology, in comparison with the others, in aspects such as vehicle sale costs and fuel price.
Magnetic resonance imaging of the subthalamic nucleus for deep brain stimulation.
Chandran, Arjun S; Bynevelt, Michael; Lind, Christopher R P
2016-01-01
The subthalamic nucleus (STN) is one of the most important stereotactic targets in neurosurgery, and its accurate imaging is crucial. With improving MRI sequences there is impetus for direct targeting of the STN. High-quality, distortion-free images are paramount. Image reconstruction techniques appear to show the greatest promise in balancing the issue of geometrical distortion and STN edge detection. Existing spin echo- and susceptibility-based MRI sequences are compared with new image reconstruction methods. Quantitative susceptibility mapping is the most promising technique for stereotactic imaging of the STN.
Retrieval of the atmospheric compounds using a spectral optical thickness information
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ioltukhovski, A.A.
A spectral inversion technique for retrieval of the atmospheric gases and aerosols contents is proposed. This technique based upon the preliminary measurement or retrieval of the spectral optical thickness. The existence of a priori information about the spectral cross sections for some of the atmospheric components allows to retrieve the relative contents of these components in the atmosphere. Method of smooth filtration makes possible to estimate contents of atmospheric aerosols with known cross sections and to filter out other aerosols; this is done independently from their relative contribution to the optical thickness.
Development of advanced avionics systems applicable to terminal-configured vehicles
NASA Technical Reports Server (NTRS)
Heimbold, R. L.; Lee, H. P.; Leffler, M. F.
1980-01-01
A technique to add the time constraint to the automatic descent feature of the existing L-1011 aircraft Flight Management System (FMS) was developed. Software modifications were incorporated in the FMS computer program and the results checked by lab simulation and on a series of eleven test flights. An arrival time dispersion (2 sigma) of 19 seconds was achieved. The 4 D descent technique can be integrated with the time-based metering method of air traffic control. Substantial reductions in delays at today's busy airports should result.
Assimilation of Spatially Sparse In Situ Soil Moisture Networks into a Continuous Model Domain
NASA Astrophysics Data System (ADS)
Gruber, A.; Crow, W. T.; Dorigo, W. A.
2018-02-01
Growth in the availability of near-real-time soil moisture observations from ground-based networks has spurred interest in the assimilation of these observations into land surface models via a two-dimensional data assimilation system. However, the design of such systems is currently hampered by our ignorance concerning the spatial structure of error afflicting ground and model-based soil moisture estimates. Here we apply newly developed triple collocation techniques to provide the spatial error information required to fully parameterize a two-dimensional (2-D) data assimilation system designed to assimilate spatially sparse observations acquired from existing ground-based soil moisture networks into a spatially continuous Antecedent Precipitation Index (API) model for operational agricultural drought monitoring. Over the contiguous United States (CONUS), the posterior uncertainty of surface soil moisture estimates associated with this 2-D system is compared to that obtained from the 1-D assimilation of remote sensing retrievals to assess the value of ground-based observations to constrain a surface soil moisture analysis. Results demonstrate that a fourfold increase in existing CONUS ground station density is needed for ground network observations to provide a level of skill comparable to that provided by existing satellite-based surface soil moisture retrievals.
Toward a view-oriented approach for aligning RDF-based biomedical repositories.
Anguita, A; García-Remesal, M; de la Iglesia, D; Graf, N; Maojo, V
2015-01-01
This article is part of the Focus Theme of METHODS of Information in Medicine on "Managing Interoperability and Complexity in Health Systems". The need for complementary access to multiple RDF databases has fostered new lines of research, but also entailed new challenges due to data representation disparities. While several approaches for RDF-based database integration have been proposed, those focused on schema alignment have become the most widely adopted. All state-of-the-art solutions for aligning RDF-based sources resort to a simple technique inherited from legacy relational database integration methods. This technique - known as element-to-element (e2e) mappings - is based on establishing 1:1 mappings between single primitive elements - e.g. concepts, attributes, relationships, etc. - belonging to the source and target schemas. However, due to the intrinsic nature of RDF - a representation language based on defining tuples < subject, predicate, object > -, one may find RDF elements whose semantics vary dramatically when combined into a view involving other RDF elements - i.e. they depend on their context. The latter cannot be adequately represented in the target schema by resorting to the traditional e2e approach. These approaches fail to properly address this issue without explicitly modifying the target ontology, thus lacking the required expressiveness for properly reflecting the intended semantics in the alignment information. To enhance existing RDF schema alignment techniques by providing a mechanism to properly represent elements with context-dependent semantics, thus enabling users to perform more expressive alignments, including scenarios that cannot be adequately addressed by the existing approaches. Instead of establishing 1:1 correspondences between single primitive elements of the schemas, we propose adopting a view-based approach. The latter is targeted at establishing mapping relationships between RDF subgraphs - that can be regarded as the equivalent of views in traditional databases -, rather than between single schema elements. This approach enables users to represent scenarios defined by context-dependent RDF elements that cannot be properly represented when adopting the currently existing approaches. We developed a software tool implementing our view-based strategy. Our tool is currently being used in the context of the European Commission funded p-medicine project, targeted at creating a technological framework to integrate clinical and genomic data to facilitate the development of personalized drugs and therapies for cancer, based on the genetic profile of the patient. We used our tool to integrate different RDF-based databases - including different repositories of clinical trials and DICOM images - using the Health Data Ontology Trunk (HDOT) ontology as the target schema. The importance of database integration methods and tools in the context of biomedical research has been widely recognized. Modern research in this area - e.g. identification of disease biomarkers, or design of personalized therapies - heavily relies on the availability of a technical framework to enable researchers to uniformly access disparate repositories. We present a method and a tool that implement a novel alignment method specifically designed to support and enhance the integration of RDF-based data sources at schema (metadata) level. This approach provides an increased level of expressiveness compared to other existing solutions, and allows solving heterogeneity scenarios that cannot be properly represented using other state-of-the-art techniques.
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1996-07-01
This paper presents a generalized image contrast enhancement technique, which equalizes the perceived brightness distribution based on the Heinemann contrast discrimination model. It is based on the mathematically proven existence of a unique solution to a nonlinear equation, and is formulated with easily tunable parameters. The model uses a two-step log-log representation of luminance contrast between targets and surround in a luminous background setting. The algorithm consists of two nonlinear gray scale mapping functions that have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of the gray-level distribution of the given image, and can be uniquely determined once the previous three are set. Tests have been carried out to demonstrate the effectiveness of the algorithm for increasing the overall contrast of radiology images. The traditional histogram equalization can be reinterpreted as an image enhancement technique based on the knowledge of human contrast perception. In fact, it is a special case of the proposed algorithm.
Novel secret key generation techniques using memristor devices
NASA Astrophysics Data System (ADS)
Abunahla, Heba; Shehada, Dina; Yeun, Chan Yeob; Mohammad, Baker; Jaoude, Maguy Abi
2016-02-01
This paper proposes novel secret key generation techniques using memristor devices. The approach depends on using the initial profile of a memristor as a master key. In addition, session keys are generated using the master key and other specified parameters. In contrast to existing memristor-based security approaches, the proposed development is cost effective and power efficient since the operation can be achieved with a single device rather than a crossbar structure. An algorithm is suggested and demonstrated using physics based Matlab model. It is shown that the generated keys can have dynamic size which provides perfect security. Moreover, the proposed encryption and decryption technique using the memristor based generated keys outperforms Triple Data Encryption Standard (3DES) and Advanced Encryption Standard (AES) in terms of processing time. This paper is enriched by providing characterization results of a fabricated microscale Al/TiO2/Al memristor prototype in order to prove the concept of the proposed approach and study the impacts of process variations. The work proposed in this paper is a milestone towards System On Chip (SOC) memristor based security.
Parametric Model Based On Imputations Techniques for Partly Interval Censored Data
NASA Astrophysics Data System (ADS)
Zyoud, Abdallah; Elfaki, F. A. M.; Hrairi, Meftah
2017-12-01
The term ‘survival analysis’ has been used in a broad sense to describe collection of statistical procedures for data analysis. In this case, outcome variable of interest is time until an event occurs where the time to failure of a specific experimental unit might be censored which can be right, left, interval, and Partly Interval Censored data (PIC). In this paper, analysis of this model was conducted based on parametric Cox model via PIC data. Moreover, several imputation techniques were used, which are: midpoint, left & right point, random, mean, and median. Maximum likelihood estimate was considered to obtain the estimated survival function. These estimations were then compared with the existing model, such as: Turnbull and Cox model based on clinical trial data (breast cancer data), for which it showed the validity of the proposed model. Result of data set indicated that the parametric of Cox model proved to be more superior in terms of estimation of survival functions, likelihood ratio tests, and their P-values. Moreover, based on imputation techniques; the midpoint, random, mean, and median showed better results with respect to the estimation of survival function.
NASA Technical Reports Server (NTRS)
Yoshikawa, H. H.; Madison, I. B.
1971-01-01
This study was performed in support of the NASA Task B-2 Study Plan for Space Basing. The nature of space-based operations implies that orbital transfer of propellant is a prime consideration. The intent of this report is (1) to report on the findings and recommendations of existing literature on space-based propellant transfer techniques, and (2) to determine possible alternatives to the recommended methods. The reviewed literature recommends, in general, the use of conventional liquid transfer techniques (i.e., pumping) in conjunction with an artificially induced gravitational field. An alternate concept that was studied, the Thermal Bootstrap Transfer Process, is based on the compression of a two-phase fluid with subsequent condensation to a liquid (vapor compression/condensation). This concept utilizes the intrinsic energy capacities of the tanks and propellant by exploiting temperature differentials and available energy differences. The results indicate the thermodynamic feasibility of the Thermal Bootstrap Transfer Process for a specific range of tank sizes, temperatures, fill-factors and receiver tank heat transfer coefficients.
López-Rodríguez, Patricia; Escot-Bocanegra, David; Poyatos-Martínez, David; Weinmann, Frank
2016-01-01
The trend in the last few decades is that current unmanned aerial vehicles are completely made of composite materials rather than metallic, such as carbon-fiber or fiberglass composites. From the electromagnetic point of view, this fact forces engineers and scientists to assess how these materials may affect their radar response or their electronics in terms of electromagnetic compatibility. In order to evaluate this, electromagnetic characterization of different composite materials has become a need. Several techniques exist to perform this characterization, all of them based on the utilization of different sensors for measuring different parameters. In this paper, an implementation of the metal-backed free-space technique, based on the employment of antenna probes, is utilized for the characterization of composite materials that belong to an actual drone. Their extracted properties are compared with those given by a commercial solution, an open-ended coaxial probe (OECP). The discrepancies found between both techniques along with a further evaluation of the methodologies, including measurements with a split-cavity resonator, conclude that the implemented free-space technique provides more reliable results for this kind of composites than the OECP technique. PMID:27347966
NASA Astrophysics Data System (ADS)
Liu, Zhikun; Cao, Zeyuan; Deng, Biwei; Wang, Yuefeng; Shao, Jiayi; Kumar, Prashant; Liu, C. Richard; Wei, Bingqing; Cheng, Gary J.
2014-05-01
Laser-induced photo-chemical synthesis of SnO2 nanotubes has been demonstrated by employing a nanoporous polycarbonate membrane as a template. The SnO2 nanotube diameter can be controlled by the nanoporous template while the nanotube length can be tuned by laser parameters and reaction duration. The microstructure characterization of the nanotubes indicates that they consist of mesoporous structures with sub 5 nm size nanocrystals connected by the twinning structure. The application of SnO2 nanotubes as an anode material in lithium ion batteries has also been explored, and they exhibited high capacity and excellent cyclic stability. The laser based emerging technique for scalable production of crystalline metal oxide nanotubes in a matter of seconds is remarkable. The compliance of the laser based technique with the existing technologies would lead to mass production of novel nanomaterials that would be suitable for several emerging applications.Laser-induced photo-chemical synthesis of SnO2 nanotubes has been demonstrated by employing a nanoporous polycarbonate membrane as a template. The SnO2 nanotube diameter can be controlled by the nanoporous template while the nanotube length can be tuned by laser parameters and reaction duration. The microstructure characterization of the nanotubes indicates that they consist of mesoporous structures with sub 5 nm size nanocrystals connected by the twinning structure. The application of SnO2 nanotubes as an anode material in lithium ion batteries has also been explored, and they exhibited high capacity and excellent cyclic stability. The laser based emerging technique for scalable production of crystalline metal oxide nanotubes in a matter of seconds is remarkable. The compliance of the laser based technique with the existing technologies would lead to mass production of novel nanomaterials that would be suitable for several emerging applications. Electronic supplementary information (ESI) available. See DOI: 10.1039/c3nr06444a
Image smoothing and enhancement via min/max curvature flow
NASA Astrophysics Data System (ADS)
Malladi, Ravikanth; Sethian, James A.
1996-03-01
We present a class of PDE-based algorithms suitable for a wide range of image processing applications. The techniques are applicable to both salt-and-pepper gray-scale noise and full- image continuous noise present in black and white images, gray-scale images, texture images and color images. At the core, the techniques rely on a level set formulation of evolving curves and surfaces and the viscosity in profile evolution. Essentially, the method consists of moving the isointensity contours in an image under curvature dependent speed laws to achieve enhancement. Compared to existing techniques, our approach has several distinct advantages. First, it contains only one enhancement parameter, which in most cases is automatically chosen. Second, the scheme automatically stops smoothing at some optimal point; continued application of the scheme produces no further change. Third, the method is one of the fastest possible schemes based on a curvature-controlled approach.
Chen, Shyi-Ming; Manalu, Gandhi Maruli Tua; Pan, Jeng-Shyang; Liu, Hsiang-Chuan
2013-06-01
In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and particle swarm optimization (PSO) techniques. First, we fuzzify the historical training data of the main factor and the secondary factor, respectively, to form two-factors second-order fuzzy logical relationships. Then, we group the two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, we obtain the optimal weighting vector for each fuzzy-trend logical relationship group by using PSO techniques to perform the forecasting. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index and the NTD/USD exchange rates. The experimental results show that the proposed method gets better forecasting performance than the existing methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pharhizgar, K.D.; Lunce, S.E.
1994-12-31
Development of knowledge-based technological acquisition techniques and customers` information profiles are known as assimilative integrated discovery systems (AIDS) in modern organizations. These systems have access through processing to both deep and broad domains of information in modern societies. Through these systems organizations and individuals can predict future trend probabilities and events concerning their customers. AIDSs are new techniques which produce new information which informants can use without the help of the knowledge sources because of the existence of highly sophisticated computerized networks. This paper has analyzed the danger and side effects of misuse of information through the illegal, unethical andmore » immoral access to the data-base in an integrated and assimilative information system as described above. Cognivistic mapping, pragmatistic informational design gathering, and holistic classifiable and distributive techniques are potentially abusive systems whose outputs can be easily misused by businesses when researching the firm`s customers.« less
Chen, Z; Ngo, H H; Guo, W S; Listowski, A; O'Halloran, K; Thompson, M; Muthukaruppan, M
2012-11-01
This paper aims to put forward several management alternatives regarding the application of recycled water for household laundry in Sydney. Based on different recycled water treatment techniques such as microfiltration (MF), granular activated carbon (GAC) or reverse osmosis (RO), and types of washing machines (WMs), five alternatives were proposed as follows: (1) do nothing scenario; (2) MF+existing WMs; (3) MF+new WMs; (4) MF-GAC+existing WMs; and (5) MF-RO+existing WMs. Accordingly, a comprehensive quantitative assessment on the trade-off among a variety of issues (e.g., engineering feasibility, initial cost, energy consumption, supply flexibility and water savings) was performed over the alternatives. This was achieved by a computer-based multi-criteria analysis (MCA) using the rank order weight generation together with preference ranking organization method for enrichment evaluation (PROMETHEE) outranking techniques. Particularly, the generated 10,000 combinations of weights via Monte Carlo simulation were able to significantly reduce the man-made errors of single fixed set of weights because of its objectivity and high efficiency. To illustrate the methodology, a case study on Rouse Hill Development Area (RHDA), Sydney, Australia was carried out afterwards. The study was concluded by highlighting the feasibility of using highly treated recycled water for existing and new washing machines. This could provide a powerful guidance for sustainable water reuse management in the long term. However, more detailed field trials and investigations are still needed to effectively understand, predict and manage the impact of selected recycled water for new end use alternatives. Copyright © 2012 Elsevier B.V. All rights reserved.
Three-dimensional hybrid grid generation using advancing front techniques
NASA Technical Reports Server (NTRS)
Steinbrenner, John P.; Noack, Ralph W.
1995-01-01
A new 3-dimensional hybrid grid generation technique has been developed, based on ideas of advancing fronts for both structured and unstructured grids. In this approach, structured grids are first generate independently around individual components of the geometry. Fronts are initialized on these structure grids, and advanced outward so that new cells are extracted directly from the structured grids. Employing typical advancing front techniques, cells are rejected if they intersect the existing front or fail other criteria When no more viable structured cells exist further cells are advanced in an unstructured manner to close off the overall domain, resulting in a grid of 'hybrid' form. There are two primary advantages to the hybrid formulation. First, generating blocks with limited regard to topology eliminates the bottleneck encountered when a multiple block system is used to fully encapsulate a domain. Individual blocks may be generated free of external constraints, which will significantly reduce the generation time. Secondly, grid points near the body (presumably with high aspect ratio) will still maintain a structured (non-triangular or tetrahedral) character, thereby maximizing grid quality and solution accuracy near the surface.
Pulse-shape discrimination techniques for the COBRA double beta-decay experiment at LNGS
NASA Astrophysics Data System (ADS)
Zatschler, S.; COBRA Collaboration
2017-09-01
In modern elementary particle physics several questions arise from the fact that neutrino oscillation experiments have found neutrinos to be massive. Among them is the so far unknown nature of neutrinos: either they act as so-called Majorana particles, where one cannot distinguish between particle and antiparticle, or they are Dirac particles like all the other fermions in the Standard Model. The study of neutrinoless double beta-decay (0νββ-decay), where the lepton number conservation is violated by two units, could answer the question regarding the underlying nature of neutrinos and might also shed light on the mechanism responsible for the mass generation. So far there is no experimental evidence for the existence of 0νββ-decay, hence, existing experiments have to be improved and novel techniques should be explored. One of the next-generation experiments dedicated to the search for this ultra-rare decay is the COBRA experiment. This article gives an overview of techniques to identify and reject background based on pulse-shape discrimination.
Two-step tunneling technique of deep brain stimulation extension wires-a description.
Fontaine, Denys; Vandersteen, Clair; Saleh, Christian; von Langsdorff, Daniel; Poissonnet, Gilles
2013-12-01
While a significant body of literature exists on the intracranial part of deep brain stimulation surgery, the equally important second part of the intervention related to the subcutaneous tunneling of deep brain stimulation extension wires is rarely described. The tunneling strategy can consist of a single passage of the extension wires from the frontal incision site to the subclavicular area, or of a two-step approach that adds a retro-auricular counter-incision. Each technique harbors the risk of intraoperative and postoperative complications. At our center, we perform a two-step tunneling procedure that we developed based on a cadaveric study. In 125 consecutive patients operated since 2002, we did not encounter any complication related to our tunneling method. Insufficient data exist to fully evaluate the advantages and disadvantages of each tunneling technique. It is of critical importance that authors detail their tunneling modus operandi and report the presence or absence of complications. This gathered data pool may help to formulate a definitive conclusions on the safest method for subcutaneous tunneling of extension wires in deep brain stimulation.
Efficient clustering aggregation based on data fragments.
Wu, Ou; Hu, Weiming; Maybank, Stephen J; Zhu, Mingliang; Li, Bing
2012-06-01
Clustering aggregation, known as clustering ensembles, has emerged as a powerful technique for combining different clustering results to obtain a single better clustering. Existing clustering aggregation algorithms are applied directly to data points, in what is referred to as the point-based approach. The algorithms are inefficient if the number of data points is large. We define an efficient approach for clustering aggregation based on data fragments. In this fragment-based approach, a data fragment is any subset of the data that is not split by any of the clustering results. To establish the theoretical bases of the proposed approach, we prove that clustering aggregation can be performed directly on data fragments under two widely used goodness measures for clustering aggregation taken from the literature. Three new clustering aggregation algorithms are described. The experimental results obtained using several public data sets show that the new algorithms have lower computational complexity than three well-known existing point-based clustering aggregation algorithms (Agglomerative, Furthest, and LocalSearch); nevertheless, the new algorithms do not sacrifice the accuracy.
Biochar-based nano-composites for the decontamination of wastewater: A review.
Tan, Xiao-Fei; Liu, Yun-Guo; Gu, Yan-Ling; Xu, Yan; Zeng, Guang-Ming; Hu, Xin-Jiang; Liu, Shao-Bo; Wang, Xin; Liu, Si-Mian; Li, Jiang
2016-07-01
Synthesizing biochar-based nano-composites can obtain new composites and combine the advantages of biochar with nano-materials. The resulting composites usually exhibit great improvement in functional groups, pore properties, surface active sites, catalytic degradation ability and easy to separation. These composites have excellent abilities to adsorb a range of contaminants from aqueous solutions. Particularly, catalytic material-coated biochar can exert simultaneous adsorption and catalytic degradation function for organic contaminants removal. Synthesizing biochar-based nano-composites has become an important practice for expanding the environmental applications of biochar and nanotechnology. This paper aims to review and summarize the various synthesis techniques for biochar-based nano-composites and their effects on the decontamination of wastewater. The characteristic and advantages of existing synthesis methods are summarized and discussed. Application of biochar-based nano-composites for different contaminants removal and the underlying mechanisms are reviewed. Furthermore, knowledge gaps that exist in the fabrication and application of biochar-based nano-composites are also identified. Copyright © 2016 Elsevier Ltd. All rights reserved.
Atmospheric Fluorescence Yield
NASA Technical Reports Server (NTRS)
Adams, James H., Jr.; Christl, M. J.; Fountain, W. F.; Gregory, J. C.; Martens, K.; Sokolsky, P.; Whitaker, Ann F. (Technical Monitor)
2001-01-01
Several existing and planned experiments estimate the energies of ultra-high energy cosmic rays from air showers using the atmospheric fluorescence from these showers. Accurate knowledge of the conversion from atmospheric fluorescence to energy loss by ionizing particles in the atmosphere is key to this technique. In this paper we discuss a small balloon-borne instrument to make the first in situ measurements versus altitude of the atmospheric fluorescence yield. The instrument can also be used in the lab to investigate the dependence of the fluorescence yield in air on temperature, pressure and the concentrations of other gases that present in the atmosphere. The results can be used to explore environmental effects on and improve the accuracy of cosmic ray energy measurements for existing ground-based experiments and future space-based experiments.
Handling nonnormality and variance heterogeneity for quantitative sublethal toxicity tests.
Ritz, Christian; Van der Vliet, Leana
2009-09-01
The advantages of using regression-based techniques to derive endpoints from environmental toxicity data are clear, and slowly, this superior analytical technique is gaining acceptance. As use of regression-based analysis becomes more widespread, some of the associated nuances and potential problems come into sharper focus. Looking at data sets that cover a broad spectrum of standard test species, we noticed that some model fits to data failed to meet two key assumptions-variance homogeneity and normality-that are necessary for correct statistical analysis via regression-based techniques. Failure to meet these assumptions often is caused by reduced variance at the concentrations showing severe adverse effects. Although commonly used with linear regression analysis, transformation of the response variable only is not appropriate when fitting data using nonlinear regression techniques. Through analysis of sample data sets, including Lemna minor, Eisenia andrei (terrestrial earthworm), and algae, we show that both the so-called Box-Cox transformation and use of the Poisson distribution can help to correct variance heterogeneity and nonnormality and so allow nonlinear regression analysis to be implemented. Both the Box-Cox transformation and the Poisson distribution can be readily implemented into existing protocols for statistical analysis. By correcting for nonnormality and variance heterogeneity, these two statistical tools can be used to encourage the transition to regression-based analysis and the depreciation of less-desirable and less-flexible analytical techniques, such as linear interpolation.
Techniques in teaching statistics : linking research production and research use.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Martinez-Moyano, I .; Smith, A.; Univ. of Massachusetts at Boston)
In the spirit of closing the 'research-practice gap,' the authors extend evidence-based principles to statistics instruction in social science graduate education. The authors employ a Delphi method to survey experienced statistics instructors to identify teaching techniques to overcome the challenges inherent in teaching statistics to students enrolled in practitioner-oriented master's degree programs. Among the teaching techniques identi?ed as essential are using real-life examples, requiring data collection exercises, and emphasizing interpretation rather than results. Building on existing research, preliminary interviews, and the ?ndings from the study, the authors develop a model describing antecedents to the strength of the link between researchmore » and practice.« less
Off-diagonal expansion quantum Monte Carlo
NASA Astrophysics Data System (ADS)
Albash, Tameem; Wagenbreth, Gene; Hen, Itay
2017-12-01
We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.
Some fuzzy techniques for staff selection process: A survey
NASA Astrophysics Data System (ADS)
Md Saad, R.; Ahmad, M. Z.; Abu, M. S.; Jusoh, M. S.
2013-04-01
With high level of business competition, it is vital to have flexible staff that are able to adapt themselves with work circumstances. However, staff selection process is not an easy task to be solved, even when it is tackled in a simplified version containing only a single criterion and a homogeneous skill. When multiple criteria and various skills are involved, the problem becomes much more complicated. In adddition, there are some information that could not be measured precisely. This is patently obvious when dealing with opinions, thoughts, feelings, believes, etc. One possible tool to handle this issue is by using fuzzy set theory. Therefore, the objective of this paper is to review the existing fuzzy techniques for solving staff selection process. It classifies several existing research methods and identifies areas where there is a gap and need further research. Finally, this paper concludes by suggesting new ideas for future research based on the gaps identified.
Structural Optimization of a Knuckle with Consideration of Stiffness and Durability Requirements
Kim, Geun-Yeon
2014-01-01
The automobile's knuckle is connected to the parts of the steering system and the suspension system and it is used for adjusting the direction of a rotation through its attachment to the wheel. This study changes the existing material made of GCD45 to Al6082M and recommends the lightweight design of the knuckle as the optimal design technique to be installed in small cars. Six shape design variables were selected for the optimization of the knuckle and the criteria relevant to stiffness and durability were considered as the design requirements during the optimization process. The metamodel-based optimization method that uses the kriging interpolation method as the optimization technique was applied. The result shows that all constraints for stiffness and durability are satisfied using A16082M, while reducing the weight of the knuckle by 60% compared to that of the existing GCD450. PMID:24995359
Web Navigation Sequences Automation in Modern Websites
NASA Astrophysics Data System (ADS)
Montoto, Paula; Pan, Alberto; Raposo, Juan; Bellas, Fernando; López, Javier
Most today’s web sources are designed to be used by humans, but they do not provide suitable interfaces for software programs. That is why a growing interest has arisen in so-called web automation applications that are widely used for different purposes such as B2B integration, automated testing of web applications or technology and business watch. Previous proposals assume models for generating and reproducing navigation sequences that are not able to correctly deal with new websites using technologies such as AJAX: on one hand existing systems only allow recording simple navigation actions and, on the other hand, they are unable to detect the end of the effects caused by an user action. In this paper, we propose a set of new techniques to record and execute web navigation sequences able to deal with all the complexity existing in AJAX-based web sites. We also present an exhaustive evaluation of the proposed techniques that shows very promising results.
Off-diagonal expansion quantum Monte Carlo.
Albash, Tameem; Wagenbreth, Gene; Hen, Itay
2017-12-01
We propose a Monte Carlo algorithm designed to simulate quantum as well as classical systems at equilibrium, bridging the algorithmic gap between quantum and classical thermal simulation algorithms. The method is based on a decomposition of the quantum partition function that can be viewed as a series expansion about its classical part. We argue that the algorithm not only provides a theoretical advancement in the field of quantum Monte Carlo simulations, but is optimally suited to tackle quantum many-body systems that exhibit a range of behaviors from "fully quantum" to "fully classical," in contrast to many existing methods. We demonstrate the advantages, sometimes by orders of magnitude, of the technique by comparing it against existing state-of-the-art schemes such as path integral quantum Monte Carlo and stochastic series expansion. We also illustrate how our method allows for the unification of quantum and classical thermal parallel tempering techniques into a single algorithm and discuss its practical significance.
Fundamental limits of reconstruction-based superresolution algorithms under local translation.
Lin, Zhouchen; Shum, Heung-Yeung
2004-01-01
Superresolution is a technique that can produce images of a higher resolution than that of the originally captured ones. Nevertheless, improvement in resolution using such a technique is very limited in practice. This makes it significant to study the problem: "Do fundamental limits exist for superresolution?" In this paper, we focus on a major class of superresolution algorithms, called the reconstruction-based algorithms, which compute high-resolution images by simulating the image formation process. Assuming local translation among low-resolution images, this paper is the first attempt to determine the explicit limits of reconstruction-based algorithms, under both real and synthetic conditions. Based on the perturbation theory of linear systems, we obtain the superresolution limits from the conditioning analysis of the coefficient matrix. Moreover, we determine the number of low-resolution images that are sufficient to achieve the limit. Both real and synthetic experiments are carried out to verify our analysis.
Sen, Novonil; Kundu, Tribikram
2018-07-01
Estimating the location of an acoustic source in a structure is an important step towards passive structural health monitoring. Techniques for localizing an acoustic source in isotropic structures are well developed in the literature. Development of similar techniques for anisotropic structures, however, has gained attention only in the recent years and has a scope of further improvement. Most of the existing techniques for anisotropic structures either assume a straight line wave propagation path between the source and an ultrasonic sensor or require the material properties to be known. This study considers different shapes of the wave front generated during an acoustic event and develops a methodology to localize the acoustic source in an anisotropic plate from those wave front shapes. An elliptical wave front shape-based technique was developed first, followed by the development of a parametric curve-based technique for non-elliptical wave front shapes. The source coordinates are obtained by minimizing an objective function. The proposed methodology does not assume a straight line wave propagation path and can predict the source location without any knowledge of the elastic properties of the material. A numerical study presented here illustrates how the proposed methodology can accurately estimate the source coordinates. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Kougioumtzoglou, Ioannis A.; dos Santos, Ketson R. M.; Comerford, Liam
2017-09-01
Various system identification techniques exist in the literature that can handle non-stationary measured time-histories, or cases of incomplete data, or address systems following a fractional calculus modeling. However, there are not many (if any) techniques that can address all three aforementioned challenges simultaneously in a consistent manner. In this paper, a novel multiple-input/single-output (MISO) system identification technique is developed for parameter identification of nonlinear and time-variant oscillators with fractional derivative terms subject to incomplete non-stationary data. The technique utilizes a representation of the nonlinear restoring forces as a set of parallel linear sub-systems. In this regard, the oscillator is transformed into an equivalent MISO system in the wavelet domain. Next, a recently developed L1-norm minimization procedure based on compressive sensing theory is applied for determining the wavelet coefficients of the available incomplete non-stationary input-output (excitation-response) data. Finally, these wavelet coefficients are utilized to determine appropriately defined time- and frequency-dependent wavelet based frequency response functions and related oscillator parameters. Several linear and nonlinear time-variant systems with fractional derivative elements are used as numerical examples to demonstrate the reliability of the technique even in cases of noise corrupted and incomplete data.
A novel methodology for querying web images
NASA Astrophysics Data System (ADS)
Prabhakara, Rashmi; Lee, Ching Cheng
2005-01-01
Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.
A novel methodology for querying web images
NASA Astrophysics Data System (ADS)
Prabhakara, Rashmi; Lee, Ching Cheng
2004-12-01
Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.
Current state of the art of vision based SLAM
NASA Astrophysics Data System (ADS)
Muhammad, Naveed; Fofi, David; Ainouz, Samia
2009-02-01
The ability of a robot to localise itself and simultaneously build a map of its environment (Simultaneous Localisation and Mapping or SLAM) is a fundamental characteristic required for autonomous operation of the robot. Vision Sensors are very attractive for application in SLAM because of their rich sensory output and cost effectiveness. Different issues are involved in the problem of vision based SLAM and many different approaches exist in order to solve these issues. This paper gives a classification of state-of-the-art vision based SLAM techniques in terms of (i) imaging systems used for performing SLAM which include single cameras, stereo pairs, multiple camera rigs and catadioptric sensors, (ii) features extracted from the environment in order to perform SLAM which include point features and line/edge features, (iii) initialisation of landmarks which can either be delayed or undelayed, (iv) SLAM techniques used which include Extended Kalman Filtering, Particle Filtering, biologically inspired techniques like RatSLAM, and other techniques like Local Bundle Adjustment, and (v) use of wheel odometry information. The paper also presents the implementation and analysis of stereo pair based EKF SLAM for synthetic data. Results prove the technique to work successfully in the presence of considerable amounts of sensor noise. We believe that state of the art presented in the paper can serve as a basis for future research in the area of vision based SLAM. It will permit further research in the area to be carried out in an efficient and application specific way.
Endoscopic skull base training using 3D printed models with pre-existing pathology.
Narayanan, Vairavan; Narayanan, Prepageran; Rajagopalan, Raman; Karuppiah, Ravindran; Rahman, Zainal Ariff Abdul; Wormald, Peter-John; Van Hasselt, Charles Andrew; Waran, Vicknes
2015-03-01
Endoscopic base of skull surgery has been growing in acceptance in the recent past due to improvements in visualisation and micro instrumentation as well as the surgical maturing of early endoscopic skull base practitioners. Unfortunately, these demanding procedures have a steep learning curve. A physical simulation that is able to reproduce the complex anatomy of the anterior skull base provides very useful means of learning the necessary skills in a safe and effective environment. This paper aims to assess the ease of learning endoscopic skull base exposure and drilling techniques using an anatomically accurate physical model with a pre-existing pathology (i.e., basilar invagination) created from actual patient data. Five models of a patient with platy-basia and basilar invagination were created from the original MRI and CT imaging data of a patient. The models were used as part of a training workshop for ENT surgeons with varying degrees of experience in endoscopic base of skull surgery, from trainees to experienced consultants. The surgeons were given a list of key steps to achieve in exposing and drilling the skull base using the simulation model. They were then asked to list the level of difficulty of learning these steps using the model. The participants found the models suitable for learning registration, navigation and skull base drilling techniques. All participants also found the deep structures to be accurately represented spatially as confirmed by the navigation system. These models allow structured simulation to be conducted in a workshop environment where surgeons and trainees can practice to perform complex procedures in a controlled fashion under the supervision of experts.
Modeling of ETL-Processes and Processed Information in Clinical Data Warehousing.
Tute, Erik; Steiner, Jochen
2018-01-01
Literature describes a big potential for reuse of clinical patient data. A clinical data warehouse (CDWH) is a means for that. To support management and maintenance of processes extracting, transforming and loading (ETL) data into CDWHs as well as to ease reuse of metadata between regular IT-management, CDWH and secondary data users by providing a modeling approach. Expert survey and literature review to find requirements and existing modeling techniques. An ETL-modeling-technique was developed extending existing modeling techniques. Evaluation by exemplarily modeling existing ETL-process and a second expert survey. Nine experts participated in the first survey. Literature review yielded 15 included publications. Six existing modeling techniques were identified. A modeling technique extending 3LGM2 and combining it with openEHR information models was developed and evaluated. Seven experts participated in the evaluation. The developed approach can help in management and maintenance of ETL-processes and could serve as interface between regular IT-management, CDWH and secondary data users.
Cheboyina, Sreekhar; Wyandt, Christy M
2008-07-09
A novel freeze pelletization technique was evaluated for the preparation of wax-based sustained release matrix pellets. Pellets containing water-soluble drugs were successfully prepared using a variety of waxes. The drug release significantly depended on the wax type used and the aqueous drug solubility. The drug release decreased as the hydrophobicity of wax increased and the drug release increased as the aqueous drug solubility increased. In glyceryl monostearate (GMS) pellets, drug release rate decreased as the loading of theophylline increased. On the contrary, the release rate increased as the drug loading of diltiazem HCl increased in Precirol pellets. Theophylline at low drug loads existed in a dissolved state in GMS pellets and the release followed desorption kinetics. At higher loads, theophylline existed in a crystalline state and the release followed dissolution-controlled constant release for all the waxes studied. However, with the addition of increasing amounts of Brij 76, theophylline release rate increased and the release mechanism shifted to diffusion-controlled square root time kinetics. But the release of diltiazem HCl from Precirol pellets at all drug loads, followed diffusion-controlled square root time kinetics. Therefore, pellets capable of providing a variety of release profiles for different drugs can be prepared using this freeze pelletization technique by suitably modifying the pellet forming matrix compositions.
Rahman, Mohd Nasrull Abdol; Mohamad, Siti Shafika
2017-01-01
Computer works are associated with Musculoskeletal Disorders (MSDs). There are several methods have been developed to assess computer work risk factor related to MSDs. This review aims to give an overview of current techniques available for pen-and-paper-based observational methods in assessing ergonomic risk factors of computer work. We searched an electronic database for materials from 1992 until 2015. The selected methods were focused on computer work, pen-and-paper observational methods, office risk factors and musculoskeletal disorders. This review was developed to assess the risk factors, reliability and validity of pen-and-paper observational method associated with computer work. Two evaluators independently carried out this review. Seven observational methods used to assess exposure to office risk factor for work-related musculoskeletal disorders were identified. The risk factors involved in current techniques of pen and paper based observational tools were postures, office components, force and repetition. From the seven methods, only five methods had been tested for reliability. They were proven to be reliable and were rated as moderate to good. For the validity testing, from seven methods only four methods were tested and the results are moderate. Many observational tools already exist, but no single tool appears to cover all of the risk factors including working posture, office component, force, repetition and office environment at office workstations and computer work. Although the most important factor in developing tool is proper validation of exposure assessment techniques, the existing observational method did not test reliability and validity. Futhermore, this review could provide the researchers with ways on how to improve the pen-and-paper-based observational method for assessing ergonomic risk factors of computer work.
Development of Instrumental Techniques for Color Assessment of Camouflage Patterns
ERIC Educational Resources Information Center
Fang, Gang
2012-01-01
Camouflage fabrics are produced on a large scale for use in the US military and other applications. One of the highest volume camouflage fabrics is known as the Universal Camouflage Pattern (UCP) which is produced for the US Department of Defense. At present, no standard measurement-based color quality control method exists for camouflage…
A simulation-based evaluation of methods for inferring linear barriers to gene flow
Christopher Blair; Dana E. Weigel; Matthew Balazik; Annika T. H. Keeley; Faith M. Walker; Erin Landguth; Sam Cushman; Melanie Murphy; Lisette Waits; Niko Balkenhol
2012-01-01
Different analytical techniques used on the same data set may lead to different conclusions about the existence and strength of genetic structure. Therefore, reliable interpretation of the results from different methods depends on the efficacy and reliability of different statistical methods. In this paper, we evaluated the performance of multiple analytical methods to...
Predictive Cache Modeling and Analysis
2011-11-01
metaheuristic /bin-packing algorithm to optimize task placement based on task communication characterization. Our previous work on task allocation showed...Cache Miss Minimization Technology To efficiently explore combinations and discover nearly-optimal task-assignment algorithms , we extended to our...it was possible to use our algorithmic techniques to decrease network bandwidth consumption by ~25%. In this effort, we adapted these existing
ERIC Educational Resources Information Center
Sniderman, Jhase A.; Roffey, Darren M.; Lee, Richard; Papineau, Gabrielle D.; Miles, Isabelle H.; Wai, Eugene K.; Kingwell, Stephen P.
2017-01-01
Background: Evidence-based treatments for adult back pain have long been confirmed, with research continuing to narrow down the scope of recommended practices. However, a tension exists between research-driven treatments and unsubstantiated modalities and techniques promoted to the public. This disparity in knowledge translation, which results in…
ERIC Educational Resources Information Center
Micco, Mary; Popp, Rich
Techniques for building a world-wide information infrastructure by reverse engineering existing databases to link them in a hierarchical system of subject clusters to create an integrated database are explored. The controlled vocabulary of the Library of Congress Subject Headings is used to ensure consistency and group similar items. Each database…
Floating-point system quantization errors in digital control systems
NASA Technical Reports Server (NTRS)
Phillips, C. L.; Vallely, D. P.
1978-01-01
This paper considers digital controllers (filters) operating in floating-point arithmetic in either open-loop or closed-loop systems. A quantization error analysis technique is developed, and is implemented by a digital computer program that is based on a digital simulation of the system. The program can be integrated into existing digital simulations of a system.
Cost Analysis of Instructional Technology.
ERIC Educational Resources Information Center
Johnson, F. Craig; Dietrich, John E.
Although some serious limitations in the cost analysis technique do exist, the need for cost data in decision making is so great that every effort should be made to obtain accurate estimates. This paper discusses the several issues which arise when an attempt is made to make quality, trade-off, or scope decisions based on cost data. Three methods…
Reducing Router Forwarding Table Size Using Aggregation and Caching
ERIC Educational Resources Information Center
Liu, Yaoqing
2013-01-01
The fast growth of global routing table size has been causing concerns that the Forwarding Information Base (FIB) will not be able to fit in existing routers' expensive line-card memory, and upgrades will lead to a higher cost for network operators and customers. FIB Aggregation, a technique that merges multiple FIB entries into one, is probably…
Application of an Optimal Tuner Selection Approach for On-Board Self-Tuning Engine Models
NASA Technical Reports Server (NTRS)
Simon, Donald L.; Armstrong, Jeffrey B.; Garg, Sanjay
2012-01-01
An enhanced design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented in this paper. It specific-ally addresses the under-determined estimation problem, in which there are more unknown parameters than available sensor measurements. This work builds upon an existing technique for systematically selecting a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. While the existing technique was optimized for open-loop engine operation at a fixed design point, in this paper an alternative formulation is presented that enables the technique to be optimized for an engine operating under closed-loop control throughout the flight envelope. The theoretical Kalman filter mean squared estimation error at a steady-state closed-loop operating point is derived, and the tuner selection approach applied to minimize this error is discussed. A technique for constructing a globally optimal tuning parameter vector, which enables full-envelope application of the technology, is also presented, along with design steps for adjusting the dynamic response of the Kalman filter state estimates. Results from the application of the technique to linear and nonlinear aircraft engine simulations are presented and compared to the conventional approach of tuner selection. The new methodology is shown to yield a significant improvement in on-line Kalman filter estimation accuracy.
Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang
2016-01-01
Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.
A new method for QRS detection in ECG signals using QRS-preserving filtering techniques.
Sharma, Tanushree; Sharma, Kamalesh K
2018-03-28
Detection of QRS complexes in ECG signals is required for various purposes such as determination of heart rate, feature extraction and classification. The problem of automatic QRS detection in ECG signals is complicated by the presence of noise spectrally overlapping with the QRS frequency range. As a solution to this problem, we propose the use of least-squares-optimisation-based smoothing techniques that suppress the noise peaks in the ECG while preserving the QRS complexes. We also propose a novel nonlinear transformation technique that is applied after the smoothing operations, which equalises the QRS amplitudes without boosting the supressed noise peaks. After these preprocessing operations, the R-peaks can finally be detected with high accuracy. The proposed technique has a low computational load and, therefore, it can be used for real-time QRS detection in a wearable device such as a Holter monitor or for fast offline QRS detection. The offline and real-time versions of the proposed technique have been evaluated on the standard MIT-BIH database. The offline implementation is found to perform better than state-of-the-art techniques based on wavelet transforms, empirical mode decomposition, etc. and the real-time implementation also shows improved performance over existing real-time QRS detection techniques.
Quigley, Elizabeth A; Tokay, Barbara A; Jewell, Sarah T; Marchetti, Michael A; Halpern, Allan C
2015-08-01
Photographs are invaluable dermatologic diagnostic, management, research, teaching, and documentation tools. Digital Imaging and Communications in Medicine (DICOM) standards exist for many types of digital medical images, but there are no DICOM standards for camera-acquired dermatologic images to date. To identify and describe existing or proposed technology and technique standards for camera-acquired dermatologic images in the scientific literature. Systematic searches of the PubMed, EMBASE, and Cochrane databases were performed in January 2013 using photography and digital imaging, standardization, and medical specialty and medical illustration search terms and augmented by a gray literature search of 14 websites using Google. Two reviewers independently screened titles of 7371 unique publications, followed by 3 sequential full-text reviews, leading to the selection of 49 publications with the most recent (1985-2013) or detailed description of technology or technique standards related to the acquisition or use of images of skin disease (or related conditions). No universally accepted existing technology or technique standards for camera-based digital images in dermatology were identified. Recommendations are summarized for technology imaging standards, including spatial resolution, color resolution, reproduction (magnification) ratios, postacquisition image processing, color calibration, compression, output, archiving and storage, and security during storage and transmission. Recommendations are also summarized for technique imaging standards, including environmental conditions (lighting, background, and camera position), patient pose and standard view sets, and patient consent, privacy, and confidentiality. Proposed standards for specific-use cases in total body photography, teledermatology, and dermoscopy are described. The literature is replete with descriptions of obtaining photographs of skin disease, but universal imaging standards have not been developed, validated, and adopted to date. Dermatologic imaging is evolving without defined standards for camera-acquired images, leading to variable image quality and limited exchangeability. The development and adoption of universal technology and technique standards may first emerge in scenarios when image use is most associated with a defined clinical benefit.
Support Vector Machine Based on Adaptive Acceleration Particle Swarm Optimization
Abdulameer, Mohammed Hasan; Othman, Zulaiha Ali
2014-01-01
Existing face recognition methods utilize particle swarm optimizer (PSO) and opposition based particle swarm optimizer (OPSO) to optimize the parameters of SVM. However, the utilization of random values in the velocity calculation decreases the performance of these techniques; that is, during the velocity computation, we normally use random values for the acceleration coefficients and this creates randomness in the solution. To address this problem, an adaptive acceleration particle swarm optimization (AAPSO) technique is proposed. To evaluate our proposed method, we employ both face and iris recognition based on AAPSO with SVM (AAPSO-SVM). In the face and iris recognition systems, performance is evaluated using two human face databases, YALE and CASIA, and the UBiris dataset. In this method, we initially perform feature extraction and then recognition on the extracted features. In the recognition process, the extracted features are used for SVM training and testing. During the training and testing, the SVM parameters are optimized with the AAPSO technique, and in AAPSO, the acceleration coefficients are computed using the particle fitness values. The parameters in SVM, which are optimized by AAPSO, perform efficiently for both face and iris recognition. A comparative analysis between our proposed AAPSO-SVM and the PSO-SVM technique is presented. PMID:24790584
Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns
NASA Technical Reports Server (NTRS)
Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.
2006-01-01
Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.
Image processing via level set curvature flow
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malladi, R.; Sethian, J.A.
We present a controlled image smoothing and enhancement method based on a curvature flow interpretation of the geometric heat equation. Compared to existing techniques, the model has several distinct advantages. (i) It contains just one enhancement parameter. (ii) The scheme naturally inherits a stopping criterion from the image; continued application of the scheme produces no further change. (iii) The method is one of the fastest possible schemes based on a curvature-controlled approach. 15 ref., 6 figs.
NASA Technical Reports Server (NTRS)
Ito, K.; Teglas, R.
1984-01-01
The numerical scheme based on the Legendre-tau approximation is proposed to approximate the feedback solution to the linear quadratic optimal control problem for hereditary differential systems. The convergence property is established using Trotter ideas. The method yields very good approximations at low orders and provides an approximation technique for computing closed-loop eigenvalues of the feedback system. A comparison with existing methods (based on averaging and spline approximations) is made.
NASA Technical Reports Server (NTRS)
Ito, Kazufumi; Teglas, Russell
1987-01-01
The numerical scheme based on the Legendre-tau approximation is proposed to approximate the feedback solution to the linear quadratic optimal control problem for hereditary differential systems. The convergence property is established using Trotter ideas. The method yields very good approximations at low orders and provides an approximation technique for computing closed-loop eigenvalues of the feedback system. A comparison with existing methods (based on averaging and spline approximations) is made.
NASA Astrophysics Data System (ADS)
Bezmaternykh, P. V.; Nikolaev, D. P.; Arlazarov, V. L.
2018-04-01
Textual blocks rectification or slant correction is an important stage of document image processing in OCR systems. This paper considers existing methods and introduces an approach for the construction of such algorithms based on Fast Hough Transform analysis. A quality measurement technique is proposed and obtained results are shown for both printed and handwritten textual blocks processing as a part of an industrial system of identity documents recognition on mobile devices.
Fourier-based classification of protein secondary structures.
Shu, Jian-Jun; Yong, Kian Yan
2017-04-15
The correct prediction of protein secondary structures is one of the key issues in predicting the correct protein folded shape, which is used for determining gene function. Existing methods make use of amino acids properties as indices to classify protein secondary structures, but are faced with a significant number of misclassifications. The paper presents a technique for the classification of protein secondary structures based on protein "signal-plotting" and the use of the Fourier technique for digital signal processing. New indices are proposed to classify protein secondary structures by analyzing hydrophobicity profiles. The approach is simple and straightforward. Results show that the more types of protein secondary structures can be classified by means of these newly-proposed indices. Copyright © 2017 Elsevier Inc. All rights reserved.
Neighborhood graph and learning discriminative distance functions for clinical decision support.
Tsymbal, Alexey; Zhou, Shaohua Kevin; Huber, Martin
2009-01-01
There are two essential reasons for the slow progress in the acceptance of clinical case retrieval and similarity search-based decision support systems; the especial complexity of clinical data making it difficult to define a meaningful and effective distance function on them and the lack of transparency and explanation ability in many existing clinical case retrieval decision support systems. In this paper, we try to address these two problems by introducing a novel technique for visualizing inter-patient similarity based on a node-link representation with neighborhood graphs and by considering two techniques for learning discriminative distance function that help to combine the power of strong "black box" learners with the transparency of case retrieval and nearest neighbor classification.
Site-directed nucleases: a paradigm shift in predictable, knowledge-based plant breeding.
Podevin, Nancy; Davies, Howard V; Hartung, Frank; Nogué, Fabien; Casacuberta, Josep M
2013-06-01
Conventional plant breeding exploits existing genetic variability and introduces new variability by mutagenesis. This has proven highly successful in securing food supplies for an ever-growing human population. The use of genetically modified plants is a complementary approach but all plant breeding techniques have limitations. Here, we discuss how the recent evolution of targeted mutagenesis and DNA insertion techniques based on tailor-made site-directed nucleases (SDNs) provides opportunities to overcome such limitations. Plant breeding companies are exploiting SDNs to develop a new generation of crops with new and improved traits. Nevertheless, some technical limitations as well as significant uncertainties on the regulatory status of SDNs may challenge their use for commercial plant breeding. Copyright © 2013 Elsevier Ltd. All rights reserved.
NASA Technical Reports Server (NTRS)
Haste, Deepak; Ghoshal, Sudipto; Johnson, Stephen B.; Moore, Craig
2018-01-01
This paper describes the theory and considerations in the application of model-based techniques to assimilate information from disjoint knowledge sources for performing NASA's Fault Management (FM)-related activities using the TEAMS® toolset. FM consists of the operational mitigation of existing and impending spacecraft failures. NASA's FM directives have both design-phase and operational-phase goals. This paper highlights recent studies by QSI and DST of the capabilities required in the TEAMS® toolset for conducting FM activities with the aim of reducing operating costs, increasing autonomy, and conforming to time schedules. These studies use and extend the analytic capabilities of QSI's TEAMS® toolset to conduct a range of FM activities within a centralized platform.
Laser-Based Slam with Efficient Occupancy Likelihood Map Learning for Dynamic Indoor Scenes
NASA Astrophysics Data System (ADS)
Li, Li; Yao, Jian; Xie, Renping; Tu, Jinge; Feng, Chen
2016-06-01
Location-Based Services (LBS) have attracted growing attention in recent years, especially in indoor environments. The fundamental technique of LBS is the map building for unknown environments, this technique also named as simultaneous localization and mapping (SLAM) in robotic society. In this paper, we propose a novel approach for SLAMin dynamic indoor scenes based on a 2D laser scanner mounted on a mobile Unmanned Ground Vehicle (UGV) with the help of the grid-based occupancy likelihood map. Instead of applying scan matching in two adjacent scans, we propose to match current scan with the occupancy likelihood map learned from all previous scans in multiple scales to avoid the accumulation of matching errors. Due to that the acquisition of the points in a scan is sequential but not simultaneous, there unavoidably exists the scan distortion at different extents. To compensate the scan distortion caused by the motion of the UGV, we propose to integrate a velocity of a laser range finder (LRF) into the scan matching optimization framework. Besides, to reduce the effect of dynamic objects such as walking pedestrians often existed in indoor scenes as much as possible, we propose a new occupancy likelihood map learning strategy by increasing or decreasing the probability of each occupancy grid after each scan matching. Experimental results in several challenged indoor scenes demonstrate that our proposed approach is capable of providing high-precision SLAM results.
Review of passive-blind detection in digital video forgery based on sensing and imaging techniques
NASA Astrophysics Data System (ADS)
Tao, Junjie; Jia, Lili; You, Ying
2016-01-01
Advances in digital video compression and IP communication technologies raised new issues and challenges concerning the integrity and authenticity of surveillance videos. It is so important that the system should ensure that once recorded, the video cannot be altered; ensuring the audit trail is intact for evidential purposes. This paper gives an overview of passive techniques of Digital Video Forensics which are based on intrinsic fingerprints inherent in digital surveillance videos. In this paper, we performed a thorough research of literatures relevant to video manipulation detection methods which accomplish blind authentications without referring to any auxiliary information. We presents review of various existing methods in literature, and much more work is needed to be done in this field of video forensics based on video data analysis and observation of the surveillance systems.
Weighted spline based integration for reconstruction of freeform wavefront.
Pant, Kamal K; Burada, Dali R; Bichra, Mohamed; Ghosh, Amitava; Khan, Gufran S; Sinzinger, Stefan; Shakher, Chandra
2018-02-10
In the present work, a spline-based integration technique for the reconstruction of a freeform wavefront from the slope data has been implemented. The slope data of a freeform surface contain noise due to their machining process and that introduces reconstruction error. We have proposed a weighted cubic spline based least square integration method (WCSLI) for the faithful reconstruction of a wavefront from noisy slope data. In the proposed method, the measured slope data are fitted into a piecewise polynomial. The fitted coefficients are determined by using a smoothing cubic spline fitting method. The smoothing parameter locally assigns relative weight to the fitted slope data. The fitted slope data are then integrated using the standard least squares technique to reconstruct the freeform wavefront. Simulation studies show the improved result using the proposed technique as compared to the existing cubic spline-based integration (CSLI) and the Southwell methods. The proposed reconstruction method has been experimentally implemented to a subaperture stitching-based measurement of a freeform wavefront using a scanning Shack-Hartmann sensor. The boundary artifacts are minimal in WCSLI which improves the subaperture stitching accuracy and demonstrates an improved Shack-Hartmann sensor for freeform metrology application.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cui, Yonggang
In implementation of nuclear safeguards, many different techniques are being used to monitor operation of nuclear facilities and safeguard nuclear materials, ranging from radiation detectors, flow monitors, video surveillance, satellite imagers, digital seals to open source search and reports of onsite inspections/verifications. Each technique measures one or more unique properties related to nuclear materials or operation processes. Because these data sets have no or loose correlations, it could be beneficial to analyze the data sets together to improve the effectiveness and efficiency of safeguards processes. Advanced visualization techniques and machine-learning based multi-modality analysis could be effective tools in such integratedmore » analysis. In this project, we will conduct a survey of existing visualization and analysis techniques for multi-source data and assess their potential values in nuclear safeguards.« less
Nonlinear ultrasonic fatigue crack detection using a single piezoelectric transducer
NASA Astrophysics Data System (ADS)
An, Yun-Kyu; Lee, Dong Jun
2016-04-01
This paper proposes a new nonlinear ultrasonic technique for fatigue crack detection using a single piezoelectric transducer (PZT). The proposed technique identifies a fatigue crack using linear (α) and nonlinear (β) parameters obtained from only a single PZT mounted on a target structure. Based on the different physical characteristics of α and β, a fatigue crack-induced feature is able to be effectively isolated from the inherent nonlinearity of a target structure and data acquisition system. The proposed technique requires much simpler test setup and less processing costs than the existing nonlinear ultrasonic techniques, but fast and powerful. To validate the proposed technique, a real fatigue crack is created in an aluminum plate, and then false positive and negative tests are carried out under varying temperature conditions. The experimental results reveal that the fatigue crack is successfully detected, and no positive false alarm is indicated.
Multicast for savings in cache-based video distribution
NASA Astrophysics Data System (ADS)
Griwodz, Carsten; Zink, Michael; Liepert, Michael; On, Giwon; Steinmetz, Ralf
1999-12-01
Internet video-on-demand (VoD) today streams videos directly from server to clients, because re-distribution is not established yet. Intranet solutions exist but are typically managed centrally. Caching may overcome these management needs, however existing web caching strategies are not applicable because they work in different conditions. We propose movie distribution by means of caching, and study the feasibility from the service providers' point of view. We introduce the combination of our reliable multicast protocol LCRTP for caching hierarchies combined with our enhancement to the patching technique for bandwidth friendly True VoD, not depending on network resource guarantees.
Advanced Computational Techniques for Hypersonic Propulsion
NASA Technical Reports Server (NTRS)
Povinelli, Louis A.
1996-01-01
CFD has played a major role in the resurgence of hypersonic flight, on the premise that numerical methods will allow us to perform simulations at conditions for which no ground test capability exists. Validation of CFD methods is being established using the experimental data base available, which is below Mach 8. It is important, however, to realize the limitations involved in the extrapolation process as well as the deficiencies that exist in numerical methods at the present time. Current features of CFD codes are examined for application to propulsion system components. The shortcomings in simulation and modeling are identified and discussed.
An intelligent content discovery technique for health portal content management.
De Silva, Daswin; Burstein, Frada
2014-04-23
Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current.
Kearney, Philip E; Carson, Howie J; Collins, Dave
2018-05-01
This paper explores the approaches adopted by high-level field athletics coaches when attempting to refine an athlete's already well-established technique (long and triple jump and javelin throwing). Six coaches, who had all coached multiple athletes to multiple major championships, took part in semi-structured interviews focused upon a recent example of technique refinement. Data were analysed using a thematic content analysis. The coaching tools reported were generally consistent with those advised by the existing literature, focusing on attaining "buy-in", utilising part-practice, restoring movement automaticity and securing performance under pressure. Five of the six coaches reported using a systematic sequence of stages to implement the refinement, although the number and content of these stages varied between them. Notably, however, there were no formal sources of knowledge (e.g., coach education or training) provided to inform coaches' decision making. Instead, coaches' decisions were largely based on experience both within and outside the sporting domain. Data offer a useful stimulus for reflection amongst sport practitioners confronted by the problem of technique refinement. Certainly the limited awareness of existing guidelines on technique refinement expressed by the coaches emphasises a need for further collaborative work by researchers and coach educators to disseminate best practice.
Behavior based safety. A different way of looking at an old problem.
Haney, L; Anderson, M
1999-09-01
1. The occupational and environmental health nurse role in behavioral safety initiatives can very to include: serving as a leader, change agent, collaborator with safety professionals, consultant, team participant, educator, coach, and supporter to employees and management. 2. Behavior based safety and health initiatives add to existing knowledge and techniques for improving the health and safety of workers. 3. Behavior based safety relies on employee involvement and places a strong emphasis on observation, measurement, feedback, positive reinforcement, and evaluation. It focuses on identification of system improvements and prevention.
Optimum filter-based discrimination of neutrons and gamma rays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amiri, Moslem; Prenosil, Vaclav; Cvachovec, Frantisek
2015-07-01
An optimum filter-based method for discrimination of neutrons and gamma-rays in a mixed radiation field is presented. The existing filter-based implementations of discriminators require sample pulse responses in advance of the experiment run to build the filter coefficients, which makes them less practical. Our novel technique creates the coefficients during the experiment and improves their quality gradually. Applied to several sets of mixed neutron and photon signals obtained through different digitizers using stilbene scintillator, this approach is analyzed and its discrimination quality is measured. (authors)
An integrated approach for updating cadastral maps in Pakistan using satellite remote sensing data
NASA Astrophysics Data System (ADS)
Ali, Zahir; Tuladhar, Arbind; Zevenbergen, Jaap
2012-08-01
Updating cadastral information is crucial for recording land ownership and property division changes in a timely fashioned manner. In most cases, the existing cadastral maps do not provide up-to-date information on land parcel boundaries. Such a situation demands that all the cadastral data and parcel boundaries information in these maps to be updated in a timely fashion. The existing techniques for acquiring cadastral information are discipline-oriented based on different disciplines such as geodesy, surveying, and photogrammetry. All these techniques require a large number of manpower, time, and cost when they are carried out separately. There is a need to integrate these techniques for acquiring cadastral information to update the existing cadastral data and (re)produce cadastral maps in an efficient manner. To reduce the time and cost involved in cadastral data acquisition, this study develops an integrated approach by integrating global position system (GPS) data, remote sensing (RS) imagery, and existing cadastral maps. For this purpose, the panchromatic image with 0.6 m spatial resolution and the corresponding multi-spectral image with 2.4 m spatial resolution and 3 spectral bands from QuickBird satellite were used. A digital elevation model (DEM) was extracted from SPOT-5 stereopairs and some ground control points (GCPs) were also used for ortho-rectifying the QuickBird images. After ortho-rectifying these images and registering the multi-spectral image to the panchromatic image, fusion between them was attained to get good quality multi-spectral images of these two study areas with 0.6 m spatial resolution. Cadastral parcel boundaries were then identified on QuickBird images of the two study areas via visual interpretation using participatory-GIS (PGIS) technique. The regions of study are the urban and rural areas of Peshawar and Swabi districts in the Khyber Pakhtunkhwa province of Pakistan. The results are the creation of updated cadastral maps with a lot of cadastral information which can be used in updating the existing cadastral data with less time and cost.
3D shape measurement of automotive glass by using a fringe reflection technique
NASA Astrophysics Data System (ADS)
Skydan, O. A.; Lalor, M. J.; Burton, D. R.
2007-01-01
In automotive and glass making industries, there is a need for accurately measuring the 3D shapes of reflective surfaces to speed up and ensure product development and manufacturing quality by using non-contact techniques. This paper describes a technique for the measurement of non-full-field reflective surfaces of automotive glass by using a fringe reflection technique. Physical properties of the measurement surfaces do not allow us to apply optical geometries used in existing techniques for surface measurement based upon direct fringe pattern illumination. However, this property of surface reflectivity can be used to implement similar ideas from existing techniques in a new improved method. In other words, the reflective surface can be used as a mirror to reflect illuminated fringe patterns onto a screen behind. It has been found that in the case of implementing the reflective fringe technique, the phase-shift distribution depends not only on the height of the object but also on the slope at each measurement point. This requires the solving of differential equations to find the surface slope and height distributions in the x and y directions and development of the additional height reconstruction algorithms. The main focus has been made on developing a mathematical model of the optical sub-system and discussing ways for its practical implementation including calibration procedures. A number of implemented image processing algorithms for system calibration and data analysis are discussed and two experimental results are given for automotive glass surfaces with different shapes and defects. The proposed technique showed the ability to provide accurate non-destructive measurement of 3D shapes of the reflective automotive glass surfaces and can be used as a key element for a glass shape quality control system on-line or in a laboratory environment.
Driving profile modeling and recognition based on soft computing approach.
Wahab, Abdul; Quek, Chai; Tan, Chin Keong; Takeda, Kazuya
2009-04-01
Advancements in biometrics-based authentication have led to its increasing prominence and are being incorporated into everyday tasks. Existing vehicle security systems rely only on alarms or smart card as forms of protection. A biometric driver recognition system utilizing driving behaviors is a highly novel and personalized approach and could be incorporated into existing vehicle security system to form a multimodal identification system and offer a greater degree of multilevel protection. In this paper, detailed studies have been conducted to model individual driving behavior in order to identify features that may be efficiently and effectively used to profile each driver. Feature extraction techniques based on Gaussian mixture models (GMMs) are proposed and implemented. Features extracted from the accelerator and brake pedal pressure were then used as inputs to a fuzzy neural network (FNN) system to ascertain the identity of the driver. Two fuzzy neural networks, namely, the evolving fuzzy neural network (EFuNN) and the adaptive network-based fuzzy inference system (ANFIS), are used to demonstrate the viability of the two proposed feature extraction techniques. The performances were compared against an artificial neural network (NN) implementation using the multilayer perceptron (MLP) network and a statistical method based on the GMM. Extensive testing was conducted and the results show great potential in the use of the FNN for real-time driver identification and verification. In addition, the profiling of driver behaviors has numerous other potential applications for use by law enforcement and companies dealing with buses and truck drivers.
Realising the knowledge spiral in healthcare: the role of data mining and knowledge management.
Wickramasinghe, Nilmini; Bali, Rajeev K; Gibbons, M Chris; Schaffer, Jonathan
2008-01-01
Knowledge Management (KM) is an emerging business approach aimed at solving current problems such as competitiveness and the need to innovate which are faced by businesses today. The premise for the need for KM is based on a paradigm shift in the business environment where knowledge is central to organizational performance . Organizations trying to embrace KM have many tools, techniques and strategies at their disposal. A vital technique in KM is data mining which enables critical knowledge to be gained from the analysis of large amounts of data and information. The healthcare industry is a very information rich industry. The collecting of data and information permeate most, if not all areas of this industry; however, the healthcare industry has yet to fully embrace KM, let alone the new evolving techniques of data mining. In this paper, we demonstrate the ubiquitous benefits of data mining and KM to healthcare by highlighting their potential to enable and facilitate superior clinical practice and administrative management to ensue. Specifically, we show how data mining can realize the knowledge spiral by effecting the four key transformations identified by Nonaka of turning: (1) existing explicit knowledge to new explicit knowledge, (2) existing explicit knowledge to new tacit knowledge, (3) existing tacit knowledge to new explicit knowledge and (4) existing tacit knowledge to new tacit knowledge. This is done through the establishment of theoretical models that respectively identify the function of the knowledge spiral and the powers of data mining, both exploratory and predictive, in the knowledge discovery process. Our models are then applied to a healthcare data set to demonstrate the potential of this approach as well as the implications of such an approach to the clinical and administrative aspects of healthcare. Further, we demonstrate how these techniques can facilitate hospitals to address the six healthcare quality dimensions identified by the Committee for Quality Healthcare.
Problem-based learning biotechnology courses in chemical engineering.
Glatz, Charles E; Gonzalez, Ramon; Huba, Mary E; Mallapragada, Surya K; Narasimhan, Balaji; Reilly, Peter J; Saunders, Kevin P; Shanks, Jacqueline V
2006-01-01
We have developed a series of upper undergraduate/graduate lecture and laboratory courses on biotechnological topics to supplement existing biochemical engineering, bioseparations, and biomedical engineering lecture courses. The laboratory courses are based on problem-based learning techniques, featuring two- and three-person teams, journaling, and performance rubrics for guidance and assessment. Participants initially have found them to be difficult, since they had little experience with problem-based learning. To increase enrollment, we are combining the laboratory courses into 2-credit groupings and allowing students to substitute one of them for the second of our 2-credit chemical engineering unit operations laboratory courses.
Implementing a Reliability Centered Maintenance Program at NASA's Kennedy Space Center
NASA Technical Reports Server (NTRS)
Tuttle, Raymond E.; Pete, Robert R.
1998-01-01
Maintenance practices have long focused on time based "preventive maintenance" techniques. Components were changed out and parts replaced based on how long they had been in place instead of what condition they were in. A reliability centered maintenance (RCM) program seeks to offer equal or greater reliability at decreased cost by insuring only applicable, effective maintenance is performed and by in large part replacing time based maintenance with condition based maintenance. A significant portion of this program involved introducing non-intrusive technologies, such as vibration analysis, oil analysis and I/R cameras, to an existing labor force and management team.
Scala Roles: Reusable Object Collaborations in a Library
NASA Astrophysics Data System (ADS)
Pradel, Michael; Odersky, Martin
Purely class-based implementations of object-oriented software are often inappropriate for reuse. In contrast, the notion of objects playing roles in a collaboration has been proven to be a valuable reuse abstraction. However, existing solutions to enable role-based programming tend to require vast extensions of the underlying programming language, and thus, are difficult to use in every day work. We present a programming technique, based on dynamic proxies, that allows to augment an object’s type at runtime while preserving strong static type safety. It enables role-based implementations that lead to more reuse and better separation of concerns.
Prior knowledge based mining functional modules from Yeast PPI networks with gene ontology
2010-01-01
Background In the literature, there are fruitful algorithmic approaches for identification functional modules in protein-protein interactions (PPI) networks. Because of accumulation of large-scale interaction data on multiple organisms and non-recording interaction data in the existing PPI database, it is still emergent to design novel computational techniques that can be able to correctly and scalably analyze interaction data sets. Indeed there are a number of large scale biological data sets providing indirect evidence for protein-protein interaction relationships. Results The main aim of this paper is to present a prior knowledge based mining strategy to identify functional modules from PPI networks with the aid of Gene Ontology. Higher similarity value in Gene Ontology means that two gene products are more functionally related to each other, so it is better to group such gene products into one functional module. We study (i) to encode the functional pairs into the existing PPI networks; and (ii) to use these functional pairs as pairwise constraints to supervise the existing functional module identification algorithms. Topology-based modularity metric and complex annotation in MIPs will be used to evaluate the identified functional modules by these two approaches. Conclusions The experimental results on Yeast PPI networks and GO have shown that the prior knowledge based learning methods perform better than the existing algorithms. PMID:21172053
Evaluation of a rule-based compositing technique for Landsat-5 TM and Landsat-7 ETM+ images
NASA Astrophysics Data System (ADS)
Lück, W.; van Niekerk, A.
2016-05-01
Image compositing is a multi-objective optimization process. Its goal is to produce a seamless cloud and artefact-free artificial image. This is achieved by aggregating image observations and by replacing poor and cloudy data with good observations from imagery acquired within the timeframe of interest. This compositing process aims to minimise the visual artefacts which could result from different radiometric properties, caused by atmospheric conditions, phenologic patterns and land cover changes. It has the following requirements: (1) image compositing must be cloud free, which requires the detection of clouds and shadows, and (2) the image composite must be seamless, minimizing artefacts and visible across inter image seams. This study proposes a new rule-based compositing technique (RBC) that combines the strengths of several existing methods. A quantitative and qualitative evaluation is made of the RBC technique by comparing it to the maximum NDVI (MaxNDVI), minimum red (MinRed) and maximum ratio (MaxRatio) compositing techniques. A total of 174 Landsat TM and ETM+ images, covering three study sites and three different timeframes for each site, are used in the evaluation. A new set of quantitative/qualitative evaluation techniques for compositing quality measurement was developed and showed that the RBC technique outperformed all other techniques, with MaxRatio, MaxNDVI, and MinRed techniques in order of performance from best to worst.
Promoter Sequences Prediction Using Relational Association Rule Mining
Czibula, Gabriela; Bocicor, Maria-Iuliana; Czibula, Istvan Gergely
2012-01-01
In this paper we are approaching, from a computational perspective, the problem of promoter sequences prediction, an important problem within the field of bioinformatics. As the conditions for a DNA sequence to function as a promoter are not known, machine learning based classification models are still developed to approach the problem of promoter identification in the DNA. We are proposing a classification model based on relational association rules mining. Relational association rules are a particular type of association rules and describe numerical orderings between attributes that commonly occur over a data set. Our classifier is based on the discovery of relational association rules for predicting if a DNA sequence contains or not a promoter region. An experimental evaluation of the proposed model and comparison with similar existing approaches is provided. The obtained results show that our classifier overperforms the existing techniques for identifying promoter sequences, confirming the potential of our proposal. PMID:22563233
Sim, K S; Kiani, M A; Nia, M E; Tso, C P
2014-01-01
A new technique based on cubic spline interpolation with Savitzky-Golay noise reduction filtering is designed to estimate signal-to-noise ratio of scanning electron microscopy (SEM) images. This approach is found to present better result when compared with two existing techniques: nearest neighbourhood and first-order interpolation. When applied to evaluate the quality of SEM images, noise can be eliminated efficiently with optimal choice of scan rate from real-time SEM images, without generating corruption or increasing scanning time. © 2013 The Authors Journal of Microscopy © 2013 Royal Microscopical Society.
Rank-based decompositions of morphological templates.
Sussner, P; Ritter, G X
2000-01-01
Methods for matrix decomposition have found numerous applications in image processing, in particular for the problem of template decomposition. Since existing matrix decomposition techniques are mainly concerned with the linear domain, we consider it timely to investigate matrix decomposition techniques in the nonlinear domain with applications in image processing. The mathematical basis for these investigations is the new theory of rank within minimax algebra. Thus far, only minimax decompositions of rank 1 and rank 2 matrices into outer product expansions are known to the image processing community. We derive a heuristic algorithm for the decomposition of matrices having arbitrary rank.
History Matters: Incremental Ontology Reasoning Using Modules
NASA Astrophysics Data System (ADS)
Cuenca Grau, Bernardo; Halaschek-Wiener, Christian; Kazakov, Yevgeny
The development of ontologies involves continuous but relatively small modifications. Existing ontology reasoners, however, do not take advantage of the similarities between different versions of an ontology. In this paper, we propose a technique for incremental reasoning—that is, reasoning that reuses information obtained from previous versions of an ontology—based on the notion of a module. Our technique does not depend on a particular reasoning calculus and thus can be used in combination with any reasoner. We have applied our results to incremental classification of OWL DL ontologies and found significant improvement over regular classification time on a set of real-world ontologies.
NASA Astrophysics Data System (ADS)
Li, Dong-xia; Ye, Qian-wen
Out-of-band radiation suppression algorithm must be used efficiently for broadband aeronautical communication system in order not to interfere the operation of the existing systems in aviation L-Band. Based on the simple introduction of the broadband aeronautical multi-carrier communication (B-AMC) system model, several sidelobe suppression techniques in orthogonal frequency multiplexing (OFDM) system are presented and analyzed so as to find a suitable algorithm for B-AMC system in this paper. Simulation results show that raise-cosine function windowing can suppress the out-of-band radiation of B-AMC system effectively.
Failure warning of hydrous sandstone based on electroencephalogram technique
NASA Astrophysics Data System (ADS)
Tao, Kai; Zheng, Wei
2018-06-01
Sandstone is a type of rock mass that widely exists in nature. Moisture is an important factor that leads to sandstone structural failure. The major failure assessment methods of hydrous sandstone at present cannot satisfy real-time and portability requirements, especially lacks of warning function. In this study, acoustic emission (AE) and computed tomography (CT) techniques are combined for real-time failure assessment of hydrous sandstone. Eight visual colors for warning are screened according to different failure states, and an electroencephalogram (EEG) experiment is conducted to demonstrate their diverse excitations of the human brain's concentration.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ravelo Arias, S. I.; Ramírez Muñoz, D.; Cardoso, S.
2015-06-15
The work shows a measurement technique to obtain the correct value of the four elements in a resistive Wheatstone bridge without the need to separate the physical connections existing between them. Two electronic solutions are presented, based on a source-and-measure unit and using discrete electronic components. The proposed technique brings the possibility to know the mismatching or the tolerance between the bridge resistive elements and then to pass or reject it in terms of its related common-mode rejection. Experimental results were taken in various Wheatstone resistive bridges (discrete and magnetoresistive integrated bridges) validating the proposed measurement technique specially when themore » bridge is micro-fabricated and there is no physical way to separate one resistive element from the others.« less
DNA-based techniques for authentication of processed food and food supplements.
Lo, Yat-Tung; Shaw, Pang-Chui
2018-02-01
Authentication of food or food supplements with medicinal values is important to avoid adverse toxic effects, provide consumer rights, as well as for certification purpose. Compared to morphological and spectrometric techniques, molecular authentication is found to be accurate, sensitive and reliable. However, DNA degradation and inclusion of inhibitors may lead to failure in PCR amplification. This paper reviews on the existing DNA extraction and PCR protocols, and the use of small size DNA markers with sufficient discriminative power for molecular authentication. Various emerging new molecular techniques such as isothermal amplification for on-site diagnosis, next-generation sequencing for high-throughput species identification, high resolution melting analysis for quick species differentiation, DNA array techniques for rapid detection and quantitative determination in food products are also discussed. Copyright © 2017 Elsevier Ltd. All rights reserved.
Nguyen, Thanh; Bui, Vy; Lam, Van; Raub, Christopher B; Chang, Lin-Ching; Nehmetallah, George
2017-06-26
We propose a fully automatic technique to obtain aberration free quantitative phase imaging in digital holographic microscopy (DHM) based on deep learning. The traditional DHM solves the phase aberration compensation problem by manually detecting the background for quantitative measurement. This would be a drawback in real time implementation and for dynamic processes such as cell migration phenomena. A recent automatic aberration compensation approach using principle component analysis (PCA) in DHM avoids human intervention regardless of the cells' motion. However, it corrects spherical/elliptical aberration only and disregards the higher order aberrations. Traditional image segmentation techniques can be employed to spatially detect cell locations. Ideally, automatic image segmentation techniques make real time measurement possible. However, existing automatic unsupervised segmentation techniques have poor performance when applied to DHM phase images because of aberrations and speckle noise. In this paper, we propose a novel method that combines a supervised deep learning technique with convolutional neural network (CNN) and Zernike polynomial fitting (ZPF). The deep learning CNN is implemented to perform automatic background region detection that allows for ZPF to compute the self-conjugated phase to compensate for most aberrations.
Geuna, S
2000-11-20
Quantitative morphology of the nervous system has undergone great developments over recent years, and several new technical procedures have been devised and applied successfully to neuromorphological research. However, a lively debate has arisen on some issues, and a great deal of confusion appears to exist that is definitely responsible for the slow spread of the new techniques among scientists. One such element of confusion is related to uncertainty about the meaning, implications, and advantages of the design-based sampling strategy that characterize the new techniques. In this article, to help remove this uncertainty, morphoquantitative methods are described and contrasted on the basis of the inferential paradigm of the sampling strategy: design-based vs model-based. Moreover, some recommendations are made to help scientists judge the appropriateness of a method used for a given study in relation to its specific goals. Finally, the use of the term stereology to label, more or less expressly, only some methods is critically discussed. Copyright 2000 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
NASA Astrophysics Data System (ADS)
Kim, Saejoon
2018-01-01
We consider the problem of low-volatility portfolio selection which has been the subject of extensive research in the field of portfolio selection. To improve the currently existing techniques that rely purely on past information to select low-volatility portfolios, this paper investigates the use of time series regression techniques that make forecasts of future volatility to select the portfolios. In particular, for the first time, the utility of support vector regression and its enhancements as portfolio selection techniques is provided. It is shown that our regression-based portfolio selection provides attractive outperformances compared to the benchmark index and the portfolio defined by a well-known strategy on the data-sets of the S&P 500 and the KOSPI 200.
Numerical Simulation of Delamination Growth in Composite Materials
NASA Technical Reports Server (NTRS)
Camanho, P. P.; Davila, C. G.; Ambur, D. R.
2001-01-01
The use of decohesion elements for the simulation of delamination in composite materials is reviewed. The test methods available to measure the interfacial fracture toughness used in the formulation of decohesion elements are described initially. After a brief presentation of the virtual crack closure technique, the technique most widely used to simulate delamination growth, the formulation of interfacial decohesion elements is described. Problems related with decohesion element constitutive equations, mixed-mode crack growth, element numerical integration and solution procedures are discussed. Based on these investigations, it is concluded that the use of interfacial decohesion elements is a promising technique that avoids the need for a pre-existing crack and pre-defined crack paths, and that these elements can be used to simulate both delamination onset and growth.
Joint temporal density measurements for two-photon state characterization.
Kuzucu, Onur; Wong, Franco N C; Kurimura, Sunao; Tovstonog, Sergey
2008-10-10
We demonstrate a technique for characterizing two-photon quantum states based on joint temporal correlation measurements using time-resolved single-photon detection by femtosecond up-conversion. We measure for the first time the joint temporal density of a two-photon entangled state, showing clearly the time anticorrelation of the coincident-frequency entangled photon pair generated by ultrafast spontaneous parametric down-conversion under extended phase-matching conditions. The new technique enables us to manipulate the frequency entanglement by varying the down-conversion pump bandwidth to produce a nearly unentangled two-photon state that is expected to yield a heralded single-photon state with a purity of 0.88. The time-domain correlation technique complements existing frequency-domain measurement methods for a more complete characterization of photonic entanglement.
Neufeld, E; Chavannes, N; Samaras, T; Kuster, N
2007-08-07
The modeling of thermal effects, often based on the Pennes Bioheat Equation, is becoming increasingly popular. The FDTD technique commonly used in this context suffers considerably from staircasing errors at boundaries. A new conformal technique is proposed that can easily be integrated into existing implementations without requiring a special update scheme. It scales fluxes at interfaces with factors derived from the local surface normal. The new scheme is validated using an analytical solution, and an error analysis is performed to understand its behavior. The new scheme behaves considerably better than the standard scheme. Furthermore, in contrast to the standard scheme, it is possible to obtain with it more accurate solutions by increasing the grid resolution.
Behavior-based multi-robot collaboration for autonomous construction tasks
NASA Technical Reports Server (NTRS)
Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew
2005-01-01
The Robot Construction Crew (RCC) is a heterogeneous multi-robot system for autonomous construction of a structure through assembly of Long components. The two robot team demonstrates component placement into an existing structure in a realistic environment. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. A behavior-based architecture provides adaptability. The RCC approach minimizes computation, power, communication, and sensing for applicability to space-related construction efforts, but the techniques are applicable to terrestrial construction tasks.
Behavior-Based Multi-Robot Collaboration for Autonomous Construction Tasks
NASA Technical Reports Server (NTRS)
Stroupe, Ashley; Huntsberger, Terry; Okon, Avi; Aghazarian, Hrand; Robinson, Matthew
2005-01-01
We present a heterogeneous multi-robot system for autonomous construction of a structure through assembly of long components. Placement of a component within an existing structure in a realistic environment is demonstrated on a two-robot team. The task requires component acquisition, cooperative transport, and cooperative precision manipulation. Far adaptability, the system is designed as a behavior-based architecture. Far applicability to space-related construction efforts, computation, power, communication, and sensing are minimized, though the techniques developed are also applicable to terrestrial construction tasks.
Compressing random microstructures via stochastic Wang tilings.
Novák, Jan; Kučerová, Anna; Zeman, Jan
2012-10-01
This Rapid Communication presents a stochastic Wang tiling-based technique to compress or reconstruct disordered microstructures on the basis of given spatial statistics. Unlike the existing approaches based on a single unit cell, it utilizes a finite set of tiles assembled by a stochastic tiling algorithm, thereby allowing to accurately reproduce long-range orientation orders in a computationally efficient manner. Although the basic features of the method are demonstrated for a two-dimensional particulate suspension, the present framework is fully extensible to generic multidimensional media.
Summary of vulnerability related technologies based on machine learning
NASA Astrophysics Data System (ADS)
Zhao, Lei; Chen, Zhihao; Jia, Qiong
2018-04-01
As the scale of information system increases by an order of magnitude, the complexity of system software is getting higher. The vulnerability interaction from design, development and deployment to implementation stages greatly increases the risk of the entire information system being attacked successfully. Considering the limitations and lags of the existing mainstream security vulnerability detection techniques, this paper summarizes the development and current status of related technologies based on the machine learning methods applied to deal with massive and irregular data, and handling security vulnerabilities.
Prediction of quantitative intrathoracic fluid volume to diagnose pulmonary oedema using LabVIEW.
Urooj, Shabana; Khan, M; Ansari, A Q; Lay-Ekuakille, Aimé; Salhan, Ashok K
2012-01-01
Pulmonary oedema is a life-threatening disease that requires special attention in the area of research and clinical diagnosis. Computer-based techniques are rarely used to quantify the intrathoracic fluid volume (IFV) for diagnostic purposes. This paper discusses a software program developed to detect and diagnose pulmonary oedema using LabVIEW. The software runs on anthropometric dimensions and physiological parameters, mainly transthoracic electrical impedance (TEI). This technique is accurate and faster than existing manual techniques. The LabVIEW software was used to compute the parameters required to quantify IFV. An equation relating per cent control and IFV was obtained. The results of predicted TEI and measured TEI were compared with previously reported data to validate the developed program. It was found that the predicted values of TEI obtained from the computer-based technique were much closer to the measured values of TEI. Six new subjects were enrolled to measure and predict transthoracic impedance and hence to quantify IFV. A similar difference was also observed in the measured and predicted values of TEI for the new subjects.
Higgs, Gary
2006-04-01
Despite recent U.K. Government commitments' to encourage public participation in environmental decision making, those exercises conducted to date have been largely confined to 'traditional' modes of participation such as the dissemination of information and in encouraging feedback on proposals through, for example, questionnaires or surveys. It is the premise of this paper that participative approaches that use IT-based methods, based on combined geographical information systems (GIS) and multi-criteria evaluation techniques that could involve the public in the decision-making process, have the potential to build consensus and reduce disputes and conflicts such as those arising from the siting of different types of waste facilities. The potential of these techniques are documented through a review of the existing literature in order to highlight the opportunities and challenges facing decision makers in increasing the involvement of the public at different stages of the waste facility management process. It is concluded that there are important lessons to be learned by researchers, consultants, managers and decision makers if barriers hindering the wider use of such techniques are to be overcome.
ERIC Educational Resources Information Center
Manning, S.; Dix, A.
2008-01-01
There is anecdotal evidence that a significant number of students studying computing related courses at degree level have difficulty with sub-GCE mathematics. Testing of students' skills is often performed using diagnostic tests and a number of computer-based diagnostic tests exist, which work, essentially, by testing one specific diagnostic skill…
Mee-Sook Kim; Ned B. Klopfenstein; Geral I. McDonald; Kathiravetpillai Arumuganathan
2001-01-01
For assessments of intraspecific mating using flow cytometry and fluorescence microscopy, two compatible basidiospore-derived isolates were selected from each of four parental basidiomata of North American Biological Species (NABS) X. The nuclear status in NABS X varied with basidiospore-derived isolates. Nuclei within basidiospore-derived isolates existed as haploids...
[Present status and sustainable development of Dendrobium officinale industry].
Wu, Yunqin; Si, Jinping
2010-08-01
To understand the present status and characteristics of Dendrobium officinale industry and to provide a rationale for the sustainable industrial development. Based on references and an on-site investigation of main Dendrobium officinale-producing enterprises and market, to analyze main existing problems and to propose suggestions for sustainable development. More than 10 provinces and regions are involved in the production around the center of Zhejiang and Yunnan provinces. These two provinces are different from each other in development pattern. Yunnan adopts a mode of companies minus farmer households but Zhejiang mainly employs a mode that a leading company establishes a production base with production, processing and marketing combined together. Zhejiang mode is characterized by high tech, high investment, high risk and high return. Existence of non-genuine species, stagnancy in development and application of varieties and techniques for quality control and a narrow channel for marketing are the key problems limiting sustainable development of the industry. The key to sustainable development of the industry is to establish a technological alliance to speed up development of common techniques and application of integrated innovations, to strengthen self-discipline and monitoring of production, and to expand sales market.
NMF-Based Image Quality Assessment Using Extreme Learning Machine.
Wang, Shuigen; Deng, Chenwei; Lin, Weisi; Huang, Guang-Bin; Zhao, Baojun
2017-01-01
Numerous state-of-the-art perceptual image quality assessment (IQA) algorithms share a common two-stage process: distortion description followed by distortion effects pooling. As for the first stage, the distortion descriptors or measurements are expected to be effective representatives of human visual variations, while the second stage should well express the relationship among quality descriptors and the perceptual visual quality. However, most of the existing quality descriptors (e.g., luminance, contrast, and gradient) do not seem to be consistent with human perception, and the effects pooling is often done in ad-hoc ways. In this paper, we propose a novel full-reference IQA metric. It applies non-negative matrix factorization (NMF) to measure image degradations by making use of the parts-based representation of NMF. On the other hand, a new machine learning technique [extreme learning machine (ELM)] is employed to address the limitations of the existing pooling techniques. Compared with neural networks and support vector regression, ELM can achieve higher learning accuracy with faster learning speed. Extensive experimental results demonstrate that the proposed metric has better performance and lower computational complexity in comparison with the relevant state-of-the-art approaches.
NASA Technical Reports Server (NTRS)
Erickson, Gary E.
2007-01-01
An overview is given of selected measurement techniques used in the NASA Langley Research Center (NASA LaRC) Unitary Plan Wind Tunnel (UPWT) to determine the aerodynamic characteristics of aerospace vehicles operating at supersonic speeds. A broad definition of a measurement technique is adopted in this paper and is any qualitative or quantitative experimental approach that provides information leading to the improved understanding of the supersonic aerodynamic characteristics. On-surface and off-surface measurement techniques used to obtain discrete (point) and global (field) measurements and planar and global flow visualizations are described, and examples of all methods are included. The discussion is limited to recent experiences in the UPWT and is, therefore, not an exhaustive review of existing experimental techniques. The diversity and high quality of the measurement techniques and the resultant data illustrate the capabilities of a ground-based experimental facility and the key role that it plays in the advancement of our understanding, prediction, and control of supersonic aerodynamics.
NASA Astrophysics Data System (ADS)
Mashayekhi, Mohammad Jalali; Behdinan, Kamran
2017-10-01
The increasing demand to minimize undesired vibration and noise levels in several high-tech industries has generated a renewed interest in vibration transfer path analysis. Analyzing vibration transfer paths within a system is of crucial importance in designing an effective vibration isolation strategy. Most of the existing vibration transfer path analysis techniques are empirical which are suitable for diagnosis and troubleshooting purpose. The lack of an analytical transfer path analysis to be used in the design stage is the main motivation behind this research. In this paper an analytical transfer path analysis based on the four-pole theory is proposed for multi-energy-domain systems. Bond graph modeling technique which is an effective approach to model multi-energy-domain systems is used to develop the system model. In this paper an electro-mechanical system is used as a benchmark example to elucidate the effectiveness of the proposed technique. An algorithm to obtain the equivalent four-pole representation of a dynamical systems based on the corresponding bond graph model is also presented in this paper.
Tone-Based Command of Deep Space Probes using Ground Antennas
NASA Technical Reports Server (NTRS)
Bokulic, Robert S.; Jensen, J. Robert
2008-01-01
A document discusses a technique for enabling the reception of spacecraft commands at received signal levels as much as three orders of magnitude below those of current deep space systems. Tone-based commanding deals with the reception of commands that are sent in the form of precise frequency offsets using an open-loop receiver. The key elements of this technique are an ultrastable oscillator and open-loop receiver onboard the spacecraft, both of which are part of the existing New Horizons (Pluto flyby) communications system design. This enables possible flight experimentation for tone-based commanding during the long cruise of the spacecraft to Pluto. In this technique, it is also necessary to accurately remove Doppler shift from the uplink signal presented to the spacecraft. A signal processor in the spacecraft performs a discrete Fourier transform on the received signal to determine the frequency of the received signal. Due to the long-term drift in the oscillators and orbit prediction model, the system is likely to be implemented differentially, where changes in the uplink frequency convey the command information.
NASA Astrophysics Data System (ADS)
Gill, Douglas M.; Rasras, Mahmoud; Tu, Kun-Yii; Chen, Young-Kai; White, Alice E.; Patel, Sanjay S.; Carothers, Daniel; Pomerene, Andrew; Kamocsai, Robert; Beattie, James; Kopa, Anthony; Apsel, Alyssa; Beals, Mark; Mitchel, Jurgen; Liu, Jifeng; Kimerling, Lionel C.
2008-02-01
Integrating electronic and photonic functions onto a single silicon-based chip using techniques compatible with mass-production CMOS electronics will enable new design paradigms for existing system architectures and open new opportunities for electro-optic applications with the potential to dramatically change the management, cost, footprint, weight, and power consumption of today's communication systems. While broadband analog system applications represent a smaller volume market than that for digital data transmission, there are significant deployments of analog electro-optic systems for commercial and military applications. Broadband linear modulation is a critical building block in optical analog signal processing and also could have significant applications in digital communication systems. Recently, broadband electro-optic modulators on a silicon platform have been demonstrated based on the plasma dispersion effect. The use of the plasma dispersion effect within a CMOS compatible waveguide creates new challenges and opportunities for analog signal processing since the index and propagation loss change within the waveguide during modulation. We will review the current status of silicon-based electrooptic modulators and also linearization techniques for optical modulation.
Microstructural Effects on Initiation Behavior in HMX
NASA Astrophysics Data System (ADS)
Molek, Christopher; Welle, Eric; Hardin, Barrett; Vitarelli, Jim; Wixom, Ryan; Samuels, Philip
Understanding the role microstructure plays on ignition and growth behavior has been the subject of a significant body of research within the detonation physics community. The pursuit of this understanding is important because safety and performance characteristics have been shown to strongly correlate to particle morphology. Historical studies have often correlated bulk powder characteristics to the performance or safety characteristics of pressed materials. We believe that a clearer and more relevant correlation is made between the pressed microstructure and the observed detonation behavior. This type of assessment is possible, as techniques now exist for the quantification of the pressed microstructures. Our talk will report on experimental efforts that correlate directly measured microstructural characteristics to initiation threshold behavior of HMX based materials. The internal microstructures were revealed using an argon ion cross-sectioning technique. This technique enabled the quantification of density and interface area of the pores within the pressed bed using methods of stereology. These bed characteristics are compared to the initiation threshold behavior of three HMX based materials using an electric gun based test method. Finally, a comparison of experimental threshold data to supporting theoretical efforts will be made.
Vision-based obstacle recognition system for automated lawn mower robot development
NASA Astrophysics Data System (ADS)
Mohd Zin, Zalhan; Ibrahim, Ratnawati
2011-06-01
Digital image processing techniques (DIP) have been widely used in various types of application recently. Classification and recognition of a specific object using vision system require some challenging tasks in the field of image processing and artificial intelligence. The ability and efficiency of vision system to capture and process the images is very important for any intelligent system such as autonomous robot. This paper gives attention to the development of a vision system that could contribute to the development of an automated vision based lawn mower robot. The works involve on the implementation of DIP techniques to detect and recognize three different types of obstacles that usually exist on a football field. The focus was given on the study on different types and sizes of obstacles, the development of vision based obstacle recognition system and the evaluation of the system's performance. Image processing techniques such as image filtering, segmentation, enhancement and edge detection have been applied in the system. The results have shown that the developed system is able to detect and recognize various types of obstacles on a football field with recognition rate of more 80%.
Classification of air quality using fuzzy synthetic multiplication.
Abdullah, Lazim; Khalid, Noor Dalina
2012-11-01
Proper identification of environment's air quality based on limited observations is an essential task to meet the goals of environmental management. Various classification methods have been used to estimate the change of air quality status and health. However, discrepancies frequently arise from the lack of clear distinction between each air quality, the uncertainty in the quality criteria employed and the vagueness or fuzziness embedded in the decision-making output values. Owing to inherent imprecision, difficulties always exist in some conventional methodologies when describing integrated air quality conditions with respect to various pollutants. Therefore, this paper presents two fuzzy multiplication synthetic techniques to establish classification of air quality. The fuzzy multiplication technique empowers the max-min operations in "or" and "and" in executing the fuzzy arithmetic operations. Based on a set of air pollutants data carbon monoxide, sulfur dioxide, nitrogen dioxide, ozone, and particulate matter (PM(10)) collected from a network of 51 stations in Klang Valley, East Malaysia, Sabah, and Sarawak were utilized in this evaluation. The two fuzzy multiplication techniques consistently classified Malaysia's air quality as "good." The findings indicated that the techniques may have successfully harmonized inherent discrepancies and interpret complex conditions. It was demonstrated that fuzzy synthetic multiplication techniques are quite appropriate techniques for air quality management.
Cell Membrane Coating Nanotechnology.
Fang, Ronnie H; Kroll, Ashley V; Gao, Weiwei; Zhang, Liangfang
2018-06-01
Nanoparticle-based therapeutic, prevention, and detection modalities have the potential to greatly impact how diseases are diagnosed and managed in the clinic. With the wide range of nanomaterials available, the rational design of nanocarriers on an application-specific basis has become increasingly commonplace. Here, a comprehensive overview is provided on an emerging platform: cell-membrane-coating nanotechnology. As a fundamental unit of biology, cells carry out a wide range of functions, including the remarkable ability to interface and interact with their surrounding environment. Instead of attempting to replicate such functions via synthetic techniques, researchers are now directly leveraging naturally derived cell membranes as a means of bestowing nanoparticles with enhanced biointerfacing capabilities. This top-down technique is facile, highly generalizable, and has the potential to greatly augment existing nanocarriers. Further, the introduction of a natural membrane substrate onto nanoparticles surfaces has enabled additional applications beyond those traditionally associated with nanomedicine. Despite its relative youth, there exists an impressive body of literature on cell membrane coating, which is covered here in detail. Overall, there is still significant room for development, as researchers continue to refine existing workflows while finding new and exciting applications that can take advantage of this developing technology. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Structural design using equilibrium programming formulations
NASA Technical Reports Server (NTRS)
Scotti, Stephen J.
1995-01-01
Solutions to increasingly larger structural optimization problems are desired. However, computational resources are strained to meet this need. New methods will be required to solve increasingly larger problems. The present approaches to solving large-scale problems involve approximations for the constraints of structural optimization problems and/or decomposition of the problem into multiple subproblems that can be solved in parallel. An area of game theory, equilibrium programming (also known as noncooperative game theory), can be used to unify these existing approaches from a theoretical point of view (considering the existence and optimality of solutions), and be used as a framework for the development of new methods for solving large-scale optimization problems. Equilibrium programming theory is described, and existing design techniques such as fully stressed design and constraint approximations are shown to fit within its framework. Two new structural design formulations are also derived. The first new formulation is another approximation technique which is a general updating scheme for the sensitivity derivatives of design constraints. The second new formulation uses a substructure-based decomposition of the structure for analysis and sensitivity calculations. Significant computational benefits of the new formulations compared with a conventional method are demonstrated.
Commentary: "re-programming or selecting adult stem cells?".
Trosko, James E
2008-01-01
The recent observations that embryonic stemness-associated genes could assist in the "de-differentiation" of adult skin fibroblast cells to "embryonic-like stem cells", using the "somatic cell nuclear transfer" techniques, have been interpreted as indicating a "re-programming" of genes. These reports have demonstrated a "proof of principle" approach to by-pass many, but not all, of the ethical, scientific and medical limitations of the "therapeutic cloning" of embryonic stem cells from embryos. However, while the interpretation that real "re-programming" of all those somatic fibroblastic differentiation genes might be correct, there does exists an alternative hypothesis of these exciting results. Based on the fact that multipotent adult stem cells exist in most, if not all, adult organs, the possibility exists that all these recent "re-programming" results, using the somatic nuclear transfer techniques, actually were the results of transferred rare nuclear material from the adult stem cells residing in the skin of the mouse, monkey and human samples. An examination of the rationale for this challenging hypothesis has been drawn from the hypothesis of the "stem cell theory of cancer", as well as from the field of human adult stem cells research.
Methods for artifact detection and removal from scalp EEG: A review.
Islam, Md Kafiul; Rastegarnia, Amir; Yang, Zhi
2016-11-01
Electroencephalography (EEG) is the most popular brain activity recording technique used in wide range of applications. One of the commonly faced problems in EEG recordings is the presence of artifacts that come from sources other than brain and contaminate the acquired signals significantly. Therefore, much research over the past 15 years has focused on identifying ways for handling such artifacts in the preprocessing stage. However, this is still an active area of research as no single existing artifact detection/removal method is complete or universal. This article presents an extensive review of the existing state-of-the-art artifact detection and removal methods from scalp EEG for all potential EEG-based applications and analyses the pros and cons of each method. First, a general overview of the different artifact types that are found in scalp EEG and their effect on particular applications are presented. In addition, the methods are compared based on their ability to remove certain types of artifacts and their suitability in relevant applications (only functional comparison is provided not performance evaluation of methods). Finally, the future direction and expected challenges of current research is discussed. Therefore, this review is expected to be helpful for interested researchers who will develop and/or apply artifact handling algorithm/technique in future for their applications as well as for those willing to improve the existing algorithms or propose a new solution in this particular area of research. Copyright © 2016 Elsevier Masson SAS. All rights reserved.
Molecular Strain Typing of Mycobacterium tuberculosis: a Review of Frequently Used Methods
2016-01-01
Tuberculosis, caused by the bacterium Mycobacterium tuberculosis, remains one of the most serious global health problems. Molecular typing of M. tuberculosis has been used for various epidemiologic purposes as well as for clinical management. Currently, many techniques are available to type M. tuberculosis. Choosing the most appropriate technique in accordance with the existing laboratory conditions and the specific features of the geographic region is important. Insertion sequence IS6110-based restriction fragment length polymorphism (RFLP) analysis is considered the gold standard for the molecular epidemiologic investigations of tuberculosis. However, other polymerase chain reaction-based methods such as spacer oligonucleotide typing (spoligotyping), which detects 43 spacer sequence-interspersing direct repeats (DRs) in the genomic DR region; mycobacterial interspersed repetitive units–variable number tandem repeats, (MIRU-VNTR), which determines the number and size of tandem repetitive DNA sequences; repetitive-sequence-based PCR (rep-PCR), which provides high-throughput genotypic fingerprinting of multiple Mycobacterium species; and the recently developed genome-based whole genome sequencing methods demonstrate similar discriminatory power and greater convenience. This review focuses on techniques frequently used for the molecular typing of M. tuberculosis and discusses their general aspects and applications. PMID:27709842
Shape Sensing Techniques for Continuum Robots in Minimally Invasive Surgery: A Survey.
Shi, Chaoyang; Luo, Xiongbiao; Qi, Peng; Li, Tianliang; Song, Shuang; Najdovski, Zoran; Fukuda, Toshio; Ren, Hongliang
2017-08-01
Continuum robots provide inherent structural compliance with high dexterity to access the surgical target sites along tortuous anatomical paths under constrained environments and enable to perform complex and delicate operations through small incisions in minimally invasive surgery. These advantages enable their broad applications with minimal trauma and make challenging clinical procedures possible with miniaturized instrumentation and high curvilinear access capabilities. However, their inherent deformable designs make it difficult to realize 3-D intraoperative real-time shape sensing to accurately model their shape. Solutions to this limitation can lead themselves to further develop closely associated techniques of closed-loop control, path planning, human-robot interaction, and surgical manipulation safety concerns in minimally invasive surgery. Although extensive model-based research that relies on kinematics and mechanics has been performed, accurate shape sensing of continuum robots remains challenging, particularly in cases of unknown and dynamic payloads. This survey investigates the recent advances in alternative emerging techniques for 3-D shape sensing in this field and focuses on the following categories: fiber-optic-sensor-based, electromagnetic-tracking-based, and intraoperative imaging modality-based shape-reconstruction methods. The limitations of existing technologies and prospects of new technologies are also discussed.
Lindsay, Kaitlin E; Rühli, Frank J; Deleon, Valerie Burke
2015-06-01
The technique of forensic facial approximation, or reconstruction, is one of many facets of the field of mummy studies. Although far from a rigorous scientific technique, evidence-based visualization of antemortem appearance may supplement radiological, chemical, histological, and epidemiological studies of ancient remains. Published guidelines exist for creating facial approximations, but few approximations are published with documentation of the specific process and references used. Additionally, significant new research has taken place in recent years which helps define best practices in the field. This case study records the facial approximation of a 3,000-year-old ancient Egyptian woman using medical imaging data and the digital sculpting program, ZBrush. It represents a synthesis of current published techniques based on the most solid anatomical and/or statistical evidence. Through this study, it was found that although certain improvements have been made in developing repeatable, evidence-based guidelines for facial approximation, there are many proposed methods still awaiting confirmation from comprehensive studies. This study attempts to assist artists, anthropologists, and forensic investigators working in facial approximation by presenting the recommended methods in a chronological and usable format. © 2015 Wiley Periodicals, Inc.
Molecular Strain Typing of Mycobacterium tuberculosis: a Review of Frequently Used Methods.
Ei, Phyu Win; Aung, Wah Wah; Lee, Jong Seok; Choi, Go Eun; Chang, Chulhun L
2016-11-01
Tuberculosis, caused by the bacterium Mycobacterium tuberculosis, remains one of the most serious global health problems. Molecular typing of M. tuberculosis has been used for various epidemiologic purposes as well as for clinical management. Currently, many techniques are available to type M. tuberculosis. Choosing the most appropriate technique in accordance with the existing laboratory conditions and the specific features of the geographic region is important. Insertion sequence IS6110-based restriction fragment length polymorphism (RFLP) analysis is considered the gold standard for the molecular epidemiologic investigations of tuberculosis. However, other polymerase chain reaction-based methods such as spacer oligonucleotide typing (spoligotyping), which detects 43 spacer sequence-interspersing direct repeats (DRs) in the genomic DR region; mycobacterial interspersed repetitive units-variable number tandem repeats, (MIRU-VNTR), which determines the number and size of tandem repetitive DNA sequences; repetitive-sequence-based PCR (rep-PCR), which provides high-throughput genotypic fingerprinting of multiple Mycobacterium species; and the recently developed genome-based whole genome sequencing methods demonstrate similar discriminatory power and greater convenience. This review focuses on techniques frequently used for the molecular typing of M. tuberculosis and discusses their general aspects and applications.
Glioma grading using cell nuclei morphologic features in digital pathology images
NASA Astrophysics Data System (ADS)
Reza, Syed M. S.; Iftekharuddin, Khan M.
2016-03-01
This work proposes a computationally efficient cell nuclei morphologic feature analysis technique to characterize the brain gliomas in tissue slide images. In this work, our contributions are two-fold: 1) obtain an optimized cell nuclei segmentation method based on the pros and cons of the existing techniques in literature, 2) extract representative features by k-mean clustering of nuclei morphologic features to include area, perimeter, eccentricity, and major axis length. This clustering based representative feature extraction avoids shortcomings of extensive tile [1] [2] and nuclear score [3] based methods for brain glioma grading in pathology images. Multilayer perceptron (MLP) is used to classify extracted features into two tumor types: glioblastoma multiforme (GBM) and low grade glioma (LGG). Quantitative scores such as precision, recall, and accuracy are obtained using 66 clinical patients' images from The Cancer Genome Atlas (TCGA) [4] dataset. On an average ~94% accuracy from 10 fold crossvalidation confirms the efficacy of the proposed method.
NASA Technical Reports Server (NTRS)
Scott, D. W.
1994-01-01
This report describes efforts to use digital motion video compression technology to develop a highly portable device that would convert 1990-91 era IBM-compatible and/or MacIntosh notebook computers into full-color, motion-video capable multimedia training systems. An architecture was conceived that would permit direct conversion of existing laser-disk-based multimedia courses with little or no reauthoring. The project did not physically demonstrate certain critical video keying techniques, but their implementation should be feasible. This investigation of digital motion video has spawned two significant spaceflight projects at MSFC: one to downlink multiple high-quality video signals from Spacelab, and the other to uplink videoconference-quality video in realtime and high quality video off-line, plus investigate interactive, multimedia-based techniques for enhancing onboard science operations. Other airborne or spaceborne spinoffs are possible.
A multimodal biometric authentication system based on 2D and 3D palmprint features
NASA Astrophysics Data System (ADS)
Aggithaya, Vivek K.; Zhang, David; Luo, Nan
2008-03-01
This paper presents a new personal authentication system that simultaneously exploits 2D and 3D palmprint features. Here, we aim to improve the accuracy and robustness of existing palmprint authentication systems using 3D palmprint features. The proposed system uses an active stereo technique, structured light, to capture 3D image or range data of the palm and a registered intensity image simultaneously. The surface curvature based method is employed to extract features from 3D palmprint and Gabor feature based competitive coding scheme is used for 2D representation. We individually analyze these representations and attempt to combine them with score level fusion technique. Our experiments on a database of 108 subjects achieve significant improvement in performance (Equal Error Rate) with the integration of 3D features as compared to the case when 2D palmprint features alone are employed.
A novel murmur-based heart sound feature extraction technique using envelope-morphological analysis
NASA Astrophysics Data System (ADS)
Yao, Hao-Dong; Ma, Jia-Li; Fu, Bin-Bin; Wang, Hai-Yang; Dong, Ming-Chui
2015-07-01
Auscultation of heart sound (HS) signals serves as an important primary approach to diagnose cardiovascular diseases (CVDs) for centuries. Confronting the intrinsic drawbacks of traditional HS auscultation, computer-aided automatic HS auscultation based on feature extraction technique has witnessed explosive development. Yet, most existing HS feature extraction methods adopt acoustic or time-frequency features which exhibit poor relationship with diagnostic information, thus restricting the performance of further interpretation and analysis. Tackling such a bottleneck problem, this paper innovatively proposes a novel murmur-based HS feature extraction method since murmurs contain massive pathological information and are regarded as the first indications of pathological occurrences of heart valves. Adapting discrete wavelet transform (DWT) and Shannon envelope, the envelope-morphological characteristics of murmurs are obtained and three features are extracted accordingly. Validated by discriminating normal HS and 5 various abnormal HS signals with extracted features, the proposed method provides an attractive candidate in automatic HS auscultation.
Adaptive proxy map server for efficient vector spatial data rendering
NASA Astrophysics Data System (ADS)
Sayar, Ahmet
2013-01-01
The rapid transmission of vector map data over the Internet is becoming a bottleneck of spatial data delivery and visualization in web-based environment because of increasing data amount and limited network bandwidth. In order to improve both the transmission and rendering performances of vector spatial data over the Internet, we propose a proxy map server enabling parallel vector data fetching as well as caching to improve the performance of web-based map servers in a dynamic environment. Proxy map server is placed seamlessly anywhere between the client and the final services, intercepting users' requests. It employs an efficient parallelization technique based on spatial proximity and data density in case distributed replica exists for the same spatial data. The effectiveness of the proposed technique is proved at the end of the article by the application of creating map images enriched with earthquake seismic data records.
Flag-based detection of weak gas signatures in long-wave infrared hyperspectral image sequences
NASA Astrophysics Data System (ADS)
Marrinan, Timothy; Beveridge, J. Ross; Draper, Bruce; Kirby, Michael; Peterson, Chris
2016-05-01
We present a flag manifold based method for detecting chemical plumes in long-wave infrared hyperspectral movies. The method encodes temporal and spatial information related to a hyperspectral pixel into a flag, or nested sequence of linear subspaces. The technique used to create the flags pushes information about the background clutter, ambient conditions, and potential chemical agents into the leading elements of the flags. Exploiting this temporal information allows for a detection algorithm that is sensitive to the presence of weak signals. This method is compared to existing techniques qualitatively on real data and quantitatively on synthetic data to show that the flag-based algorithm consistently performs better on data when the SINRdB is low, and beats the ACE and MF algorithms in probability of detection for low probabilities of false alarm even when the SINRdB is high.
Hard exudates segmentation based on learned initial seeds and iterative graph cut.
Kusakunniran, Worapan; Wu, Qiang; Ritthipravat, Panrasee; Zhang, Jian
2018-05-01
(Background and Objective): The occurrence of hard exudates is one of the early signs of diabetic retinopathy which is one of the leading causes of the blindness. Many patients with diabetic retinopathy lose their vision because of the late detection of the disease. Thus, this paper is to propose a novel method of hard exudates segmentation in retinal images in an automatic way. (Methods): The existing methods are based on either supervised or unsupervised learning techniques. In addition, the learned segmentation models may often cause miss-detection and/or fault-detection of hard exudates, due to the lack of rich characteristics, the intra-variations, and the similarity with other components in the retinal image. Thus, in this paper, the supervised learning based on the multilayer perceptron (MLP) is only used to identify initial seeds with high confidences to be hard exudates. Then, the segmentation is finalized by unsupervised learning based on the iterative graph cut (GC) using clusters of initial seeds. Also, in order to reduce color intra-variations of hard exudates in different retinal images, the color transfer (CT) is applied to normalize their color information, in the pre-processing step. (Results): The experiments and comparisons with the other existing methods are based on the two well-known datasets, e_ophtha EX and DIARETDB1. It can be seen that the proposed method outperforms the other existing methods in the literature, with the sensitivity in the pixel-level of 0.891 for the DIARETDB1 dataset and 0.564 for the e_ophtha EX dataset. The cross datasets validation where the training process is performed on one dataset and the testing process is performed on another dataset is also evaluated in this paper, in order to illustrate the robustness of the proposed method. (Conclusions): This newly proposed method integrates the supervised learning and unsupervised learning based techniques. It achieves the improved performance, when compared with the existing methods in the literature. The robustness of the proposed method for the scenario of cross datasets could enhance its practical usage. That is, the trained model could be more practical for unseen data in the real-world situation, especially when the capturing environments of training and testing images are not the same. Copyright © 2018 Elsevier B.V. All rights reserved.
ANALYSIS OF RADON MITIGATION TECHNIQUES USED IN EXISTING U.S. HOUSES
This paper reviews the full range of techniques that have been installed in existing US houses for the purpose of reducing indoor radon concentrations resulting from soil gas entry. The review addresses the performance, installation and operating costs, applicability, mechanisms,...
NASA Astrophysics Data System (ADS)
Tamboli, Prakash Kumar; Duttagupta, Siddhartha P.; Roy, Kallol
2015-08-01
The paper deals with dynamic compensation of delayed Self Powered Flux Detectors (SPFDs) using discrete time H∞ filtering method for improving the response of SPFDs with significant delayed components such as Platinum and Vanadium SPFD. We also present a comparative study between the Linear Matrix Inequality (LMI) based H∞ filtering and Algebraic Riccati Equation (ARE) based Kalman filtering methods with respect to their delay compensation capabilities. Finally an improved recursive H∞ filter based on the adaptive fading memory technique is proposed which provides an improved performance over existing methods. The existing delay compensation algorithms do not account for the rate of change in the signal for determining the filter gain and therefore add significant noise during the delay compensation process. The proposed adaptive fading memory H∞ filter minimizes the overall noise very effectively at the same time keeps the response time at minimum values. The recursive algorithm is easy to implement in real time as compared to the LMI (or ARE) based solutions.
NASA Astrophysics Data System (ADS)
Pappalardo, Francesco; Pennisi, Marzio
2016-07-01
Fibrosis represents a process where an excessive tissue formation in an organ follows the failure of a physiological reparative or reactive process. Mathematical and computational techniques may be used to improve the understanding of the mechanisms that lead to the disease and to test potential new treatments that may directly or indirectly have positive effects against fibrosis [1]. In this scenario, Ben Amar and Bianca [2] give us a broad picture of the existing mathematical and computational tools that have been used to model fibrotic processes at the molecular, cellular, and tissue levels. Among such techniques, agent based models (ABM) can give a valuable contribution in the understanding and better management of fibrotic diseases.
A Design of Product Collaborative Online Configuration Model
NASA Astrophysics Data System (ADS)
Wang, Xiaoguo; Zheng, Jin; Zeng, Qian
According to the actual needs of mass customization, the personalization of product and its collaborative design, the paper analyzes and studies the working mechanism of modular-based product configuration technology and puts forward an information model of modular product family. Combined with case-based reasoning techniques (CBR) and the constraint satisfaction problem solving techniques (CSP), we design and study the algorithm for product configuration, and analyze its time complexity. A car chassis is made as the application object, we provide a prototype system of online configuration. Taking advantage of this system, designers can make appropriate changes on the existing programs in accordance with the demand. This will accelerate all aspects of product development and shorten the product cycle. Also the system will provide a strong technical support for enterprises to improve their market competitiveness.
Analyzing Activity Behavior and Movement in a Naturalistic Environment using Smart Home Techniques
Cook, Diane J.; Schmitter-Edgecombe, Maureen; Dawadi, Prafulla
2015-01-01
One of the many services that intelligent systems can provide is the ability to analyze the impact of different medical conditions on daily behavior. In this study we use smart home and wearable sensors to collect data while (n=84) older adults perform complex activities of daily living. We analyze the data using machine learning techniques and reveal that differences between healthy older adults and adults with Parkinson disease not only exist in their activity patterns, but that these differences can be automatically recognized. Our machine learning classifiers reach an accuracy of 0.97 with an AUC value of 0.97 in distinguishing these groups. Our permutation-based testing confirms that the sensor-based differences between these groups are statistically significant. PMID:26259225
Analyzing Activity Behavior and Movement in a Naturalistic Environment Using Smart Home Techniques.
Cook, Diane J; Schmitter-Edgecombe, Maureen; Dawadi, Prafulla
2015-11-01
One of the many services that intelligent systems can provide is the ability to analyze the impact of different medical conditions on daily behavior. In this study, we use smart home and wearable sensors to collect data, while ( n = 84) older adults perform complex activities of daily living. We analyze the data using machine learning techniques and reveal that differences between healthy older adults and adults with Parkinson disease not only exist in their activity patterns, but that these differences can be automatically recognized. Our machine learning classifiers reach an accuracy of 0.97 with an area under the ROC curve value of 0.97 in distinguishing these groups. Our permutation-based testing confirms that the sensor-based differences between these groups are statistically significant.
Recent Advances in Paper-Based Sensors
Liana, Devi D.; Raguse, Burkhard; Gooding, J. Justin; Chow, Edith
2012-01-01
Paper-based sensors are a new alternative technology for fabricating simple, low-cost, portable and disposable analytical devices for many application areas including clinical diagnosis, food quality control and environmental monitoring. The unique properties of paper which allow passive liquid transport and compatibility with chemicals/biochemicals are the main advantages of using paper as a sensing platform. Depending on the main goal to be achieved in paper-based sensors, the fabrication methods and the analysis techniques can be tuned to fulfill the needs of the end-user. Current paper-based sensors are focused on microfluidic delivery of solution to the detection site whereas more advanced designs involve complex 3-D geometries based on the same microfluidic principles. Although paper-based sensors are very promising, they still suffer from certain limitations such as accuracy and sensitivity. However, it is anticipated that in the future, with advances in fabrication and analytical techniques, that there will be more new and innovative developments in paper-based sensors. These sensors could better meet the current objectives of a viable low-cost and portable device in addition to offering high sensitivity and selectivity, and multiple analyte discrimination. This paper is a review of recent advances in paper-based sensors and covers the following topics: existing fabrication techniques, analytical methods and application areas. Finally, the present challenges and future outlooks are discussed. PMID:23112667
Ibrahim, El-Sayed H; Stojanovska, Jadranka; Hassanein, Azza; Duvernoy, Claire; Croisille, Pierre; Pop-Busui, Rodica; Swanson, Scott D
2018-05-16
Cardiac MRI tagging is a valuable technique for evaluating regional heart function. Currently, there are a number of different techniques for analyzing the tagged images. Specifically, k-space-based analysis techniques showed to be much faster than image-based techniques, where harmonic-phase (HARP) and sine-wave modeling (SinMod) stand as two famous techniques of the former group, which are frequently used in clinical studies. In this study, we compared HARP and SinMod and studied inter-observer variability between the two techniques for evaluating myocardial strain and apical-to-base torsion in numerical phantom, nine healthy controls, and thirty diabetic patients. Based on the ground-truth numerical phantom measurements (strain = -20% and rotation angle = -4.4°), HARP and SinMod resulted in overestimation (in absolute value terms) of strain by 1% and 5% (strain values), and of rotation angle by 0.4° and 2.0°, respectively. For the in-vivo results, global strain and torsion ranges were -10.6 to -35.3% and 1.8-12.7°/cm in patients, and -17.8 to -32.7% and 1.8-12.3°/cm in volunteers. On average, SinMod overestimated strain measurements by 5.7% and 5.9% (strain values) in the patients and volunteers, respectively, compared to HARP, and overestimated torsion measurements by 2.9°/cm and 2.5°/cm in the patients and volunteers, respectively, compared to HARP. Location-wise, the ranges for basal, mid-ventricular, and apical strain in patients (volunteers) were -8.4 to -31.5% (-11.6 to -33.3%), -6.3 to -37.2% (-17.8 to -33.3%), and -5.2 to -38.4% (-20.0 to -33.2%), respectively. SinMod overestimated strain in the basal, mid-ventricular, and apical slices by 4.7% (5.7%), 5.9% (5.5%), and 8.9% (6.8%), respectively, compared to HARP in the patients (volunteers). Nevertheless, there existed good correlation between the HARP and SinMod measurements. Finally, there were no significant strain or torsion measurement differences between patients and volunteers. There existed good inter-observer agreement, as all measurement differences lied within the Bland-Altman ± 2 standard-deviation (SD) difference limits. In conclusion, despite the consistency of the results by either HARP or SinMod and acceptable agreement of the generated strain and torsion patterns by both techniques, SinMod systematically overestimated the measurements compared to HARP. Under current operating conditions, the measurements from HARP and SinMod cannot be used interchangeably. Copyright © 2017. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Harman, Philip V.; Flack, Julien; Fox, Simon; Dowley, Mark
2002-05-01
The conversion of existing 2D images to 3D is proving commercially viable and fulfills the growing need for high quality stereoscopic images. This approach is particularly effective when creating content for the new generation of autostereoscopic displays that require multiple stereo images. The dominant technique for such content conversion is to develop a depth map for each frame of 2D material. The use of a depth map as part of the 2D to 3D conversion process has a number of desirable characteristics: 1. The resolution of the depth may be lower than that of the associated 2D image. 2. It can be highly compressed. 3. 2D compatibility is maintained. 4. Real time generation of stereo, or multiple stereo pairs, is possible. The main disadvantage has been the laborious nature of the manual conversion techniques used to create depth maps from existing 2D images, which results in a slow and costly process. An alternative, highly productive technique has been developed based upon the use of Machine Leaning Algorithm (MLAs). This paper describes the application of MLAs to the generation of depth maps and presents the results of the commercial application of this approach.
Global Design Optimization for Aerodynamics and Rocket Propulsion Components
NASA Technical Reports Server (NTRS)
Shyy, Wei; Papila, Nilay; Vaidyanathan, Rajkumar; Tucker, Kevin; Turner, James E. (Technical Monitor)
2000-01-01
Modern computational and experimental tools for aerodynamics and propulsion applications have matured to a stage where they can provide substantial insight into engineering processes involving fluid flows, and can be fruitfully utilized to help improve the design of practical devices. In particular, rapid and continuous development in aerospace engineering demands that new design concepts be regularly proposed to meet goals for increased performance, robustness and safety while concurrently decreasing cost. To date, the majority of the effort in design optimization of fluid dynamics has relied on gradient-based search algorithms. Global optimization methods can utilize the information collected from various sources and by different tools. These methods offer multi-criterion optimization, handle the existence of multiple design points and trade-offs via insight into the entire design space, can easily perform tasks in parallel, and are often effective in filtering the noise intrinsic to numerical and experimental data. However, a successful application of the global optimization method needs to address issues related to data requirements with an increase in the number of design variables, and methods for predicting the model performance. In this article, we review recent progress made in establishing suitable global optimization techniques employing neural network and polynomial-based response surface methodologies. Issues addressed include techniques for construction of the response surface, design of experiment techniques for supplying information in an economical manner, optimization procedures and multi-level techniques, and assessment of relative performance between polynomials and neural networks. Examples drawn from wing aerodynamics, turbulent diffuser flows, gas-gas injectors, and supersonic turbines are employed to help demonstrate the issues involved in an engineering design context. Both the usefulness of the existing knowledge to aid current design practices and the need for future research are identified.
Wind-Tunnel Investigations of Blunt-Body Drag Reduction Using Forebody Surface Roughness
NASA Technical Reports Server (NTRS)
Whitmore, Stephen A.; Sprague, Stephanie; Naughton, Jonathan W.; Curry, Robert E. (Technical Monitor)
2001-01-01
This paper presents results of wind-tunnel tests that demonstrate a novel drag reduction technique for blunt-based vehicles. For these tests, the forebody roughness of a blunt-based model was modified using micomachined surface overlays. As forebody roughness increases, boundary layer at the model aft thickens and reduces the shearing effect of external flow on the separated flow behind the base region, resulting in reduced base drag. For vehicle configurations with large base drag, existing data predict that a small increment in forebody friction drag will result in a relatively large decrease in base drag. If the added increment in forebody skin drag is optimized with respect to base drag, reducing the total drag of the configuration is possible. The wind-tunnel tests results conclusively demonstrate the existence of a forebody dragbase drag optimal point. The data demonstrate that the base drag coefficient corresponding to the drag minimum lies between 0.225 and 0.275, referenced to the base area. Most importantly, the data show a drag reduction of approximately 15% when the drag optimum is reached. When this drag reduction is scaled to the X-33 base area, drag savings approaching 45,000 N (10,000 lbf) can be realized.
Ben Chaabane, Salim; Fnaiech, Farhat
2014-01-23
Color image segmentation has been so far applied in many areas; hence, recently many different techniques have been developed and proposed. In the medical imaging area, the image segmentation may be helpful to provide assistance to doctor in order to follow-up the disease of a certain patient from the breast cancer processed images. The main objective of this work is to rebuild and also to enhance each cell from the three component images provided by an input image. Indeed, from an initial segmentation obtained using the statistical features and histogram threshold techniques, the resulting segmentation may represent accurately the non complete and pasted cells and enhance them. This allows real help to doctors, and consequently, these cells become clear and easy to be counted. A novel method for color edges extraction based on statistical features and automatic threshold is presented. The traditional edge detector, based on the first and the second order neighborhood, describing the relationship between the current pixel and its neighbors, is extended to the statistical domain. Hence, color edges in an image are obtained by combining the statistical features and the automatic threshold techniques. Finally, on the obtained color edges with specific primitive color, a combination rule is used to integrate the edge results over the three color components. Breast cancer cell images were used to evaluate the performance of the proposed method both quantitatively and qualitatively. Hence, a visual and a numerical assessment based on the probability of correct classification (PC), the false classification (Pf), and the classification accuracy (Sens(%)) are presented and compared with existing techniques. The proposed method shows its superiority in the detection of points which really belong to the cells, and also the facility of counting the number of the processed cells. Computer simulations highlight that the proposed method substantially enhances the segmented image with smaller error rates better than other existing algorithms under the same settings (patterns and parameters). Moreover, it provides high classification accuracy, reaching the rate of 97.94%. Additionally, the segmentation method may be extended to other medical imaging types having similar properties.
Texas two-step: a framework for optimal multi-input single-output deconvolution.
Neelamani, Ramesh; Deffenbaugh, Max; Baraniuk, Richard G
2007-11-01
Multi-input single-output deconvolution (MISO-D) aims to extract a deblurred estimate of a target signal from several blurred and noisy observations. This paper develops a new two step framework--Texas Two-Step--to solve MISO-D problems with known blurs. Texas Two-Step first reduces the MISO-D problem to a related single-input single-output deconvolution (SISO-D) problem by invoking the concept of sufficient statistics (SSs) and then solves the simpler SISO-D problem using an appropriate technique. The two-step framework enables new MISO-D techniques (both optimal and suboptimal) based on the rich suite of existing SISO-D techniques. In fact, the properties of SSs imply that a MISO-D algorithm is mean-squared-error optimal if and only if it can be rearranged to conform to the Texas Two-Step framework. Using this insight, we construct new wavelet- and curvelet-based MISO-D algorithms with asymptotically optimal performance. Simulated and real data experiments verify that the framework is indeed effective.
A force-based, parallel assay for the quantification of protein-DNA interactions.
Limmer, Katja; Pippig, Diana A; Aschenbrenner, Daniela; Gaub, Hermann E
2014-01-01
Analysis of transcription factor binding to DNA sequences is of utmost importance to understand the intricate regulatory mechanisms that underlie gene expression. Several techniques exist that quantify DNA-protein affinity, but they are either very time-consuming or suffer from possible misinterpretation due to complicated algorithms or approximations like many high-throughput techniques. We present a more direct method to quantify DNA-protein interaction in a force-based assay. In contrast to single-molecule force spectroscopy, our technique, the Molecular Force Assay (MFA), parallelizes force measurements so that it can test one or multiple proteins against several DNA sequences in a single experiment. The interaction strength is quantified by comparison to the well-defined rupture stability of different DNA duplexes. As a proof-of-principle, we measured the interaction of the zinc finger construct Zif268/NRE against six different DNA constructs. We could show the specificity of our approach and quantify the strength of the protein-DNA interaction.
Generalized image contrast enhancement technique based on Heinemann contrast discrimination model
NASA Astrophysics Data System (ADS)
Liu, Hong; Nodine, Calvin F.
1994-03-01
This paper presents a generalized image contrast enhancement technique which equalizes perceived brightness based on the Heinemann contrast discrimination model. This is a modified algorithm which presents an improvement over the previous study by Mokrane in its mathematically proven existence of a unique solution and in its easily tunable parameterization. The model uses a log-log representation of contrast luminosity between targets and the surround in a fixed luminosity background setting. The algorithm consists of two nonlinear gray-scale mapping functions which have seven parameters, two of which are adjustable Heinemann constants. Another parameter is the background gray level. The remaining four parameters are nonlinear functions of gray scale distribution of the image, and can be uniquely determined once the previous three are given. Tests have been carried out to examine the effectiveness of the algorithm for increasing the overall contrast of images. It can be demonstrated that the generalized algorithm provides better contrast enhancement than histogram equalization. In fact, the histogram equalization technique is a special case of the proposed mapping.
Dense and dynamic 3D selection for game-based virtual environments.
Cashion, Jeffrey; Wingrave, Chadwick; LaViola, Joseph J
2012-04-01
3D object selection is more demanding when, 1) objects densly surround the target object, 2) the target object is significantly occluded, and 3) when the target object is dynamically changing location. Most 3D selection techniques and guidelines were developed and tested on static or mostly sparse environments. In contrast, games tend to incorporate densly packed and dynamic objects as part of their typical interaction. With the increasing popularity of 3D selection in games using hand gestures or motion controllers, our current understanding of 3D selection needs revision. We present a study that compared four different selection techniques under five different scenarios based on varying object density and motion dynamics. We utilized two existing techniques, Raycasting and SQUAD, and developed two variations of them, Zoom and Expand, using iterative design. Our results indicate that while Raycasting and SQUAD both have weaknesses in terms of speed and accuracy in dense and dynamic environments, by making small modifications to them (i.e., flavoring), we can achieve significant performance increases.
Study of Automated Module Fabrication for Lightweight Solar Blanket Utilization
NASA Technical Reports Server (NTRS)
Gibson, C. E.
1979-01-01
Cost-effective automated techniques for accomplishing the titled purpose; based on existing in-house capability are described. As a measure of the considered automation, the production of a 50 kilowatt solar array blanket, exclusive of support and deployment structure, within an eight-month fabrication period was used. Solar cells considered for this blanket were 2 x 4 x .02 cm wrap-around cells, 2 x 2 x .005 cm and 3 x 3 x .005 cm standard bar contact thin cells, all welded contacts. Existing fabrication processes are described, the rationale for each process is discussed, and the capability for further automation is discussed.
Vibration Based Crack Detection in a Rotating Disk. Part 2; Experimental Results
NASA Technical Reports Server (NTRS)
Gyekenyesi, Andrew L.; Sawicki, Jerzy T.; Martin, Richard E.; Haase, Wayne C.; Baaklini, George
2005-01-01
This paper describes the experimental results concerning the detection of a crack in a rotating disk. The goal was to utilize blade tip clearance and shaft vibration measurements to monitor changes in the system's center of mass and/or blade deformation behaviors. The concept of the approach is based on the fact that the development of a disk crack results in a distorted strain field within the component. As a result, a minute deformation in the disk's geometry as well as a change in the system's center of mass occurs. Here, a notch was used to simulate an actual crack. The vibration based experimental results failed to identify the existence of a notch when utilizing the approach described above, even with a rather large, circumferential notch (l.2 in.) located approximately mid-span on the disk (disk radius = 4.63 in. with notch at r = 2.12 in.). This was somewhat expected, since the finite element based results in Part 1 of this study predicted changes in blade tip clearance as well as center of mass shifts due to a notch to be less than 0.001 in. Therefore, the small changes incurred by the notch could not be differentiated from the mechanical and electrical noise of the rotor system. Although the crack detection technique of interest failed to identify the existence ofthe notch, the vibration data produced and captured here will be utilized in upcoming studies that will focus on different data mining techniques concerning damage detection in a disk.
Optimization technique for problems with an inequality constraint
NASA Technical Reports Server (NTRS)
Russell, K. J.
1972-01-01
General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.
Mindfulness Meditation for Fibromyalgia: Mechanistic and Clinical Considerations.
Adler-Neal, Adrienne L; Zeidan, Fadel
2017-09-01
Fibromyalgia is a disorder characterized by widespread pain and a spectrum of psychological comorbidities, rendering treatment difficult and often a financial burden. Fibromyalgia is a complicated chronic pain condition that requires a multimodal therapeutic approach to optimize treatment efficacy. Thus, it has been postulated that mind-body techniques may prove fruitful in treating fibromyalgia. Mindfulness meditation, a behavioral technique premised on non-reactive sensory awareness, attenuates pain and improves mental health outcomes. However, the impact of mindfulness meditation on fibromyalgia-related outcomes has not been comprehensively characterized. The present review delineates the existing evidence supporting the effectiveness and hypothesized mechanisms of mindfulness meditation in treating fibromyalgia-related outcomes. Mindfulness-based interventions premised on cultivating acceptance, non-attachment, and social engagement may be most effective in decreasing fibromyalgia-related pain and psychological symptoms. Mindfulness-based therapies may alleviate fibromyalgia-related outcomes through multiple neural, psychological, and physiological processes. Mindfulness meditation may provide an effective complementary treatment approach for fibromyalgia patients, especially when combined with other reliable techniques (exercise; cognitive behavioral therapy). However, characterizing the specific analgesic mechanisms supporting mindfulness meditation is a critical step to fostering the clinical validity of this technique. Identification of the specific analgesic mechanisms supporting mindfulness-based pain relief could be utilized to better design behavioral interventions to specifically target fibromyalgia-related outcomes.
Cooperative Opportunistic Pressure Based Routing for Underwater Wireless Sensor Networks.
Javaid, Nadeem; Muhammad; Sher, Arshad; Abdul, Wadood; Niaz, Iftikhar Azim; Almogren, Ahmad; Alamri, Atif
2017-03-19
In this paper, three opportunistic pressure based routing techniques for underwater wireless sensor networks (UWSNs) are proposed. The first one is the cooperative opportunistic pressure based routing protocol (Co-Hydrocast), second technique is the improved Hydrocast (improved-Hydrocast), and third one is the cooperative improved Hydrocast (Co-improved Hydrocast). In order to minimize lengthy routing paths between the source and the destination and to avoid void holes at the sparse networks, sensor nodes are deployed at different strategic locations. The deployment of sensor nodes at strategic locations assure the maximum monitoring of the network field. To conserve the energy consumption and minimize the number of hops, greedy algorithm is used to transmit data packets from the source to the destination. Moreover, the opportunistic routing is also exploited to avoid void regions by making backward transmissions to find reliable path towards the destination in the network. The relay cooperation mechanism is used for reliable data packet delivery, when signal to noise ratio (SNR) of the received signal is not within the predefined threshold then the maximal ratio combining (MRC) is used as a diversity technique to improve the SNR of the received signals at the destination. Extensive simulations validate that our schemes perform better in terms of packet delivery ratio and energy consumption than the existing technique; Hydrocast.
Cooperative Opportunistic Pressure Based Routing for Underwater Wireless Sensor Networks
Javaid, Nadeem; Muhammad; Sher, Arshad; Abdul, Wadood; Niaz, Iftikhar Azim; Almogren, Ahmad; Alamri, Atif
2017-01-01
In this paper, three opportunistic pressure based routing techniques for underwater wireless sensor networks (UWSNs) are proposed. The first one is the cooperative opportunistic pressure based routing protocol (Co-Hydrocast), second technique is the improved Hydrocast (improved-Hydrocast), and third one is the cooperative improved Hydrocast (Co-improved Hydrocast). In order to minimize lengthy routing paths between the source and the destination and to avoid void holes at the sparse networks, sensor nodes are deployed at different strategic locations. The deployment of sensor nodes at strategic locations assure the maximum monitoring of the network field. To conserve the energy consumption and minimize the number of hops, greedy algorithm is used to transmit data packets from the source to the destination. Moreover, the opportunistic routing is also exploited to avoid void regions by making backward transmissions to find reliable path towards the destination in the network. The relay cooperation mechanism is used for reliable data packet delivery, when signal to noise ratio (SNR) of the received signal is not within the predefined threshold then the maximal ratio combining (MRC) is used as a diversity technique to improve the SNR of the received signals at the destination. Extensive simulations validate that our schemes perform better in terms of packet delivery ratio and energy consumption than the existing technique; Hydrocast. PMID:28335494
NASA Technical Reports Server (NTRS)
Sidney, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.;
2014-01-01
The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiralonly signals from compact binary systems with a total mass of equal to or less than 20M solar mass and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor approx. equals 20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor approx. equals 1000 longer processing time.
NASA Astrophysics Data System (ADS)
Sidery, T.; Aylott, B.; Christensen, N.; Farr, B.; Farr, W.; Feroz, F.; Gair, J.; Grover, K.; Graff, P.; Hanna, C.; Kalogera, V.; Mandel, I.; O'Shaughnessy, R.; Pitkin, M.; Price, L.; Raymond, V.; Röver, C.; Singer, L.; van der Sluys, M.; Smith, R. J. E.; Vecchio, A.; Veitch, J.; Vitale, S.
2014-04-01
The problem of reconstructing the sky position of compact binary coalescences detected via gravitational waves is a central one for future observations with the ground-based network of gravitational-wave laser interferometers, such as Advanced LIGO and Advanced Virgo. Different techniques for sky localization have been independently developed. They can be divided in two broad categories: fully coherent Bayesian techniques, which are high latency and aimed at in-depth studies of all the parameters of a source, including sky position, and "triangulation-based" techniques, which exploit the data products from the search stage of the analysis to provide an almost real-time approximation of the posterior probability density function of the sky location of a detection candidate. These techniques have previously been applied to data collected during the last science runs of gravitational-wave detectors operating in the so-called initial configuration. Here, we develop and analyze methods for assessing the self consistency of parameter estimation methods and carrying out fair comparisons between different algorithms, addressing issues of efficiency and optimality. These methods are general, and can be applied to parameter estimation problems other than sky localization. We apply these methods to two existing sky localization techniques representing the two above-mentioned categories, using a set of simulated inspiral-only signals from compact binary systems with a total mass of ≤20M⊙ and nonspinning components. We compare the relative advantages and costs of the two techniques and show that sky location uncertainties are on average a factor ≈20 smaller for fully coherent techniques than for the specific variant of the triangulation-based technique used during the last science runs, at the expense of a factor ≈1000 longer processing time.
Spherical hashing: binary code embedding with hyperspheres.
Heo, Jae-Pil; Lee, Youngwoon; He, Junfeng; Chang, Shih-Fu; Yoon, Sung-Eui
2015-11-01
Many binary code embedding schemes have been actively studied recently, since they can provide efficient similarity search, and compact data representations suitable for handling large scale image databases. Existing binary code embedding techniques encode high-dimensional data by using hyperplane-based hashing functions. In this paper we propose a novel hypersphere-based hashing function, spherical hashing, to map more spatially coherent data points into a binary code compared to hyperplane-based hashing functions. We also propose a new binary code distance function, spherical Hamming distance, tailored for our hypersphere-based binary coding scheme, and design an efficient iterative optimization process to achieve both balanced partitioning for each hash function and independence between hashing functions. Furthermore, we generalize spherical hashing to support various similarity measures defined by kernel functions. Our extensive experiments show that our spherical hashing technique significantly outperforms state-of-the-art techniques based on hyperplanes across various benchmarks with sizes ranging from one to 75 million of GIST, BoW and VLAD descriptors. The performance gains are consistent and large, up to 100 percent improvements over the second best method among tested methods. These results confirm the unique merits of using hyperspheres to encode proximity regions in high-dimensional spaces. Finally, our method is intuitive and easy to implement.
Sparse alignment for robust tensor learning.
Lai, Zhihui; Wong, Wai Keung; Xu, Yong; Zhao, Cairong; Sun, Mingming
2014-10-01
Multilinear/tensor extensions of manifold learning based algorithms have been widely used in computer vision and pattern recognition. This paper first provides a systematic analysis of the multilinear extensions for the most popular methods by using alignment techniques, thereby obtaining a general tensor alignment framework. From this framework, it is easy to show that the manifold learning based tensor learning methods are intrinsically different from the alignment techniques. Based on the alignment framework, a robust tensor learning method called sparse tensor alignment (STA) is then proposed for unsupervised tensor feature extraction. Different from the existing tensor learning methods, L1- and L2-norms are introduced to enhance the robustness in the alignment step of the STA. The advantage of the proposed technique is that the difficulty in selecting the size of the local neighborhood can be avoided in the manifold learning based tensor feature extraction algorithms. Although STA is an unsupervised learning method, the sparsity encodes the discriminative information in the alignment step and provides the robustness of STA. Extensive experiments on the well-known image databases as well as action and hand gesture databases by encoding object images as tensors demonstrate that the proposed STA algorithm gives the most competitive performance when compared with the tensor-based unsupervised learning methods.
Hierarchical clustering of EMD based interest points for road sign detection
NASA Astrophysics Data System (ADS)
Khan, Jesmin; Bhuiyan, Sharif; Adhami, Reza
2014-04-01
This paper presents an automatic road traffic signs detection and recognition system based on hierarchical clustering of interest points and joint transform correlation. The proposed algorithm consists of the three following stages: interest points detection, clustering of those points and similarity search. At the first stage, good discriminative, rotation and scale invariant interest points are selected from the image edges based on the 1-D empirical mode decomposition (EMD). We propose a two-step unsupervised clustering technique, which is adaptive and based on two criterion. In this context, the detected points are initially clustered based on the stable local features related to the brightness and color, which are extracted using Gabor filter. Then points belonging to each partition are reclustered depending on the dispersion of the points in the initial cluster using position feature. This two-step hierarchical clustering yields the possible candidate road signs or the region of interests (ROIs). Finally, a fringe-adjusted joint transform correlation (JTC) technique is used for matching the unknown signs with the existing known reference road signs stored in the database. The presented framework provides a novel way to detect a road sign from the natural scenes and the results demonstrate the efficacy of the proposed technique, which yields a very low false hit rate.
Ghorpade, Uma; Suryawanshi, Mahesh; Shin, Seung Wook; Gurav, Kishor; Patil, Pramod; Pawar, Sambhaji; Hong, Chang Woo; Kim, Jin Hyeok; Kolekar, Sanjay
2014-10-07
With the earth's abundance of kesterite, recent progress in chalcogenide based Cu2ZnSn(Sx,Se1-x)4 (CZTSSe) thin films has drawn prime attention in thin film solar cells (TFSCs) research and development. This review is focused on the current developments in the synthesis of CZTS nanocrystals (NCs) using a hot injection (HI) technique and provides comprehensive discussions on the current status of CZTSSe TFSCs. This article begins with a description of the advantages of nanoparticulate based thin films, and then introduces the basics of this technique and the corresponding growth mechanism is also discussed. A brief overview further addresses a series of investigations on the developments in the HI based CZTSSe NCs using different solvents in terms of their high toxicity to environmentally benign materials. A variety of recipes and techniques for the NCs ink formulation and thereby the preparation of absorber layers using NC inks are outlined, respectively. The deposition of precursor thin films, post-deposition processes such as sulfurization or selenization treatments and the fabrication of CZTSSe NCs based solar cells and their performances are discussed. Finally, we discussed concluding remarks and the perspectives for further developments in the existing research on CZTSSe based nanoparticulate (NP) TFSCs towards future green technology.
Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey
NASA Astrophysics Data System (ADS)
Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin
2018-04-01
Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f-v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.
Frequency-Wavenumber (FK)-Based Data Selection in High-Frequency Passive Surface Wave Survey
NASA Astrophysics Data System (ADS)
Cheng, Feng; Xia, Jianghai; Xu, Zongbo; Hu, Yue; Mi, Binbin
2018-07-01
Passive surface wave methods have gained much attention from geophysical and civil engineering communities because of the limited application of traditional seismic surveys in highly populated urban areas. Considering that they can provide high-frequency phase velocity information up to several tens of Hz, the active surface wave survey would be omitted and the amount of field work could be dramatically reduced. However, the measured dispersion energy image in the passive surface wave survey would usually be polluted by a type of "crossed" artifacts at high frequencies. It is common in the bidirectional noise distribution case with a linear receiver array deployed along roads or railways. We review several frequently used passive surface wave methods and derive the underlying physics for the existence of the "crossed" artifacts. We prove that the "crossed" artifacts would cross the true surface wave energy at fixed points in the f- v domain and propose a FK-based data selection technique to attenuate the artifacts in order to retrieve the high-frequency information. Numerical tests further demonstrate the existence of the "crossed" artifacts and indicate that the well-known wave field separation method, FK filter, does not work for the selection of directional noise data. Real-world applications manifest the feasibility of the proposed FK-based technique to improve passive surface wave methods by a priori data selection. Finally, we discuss the applicability of our approach.
NASA Technical Reports Server (NTRS)
Hague, D. S.; Vanderberg, J. D.; Woodbury, N. W.
1974-01-01
A method for rapidly examining the probable applicability of weight estimating formulae to a specific aerospace vehicle design is presented. The Multivariate Analysis Retrieval and Storage System (MARS) is comprised of three computer programs which sequentially operate on the weight and geometry characteristics of past aerospace vehicles designs. Weight and geometric characteristics are stored in a set of data bases which are fully computerized. Additional data bases are readily added to the MARS system and/or the existing data bases may be easily expanded to include additional vehicles or vehicle characteristics.
Paper-based analytical devices for environmental analysis.
Meredith, Nathan A; Quinn, Casey; Cate, David M; Reilly, Thomas H; Volckens, John; Henry, Charles S
2016-03-21
The field of paper-based microfluidics has experienced rapid growth over the past decade. Microfluidic paper-based analytical devices (μPADs), originally developed for point-of-care medical diagnostics in resource-limited settings, are now being applied in new areas, such as environmental analyses. Low-cost paper sensors show great promise for on-site environmental analysis; the theme of ongoing research complements existing instrumental techniques by providing high spatial and temporal resolution for environmental monitoring. This review highlights recent applications of μPADs for environmental analysis along with technical advances that may enable μPADs to be more widely implemented in field testing.
Research directions in large scale systems and decentralized control
NASA Technical Reports Server (NTRS)
Tenney, R. R.
1980-01-01
Control theory provides a well established framework for dealing with automatic decision problems and a set of techniques for automatic decision making which exploit special structure, but it does not deal well with complexity. The potential exists for combining control theoretic and knowledge based concepts into a unified approach. The elements of control theory are diagrammed, including modern control and large scale systems.
ERIC Educational Resources Information Center
Miller, David; Moran, Teresa
2007-01-01
There are differences of opinion about self-esteem enhancement in the classroom; these differences exist at both conceptual and practical levels. The aim of this study was to ascertain whether techniques employed by primary school teachers as a day-to-day part of their teaching can have measurable effects on the self-esteem of their pupils. Two…
Poor Man's Virtual Camera: Real-Time Simultaneous Matting and Camera Pose Estimation.
Szentandrasi, Istvan; Dubska, Marketa; Zacharias, Michal; Herout, Adam
2016-03-18
Today's film and advertisement production heavily uses computer graphics combined with living actors by chromakeying. The matchmoving process typically takes a considerable manual effort. Semi-automatic matchmoving tools exist as well, but they still work offline and require manual check-up and correction. In this article, we propose an instant matchmoving solution for green screen. It uses a recent technique of planar uniform marker fields. Our technique can be used in indie and professional filmmaking as a cheap and ultramobile virtual camera, and for shot prototyping and storyboard creation. The matchmoving technique based on marker fields of shades of green is very computationally efficient: we developed and present in the article a mobile application running at 33 FPS. Our technique is thus available to anyone with a smartphone at low cost and with easy setup, opening space for new levels of filmmakers' creative expression.
The influence of surface finishing methods on touch-sensitive reactions
NASA Astrophysics Data System (ADS)
Kukhta, M. S.; Sokolov, A. P.; Krauinsh, P. Y.; Kozlova, A. D.; Bouchard, C.
2017-02-01
This paper describes the modern technological development trends in jewelry design. In the jewelry industry, new trends, associated with the introduction of updated non-traditional materials and finishing techniques, are appearing. The existing information-oriented society enhances the visual aesthetics of new jewelry forms, decoration techniques (depth and surface), synthesis of different materials, which, all in all, reveal a bias towards positive effects of visual design. Today, the jewelry industry includes not only traditional techniques, but also such improved techniques as computer-assisted design, 3D-prototyping and other alternatives to produce an updated level of jewelry material processing. The authors present the specific features of ornamental pattern designing, decoration types (depth and surface) and comparative analysis of different approaches in surface finishing. Identifying the appearance or the effect of jewelry is based on proposed evaluation criteria, providing an advanced visual aesthetics basis is predicated on touch-sensitive responses.
A quantitative investigation of the fracture pump-in/flowback test
DOE Office of Scientific and Technical Information (OSTI.GOV)
Plahn, S.V.; Nolte, K.G.; Thompson, L.G.
1997-02-01
Fracture-closure pressure is an important parameter for fracture treatment design and evaluation. The pump-in/flowback (PIFB) test is frequently used to estimate its magnitude. The test is attractive because bottomhole pressures (BHP`s) during flowback develop a distinct and repeatable signature. This is in contrast to the pump-in/shut-in test, where strong indications of fracture closure are rarely seen. Various techniques are used to extract closure pressure from the flowback-pressure response. Unfortunately, these techniques give different estimates for closure pressure, and their theoretical bases are not well established. The authors present results that place the PIFB test on a firmer foundation. A numericalmore » model is used to simulate the PIFB test and glean physical mechanisms contributing to the response. On the basis of their simulation results, they propose interpretation techniques that give better estimates of closure pressure than existing techniques.« less
Procedural mishaps with trephine-based intraosseous anesthesia.
Small, Joel C; Witherspoon, David E; Regan, John D; Hall, Ellen
2011-01-01
Failure to achieve profound anesthesia during dental treatment can be a significant problem for dental clinicians, especially for endodontic procedures on teeth in the mandibular arch with irreversible pulpitis. A number of supplemental local anesthesia techniques exist, the most effective of which may be the intraosseous injection. Two cases are presented demonstrating the dangers associated with the use of the intraosseous anesthesia technique. While the technique can provide profound anesthesia in otherwise difficult to anesthetize cases, care must be taken during its administration. Both cases show the damage done to the root and overlying bone by the injudicious use of the trephine. It is incumbent on the clinician to fully consider the anatomy in the area prior to insertion of the trephine. Intraosseous anesthesia techniques are a valuable addition to the clinicians' armamentarium. However careless administration can result in problems of endodontic or periodontal nature that may be difficult to rectify.
Counterflow Dielectrophoresis for Trypanosome Enrichment and Detection in Blood
NASA Astrophysics Data System (ADS)
Menachery, Anoop; Kremer, Clemens; Wong, Pui E.; Carlsson, Allan; Neale, Steven L.; Barrett, Michael P.; Cooper, Jonathan M.
2012-10-01
Human African trypanosomiasis or sleeping sickness is a deadly disease endemic in sub-Saharan Africa, caused by single-celled protozoan parasites. Although it has been targeted for elimination by 2020, this will only be realized if diagnosis can be improved to enable identification and treatment of afflicted patients. Existing techniques of detection are restricted by their limited field-applicability, sensitivity and capacity for automation. Microfluidic-based technologies offer the potential for highly sensitive automated devices that could achieve detection at the lowest levels of parasitemia and consequently help in the elimination programme. In this work we implement an electrokinetic technique for the separation of trypanosomes from both mouse and human blood. This technique utilises differences in polarisability between the blood cells and trypanosomes to achieve separation through opposed bi-directional movement (cell counterflow). We combine this enrichment technique with an automated image analysis detection algorithm, negating the need for a human operator.
Objective fitting of hemoglobin dynamics in traumatic bruises based on temperature depth profiling
NASA Astrophysics Data System (ADS)
Vidovič, Luka; Milanič, Matija; Majaron, Boris
2014-02-01
Pulsed photothermal radiometry (PPTR) allows noninvasive measurement of laser-induced temperature depth profiles. The obtained profiles provide information on depth distribution of absorbing chromophores, such as melanin and hemoglobin. We apply this technique to objectively characterize mass diffusion and decomposition rate of extravasated hemoglobin during the bruise healing process. In present study, we introduce objective fitting of PPTR data obtained over the course of the bruise healing process. By applying Monte Carlo simulation of laser energy deposition and simulation of the corresponding PPTR signal, quantitative analysis of underlying bruise healing processes is possible. Introduction of objective fitting enables an objective comparison between the simulated and experimental PPTR signals. In this manner, we avoid reconstruction of laser-induced depth profiles and thus inherent loss of information in the process. This approach enables us to determine the value of hemoglobin mass diffusivity, which is controversial in existing literature. Such information will be a valuable addition to existing bruise age determination techniques.
Arefin, Md Shamsul
2012-01-01
This work presents a technique for the chirality (n, m) assignment of semiconducting single wall carbon nanotubes by solving a set of empirical equations of the tight binding model parameters. The empirical equations of the nearest neighbor hopping parameters, relating the term (2n− m) with the first and second optical transition energies of the semiconducting single wall carbon nanotubes, are also proposed. They provide almost the same level of accuracy for lower and higher diameter nanotubes. An algorithm is presented to determine the chiral index (n, m) of any unknown semiconducting tube by solving these empirical equations using values of radial breathing mode frequency and the first or second optical transition energy from resonant Raman spectroscopy. In this paper, the chirality of 55 semiconducting nanotubes is assigned using the first and second optical transition energies. Unlike the existing methods of chirality assignment, this technique does not require graphical comparison or pattern recognition between existing experimental and theoretical Kataura plot. PMID:28348319
NASA Technical Reports Server (NTRS)
Hardy, E. E.; Skaley, J. E.; Phillips, E. S.
1974-01-01
This investigation was to develop a low cost, manual technique for enhancing ERTS-1 imagery and preparing it in suitable format for use by users with wide and varied interests related to land use and natural resources information. The goals were: to develop enhancement techniques based on concepts and practices extant in photographic sciences, to provide a means of allowing productive interpretation of the imagery by manual means, to produce a product at low cost, to provide a product that would have wide applications, and one compatible with existing information systems. Cost of preparation of the photographically enhanced, enlarged negatives and positives and the diazo materials is about 1 cent per square mile. Cost of creating and mapping a land use classification of twelve use types at a scale of 1:250,000 is only $1 per square mile. The product is understood by users, is economical, and is compatible with existing information systems.
NASA Astrophysics Data System (ADS)
Riah, Zoheir; Sommet, Raphael; Nallatamby, Jean C.; Prigent, Michel; Obregon, Juan
2004-05-01
We present in this paper a set of coherent tools for noise characterization and physics-based analysis of noise in semiconductor devices. This noise toolbox relies on a low frequency noise measurement setup with special high current capabilities thanks to an accurate and original calibration. It relies also on a simulation tool based on the drift diffusion equations and the linear perturbation theory, associated with the Green's function technique. This physics-based noise simulator has been implemented successfully in the Scilab environment and is specifically dedicated to HBTs. Some results are given and compared to those existing in the literature.
Severity-Based Adaptation with Limited Data for ASR to Aid Dysarthric Speakers
Mustafa, Mumtaz Begum; Salim, Siti Salwah; Mohamed, Noraini; Al-Qatab, Bassam; Siong, Chng Eng
2014-01-01
Automatic speech recognition (ASR) is currently used in many assistive technologies, such as helping individuals with speech impairment in their communication ability. One challenge in ASR for speech-impaired individuals is the difficulty in obtaining a good speech database of impaired speakers for building an effective speech acoustic model. Because there are very few existing databases of impaired speech, which are also limited in size, the obvious solution to build a speech acoustic model of impaired speech is by employing adaptation techniques. However, issues that have not been addressed in existing studies in the area of adaptation for speech impairment are as follows: (1) identifying the most effective adaptation technique for impaired speech; and (2) the use of suitable source models to build an effective impaired-speech acoustic model. This research investigates the above-mentioned two issues on dysarthria, a type of speech impairment affecting millions of people. We applied both unimpaired and impaired speech as the source model with well-known adaptation techniques like the maximum likelihood linear regression (MLLR) and the constrained-MLLR(C-MLLR). The recognition accuracy of each impaired speech acoustic model is measured in terms of word error rate (WER), with further assessments, including phoneme insertion, substitution and deletion rates. Unimpaired speech when combined with limited high-quality speech-impaired data improves performance of ASR systems in recognising severely impaired dysarthric speech. The C-MLLR adaptation technique was also found to be better than MLLR in recognising mildly and moderately impaired speech based on the statistical analysis of the WER. It was found that phoneme substitution was the biggest contributing factor in WER in dysarthric speech for all levels of severity. The results show that the speech acoustic models derived from suitable adaptation techniques improve the performance of ASR systems in recognising impaired speech with limited adaptation data. PMID:24466004
Directional virtual backbone based data aggregation scheme for Wireless Visual Sensor Networks.
Zhang, Jing; Liu, Shi-Jian; Tsai, Pei-Wei; Zou, Fu-Min; Ji, Xiao-Rong
2018-01-01
Data gathering is a fundamental task in Wireless Visual Sensor Networks (WVSNs). Features of directional antennas and the visual data make WVSNs more complex than the conventional Wireless Sensor Network (WSN). The virtual backbone is a technique, which is capable of constructing clusters. The version associating with the aggregation operation is also referred to as the virtual backbone tree. In most of the existing literature, the main focus is on the efficiency brought by the construction of clusters that the existing methods neglect local-balance problems in general. To fill up this gap, Directional Virtual Backbone based Data Aggregation Scheme (DVBDAS) for the WVSNs is proposed in this paper. In addition, a measurement called the energy consumption density is proposed for evaluating the adequacy of results in the cluster-based construction problems. Moreover, the directional virtual backbone construction scheme is proposed by considering the local-balanced factor. Furthermore, the associated network coding mechanism is utilized to construct DVBDAS. Finally, both the theoretical analysis of the proposed DVBDAS and the simulations are given for evaluating the performance. The experimental results prove that the proposed DVBDAS achieves higher performance in terms of both the energy preservation and the network lifetime extension than the existing methods.
USAF Flight Test Investigation of Focused Sonic Booms: Project Have Bears
NASA Technical Reports Server (NTRS)
Downing, Micah; Zamot, Noel; Moss, Chris; Morin, Daniel; Wolski, Ed; Chung, Sukhwan; Plotkin, Kenneth; Maglieri, Domenic
1996-01-01
Supersonic operations from military aircraft generate sonic booms that can affect people, animals and structures. A substantial experimental data base exists on sonic booms for aircraft in steady flight and confidence in the predictive techniques has been established. All the focus sonic boom data that are in existence today were collected during the 60's and 70's as part of the information base to the US Supersonic Transport program and the French Jericho studies for the Concorde. These experiments formed the data base to develop sonic boom propagation and prediction theories for focusing. There is a renewed interest in high-speed transports for civilian application. Moreover, today's fighter aircraft have better performance capabilities, and supersonic flights ars more common during air combat maneuvers. Most of the existing data on focus booms are related to high-speed civil operations such as transitional linear accelerations and mild turns. However, military aircraft operating in training areas perform more drastic maneuvers such as dives and high-g turns. An update and confirmation of USAF prediction capabilities is required to demonstrate the ability to predict and control sonic boom impacts, especially those produced by air combat maneuvers.
NASA Astrophysics Data System (ADS)
Fagbohun, B. J.; Aladejana, O. O.
2016-09-01
A major challenge in most growing urban areas of developing countries, without a pre-existing land use plan is the sustainable and efficient management of solid wastes. Siting a landfill is a complicated task because of several environmental regulations. This challenge gives birth to the need to develop efficient strategies for the selection of proper waste disposal sites in accordance with all existing environmental regulations. This paper presents a knowledge-based multi-criteria decision analysis using GIS for the selection of suitable landfill site in Ado-Ekiti, Nigeria. In order to identify suitable sites for landfill, seven factors - land use/cover, geology, river, soil, slope, lineament and roads - were taken into consideration. Each factor was classified and ranked based on prior knowledge about the area and existing guidelines. Weights for each factor were determined through pair-wise comparison using Saaty's 9 point scale and AHP. The integration of factors according to their weights using weighted index overlay analysis revealed that 39.23 km2 within the area was suitable to site a landfill. The resulting suitable area was classified as high suitability covering 6.47 km2 (16.49%), moderate suitability 25.48 km2 (64.95%) and low suitability 7.28 km2 (18.56%) based on their overall weights.
Gap Analysis and Conservation Network for Freshwater Wetlands in Central Yangtze Ecoregion
Xiaowen, Li; Haijin, Zhuge; Li, Mengdi
2013-01-01
The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces. PMID:24062632
Gap analysis and conservation network for freshwater wetlands in Central Yangtze Ecoregion.
Xiaowen, Li; Haijin, Zhuge; Li, Mengdi
2013-01-01
The Central Yangtze Ecoregion contains a large area of internationally important freshwater wetlands and supports a huge number of endangered waterbirds; however, these unique wetlands and the biodiversity they support are under the constant threats of human development pressures, and the prevailing conservation strategies generated based on the local scale cannot adequately be used as guidelines for ecoregion-based conservation initiatives for Central Yangtze at the broad scale. This paper aims at establishing and optimizing an ecological network for freshwater wetland conservation in the Central Yangtze Ecoregion based on large-scale gap analysis. A group of focal species and GIS-based extrapolation technique were employed to identify the potential habitats and conservation gaps, and the optimized conservation network was then established by combining existing protective system and identified conservation gaps. Our results show that only 23.49% of the potential habitats of the focal species have been included in the existing nature reserves in the Central Yangtze Ecoregion. To effectively conserve over 80% of the potential habitats for the focal species by optimizing the existing conservation network for the freshwater wetlands in Central Yangtze Ecoregion, it is necessary to establish new wetland nature reserves in 22 county units across Hubei, Anhui, and Jiangxi provinces.
Sukhdeo, David S; Nam, Donguk; Kang, Ju-Hyung; Brongersma, Mark L; Saraswat, Krishna C
2015-06-29
Strain engineering has proven to be vital for germanium-based photonics, in particular light emission. However, applying a large permanent biaxial tensile strain to germanium has been a challenge. We present a simple, CMOS-compatible technique to conveniently induce a large, spatially homogenous strain in circular structures patterned within germanium nanomembranes. Our technique works by concentrating and amplifying a pre-existing small strain into a circular region. Biaxial tensile strains as large as 1.11% are observed by Raman spectroscopy and are further confirmed by photoluminescence measurements, which show enhanced and redshifted light emission from the strained germanium. Our technique allows the amount of biaxial strain to be customized lithographically, allowing the bandgaps of different germanium structures to be independently customized in a single mask process.
Artificial intelligence and signal processing for infrastructure assessment
NASA Astrophysics Data System (ADS)
Assaleh, Khaled; Shanableh, Tamer; Yehia, Sherif
2015-04-01
The Ground Penetrating Radar (GPR) is being recognized as an effective nondestructive evaluation technique to improve the inspection process. However, data interpretation and complexity of the results impose some limitations on the practicality of using this technique. This is mainly due to the need of a trained experienced person to interpret images obtained by the GPR system. In this paper, an algorithm to classify and assess the condition of infrastructures utilizing image processing and pattern recognition techniques is discussed. Features extracted form a dataset of images of defected and healthy slabs are used to train a computer vision based system while another dataset is used to evaluate the proposed algorithm. Initial results show that the proposed algorithm is able to detect the existence of defects with about 77% success rate.
Single-Molecule Three-Color FRET with Both Negligible Spectral Overlap and Long Observation Time
Hohng, Sungchul
2010-01-01
Full understanding of complex biological interactions frequently requires multi-color detection capability in doing single-molecule fluorescence resonance energy transfer (FRET) experiments. Existing single-molecule three-color FRET techniques, however, suffer from severe photobleaching of Alexa 488, or its alternative dyes, and have been limitedly used for kinetics studies. In this work, we developed a single-molecule three-color FRET technique based on the Cy3-Cy5-Cy7 dye trio, thus providing enhanced observation time and improved data quality. Because the absorption spectra of three fluorophores are well separated, real-time monitoring of three FRET efficiencies was possible by incorporating the alternating laser excitation (ALEX) technique both in confocal microscopy and in total-internal-reflection fluorescence (TIRF) microscopy. PMID:20808851
Two-dimensional fringe probing of transient liquid temperatures in a mini space.
Xue, Zhenlan; Qiu, Huihe
2011-05-01
A 2D fringe probing transient temperature measurement technique based on photothermal deflection theory was developed. It utilizes material's refractive index dependence on temperature gradient to obtain temperature information from laser deflection. Instead of single beam, this method applies multiple laser beams to obtain 2D temperature information. The laser fringe was generated with a Mach-Zehnder interferometer. A transient heating experiment was conducted using an electric wire to demonstrate this technique. Temperature field around a heating wire and variation with time was obtained utilizing the scattering fringe patterns. This technique provides non-invasive 2D temperature measurements with spatial and temporal resolutions of 3.5 μm and 4 ms, respectively. It is possible to achieve temporal resolution to 500 μs utilizing the existing high speed camera.
Surgery for obstructed defecation syndrome-is there an ideal technique
Riss, Stefan; Stift, Anton
2015-01-01
Obstructive defecation syndrome (ODS) is a common disorder with a considerable impact on the quality of life of affected patients. Surgery for ODS remains a challenging topic. There exists a great variety of operative techniques to treat patients with ODS. According to the surgeon’s preference the approach can be transanal, transvaginal, transperineal or transabdominal. All techniques have its advantages and disadvantages. Notably, high evidence based studies are significantly lacking in literature, thus making accurate assessments difficult. Careful patient’s selection is crucial to achieve optimal functional results. It is mandatory to assess not only defecation disorders but also evaluate overall pelvic floor symptoms, such as fecal incontinence and urinary disorders for choosing an appropriate and tailored strategy. Radiological investigation is essential but may not explain complaints of every patient. PMID:25574075
Jenson, Susan K.; Trautwein, C.M.
1984-01-01
The application of an unsupervised, spatially dependent clustering technique (AMOEBA) to interpolated raster arrays of stream sediment data has been found to provide useful multivariate geochemical associations for modeling regional polymetallic resource potential. The technique is based on three assumptions regarding the compositional and spatial relationships of stream sediment data and their regional significance. These assumptions are: (1) compositionally separable classes exist and can be statistically distinguished; (2) the classification of multivariate data should minimize the pair probability of misclustering to establish useful compositional associations; and (3) a compositionally defined class represented by three or more contiguous cells within an array is a more important descriptor of a terrane than a class represented by spatial outliers.
The Role of a Physical Analysis Laboratory in a 300 mm IC Development and Manufacturing Centre
NASA Astrophysics Data System (ADS)
Kwakman, L. F. Tz.; Bicais-Lepinay, N.; Courtas, S.; Delille, D.; Juhel, M.; Trouiller, C.; Wyon, C.; de la Bardonnie, M.; Lorut, F.; Ross, R.
2005-09-01
To remain competitive IC manufacturers have to accelerate the development of most advanced (CMOS) technology and to deliver high yielding products with best cycle times and at a competitive pricing. With the increase of technology complexity, also the need for physical characterization support increases, however many of the existing techniques are no longer adequate to effectively support the 65-45 nm technology node developments. New and improved techniques are definitely needed to better characterize the often marginal processes, but these should not significantly impact fabrication costs or cycle time. Hence, characterization and metrology challenges in state-of-the-art IC manufacturing are both of technical and economical nature. TEM microscopy is needed for high quality, high volume analytical support but several physical and practical hurdles have to be taken. The success rate of FIB-SEM based failure analysis drops as defects often are too small to be detected and fault isolation becomes more difficult in the nano-scale device structures. To remain effective and efficient, SEM and OBIRCH techniques have to be improved or complemented with other more effective methods. Chemical analysis of novel materials and critical interfaces requires improvements in the field of e.g. SIMS, ToF-SIMS. Techniques that previously were only used sporadically, like EBSD and XRD, have become a `must' to properly support backend process development. At the bright side, thanks to major technical advances, techniques that previously were practiced at laboratory level only now can be used effectively for at-line fab metrology: Voltage Contrast based defectivity control, XPS based gate dielectric metrology and XRD based control of copper metallization processes are practical examples. In this paper capabilities and shortcomings of several techniques and corresponding equipment are presented with practical illustrations of use in our Crolles facilities.
Al-Sadi, A M; Al-Mazroui, S S; Phillips, A J L
2015-08-01
Potting media and organic fertilizers (OFs) are commonly used in agricultural systems. However, there is a lack of studies on the efficiency of culture-based techniques in assessing the level of fungal diversity in these products. A study was conducted to investigate the efficiency of seven culture-based techniques and pyrosequencing for characterizing fungal diversity in potting media and OFs. Fungal diversity was evaluated using serial dilution, direct plating and baiting with carrot slices, potato slices, radish seeds, cucumber seeds and cucumber cotyledons. Identity of all the isolates was confirmed on the basis of the internal transcribed spacer region of the ribosomal RNA (ITS rRNA) sequence data. The direct plating technique was found to be superior over other culture-based techniques in the number of fungal species detected. It was also found to be simple and the least time consuming technique. Comparing the efficiency of direct plating with 454 pyrosequencing revealed that pyrosequencing detected 12 and 15 times more fungal species from potting media and OFs respectively. Analysis revealed that there were differences between potting media and OFs in the dominant phyla, classes, orders, families, genera and species detected. Zygomycota (52%) and Chytridiomycota (60%) were the predominant phyla in potting media and OFs respectively. The superiority of pyrosequencing over cultural methods could be related to the ability to detect obligate fungi, slow growing fungi and fungi that exist at low population densities. The evaluated methods in this study, especially direct plating and pyrosequencing, may be used as tools to help detect and reduce movement of unwanted fungi between countries and regions. © 2015 The Society for Applied Microbiology.
Electromigration kinetics and critical current of Pb-free interconnects
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lu, Minhua; Rosenberg, Robert
2014-04-07
Electromigration kinetics of Pb-free solder bump interconnects have been studied using a single bump parameter sweep technique. By removing bump to bump variations in structure, texture, and composition, the single bump sweep technique has provided both activation energy and power exponents that reflect atomic migration and interface reactions with fewer samples, shorter stress time, and better statistics than standard failure testing procedures. Contact metallurgies based on Cu and Ni have been studied. Critical current, which corresponds to the Blech limit, was found to exist in the Ni metallurgy, but not in the Cu metallurgy. A temperature dependence of critical currentmore » was also observed.« less
Perceptual distortion analysis of color image VQ-based coding
NASA Astrophysics Data System (ADS)
Charrier, Christophe; Knoblauch, Kenneth; Cherifi, Hocine
1997-04-01
It is generally accepted that a RGB color image can be easily encoded by using a gray-scale compression technique on each of the three color planes. Such an approach, however, fails to take into account correlations existing between color planes and perceptual factors. We evaluated several linear and non-linear color spaces, some introduced by the CIE, compressed with the vector quantization technique for minimum perceptual distortion. To study these distortions, we measured contrast and luminance of the video framebuffer, to precisely control color. We then obtained psychophysical judgements to measure how well these methods work to minimize perceptual distortion in a variety of color space.
Sandia Higher Order Elements (SHOE) v 0.5 alpha
DOE Office of Scientific and Technical Information (OSTI.GOV)
2013-09-24
SHOE is research code for characterizing and visualizing higher-order finite elements; it contains a framework for defining classes of interpolation techniques and element shapes; methods for interpolating triangular, quadrilateral, tetrahedral, and hexahedral cells using Lagrange and Legendre polynomial bases of arbitrary order; methods to decompose each element into domains of constant gradient flow (using a polynomial solver to identify critical points); and an isocontouring technique that uses this decomposition to guarantee topological correctness. Please note that this is an alpha release of research software and that some time has passed since it was actively developed; build- and run-time issues likelymore » exist.« less
Applications of digital image processing techniques to problems of data registration and correlation
NASA Technical Reports Server (NTRS)
Green, W. B.
1978-01-01
An overview is presented of the evolution of the computer configuration at JPL's Image Processing Laboratory (IPL). The development of techniques for the geometric transformation of digital imagery is discussed and consideration is given to automated and semiautomated image registration, and the registration of imaging and nonimaging data. The increasing complexity of image processing tasks at IPL is illustrated with examples of various applications from the planetary program and earth resources activities. It is noted that the registration of existing geocoded data bases with Landsat imagery will continue to be important if the Landsat data is to be of genuine use to the user community.
A method for digital image registration using a mathematical programming technique
NASA Technical Reports Server (NTRS)
Yao, S. S.
1973-01-01
A new algorithm based on a nonlinear programming technique to correct the geometrical distortions of one digital image with respect to another is discussed. This algorithm promises to be superior to existing ones in that it is capable of treating localized differential scaling, translational and rotational errors over the whole image plane. A series of piece-wise 'rubber-sheet' approximations are used, constrained in such a manner that a smooth approximation over the entire image can be obtained. The theoretical derivation is included. The result of using the algorithm to register four channel S065 Apollo IX digitized photography over Imperial Valley, California, is discussed in detail.
Magnetic separation techniques in sample preparation for biological analysis: a review.
He, Jincan; Huang, Meiying; Wang, Dongmei; Zhang, Zhuomin; Li, Gongke
2014-12-01
Sample preparation is a fundamental and essential step in almost all the analytical procedures, especially for the analysis of complex samples like biological and environmental samples. In past decades, with advantages of superparamagnetic property, good biocompatibility and high binding capacity, functionalized magnetic materials have been widely applied in various processes of sample preparation for biological analysis. In this paper, the recent advancements of magnetic separation techniques based on magnetic materials in the field of sample preparation for biological analysis were reviewed. The strategy of magnetic separation techniques was summarized. The synthesis, stabilization and bio-functionalization of magnetic nanoparticles were reviewed in detail. Characterization of magnetic materials was also summarized. Moreover, the applications of magnetic separation techniques for the enrichment of protein, nucleic acid, cell, bioactive compound and immobilization of enzyme were described. Finally, the existed problems and possible trends of magnetic separation techniques for biological analysis in the future were proposed. Copyright © 2014 Elsevier B.V. All rights reserved.
This technical guidance document is designed to aid in the selection, design, installation and operation of indoor radon reduction techniques using soil depressurization in existing houses. Its emphasis is on active soil depressurization; i.e., on systems that use a fan to depre...
NASA Astrophysics Data System (ADS)
Chiu, L.; Vongsaard, J.; El-Ghazawi, T.; Weinman, J.; Yang, R.; Kafatos, M.
U Due to the poor temporal sampling by satellites, data gaps exist in satellite derived time series of precipitation. This poses a challenge for assimilating rain- fall data into forecast models. To yield a continuous time series, the classic image processing technique of digital image morphing has been used. However, the digital morphing technique was applied manually and that is time consuming. In order to avoid human intervention in the process, an automatic procedure for image morphing is needed for real-time operations. For this purpose, Genetic Algorithm Based Image Registration Automatic Morphing (GRAM) model was developed and tested in this paper. Specifically, automatic morphing technique was integrated with Genetic Algo- rithm and Feature Based Image Metamorphosis technique to fill in data gaps between satellite coverage. The technique was tested using NOWRAD data which are gener- ated from the network of NEXRAD radars. Time series of NOWRAD data from storm Floyd that occurred at the US eastern region on September 16, 1999 for 00:00, 01:00, 02:00,03:00, and 04:00am were used. The GRAM technique was applied to data col- lected at 00:00 and 04:00am. These images were also manually morphed. Images at 01:00, 02:00 and 03:00am were interpolated from the GRAM and manual morphing and compared with the original NOWRAD rainrates. The results show that the GRAM technique outperforms manual morphing. The correlation coefficients between the im- ages generated using manual morphing are 0.905, 0.900, and 0.905 for the images at 01:00, 02:00,and 03:00 am, while the corresponding correlation coefficients are 0.946, 0.911, and 0.913, respectively, based on the GRAM technique. Index terms Remote Sensing, Image Registration, Hydrology, Genetic Algorithm, Morphing, NEXRAD
New GMO regulations for old: Determining a new future for EU crop biotechnology.
Davison, John; Ammann, Klaus
2017-01-02
In this review, current EU GMO regulations are subjected to a point-by point analysis to determine their suitability for agriculture in modern Europe. Our analysis concerns present GMO regulations as well as suggestions for possible new regulations for genome editing and New Breeding Techniques (for which no regulations presently exist). Firstly, the present GMO regulations stem from the early days of recombinant DNA and are not adapted to current scientific understanding on this subject. Scientific understanding of GMOs has changed and these regulations are now, not only unfit for their original purpose, but, the purpose itself is now no longer scientifically valid. Indeed, they defy scientific, economic, and even common, sense. A major EU regulatory preconception is that GM crops are basically different from their parent crops. Thus, the EU regulations are "process based" regulations that discriminate against GMOs simply because they are GMOs. However current scientific evidence shows a blending of classical crops and their GMO counterparts with no clear demarcation line between them. Canada has a "product based" approach and determines the safety of each new crop variety independently of the process used to obtain it. We advise that the EC re-writes it outdated regulations and moves toward such a product based approach. Secondly, over the last few years new genomic editing techniques (sometimes called New Breeding Techniques) have evolved. These techniques are basically mutagenesis techniques that can generate genomic diversity and have vast potential for crop improvement. They are not GMO based techniques (any more than mutagenesis is a GMO technique), since in many cases no new DNA is introduced. Thus they cannot simply be lumped together with GMOs (as many anti-GMO NGOs would prefer). The EU currently has no regulations to cover these new techniques. In this review, we make suggestions as to how these new gene edited crops may be regulated. The EU is at a turning point where the wrong decision could destroy European agricultural competitively for decades to come.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gibson, Adam; Piquette, Kathryn E.; Bergmann, Uwe
Ancient Egyptian mummies were often covered with an outer casing, panels and masks made from cartonnage: a lightweight material made from linen, plaster, and recycled papyrus held together with adhesive. Egyptologists, papyrologists, and historians aim to recover and read extant text on the papyrus contained within cartonnage layers, but some methods, such as dissolving mummy casings, are destructive. The use of an advanced range of different imaging modalities was investigated to test the feasibility of non-destructive approaches applied to multi-layered papyrus found in ancient Egyptian mummy cartonnage. Eight different techniques were compared by imaging four synthetic phantoms designed to providemore » robust, well-understood, yet relevant sample standards using modern papyrus and replica inks. The techniques include optical (multispectral imaging with reflection and transillumination, and optical coherence tomography), X-ray (X-ray fluorescence imaging, X-ray fluorescence spectroscopy, X-ray micro computed tomography and phase contrast X-ray) and terahertz-based approaches. Optical imaging techniques were able to detect inks on all four phantoms, but were unable to significantly penetrate papyrus. X-ray-based techniques were sensitive to iron-based inks with excellent penetration but were not able to detect carbon-based inks. However, using terahertz imaging, it was possible to detect carbon-based inks with good penetration but with less sensitivity to iron-based inks. The phantoms allowed reliable and repeatable tests to be made at multiple sites on three continents. Finally, the tests demonstrated that each imaging modality needs to be optimised for this particular application: it is, in general, not sufficient to repurpose an existing device without modification. Furthermore, it is likely that no single imaging technique will to be able to robustly detect and enable the reading of text within ancient Egyptian mummy cartonnage. However, by carefully selecting, optimising and combining techniques, text contained within these fragile and rare artefacts may eventually be open to non-destructive imaging, identification, and interpretation.« less
Gibson, Adam; Piquette, Kathryn E.; Bergmann, Uwe; ...
2018-02-26
Ancient Egyptian mummies were often covered with an outer casing, panels and masks made from cartonnage: a lightweight material made from linen, plaster, and recycled papyrus held together with adhesive. Egyptologists, papyrologists, and historians aim to recover and read extant text on the papyrus contained within cartonnage layers, but some methods, such as dissolving mummy casings, are destructive. The use of an advanced range of different imaging modalities was investigated to test the feasibility of non-destructive approaches applied to multi-layered papyrus found in ancient Egyptian mummy cartonnage. Eight different techniques were compared by imaging four synthetic phantoms designed to providemore » robust, well-understood, yet relevant sample standards using modern papyrus and replica inks. The techniques include optical (multispectral imaging with reflection and transillumination, and optical coherence tomography), X-ray (X-ray fluorescence imaging, X-ray fluorescence spectroscopy, X-ray micro computed tomography and phase contrast X-ray) and terahertz-based approaches. Optical imaging techniques were able to detect inks on all four phantoms, but were unable to significantly penetrate papyrus. X-ray-based techniques were sensitive to iron-based inks with excellent penetration but were not able to detect carbon-based inks. However, using terahertz imaging, it was possible to detect carbon-based inks with good penetration but with less sensitivity to iron-based inks. The phantoms allowed reliable and repeatable tests to be made at multiple sites on three continents. Finally, the tests demonstrated that each imaging modality needs to be optimised for this particular application: it is, in general, not sufficient to repurpose an existing device without modification. Furthermore, it is likely that no single imaging technique will to be able to robustly detect and enable the reading of text within ancient Egyptian mummy cartonnage. However, by carefully selecting, optimising and combining techniques, text contained within these fragile and rare artefacts may eventually be open to non-destructive imaging, identification, and interpretation.« less
An Intelligent Content Discovery Technique for Health Portal Content Management
2014-01-01
Background Continuous content management of health information portals is a feature vital for its sustainability and widespread acceptance. Knowledge and experience of a domain expert is essential for content management in the health domain. The rate of generation of online health resources is exponential and thereby manual examination for relevance to a specific topic and audience is a formidable challenge for domain experts. Intelligent content discovery for effective content management is a less researched topic. An existing expert-endorsed content repository can provide the necessary leverage to automatically identify relevant resources and evaluate qualitative metrics. Objective This paper reports on the design research towards an intelligent technique for automated content discovery and ranking for health information portals. The proposed technique aims to improve efficiency of the current mostly manual process of portal content management by utilising an existing expert-endorsed content repository as a supporting base and a benchmark to evaluate the suitability of new content Methods A model for content management was established based on a field study of potential users. The proposed technique is integral to this content management model and executes in several phases (ie, query construction, content search, text analytics and fuzzy multi-criteria ranking). The construction of multi-dimensional search queries with input from Wordnet, the use of multi-word and single-word terms as representative semantics for text analytics and the use of fuzzy multi-criteria ranking for subjective evaluation of quality metrics are original contributions reported in this paper. Results The feasibility of the proposed technique was examined with experiments conducted on an actual health information portal, the BCKOnline portal. Both intermediary and final results generated by the technique are presented in the paper and these help to establish benefits of the technique and its contribution towards effective content management. Conclusions The prevalence of large numbers of online health resources is a key obstacle for domain experts involved in content management of health information portals and websites. The proposed technique has proven successful at search and identification of resources and the measurement of their relevance. It can be used to support the domain expert in content management and thereby ensure the health portal is up-to-date and current. PMID:25654440
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aziz, H. M. Abdul; Ukkusuri, Satish V.
We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less
Aziz, H. M. Abdul; Ukkusuri, Satish V.
2017-06-29
We present that EPA-MOVES (Motor Vehicle Emission Simulator) is often integrated with traffic simulators to assess emission levels of large-scale urban networks with signalized intersections. High variations in speed profiles exist in the context of congested urban networks with signalized intersections. The traditional average-speed-based emission estimation technique with EPA-MOVES provides faster execution while underestimates the emissions in most cases because of ignoring the speed variation at congested networks with signalized intersections. In contrast, the atomic second-by-second speed profile (i.e., the trajectory of each vehicle)-based technique provides accurate emissions at the cost of excessive computational power and time. We addressed thismore » issue by developing a novel method to determine the link-driving-schedules (LDSs) for the EPA-MOVES tool. Our research developed a hierarchical clustering technique with dynamic time warping similarity measures (HC-DTW) to find the LDS for EPA-MOVES that is capable of producing emission estimates better than the average-speed-based technique with execution time faster than the atomic speed profile approach. We applied the HC-DTW on a sample data from a signalized corridor and found that HC-DTW can significantly reduce computational time without compromising the accuracy. The developed technique in this research can substantially contribute to the EPA-MOVES-based emission estimation process for large-scale urban transportation network by reducing the computational time with reasonably accurate estimates. This method is highly appropriate for transportation networks with higher variation in speed such as signalized intersections. Lastly, experimental results show error difference ranging from 2% to 8% for most pollutants except PM 10.« less
Schellenberg, Florian; Oberhofer, Katja; Taylor, William R.
2015-01-01
Background. Knowledge of the musculoskeletal loading conditions during strength training is essential for performance monitoring, injury prevention, rehabilitation, and training design. However, measuring muscle forces during exercise performance as a primary determinant of training efficacy and safety has remained challenging. Methods. In this paper we review existing computational techniques to determine muscle forces in the lower limbs during strength exercises in vivo and discuss their potential for uptake into sports training and rehabilitation. Results. Muscle forces during exercise performance have almost exclusively been analysed using so-called forward dynamics simulations, inverse dynamics techniques, or alternative methods. Musculoskeletal models based on forward dynamics analyses have led to considerable new insights into muscular coordination, strength, and power during dynamic ballistic movement activities, resulting in, for example, improved techniques for optimal performance of the squat jump, while quasi-static inverse dynamics optimisation and EMG-driven modelling have helped to provide an understanding of low-speed exercises. Conclusion. The present review introduces the different computational techniques and outlines their advantages and disadvantages for the informed usage by nonexperts. With sufficient validation and widespread application, muscle force calculations during strength exercises in vivo are expected to provide biomechanically based evidence for clinicians and therapists to evaluate and improve training guidelines. PMID:26417378
Schellenberg, Florian; Oberhofer, Katja; Taylor, William R; Lorenzetti, Silvio
2015-01-01
Knowledge of the musculoskeletal loading conditions during strength training is essential for performance monitoring, injury prevention, rehabilitation, and training design. However, measuring muscle forces during exercise performance as a primary determinant of training efficacy and safety has remained challenging. In this paper we review existing computational techniques to determine muscle forces in the lower limbs during strength exercises in vivo and discuss their potential for uptake into sports training and rehabilitation. Muscle forces during exercise performance have almost exclusively been analysed using so-called forward dynamics simulations, inverse dynamics techniques, or alternative methods. Musculoskeletal models based on forward dynamics analyses have led to considerable new insights into muscular coordination, strength, and power during dynamic ballistic movement activities, resulting in, for example, improved techniques for optimal performance of the squat jump, while quasi-static inverse dynamics optimisation and EMG-driven modelling have helped to provide an understanding of low-speed exercises. The present review introduces the different computational techniques and outlines their advantages and disadvantages for the informed usage by nonexperts. With sufficient validation and widespread application, muscle force calculations during strength exercises in vivo are expected to provide biomechanically based evidence for clinicians and therapists to evaluate and improve training guidelines.
Optimization of Surfactant Mixtures and Their Interfacial Behavior for Advanced Oil Recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somasundaran, Prof. P.
2002-03-04
The objective of this project was to develop a knowledge base that is helpful for the design of improved processes for mobilizing and producing oil left untapped using conventional techniques. The main goal was to develop and evaluate mixtures of new or modified surfactants for improved oil recovery. In this regard, interfacial properties of novel biodegradable n-alkyl pyrrolidones and sugar-based surfactants have been studied systematically. Emphasis was on designing cost-effective processes compatible with existing conditions and operations in addition to ensuring minimal reagent loss.
A Third-Party E-payment Protocol Based on Quantum Multi-proxy Blind Signature
NASA Astrophysics Data System (ADS)
Niu, Xu-Feng; Zhang, Jian-Zhong; Xie, Shu-Cui; Chen, Bu-Qing
2018-05-01
A third-party E-payment protocol is presented in this paper. It is based on quantum multi-proxy blind signature. Adopting the techniques of quantum key distribution, one-time pad and quantum multi-proxy blind signature, our third-party E-payment system could protect user's anonymity as the traditional E-payment systems do, and also have unconditional security which the classical E-payment systems can not provide. Furthermore, compared with the existing quantum E-payment systems, the proposed system could support the E-payment which using the third-party platforms.
An E-payment system based on quantum group signature
NASA Astrophysics Data System (ADS)
Xiaojun, Wen
2010-12-01
Security and anonymity are essential to E-payment systems. However, existing E-payment systems will easily be broken into soon with the emergence of quantum computers. In this paper, we propose an E-payment system based on quantum group signature. In contrast to classical E-payment systems, our quantum E-payment system can protect not only the users' anonymity but also the inner structure of customer groups. Because of adopting the two techniques of quantum key distribution, a one-time pad and quantum group signature, unconditional security of our E-payment system is guaranteed.
Damage assessment in multilayered MEMS structures under thermal fatigue
NASA Astrophysics Data System (ADS)
Maligno, A. R.; Whalley, D. C.; Silberschmidt, V. V.
2011-07-01
This paper reports on the application of a Physics of Failure (PoF) methodology to assessing the reliability of a micro electro mechanical system (MEMS). Numerical simulations, based on the finite element method (FEM) using a sub-domain approach was used to examine the damage onset due to temperature variations (e.g. yielding of metals which may lead to thermal fatigue). In this work remeshing techniques were employed in order to develop a damage tolerance approach based on the assumption that initial flaws exist in the multi-layered.
New adhesives and bonding techniques. Why and when?
Scotti, Nicola; Cavalli, Giovanni; Gagliani, Massimo; Breschi, Lorenzo
Nowadays, adhesive dentistry is a fundamental part of daily clinical work. The evolution of adhesive materials and techniques has been based on the need for simplicity in the step-by-step procedures to obtain long-lasting direct and indirect restorations. For this reason, recently introduced universal multimode adhesives represent a simple option for creating a hybrid layer, with or without the use of phosphoric acid application. However, it is important to understand the limitations of this latest generation of adhesive systems as well as how to use them on coronal and radicular dentin. Based on the findings in the literature, universal multimode adhesives have shown promising results, even if the problem of hybrid layer degradation due to the hydrolytic activity of matrix metalloproteinases (MMPs) still exists. Studies are therefore required to help us understand how to reduce this degradation.
Fiber-MZI-based FBG sensor interrogation: comparative study with a CCD spectrometer.
Das, Bhargab; Chandra, Vikash
2016-10-10
We present an experimental comparative study of the two most commonly used fiber Bragg grating (FBG) sensor interrogation techniques: a charge-coupled device (CCD) spectrometer and a fiber Mach-Zehnder interferometer (F-MZI). Although the interferometric interrogation technique is historically known to offer the highest sensitivity measurements, very little information exists regarding how it compares with the current commercially available spectral-characteristics-based interrogation systems. It is experimentally established here that the performance of a modern-day CCD spectrometer interrogator is very close to a F-MZI interrogator with the capability of measuring Bragg wavelength shifts with sub-picometer-level accuracy. The results presented in this research study can further be used as a guideline for choosing between the two FBG sensor interrogator types for small-amplitude dynamic perturbation measurements down to nano-level strain.
NASA Astrophysics Data System (ADS)
Zheng, J.; Zhu, J.; Wang, Z.; Fang, F.; Pain, C. C.; Xiang, J.
2015-06-01
A new anisotropic hr-adaptive mesh technique has been applied to modelling of multiscale transport phenomena, which is based on a discontinuous Galerkin/control volume discretization on unstructured meshes. Over existing air quality models typically based on static-structured grids using a locally nesting technique, the advantage of the anisotropic hr-adaptive model has the ability to adapt the mesh according to the evolving pollutant distribution and flow features. That is, the mesh resolution can be adjusted dynamically to simulate the pollutant transport process accurately and effectively. To illustrate the capability of the anisotropic adaptive unstructured mesh model, three benchmark numerical experiments have been setup for two-dimensional (2-D) transport phenomena. Comparisons have been made between the results obtained using uniform resolution meshes and anisotropic adaptive resolution meshes.
Electron spin resonance of nitrogen-vacancy centers in optically trapped nanodiamonds
Horowitz, Viva R.; Alemán, Benjamín J.; Christle, David J.; Cleland, Andrew N.; Awschalom, David D.
2012-01-01
Using an optical tweezers apparatus, we demonstrate three-dimensional control of nanodiamonds in solution with simultaneous readout of ground-state electron-spin resonance (ESR) transitions in an ensemble of diamond nitrogen-vacancy color centers. Despite the motion and random orientation of nitrogen-vacancy centers suspended in the optical trap, we observe distinct peaks in the measured ESR spectra qualitatively similar to the same measurement in bulk. Accounting for the random dynamics, we model the ESR spectra observed in an externally applied magnetic field to enable dc magnetometry in solution. We estimate the dc magnetic field sensitivity based on variations in ESR line shapes to be approximately . This technique may provide a pathway for spin-based magnetic, electric, and thermal sensing in fluidic environments and biophysical systems inaccessible to existing scanning probe techniques. PMID:22869706
Computer Aided Diagnostic Support System for Skin Cancer: A Review of Techniques and Algorithms
Masood, Ammara; Al-Jumaily, Adel Ali
2013-01-01
Image-based computer aided diagnosis systems have significant potential for screening and early detection of malignant melanoma. We review the state of the art in these systems and examine current practices, problems, and prospects of image acquisition, pre-processing, segmentation, feature extraction and selection, and classification of dermoscopic images. This paper reports statistics and results from the most important implementations reported to date. We compared the performance of several classifiers specifically developed for skin lesion diagnosis and discussed the corresponding findings. Whenever available, indication of various conditions that affect the technique's performance is reported. We suggest a framework for comparative assessment of skin cancer diagnostic models and review the results based on these models. The deficiencies in some of the existing studies are highlighted and suggestions for future research are provided. PMID:24575126
Biomedical named entity extraction: some issues of corpus compatibilities.
Ekbal, Asif; Saha, Sriparna; Sikdar, Utpal Kumar
2013-01-01
Named Entity (NE) extraction is one of the most fundamental and important tasks in biomedical information extraction. It involves identification of certain entities from text and their classification into some predefined categories. In the biomedical community, there is yet no general consensus regarding named entity (NE) annotation; thus, it is very difficult to compare the existing systems due to corpus incompatibilities. Due to this problem we can not also exploit the advantages of using different corpora together. In our present work we address the issues of corpus compatibilities, and use a single objective optimization (SOO) based classifier ensemble technique that uses the search capability of genetic algorithm (GA) for NE extraction in biomedicine. We hypothesize that the reliability of predictions of each classifier differs among the various output classes. We use Conditional Random Field (CRF) and Support Vector Machine (SVM) frameworks to build a number of models depending upon the various representations of the set of features and/or feature templates. It is to be noted that we tried to extract the features without using any deep domain knowledge and/or resources. In order to assess the challenges of corpus compatibilities, we experiment with the different benchmark datasets and their various combinations. Comparison results with the existing approaches prove the efficacy of the used technique. GA based ensemble achieves around 2% performance improvements over the individual classifiers. Degradation in performance on the integrated corpus clearly shows the difficulties of the task. In summary, our used ensemble based approach attains the state-of-the-art performance levels for entity extraction in three different kinds of biomedical datasets. The possible reasons behind the better performance in our used approach are the (i). use of variety and rich features as described in Subsection "Features for named entity extraction"; (ii) use of GA based classifier ensemble technique to combine the outputs of multiple classifiers.
NASA Technical Reports Server (NTRS)
Lih, Shyh-Shiuh; Bar-Cohen, Yoseph; Lee, Hyeong Jae; Takano, Nobuyuki; Bao, Xiaoqi
2013-01-01
An advanced signal processing methodology is being developed to monitor the height of condensed water thru the wall of a steel pipe while operating at temperatures as high as 250deg. Using existing techniques, previous study indicated that, when the water height is low or there is disturbance in the environment, the predicted water height may not be accurate. In recent years, the use of the autocorrelation and envelope techniques in the signal processing has been demonstrated to be a very useful tool for practical applications. In this paper, various signal processing techniques including the auto correlation, Hilbert transform, and the Shannon Energy Envelope methods were studied and implemented to determine the water height in the steam pipe. The results have shown that the developed method provides a good capability for monitoring the height in the regular conditions. An alternative solution for shallow water or no water conditions based on a developed hybrid method based on Hilbert transform (HT) with a high pass filter and using the optimized windowing technique is suggested. Further development of the reported methods would provide a powerful tool for the identification of the disturbances of water height inside the pipe.
NASA Astrophysics Data System (ADS)
Wang, L.-P.; Ochoa-Rodríguez, S.; Onof, C.; Willems, P.
2015-09-01
Gauge-based radar rainfall adjustment techniques have been widely used to improve the applicability of radar rainfall estimates to large-scale hydrological modelling. However, their use for urban hydrological applications is limited as they were mostly developed based upon Gaussian approximations and therefore tend to smooth off so-called "singularities" (features of a non-Gaussian field) that can be observed in the fine-scale rainfall structure. Overlooking the singularities could be critical, given that their distribution is highly consistent with that of local extreme magnitudes. This deficiency may cause large errors in the subsequent urban hydrological modelling. To address this limitation and improve the applicability of adjustment techniques at urban scales, a method is proposed herein which incorporates a local singularity analysis into existing adjustment techniques and allows the preservation of the singularity structures throughout the adjustment process. In this paper the proposed singularity analysis is incorporated into the Bayesian merging technique and the performance of the resulting singularity-sensitive method is compared with that of the original Bayesian (non singularity-sensitive) technique and the commonly used mean field bias adjustment. This test is conducted using as case study four storm events observed in the Portobello catchment (53 km2) (Edinburgh, UK) during 2011 and for which radar estimates, dense rain gauge and sewer flow records, as well as a recently calibrated urban drainage model were available. The results suggest that, in general, the proposed singularity-sensitive method can effectively preserve the non-normality in local rainfall structure, while retaining the ability of the original adjustment techniques to generate nearly unbiased estimates. Moreover, the ability of the singularity-sensitive technique to preserve the non-normality in rainfall estimates often leads to better reproduction of the urban drainage system's dynamics, particularly of peak runoff flows.
ERIC Educational Resources Information Center
Gerdin, Göran; Pringle, Richard
2017-01-01
Kirk warns that physical education (PE) exists in a precarious situation as the dominance of the multi-activity sport-techniques model, and its associated problems, threatens the long-term educational survival of PE. Yet he also notes that although the model is problematic it is highly resistant to change. In this paper, we draw on the results of…
[Hygienic and ergonomic analysis of the technology for sinking main and subsidiary mine shafts].
Meniaĭlo, N I; Tyshlek, E G; Gritsenko, V S; Shemiakin, G M
1989-01-01
The labour conditions in mine shafts do not correspond to the existing ergonomic and hygienic norms. Drilling and blasting techniques are most hazardous as to the gravity and duration of the factors involved. Working conditions normalization should be based on the elaboration of specifically innovative technologies which should envisage the workers' periodic staying in the mine shaft area during the work shift.
Fowler, Dawnovise N; Faulkner, Monica
2011-12-01
In this article, meta-analytic techniques are used to examine existing intervention studies (n = 11) to determine their effects on substance abuse among female samples of intimate partner abuse (IPA) survivors. This research serves as a starting point for greater attention in research and practice to the implementation of evidence-based, integrated services to address co-occurring substance abuse and IPA victimization among women as major intersecting public health problems. The results show greater effects in three main areas. First, greater effect sizes exist in studies where larger numbers of women experienced current IPA. Second, studies with a lower mean age also showed greater effect sizes than studies with a higher mean age. Lastly, studies with smaller sample sizes have greater effects. This research helps to facilitate cohesion in the knowledge base on this topic, and the findings of this meta-analysis, in particular, contribute needed information to gaps in the literature on the level of promise of existing interventions to impact substance abuse in this underserved population. Published by Elsevier Inc.
Speed scanning system based on solid-state microchip laser for architectural planning
NASA Astrophysics Data System (ADS)
Redka, Dmitriy; Grishkanich, Alexsandr S.; Kolmakov, Egor; Tsvetkov, Konstantin
2017-10-01
According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.
Coordinate measuring system based on microchip lasers for reverse prototyping
NASA Astrophysics Data System (ADS)
Iakovlev, Alexey; Grishkanich, Alexsandr S.; Redka, Dmitriy; Tsvetkov, Konstantin
2017-02-01
According to the current great interest concerning Large-Scale Metrology applications in many different fields of manufacturing industry, technologies and techniques for dimensional measurement have recently shown a substantial improvement. Ease-of-use, logistic and economic issues, as well as metrological performance, are assuming a more and more important role among system requirements. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of chip and microlasers as radiators on the linear-angular characteristics of existing measurement systems. The project is planned to conduct experimental studies aimed at identifying the impact of the application of the basic laws of microlasers as radiators on the linear-angular characteristics of existing measurement systems. The system consists of a distributed network-based layout, whose modularity allows to fit differently sized and shaped working volumes by adequately increasing the number of sensing units. Differently from existing spatially distributed metrological instruments, the remote sensor devices are intended to provide embedded data elaboration capabilities, in order to share the overall computational load.
Adaptive Batch Mode Active Learning.
Chakraborty, Shayok; Balasubramanian, Vineeth; Panchanathan, Sethuraman
2015-08-01
Active learning techniques have gained popularity to reduce human effort in labeling data instances for inducing a classifier. When faced with large amounts of unlabeled data, such algorithms automatically identify the exemplar and representative instances to be selected for manual annotation. More recently, there have been attempts toward a batch mode form of active learning, where a batch of data points is simultaneously selected from an unlabeled set. Real-world applications require adaptive approaches for batch selection in active learning, depending on the complexity of the data stream in question. However, the existing work in this field has primarily focused on static or heuristic batch size selection. In this paper, we propose two novel optimization-based frameworks for adaptive batch mode active learning (BMAL), where the batch size as well as the selection criteria are combined in a single formulation. We exploit gradient-descent-based optimization strategies as well as properties of submodular functions to derive the adaptive BMAL algorithms. The solution procedures have the same computational complexity as existing state-of-the-art static BMAL techniques. Our empirical results on the widely used VidTIMIT and the mobile biometric (MOBIO) data sets portray the efficacy of the proposed frameworks and also certify the potential of these approaches in being used for real-world biometric recognition applications.
A frame selective dynamic programming approach for noise robust pitch estimation.
Yarra, Chiranjeevi; Deshmukh, Om D; Ghosh, Prasanta Kumar
2018-04-01
The principles of the existing pitch estimation techniques are often different and complementary in nature. In this work, a frame selective dynamic programming (FSDP) method is proposed which exploits the complementary characteristics of two existing methods, namely, sub-harmonic to harmonic ratio (SHR) and sawtooth-wave inspired pitch estimator (SWIPE). Using variants of SHR and SWIPE, the proposed FSDP method classifies all the voiced frames into two classes-the first class consists of the frames where a confidence score maximization criterion is used for pitch estimation, while for the second class, a dynamic programming (DP) based approach is proposed. Experiments are performed on speech signals separately from KEELE, CSLU, and PaulBaghsaw corpora under clean and additive white Gaussian noise at 20, 10, 5, and 0 dB SNR conditions using four baseline schemes including SHR, SWIPE, and two DP based techniques. The pitch estimation performance of FSDP, when averaged over all SNRs, is found to be better than those of the baseline schemes suggesting the benefit of applying smoothness constraint using DP in selected frames in the proposed FSDP scheme. The VuV classification error from FSDP is also found to be lower than that from all four baseline schemes in almost all SNR conditions on three corpora.
Ji, Hongwei; He, Jiangping; Yang, Xin; Deklerck, Rudi; Cornelis, Jan
2013-05-01
In this paper, we present an autocontext model(ACM)-based automatic liver segmentation algorithm, which combines ACM, multiatlases, and mean-shift techniques to segment liver from 3-D CT images. Our algorithm is a learning-based method and can be divided into two stages. At the first stage, i.e., the training stage, ACM is performed to learn a sequence of classifiers in each atlas space (based on each atlas and other aligned atlases). With the use of multiple atlases, multiple sequences of ACM-based classifiers are obtained. At the second stage, i.e., the segmentation stage, the test image will be segmented in each atlas space by applying each sequence of ACM-based classifiers. The final segmentation result will be obtained by fusing segmentation results from all atlas spaces via a multiclassifier fusion technique. Specially, in order to speed up segmentation, given a test image, we first use an improved mean-shift algorithm to perform over-segmentation and then implement the region-based image labeling instead of the original inefficient pixel-based image labeling. The proposed method is evaluated on the datasets of MICCAI 2007 liver segmentation challenge. The experimental results show that the average volume overlap error and the average surface distance achieved by our method are 8.3% and 1.5 m, respectively, which are comparable to the results reported in the existing state-of-the-art work on liver segmentation.
Cha, Shi-Cho; Chen, Jyun-Fu
2017-01-01
Bluetooth Low Energy (BLE) has emerged as one of the most promising technologies to enable the Internet-of-Things (IoT) paradigm. In BLE-based IoT applications, e.g., wearables-oriented service applications, the Bluetooth MAC addresses of devices will be swapped for device pairings. The random address technique is adopted to prevent malicious users from tracking the victim’s devices with stationary Bluetooth MAC addresses and accordingly the device privacy can be preserved. However, there exists a tradeoff between privacy and security in the random address technique. That is, when device pairing is launched and one device cannot actually identify another one with addresses, it provides an opportunity for malicious users to break the system security via impersonation attacks. Hence, using random addresses may lead to higher security risks. In this study, we point out the potential risk of using random address technique and then present critical security requirements for BLE-based IoT applications. To fulfill the claimed requirements, we present a privacy-aware mechanism, which is based on elliptic curve cryptography, for secure communication and access-control among BLE-based IoT objects. Moreover, to ensure the security of smartphone application associated with BLE-based IoT objects, we construct a Smart Contract-based Investigation Report Management framework (SCIRM) which enables smartphone application users to obtain security inspection reports of BLE-based applications of interest with smart contracts. PMID:29036900
Cha, Shi-Cho; Yeh, Kuo-Hui; Chen, Jyun-Fu
2017-10-14
Bluetooth Low Energy (BLE) has emerged as one of the most promising technologies to enable the Internet-of-Things (IoT) paradigm. In BLE-based IoT applications, e.g., wearables-oriented service applications, the Bluetooth MAC addresses of devices will be swapped for device pairings. The random address technique is adopted to prevent malicious users from tracking the victim's devices with stationary Bluetooth MAC addresses and accordingly the device privacy can be preserved. However, there exists a tradeoff between privacy and security in the random address technique. That is, when device pairing is launched and one device cannot actually identify another one with addresses, it provides an opportunity for malicious users to break the system security via impersonation attacks. Hence, using random addresses may lead to higher security risks. In this study, we point out the potential risk of using random address technique and then present critical security requirements for BLE-based IoT applications. To fulfill the claimed requirements, we present a privacy-aware mechanism, which is based on elliptic curve cryptography, for secure communication and access-control among BLE-based IoT objects. Moreover, to ensure the security of smartphone application associated with BLE-based IoT objects, we construct a Smart Contract-based Investigation Report Management framework (SCIRM) which enables smartphone application users to obtain security inspection reports of BLE-based applications of interest with smart contracts.
Wind Gust Measurement Techniques-From Traditional Anemometry to New Possibilities.
Suomi, Irene; Vihma, Timo
2018-04-23
Information on wind gusts is needed for assessment of wind-induced damage and risks to safety. The measurement of wind gust speed requires a high temporal resolution of the anemometer system, because the gust is defined as a short-duration (seconds) maximum of the fluctuating wind speed. Until the digitalization of wind measurements in the 1990s, the wind gust measurements suffered from limited recording and data processing resources. Therefore, the majority of continuous wind gust records date back at most only by 30 years. Although the response characteristics of anemometer systems are good enough today, the traditional measurement techniques at weather stations based on cup and sonic anemometers are limited to heights and regions where the supporting structures can reach. Therefore, existing measurements are mainly concentrated over densely-populated land areas, whereas from remote locations, such as the marine Arctic, wind gust information is available only from sparse coastal locations. Recent developments of wind gust measurement techniques based on turbulence measurements from research aircraft and from Doppler lidar can potentially provide new information from heights and locations unreachable by traditional measurement techniques. Moreover, fast-developing measurement methods based on Unmanned Aircraft Systems (UASs) may add to better coverage of wind gust measurements in the future. In this paper, we provide an overview of the history and the current status of anemometry from the perspective of wind gusts. Furthermore, a discussion on the potential future directions of wind gust measurement techniques is provided.
Magnetic particle-mediated magnetoreception
Shaw, Jeremy; Boyd, Alastair; House, Michael; Woodward, Robert; Mathes, Falko; Cowin, Gary; Saunders, Martin; Baer, Boris
2015-01-01
Behavioural studies underpin the weight of experimental evidence for the existence of a magnetic sense in animals. In contrast, studies aimed at understanding the mechanistic basis of magnetoreception by determining the anatomical location, structure and function of sensory cells have been inconclusive. In this review, studies attempting to demonstrate the existence of a magnetoreceptor based on the principles of the magnetite hypothesis are examined. Specific attention is given to the range of techniques, and main animal model systems that have been used in the search for magnetite particulates. Anatomical location/cell rarity and composition are identified as two key obstacles that must be addressed in order to make progress in locating and characterizing a magnetite-based magnetoreceptor cell. Avenues for further study are suggested, including the need for novel experimental, correlative, multimodal and multidisciplinary approaches. The aim of this review is to inspire new efforts towards understanding the cellular basis of magnetoreception in animals, which will in turn inform a new era of behavioural research based on first principles. PMID:26333810
Comparison of public peak detection algorithms for MALDI mass spectrometry data analysis.
Yang, Chao; He, Zengyou; Yu, Weichuan
2009-01-06
In mass spectrometry (MS) based proteomic data analysis, peak detection is an essential step for subsequent analysis. Recently, there has been significant progress in the development of various peak detection algorithms. However, neither a comprehensive survey nor an experimental comparison of these algorithms is yet available. The main objective of this paper is to provide such a survey and to compare the performance of single spectrum based peak detection methods. In general, we can decompose a peak detection procedure into three consequent parts: smoothing, baseline correction and peak finding. We first categorize existing peak detection algorithms according to the techniques used in different phases. Such a categorization reveals the differences and similarities among existing peak detection algorithms. Then, we choose five typical peak detection algorithms to conduct a comprehensive experimental study using both simulation data and real MALDI MS data. The results of comparison show that the continuous wavelet-based algorithm provides the best average performance.
[Application of electronic fence technology based on GIS in Oncomelania hupensis snail monitoring].
Zhi-Hua, Chen; Yi-Sheng, Zhu; Zhi-Qiang, Xue; Xue-Bing, Li; Yi-Min, Ding; Li-Jun, Bi; Kai-Min, Gao; You, Zhang
2017-07-27
To study the application of Geographic Information System (GIS) electronic fence technique in Oncomelania hupensis snail monitoring. The electronic fence was set around the history and existing snail environments in the electronic map, the information about snail monitoring and controlling was linked to the electronic fence, and the snail monitoring information system was established on these bases. The monitoring information was input through the computer and smart phone. The electronic fence around the history and existing snail environments was set in the electronic map (Baidu map), and the snail monitoring information system and smart phone APP were established. The monitoring information was input and upload real-time, and the snail monitoring information was demonstrated in real time on Baidu map. By using the electronic fence technology based on GIS, the unique "environment electronic archives" for each snail monitoring environment can be established in the electronic map, and real-time, dynamic monitoring and visual management can be realized.
DebriSat: The New Hypervelocity Impact Test for Satellite Breakup Fragment Characterization
NASA Technical Reports Server (NTRS)
Cowardin, Heather
2015-01-01
To replicate a hyper-velocity fragmentation event using modern-day spacecraft materials and construction techniques to better improve the existing DoD and NASA breakup models: DebriSat is intended to be representative of modern LEO satellites. Major design decisions were reviewed and approved by Aerospace subject matter experts from different disciplines. DebriSat includes 7 major subsystems. Attitude determination and control system (ADCS), command and data handling (C&DH), electrical power system (EPS), payload, propulsion, telemetry tracking and command (TT&C), and thermal management. To reduce cost, most components are emulated based on existing design of flight hardware and fabricated with the same materials. center dotA key laboratory-based test, Satellite Orbital debris Characterization Impact Test (SOCIT), supporting the development of the DoD and NASA satellite breakup models was conducted at AEDC in 1992. Breakup models based on SOCIT have supported many applications and matched on-orbit events reasonably well over the years.
A method to classify schizophrenia using inter-task spatial correlations of functional brain images.
Michael, Andrew M; Calhoun, Vince D; Andreasen, Nancy C; Baum, Stefi A
2008-01-01
The clinical heterogeneity of schizophrenia (scz) and the overlap of self reported and observed symptoms with other mental disorders makes its diagnosis a difficult task. At present no laboratory-based or image-based diagnostic tool for scz exists and such tools are desired to support existing methods for more precise diagnosis. Functional magnetic resonance imaging (fMRI) is currently employed to identify and correlate cognitive processes related to scz and its symptoms. Fusion of multiple fMRI tasks that probe different cognitive processes may help to better understand hidden networks of this complex disorder. In this paper we utilize three different fMRI tasks and introduce an approach to classify subjects based on inter-task spatial correlations of brain activation. The technique was applied to groups of patients and controls and its validity was checked with the leave-one-out method. We show that the classification rate increases when information from multiple tasks are combined.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Mary A.; Tangyunyong, Paiboon; Cole, Edward I.
2016-01-14
Laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes (LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increased leakage ismore » not present in devices without AVM signals. Transmission electron microscopy analysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less
Miller, Mary A.; Tangyunyong, Paiboon; Edward I. Cole, Jr.
2016-01-12
In this study, laser-based failure analysis techniques demonstrate the ability to quickly and non-intrusively screen deep ultraviolet light-emitting diodes(LEDs) for electrically-active defects. In particular, two laser-based techniques, light-induced voltage alteration and thermally-induced voltage alteration, generate applied voltage maps (AVMs) that provide information on electrically-active defect behavior including turn-on bias, density, and spatial location. Here, multiple commercial LEDs were examined and found to have dark defect signals in the AVM indicating a site of reduced resistance or leakage through the diode. The existence of the dark defect signals in the AVM correlates strongly with an increased forward-bias leakage current. This increasedmore » leakage is not present in devices without AVM signals. Transmission electron microscopyanalysis of a dark defect signal site revealed a dislocation cluster through the pn junction. The cluster included an open core dislocation. Even though LEDs with few dark AVM defect signals did not correlate strongly with power loss, direct association between increased open core dislocation densities and reduced LED device performance has been presented elsewhere [M. W. Moseley et al., J. Appl. Phys. 117, 095301 (2015)].« less
Chaudhary, Prem Prashant; Brablcová, Lenka; Buriánková, Iva; Rulík, Martin
2013-09-01
Methanogenic archaeal communities existing in freshwater sediments are responsible for approximately 50 % of the total global emission of methane. This process contributes significantly to global warming and, hence, necessitates interventional control measures to limit its emission. Unfortunately, the diversity and functional interactions of methanogenic populations occurring in these habitats are yet to be fully characterized. Considering several disadvantages of conventional culture-based methodologies, in recent years, impetus is given to molecular biology approaches to determine the community structure of freshwater sedimentary methanogenic archaea. 16S rRNA and methyl coenzyme M reductase (mcrA) gene-based cloning techniques are the first choice for this purpose. In addition, electrophoresis-based (denaturing gradient gel electrophoresis, temperature gradient gel electrophoresis, and terminal restriction fragment length polymorphism) and quantitative real-time polymerase chain reaction techniques have also found extensive applications. These techniques are highly sensitive, rapid, and reliable as compared to traditional culture-dependent approaches. Molecular diversity studies revealed the dominance of the orders Methanomicrobiales and Methanosarcinales of methanogens in freshwater sediments. The present review discusses in detail the status of the diversity of methanogens and the molecular approaches applied in this area of research.
A Study of Feature Combination for Vehicle Detection Based on Image Processing
2014-01-01
Video analytics play a critical role in most recent traffic monitoring and driver assistance systems. In this context, the correct detection and classification of surrounding vehicles through image analysis has been the focus of extensive research in the last years. Most of the pieces of work reported for image-based vehicle verification make use of supervised classification approaches and resort to techniques, such as histograms of oriented gradients (HOG), principal component analysis (PCA), and Gabor filters, among others. Unfortunately, existing approaches are lacking in two respects: first, comparison between methods using a common body of work has not been addressed; second, no study of the combination potentiality of popular features for vehicle classification has been reported. In this study the performance of the different techniques is first reviewed and compared using a common public database. Then, the combination capabilities of these techniques are explored and a methodology is presented for the fusion of classifiers built upon them, taking into account also the vehicle pose. The study unveils the limitations of single-feature based classification and makes clear that fusion of classifiers is highly beneficial for vehicle verification. PMID:24672299
Yobbi, D.K.
2000-01-01
A nonlinear least-squares regression technique for estimation of ground-water flow model parameters was applied to an existing model of the regional aquifer system underlying west-central Florida. The regression technique minimizes the differences between measured and simulated water levels. Regression statistics, including parameter sensitivities and correlations, were calculated for reported parameter values in the existing model. Optimal parameter values for selected hydrologic variables of interest are estimated by nonlinear regression. Optimal estimates of parameter values are about 140 times greater than and about 0.01 times less than reported values. Independently estimating all parameters by nonlinear regression was impossible, given the existing zonation structure and number of observations, because of parameter insensitivity and correlation. Although the model yields parameter values similar to those estimated by other methods and reproduces the measured water levels reasonably accurately, a simpler parameter structure should be considered. Some possible ways of improving model calibration are to: (1) modify the defined parameter-zonation structure by omitting and/or combining parameters to be estimated; (2) carefully eliminate observation data based on evidence that they are likely to be biased; (3) collect additional water-level data; (4) assign values to insensitive parameters, and (5) estimate the most sensitive parameters first, then, using the optimized values for these parameters, estimate the entire data set.
Overview of innovative remediation of emerging contaminants
NASA Astrophysics Data System (ADS)
Keller, A. A.; Adeleye, A. S.; Huang, Y.; Garner, K.
2015-12-01
The application of nanotechnology in drinking water treatment and pollution cleanup is promising, as demonstrated by a number of field-based (pilot and full scale) and bench scale studies. A number of reviews exist for these nanotechnology-based applications; but to better illustrate its importance and guide its development, a direct comparison between traditional treatment technologies and emerging approaches using nanotechnology is needed. In this review, the performances of traditional technologies and nanotechnology for water treatment and environmental remediation were compared with the goal of providing an up-to-date reference on the state of treatment techniques for researchers, industry, and policy makers. Pollutants were categorized into broad classes, and the most cost-effective techniques (traditional and nanotechnology-based) in each category reported in the literature were compared. Where information was available, cost and environmental implications of both technologies were also compared. Traditional treatment technologies were found to currently offer the most cost-effective choices for removal of several common pollutants from drinking water and polluted sites. Nano-based techniques may however become important in complicated remediation conditions and meeting increasingly stringent water quality standards, especially in removal of emerging pollutants and low levels of contaminants. We also discuss challenges facing environmental application of nanotechnology were also discussed and potential solutions.
PyMVPA: A python toolbox for multivariate pattern analysis of fMRI data.
Hanke, Michael; Halchenko, Yaroslav O; Sederberg, Per B; Hanson, Stephen José; Haxby, James V; Pollmann, Stefan
2009-01-01
Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability.
PyMVPA: A Python toolbox for multivariate pattern analysis of fMRI data
Hanke, Michael; Halchenko, Yaroslav O.; Sederberg, Per B.; Hanson, Stephen José; Haxby, James V.; Pollmann, Stefan
2009-01-01
Decoding patterns of neural activity onto cognitive states is one of the central goals of functional brain imaging. Standard univariate fMRI analysis methods, which correlate cognitive and perceptual function with the blood oxygenation-level dependent (BOLD) signal, have proven successful in identifying anatomical regions based on signal increases during cognitive and perceptual tasks. Recently, researchers have begun to explore new multivariate techniques that have proven to be more flexible, more reliable, and more sensitive than standard univariate analysis. Drawing on the field of statistical learning theory, these new classifier-based analysis techniques possess explanatory power that could provide new insights into the functional properties of the brain. However, unlike the wealth of software packages for univariate analyses, there are few packages that facilitate multivariate pattern classification analyses of fMRI data. Here we introduce a Python-based, cross-platform, and open-source software toolbox, called PyMVPA, for the application of classifier-based analysis techniques to fMRI datasets. PyMVPA makes use of Python's ability to access libraries written in a large variety of programming languages and computing environments to interface with the wealth of existing machine-learning packages. We present the framework in this paper and provide illustrative examples on its usage, features, and programmability. PMID:19184561
Application of decision rules for empowering of Indonesian telematics services SMEs
NASA Astrophysics Data System (ADS)
Tosida, E. T.; Hairlangga, O.; Amirudin, F.; Ridwanah, M.
2018-03-01
The independence of the field of telematics became one of Indonesia's vision in 2024. One effort to achieve it can be done by empowering SMEs in the field of telematics. Empowerment carried out need a practical mechanism by utilizing data centered, including through the National Economic Census database (Susenas). Based on the Susenas can be formulated the decision rules of determining the provision of assistance for SMEs in the field of telematics. The way it did by generating the rule base through the classification technique. The CART algorithm-based decision rule model performs better than C45 and ID3 models. The high level of performance model is also in line with the regulations applied by the government. This becomes one of the strengths of research, because the resulting model is consistent with the existing conditions in Indonesia. The rules base generated from the three classification techniques show different rules. The CART technique has pattern matching with the realization of activities in The Ministry of Cooperatives and SMEs. So far, the government has difficulty in referring data related to the empowerment of SMEs telematics services. Therefore, the findings resulting from this research can be used as an alternative decision support system related to the program of empowerment of SMEs in telematics.
Progress in EEG-Based Brain Robot Interaction Systems
Li, Mengfan; Niu, Linwei; Xian, Bin; Zeng, Ming; Chen, Genshe
2017-01-01
The most popular noninvasive Brain Robot Interaction (BRI) technology uses the electroencephalogram- (EEG-) based Brain Computer Interface (BCI), to serve as an additional communication channel, for robot control via brainwaves. This technology is promising for elderly or disabled patient assistance with daily life. The key issue of a BRI system is to identify human mental activities, by decoding brainwaves, acquired with an EEG device. Compared with other BCI applications, such as word speller, the development of these applications may be more challenging since control of robot systems via brainwaves must consider surrounding environment feedback in real-time, robot mechanical kinematics, and dynamics, as well as robot control architecture and behavior. This article reviews the major techniques needed for developing BRI systems. In this review article, we first briefly introduce the background and development of mind-controlled robot technologies. Second, we discuss the EEG-based brain signal models with respect to generating principles, evoking mechanisms, and experimental paradigms. Subsequently, we review in detail commonly used methods for decoding brain signals, namely, preprocessing, feature extraction, and feature classification, and summarize several typical application examples. Next, we describe a few BRI applications, including wheelchairs, manipulators, drones, and humanoid robots with respect to synchronous and asynchronous BCI-based techniques. Finally, we address some existing problems and challenges with future BRI techniques. PMID:28484488
High-throughput electrical characterization for robust overlay lithography control
NASA Astrophysics Data System (ADS)
Devender, Devender; Shen, Xumin; Duggan, Mark; Singh, Sunil; Rullan, Jonathan; Choo, Jae; Mehta, Sohan; Tang, Teck Jung; Reidy, Sean; Holt, Jonathan; Kim, Hyung Woo; Fox, Robert; Sohn, D. K.
2017-03-01
Realizing sensitive, high throughput and robust overlay measurement is a challenge in current 14nm and advanced upcoming nodes with transition to 300mm and upcoming 450mm semiconductor manufacturing, where slight deviation in overlay has significant impact on reliability and yield1). Exponentially increasing number of critical masks in multi-patterning lithoetch, litho-etch (LELE) and subsequent LELELE semiconductor processes require even tighter overlay specification2). Here, we discuss limitations of current image- and diffraction- based overlay measurement techniques to meet these stringent processing requirements due to sensitivity, throughput and low contrast3). We demonstrate a new electrical measurement based technique where resistance is measured for a macro with intentional misalignment between two layers. Overlay is quantified by a parabolic fitting model to resistance where minima and inflection points are extracted to characterize overlay control and process window, respectively. Analyses using transmission electron microscopy show good correlation between actual overlay performance and overlay obtained from fitting. Additionally, excellent correlation of overlay from electrical measurements to existing image- and diffraction- based techniques is found. We also discuss challenges of integrating electrical measurement based approach in semiconductor manufacturing from Back End of Line (BEOL) perspective. Our findings open up a new pathway for accessing simultaneous overlay as well as process window and margins from a robust, high throughput and electrical measurement approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Tingguang; Xia, Shuang, E-mail: xs@shu.edu.cn; Li, Hui
Grain boundary engineering was carried out on an aging-treated nickel based Alloy 690, which has precipitated carbides at grain boundaries. Electron backscatter diffraction technique was used to investigate the grain boundary networks. Results show that, compared with the solution-annealed samples, the aging-treated samples with pre-existing carbides at grain boundaries need longer duration or higher temperature during annealing after low-strain tensile deformation for forming high proportion of low-Σ coincidence site lattice grain boundaries (more than 75%). The reason is that the primary recrystallization is inhibited or retarded owing to that the pre-existing carbides are barriers to grain boundaries migration. - Highlights:more » • Study of GBE as function of pre-existing GB carbides, tensile strain and annealing • Recrystallization of GBE is inhibited or retarded by the pre-existing carbides. • Retained carbides after annealing show the original GB positions. • More than 80% of special GBs were formed after the modification of GBE processing. • Multiple twinning during recrystallization is the key process of GBE.« less
A framework for graph-based synthesis, analysis, and visualization of HPC cluster job data.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mayo, Jackson R.; Kegelmeyer, W. Philip, Jr.; Wong, Matthew H.
The monitoring and system analysis of high performance computing (HPC) clusters is of increasing importance to the HPC community. Analysis of HPC job data can be used to characterize system usage and diagnose and examine failure modes and their effects. This analysis is not straightforward, however, due to the complex relationships that exist between jobs. These relationships are based on a number of factors, including shared compute nodes between jobs, proximity of jobs in time, etc. Graph-based techniques represent an approach that is particularly well suited to this problem, and provide an effective technique for discovering important relationships in jobmore » queuing and execution data. The efficacy of these techniques is rooted in the use of a semantic graph as a knowledge representation tool. In a semantic graph job data, represented in a combination of numerical and textual forms, can be flexibly processed into edges, with corresponding weights, expressing relationships between jobs, nodes, users, and other relevant entities. This graph-based representation permits formal manipulation by a number of analysis algorithms. This report presents a methodology and software implementation that leverages semantic graph-based techniques for the system-level monitoring and analysis of HPC clusters based on job queuing and execution data. Ontology development and graph synthesis is discussed with respect to the domain of HPC job data. The framework developed automates the synthesis of graphs from a database of job information. It also provides a front end, enabling visualization of the synthesized graphs. Additionally, an analysis engine is incorporated that provides performance analysis, graph-based clustering, and failure prediction capabilities for HPC systems.« less
Wan, Yong; Otsuna, Hideo; Holman, Holly A; Bagley, Brig; Ito, Masayoshi; Lewis, A Kelsey; Colasanto, Mary; Kardon, Gabrielle; Ito, Kei; Hansen, Charles
2017-05-26
Image segmentation and registration techniques have enabled biologists to place large amounts of volume data from fluorescence microscopy, morphed three-dimensionally, onto a common spatial frame. Existing tools built on volume visualization pipelines for single channel or red-green-blue (RGB) channels have become inadequate for the new challenges of fluorescence microscopy. For a three-dimensional atlas of the insect nervous system, hundreds of volume channels are rendered simultaneously, whereas fluorescence intensity values from each channel need to be preserved for versatile adjustment and analysis. Although several existing tools have incorporated support of multichannel data using various strategies, the lack of a flexible design has made true many-channel visualization and analysis unavailable. The most common practice for many-channel volume data presentation is still converting and rendering pseudosurfaces, which are inaccurate for both qualitative and quantitative evaluations. Here, we present an alternative design strategy that accommodates the visualization and analysis of about 100 volume channels, each of which can be interactively adjusted, selected, and segmented using freehand tools. Our multichannel visualization includes a multilevel streaming pipeline plus a triple-buffer compositing technique. Our method also preserves original fluorescence intensity values on graphics hardware, a crucial feature that allows graphics-processing-unit (GPU)-based processing for interactive data analysis, such as freehand segmentation. We have implemented the design strategies as a thorough restructuring of our original tool, FluoRender. The redesign of FluoRender not only maintains the existing multichannel capabilities for a greatly extended number of volume channels, but also enables new analysis functions for many-channel data from emerging biomedical-imaging techniques.
A cross-correlation-based estimate of the galaxy luminosity function
NASA Astrophysics Data System (ADS)
van Daalen, Marcel P.; White, Martin
2018-06-01
We extend existing methods for using cross-correlations to derive redshift distributions for photometric galaxies, without using photometric redshifts. The model presented in this paper simultaneously yields highly accurate and unbiased redshift distributions and, for the first time, redshift-dependent luminosity functions, using only clustering information and the apparent magnitudes of the galaxies as input. In contrast to many existing techniques for recovering unbiased redshift distributions, the output of our method is not degenerate with the galaxy bias b(z), which is achieved by modelling the shape of the luminosity bias. We successfully apply our method to a mock galaxy survey and discuss improvements to be made before applying our model to real data.
Anomaly Detection in Power Quality at Data Centers
NASA Technical Reports Server (NTRS)
Grichine, Art; Solano, Wanda M.
2015-01-01
The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.
On degenerate coupled transport processes in porous media with memory phenomena
NASA Astrophysics Data System (ADS)
Beneš, Michal; Pažanin, Igor
2018-06-01
In this paper we prove the existence of weak solutions to degenerate parabolic systems arising from the fully coupled moisture movement, solute transport of dissolved species and heat transfer through porous materials. Physically relevant mixed Dirichlet-Neumann boundary conditions and initial conditions are considered. Existence of a global weak solution of the problem is proved by means of semidiscretization in time, proving necessary uniform estimates and by passing to the limit from discrete approximations. Degeneration occurs in the nonlinear transport coefficients which are not assumed to be bounded below and above by positive constants. Degeneracies in transport coefficients are overcome by proving suitable a-priori $L^{\\infty}$-estimates based on De Giorgi and Moser iteration technique.
Analysis and Design of a Fiber-optic Probe for DNA Sensors Final Report CRADA No. TSB-1147-95
DOE Office of Scientific and Technical Information (OSTI.GOV)
Molau, Nicole; Vail, Curtis
In 1995, a challenge in the field of genetics dealt with the acquisition of efficient DNA sequencing techniques for reading the 3 billion base-pairs that comprised the human genome. AccuPhotonics, Inc. proposed to develop and manufacture a state-of-the-art near-field scanning optical microscopy (NSOM) fiber-optic probe that was expected to increase probe efficiency by two orders of magnitude over the existing state-of-the-art and to improve resolution to 10Å. The detailed design calculation and optimization of electrical properties of the fiber-optic probe tip geometry would be performed at LLNL, using existing finite-difference time-domain (FDTD) electromagnetic (EM) codes.
Management of Dynamic Biomedical Terminologies: Current Status and Future Challenges
Dos Reis, J. C.; Pruski, C.
2015-01-01
Summary Objectives Controlled terminologies and their dependent artefacts provide a consensual understanding of a domain while reducing ambiguities and enabling reasoning. However, the evolution of a domain’s knowledge directly impacts these terminologies and generates inconsistencies in the underlying biomedical information systems. In this article, we review existing work addressing the dynamic aspect of terminologies as well as their effects on mappings and semantic annotations. Methods We investigate approaches related to the identification, characterization and propagation of changes in terminologies, mappings and semantic annotations including techniques to update their content. Results and conclusion Based on the explored issues and existing methods, we outline open research challenges requiring investigation in the near future. PMID:26293859
Intravital hybrid optical-optoacoustic microscopy based on fiber-Bragg interferometry
NASA Astrophysics Data System (ADS)
Shnaiderman, Rami; Wissmeyer, Georg; Seeger, Markus; Estrada, Hector; Ntziachristos, Vasilis
2018-02-01
Optoacoustic microscopy (OAM) has enabled high-resolution, label-free imaging of tissues at depths not achievable with purely optical microscopy. However, widespread implementation of OAM into existing epi-illumination microscopy setups is often constrained by the performance and size of the commonly used piezoelectric ultrasound detectors. In this work, we introduce a novel acoustic detector based on a π-phase-shifted fiber Bragg grating (π-FBG) interferometer embedded inside an ellipsoidal acoustic cavity. The cavity enables seamless integration of epi-illumination OAM into existing microscopy setups by decoupling the acoustic and optical paths between the microscope objective and the sample. The cavity also acts as an acoustic condenser, boosting the sensitivity of the π-FBG and enabling cost effective CW-laser interrogation technique. We characterize the sensor's sensitivity and bandwidth and demonstrate hybrid OAM and second-harmonic imaging of phantoms and mouse tissue in vivo.
Ergonomics and simulation-based approach in improving facility layout
NASA Astrophysics Data System (ADS)
Abad, Jocelyn D.
2018-02-01
The use of the simulation-based technique in facility layout has been a choice in the industry due to its convenience and efficient generation of results. Nevertheless, the solutions generated are not capable of addressing delays due to worker's health and safety which significantly impact overall operational efficiency. It is, therefore, critical to incorporate ergonomics in facility design. In this study, workstation analysis was incorporated into Promodel simulation to improve the facility layout of a garment manufacturing. To test the effectiveness of the method, existing and improved facility designs were measured using comprehensive risk level, efficiency, and productivity. Results indicated that the improved facility layout generated a decrease in comprehensive risk level and rapid upper limb assessment score; an increase of 78% in efficiency and 194% increase in productivity compared to existing design and thus proved that the approach is effective in attaining overall facility design improvement.
A Description for Rock Joint Roughness Based on Terrestrial Laser Scanner and Image Analysis
Ge, Yunfeng; Tang, Huiming; Eldin, M. A. M Ez; Chen, Pengyu; Wang, Liangqing; Wang, Jinge
2015-01-01
Shear behavior of rock mass greatly depends upon the rock joint roughness which is generally characterized by anisotropy, scale effect and interval effect. A new index enabling to capture all the three features, namely brightness area percentage (BAP), is presented to express the roughness based on synthetic illumination of a digital terrain model derived from terrestrial laser scanner (TLS). Since only tiny planes facing opposite to shear direction make contribution to resistance during shear failure, therefore these planes are recognized through the image processing technique by taking advantage of the fact that they appear brighter than other ones under the same light source. Comparison with existing roughness indexes and two case studies were illustrated to test the performance of BAP description. The results reveal that the rock joint roughness estimated by the presented description has a good match with existing roughness methods and displays a wider applicability. PMID:26585247
A dynamic programming approach to estimate the capacity value of energy storage
Sioshansi, Ramteen; Madaeni, Seyed Hossein; Denholm, Paul
2013-09-17
Here, we present a method to estimate the capacity value of storage. Our method uses a dynamic program to model the effect of power system outages on the operation and state of charge of storage in subsequent periods. We combine the optimized dispatch from the dynamic program with estimated system loss of load probabilities to compute a probability distribution for the state of charge of storage in each period. This probability distribution can be used as a forced outage rate for storage in standard reliability-based capacity value estimation methods. Our proposed method has the advantage over existing approximations that itmore » explicitly captures the effect of system shortage events on the state of charge of storage in subsequent periods. We also use a numerical case study, based on five utility systems in the U.S., to demonstrate our technique and compare it to existing approximation methods.« less
Estimating population size of Pygoscelid Penguins from TM data
NASA Technical Reports Server (NTRS)
Olson, Charles E., Jr.; Schwaller, Mathew R.; Dahmer, Paul A.
1987-01-01
An estimate was made toward a continent wide population of penguins. The results indicate that Thematic Mapper data can be used to identify penguin rookeries due to the unique reflectance properties of guano. Strong correlations exist between nesting populations and rookery area occupied by the birds. These correlations allow estimation of the number of nesting pairs in colonies. The success of remote sensing and biometric analyses leads one to believe that a continent wide estimate of penguin populations is possible based on a timely sample employing ground based and remote sensing techniques. Satellite remote sensing along the coastline may well locate previously undiscovered penguin nesting sites, or locate rookeries which have been assumed to exist for over a half century, but never located. Observations which found that penguins are one of the most sensitive elements in the complex of Southern Ocean ecosystems motivated this study.
Crystal growth from the vapor phase experiment MA-085
NASA Technical Reports Server (NTRS)
Wiedemeir, H.; Sadeek, H.; Klaessig, F. C.; Norek, M.
1976-01-01
Three vapor transport experiments on multicomponent systems were performed during the Apollo Soyuz mission to determine the effects of microgravity forces on crystal morphology and mass transport rates. The mixed systems used germanium selenide, tellurium, germanium tetraiodide (transport agent), germanium monosulfide, germanium tetrachloride (transport agent), and argon (inert atmosphere). The materials were enclosed in evacuated sealed ampoules of fused silica and were transported in a temperature gradient of the multipurpose electric furnace onboard the Apollo Soyuz spacecraft. Preliminary evaluation of 2 systems shows improved quality of space grown crystals in terms of growth morphology and bulk perfection. This conclusion is based on a direct comparison of space grown and ground based crystals by means of X-ray diffraction, microscopic, and chemical etching techniques. The observation of greater mass transport rates than predicted for a microgravity environment by existing vapor transport models indicates the existence of nongravity caused transport effects in a reactive solid/gas phase system.
Hogan, Bernie; Melville, Joshua R.; Philips, Gregory Lee; Janulis, Patrick; Contractor, Noshir; Mustanski, Brian S.; Birkett, Michelle
2016-01-01
While much social network data exists online, key network metrics for high-risk populations must still be captured through self-report. This practice has suffered from numerous limitations in workflow and response burden. However, advances in technology, network drawing libraries and databases are making interactive network drawing increasingly feasible. We describe the translation of an analog-based technique for capturing personal networks into a digital framework termed netCanvas that addresses many existing shortcomings such as: 1) complex data entry; 2) extensive interviewer intervention and field setup; 3) difficulties in data reuse; and 4) a lack of dynamic visualizations. We test this implementation within a health behavior study of a high-risk and difficult-to-reach population. We provide a within–subjects comparison between paper and touchscreens. We assert that touchscreen-based social network capture is now a viable alternative for highly sensitive data and social network data entry tasks. PMID:28018995
Hogan, Bernie; Melville, Joshua R; Philips, Gregory Lee; Janulis, Patrick; Contractor, Noshir; Mustanski, Brian S; Birkett, Michelle
2016-05-01
While much social network data exists online, key network metrics for high-risk populations must still be captured through self-report. This practice has suffered from numerous limitations in workflow and response burden. However, advances in technology, network drawing libraries and databases are making interactive network drawing increasingly feasible. We describe the translation of an analog-based technique for capturing personal networks into a digital framework termed netCanvas that addresses many existing shortcomings such as: 1) complex data entry; 2) extensive interviewer intervention and field setup; 3) difficulties in data reuse; and 4) a lack of dynamic visualizations. We test this implementation within a health behavior study of a high-risk and difficult-to-reach population. We provide a within-subjects comparison between paper and touchscreens. We assert that touchscreen-based social network capture is now a viable alternative for highly sensitive data and social network data entry tasks.
IGA: A Simplified Introduction and Implementation Details for Finite Element Users
NASA Astrophysics Data System (ADS)
Agrawal, Vishal; Gautam, Sachin S.
2018-05-01
Isogeometric analysis (IGA) is a recently introduced technique that employs the Computer Aided Design (CAD) concept of Non-uniform Rational B-splines (NURBS) tool to bridge the substantial bottleneck between the CAD and finite element analysis (FEA) fields. The simplified transition of exact CAD models into the analysis alleviates the issues originating from geometrical discontinuities and thus, significantly reduces the design-to-analysis time in comparison to traditional FEA technique. Since its origination, the research in the field of IGA is accelerating and has been applied to various problems. However, the employment of CAD tools in the area of FEA invokes the need of adapting the existing implementation procedure for the framework of IGA. Also, the usage of IGA requires the in-depth knowledge of both the CAD and FEA fields. This can be overwhelming for a beginner in IGA. Hence, in this paper, a simplified introduction and implementation details for the incorporation of NURBS based IGA technique within the existing FEA code is presented. It is shown that with little modifications, the available standard code structure of FEA can be adapted for IGA. For the clear and concise explanation of these modifications, step-by-step implementation of a benchmark plate with a circular hole under the action of in-plane tension is included.
NASA Astrophysics Data System (ADS)
Romanchuk, V. A.; Lukashenko, V. V.
2018-05-01
The technique of functioning of a control system by a computing cluster based on neurocomputers is proposed. Particular attention is paid to the method of choosing the structure of the computing cluster due to the fact that the existing methods are not effective because of a specialized hardware base - neurocomputers, which are highly parallel computer devices with an architecture different from the von Neumann architecture. A developed algorithm for choosing the computational structure of a cloud cluster is described, starting from the direction of data transfer in the flow control graph of the program and its adjacency matrix.
Developing stereo image based robot control system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Suprijadi,; Pambudi, I. R.; Woran, M.
Application of image processing is developed in various field and purposes. In the last decade, image based system increase rapidly with the increasing of hardware and microprocessor performance. Many fields of science and technology were used this methods especially in medicine and instrumentation. New technique on stereovision to give a 3-dimension image or movie is very interesting, but not many applications in control system. Stereo image has pixel disparity information that is not existed in single image. In this research, we proposed a new method in wheel robot control system using stereovision. The result shows robot automatically moves based onmore » stereovision captures.« less
Locally-Based Kernal PLS Smoothing to Non-Parametric Regression Curve Fitting
NASA Technical Reports Server (NTRS)
Rosipal, Roman; Trejo, Leonard J.; Wheeler, Kevin; Korsmeyer, David (Technical Monitor)
2002-01-01
We present a novel smoothing approach to non-parametric regression curve fitting. This is based on kernel partial least squares (PLS) regression in reproducing kernel Hilbert space. It is our concern to apply the methodology for smoothing experimental data where some level of knowledge about the approximate shape, local inhomogeneities or points where the desired function changes its curvature is known a priori or can be derived based on the observed noisy data. We propose locally-based kernel PLS regression that extends the previous kernel PLS methodology by incorporating this knowledge. We compare our approach with existing smoothing splines, hybrid adaptive splines and wavelet shrinkage techniques on two generated data sets.