Sample records for collaborative filtering techniques

  1. Weighted hybrid technique for recommender system

    NASA Astrophysics Data System (ADS)

    Suriati, S.; Dwiastuti, Meisyarah; Tulus, T.

    2017-12-01

    Recommender system becomes very popular and has important role in an information system or webpages nowadays. A recommender system tries to make a prediction of which item a user may like based on his activity on the system. There are some familiar techniques to build a recommender system, such as content-based filtering and collaborative filtering. Content-based filtering does not involve opinions from human to make the prediction, while collaborative filtering does, so collaborative filtering can predict more accurately. However, collaborative filtering cannot give prediction to items which have never been rated by any user. In order to cover the drawbacks of each approach with the advantages of other approach, both approaches can be combined with an approach known as hybrid technique. Hybrid technique used in this work is weighted technique in which the prediction score is combination linear of scores gained by techniques that are combined.The purpose of this work is to show how an approach of weighted hybrid technique combining content-based filtering and item-based collaborative filtering can work in a movie recommender system and to show the performance comparison when both approachare combined and when each approach works alone. There are three experiments done in this work, combining both techniques with different parameters. The result shows that the weighted hybrid technique that is done in this work does not really boost the performance up, but it helps to give prediction score for unrated movies that are impossible to be recommended by only using collaborative filtering.

  2. Collaborative filtering to improve navigation of large radiology knowledge resources.

    PubMed

    Kahn, Charles E

    2005-06-01

    Collaborative filtering is a knowledge-discovery technique that can help guide readers to items of potential interest based on the experience of prior users. This study sought to determine the impact of collaborative filtering on navigation of a large, Web-based radiology knowledge resource. Collaborative filtering was applied to a collection of 1,168 radiology hypertext documents available via the Internet. An item-based collaborative filtering algorithm identified each document's six most closely related documents based on 248,304 page views in an 18-day period. Documents were amended to include links to their related documents, and use was analyzed over the next 5 days. The mean number of documents viewed per visit increased from 1.57 to 1.74 (P < 0.0001). Collaborative filtering can increase a radiology information resource's utilization and can improve its usefulness and ease of navigation. The technique holds promise for improving navigation of large Internet-based radiology knowledge resources.

  3. Hybrid context aware recommender systems

    NASA Astrophysics Data System (ADS)

    Jain, Rajshree; Tyagi, Jaya; Singh, Sandeep Kumar; Alam, Taj

    2017-10-01

    Recommender systems and context awareness is currently a vital field of research. Most hybrid recommendation systems implement content based and collaborative filtering techniques whereas this work combines context and collaborative filtering. The paper presents a hybrid context aware recommender system for books and movies that gives recommendations based on the user context as well as user or item similarity. It also addresses the issue of dimensionality reduction using weighted pre filtering based on dynamically entered user context and preference of context. This unique step helps to reduce the size of dataset for collaborative filtering. Bias subtracted collaborative filtering is used so as to consider the relative rating of a particular user and not the absolute values. Cosine similarity is used as a metric to determine the similarity between users or items. The unknown ratings are calculated and evaluated using MSE (Mean Squared Error) in test and train datasets. The overall process of recommendation has helped to personalize recommendations and give more accurate results with reduced complexity in collaborative filtering.

  4. A ROle-Oriented Filtering (ROOF) approach for collaborative recommendation

    NASA Astrophysics Data System (ADS)

    Ghani, Imran; Jeong, Seung Ryul

    2016-09-01

    In collaborative filtering (CF) recommender systems, existing techniques frequently focus on determining similarities among users' historical interests. This generally refers to situations in which each user normally plays a single role and his/her taste remains consistent over the long term. However, we note that existing techniques have not been significantly employed in a role-oriented context. This is especially so in situations where users may change their roles over time or play multiple roles simultaneously, while still expecting to access relevant information resources accordingly. Such systems include enterprise architecture management systems, e-commerce sites or journal management systems. In scenarios involving existing techniques, each user needs to build up very different profiles (preferences and interests) based on multiple roles which change over time. Should this not occur to a satisfactory degree, their previous information will either be lost or not utilised at all. To limit the occurrence of such issues, we propose a ROle-Oriented Filtering (ROOF) approach focusing on the manner in which multiple user profiles are obtained and maintained over time. We conducted a number of experiments using an enterprise architecture management scenario. In so doing, we observed that the ROOF approach performs better in comparison with other existing collaborative filtering-based techniques.

  5. urCF: An Approach to Integrating User Reviews into Memory-Based Collaborative Filtering

    ERIC Educational Resources Information Center

    Zhang, Zhenxue

    2013-01-01

    Blessed by the Internet age, many online retailers (e.g., Amazon.com) have deployed recommender systems to help their customers identify products that may be of their interest in order to improve cross-selling and enhance customer loyalty. Collaborative Filtering (CF) is the most successful technique among different approaches to generating…

  6. Measuring User Similarity Using Electric Circuit Analysis: Application to Collaborative Filtering

    PubMed Central

    Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan

    2012-01-01

    We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user–item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems PMID:23145095

  7. Measuring user similarity using electric circuit analysis: application to collaborative filtering.

    PubMed

    Yang, Joonhyuk; Kim, Jinwook; Kim, Wonjoon; Kim, Young Hwan

    2012-01-01

    We propose a new technique of measuring user similarity in collaborative filtering using electric circuit analysis. Electric circuit analysis is used to measure the potential differences between nodes on an electric circuit. In this paper, by applying this method to transaction networks comprising users and items, i.e., user-item matrix, and by using the full information about the relationship structure of users in the perspective of item adoption, we overcome the limitations of one-to-one similarity calculation approach, such as the Pearson correlation, Tanimoto coefficient, and Hamming distance, in collaborative filtering. We found that electric circuit analysis can be successfully incorporated into recommender systems and has the potential to significantly enhance predictability, especially when combined with user-based collaborative filtering. We also propose four types of hybrid algorithms that combine the Pearson correlation method and electric circuit analysis. One of the algorithms exceeds the performance of the traditional collaborative filtering by 37.5% at most. This work opens new opportunities for interdisciplinary research between physics and computer science and the development of new recommendation systems.

  8. Towards a Collaborative Filtering Approach to Medication Reconciliation

    PubMed Central

    Hasan, Sharique; Duncan, George T.; Neill, Daniel B.; Padman, Rema

    2008-01-01

    A physician’s prescribing decisions depend on knowledge of the patient’s medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patient’s medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patient’s medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation. PMID:18998834

  9. Towards a collaborative filtering approach to medication reconciliation.

    PubMed

    Hasan, Sharique; Duncan, George T; Neill, Daniel B; Padman, Rema

    2008-11-06

    A physicians prescribing decisions depend on knowledge of the patients medication list. This knowledge is often incomplete, and errors or omissions could result in adverse outcomes. To address this problem, the Joint Commission recommends medication reconciliation for creating a more accurate list of a patients medications. In this paper, we develop techniques for automatic detection of omissions in medication lists, identifying drugs that the patient may be taking but are not on the patients medication list. Our key insight is that this problem is analogous to the collaborative filtering framework increasingly used by online retailers to recommend relevant products to customers. The collaborative filtering approach enables a variety of solution techniques, including nearest neighbor and co-occurrence approaches. We evaluate the effectiveness of these approaches using medication data from a long-term care center in the Eastern US. Preliminary results suggest that this framework may become a valuable tool for medication reconciliation.

  10. Image Recommendation Algorithm Using Feature-Based Collaborative Filtering

    NASA Astrophysics Data System (ADS)

    Kim, Deok-Hwan

    As the multimedia contents market continues its rapid expansion, the amount of image contents used in mobile phone services, digital libraries, and catalog service is increasing remarkably. In spite of this rapid growth, users experience high levels of frustration when searching for the desired image. Even though new images are profitable to the service providers, traditional collaborative filtering methods cannot recommend them. To solve this problem, in this paper, we propose feature-based collaborative filtering (FBCF) method to reflect the user's most recent preference by representing his purchase sequence in the visual feature space. The proposed approach represents the images that have been purchased in the past as the feature clusters in the multi-dimensional feature space and then selects neighbors by using an inter-cluster distance function between their feature clusters. Various experiments using real image data demonstrate that the proposed approach provides a higher quality recommendation and better performance than do typical collaborative filtering and content-based filtering techniques.

  11. Collaborative Filtering Based on Sequential Extraction of User-Item Clusters

    NASA Astrophysics Data System (ADS)

    Honda, Katsuhiro; Notsu, Akira; Ichihashi, Hidetomo

    Collaborative filtering is a computational realization of “word-of-mouth” in network community, in which the items prefered by “neighbors” are recommended. This paper proposes a new item-selection model for extracting user-item clusters from rectangular relation matrices, in which mutual relations between users and items are denoted in an alternative process of “liking or not”. A technique for sequential co-cluster extraction from rectangular relational data is given by combining the structural balancing-based user-item clustering method with sequential fuzzy cluster extraction appraoch. Then, the tecunique is applied to the collaborative filtering problem, in which some items may be shared by several user clusters.

  12. Collaborative filtering on a family of biological targets.

    PubMed

    Erhan, Dumitru; L'heureux, Pierre-Jean; Yue, Shi Yi; Bengio, Yoshua

    2006-01-01

    Building a QSAR model of a new biological target for which few screening data are available is a statistical challenge. However, the new target may be part of a bigger family, for which we have more screening data. Collaborative filtering or, more generally, multi-task learning, is a machine learning approach that improves the generalization performance of an algorithm by using information from related tasks as an inductive bias. We use collaborative filtering techniques for building predictive models that link multiple targets to multiple examples. The more commonalities between the targets, the better the multi-target model that can be built. We show an example of a multi-target neural network that can use family information to produce a predictive model of an undersampled target. We evaluate JRank, a kernel-based method designed for collaborative filtering. We show their performance on compound prioritization for an HTS campaign and the underlying shared representation between targets. JRank outperformed the neural network both in the single- and multi-target models.

  13. Social Collaborative Filtering by Trust.

    PubMed

    Yang, Bo; Lei, Yu; Liu, Jiming; Li, Wenjie

    2017-08-01

    Recommender systems are used to accurately and actively provide users with potentially interesting information or services. Collaborative filtering is a widely adopted approach to recommendation, but sparse data and cold-start users are often barriers to providing high quality recommendations. To address such issues, we propose a novel method that works to improve the performance of collaborative filtering recommendations by integrating sparse rating data given by users and sparse social trust network among these same users. This is a model-based method that adopts matrix factorization technique that maps users into low-dimensional latent feature spaces in terms of their trust relationship, and aims to more accurately reflect the users reciprocal influence on the formation of their own opinions and to learn better preferential patterns of users for high-quality recommendations. We use four large-scale datasets to show that the proposed method performs much better, especially for cold start users, than state-of-the-art recommendation algorithms for social collaborative filtering based on trust.

  14. A New Adaptive Framework for Collaborative Filtering Prediction

    PubMed Central

    Almosallam, Ibrahim A.; Shang, Yi

    2010-01-01

    Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix’s system. PMID:21572924

  15. A New Adaptive Framework for Collaborative Filtering Prediction.

    PubMed

    Almosallam, Ibrahim A; Shang, Yi

    2008-06-01

    Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix's system.

  16. Leveraging Collaborative Filtering to Accelerate Rare Disease Diagnosis

    PubMed Central

    Shen, Feichen; Liu, Sijia; Wang, Yanshan; Wang, Liwei; Afzal, Naveed; Liu, Hongfang

    2017-01-01

    In the USA, rare diseases are defined as those affecting fewer than 200,000 patients at any given time. Patients with rare diseases are frequently misdiagnosed or undiagnosed which may due to the lack of knowledge and experience of care providers. We hypothesize that patients’ phenotypic information available in electronic medical records (EMR) can be leveraged to accelerate disease diagnosis based on the intuition that providers need to document associated phenotypic information to support the diagnosis decision, especially for rare diseases. In this study, we proposed a collaborative filtering system enriched with natural language processing and semantic techniques to assist rare disease diagnosis based on phenotypic characterization. Specifically, we leveraged four similarity measurements with two neighborhood algorithms on 2010-2015 Mayo Clinic unstructured large patient cohort and evaluated different approaches. Preliminary results demonstrated that the use of collaborative filtering with phenotypic information is able to stratify patients with relatively similar rare diseases. PMID:29854225

  17. Leveraging Collaborative Filtering to Accelerate Rare Disease Diagnosis.

    PubMed

    Shen, Feichen; Liu, Sijia; Wang, Yanshan; Wang, Liwei; Afzal, Naveed; Liu, Hongfang

    2017-01-01

    In the USA, rare diseases are defined as those affecting fewer than 200,000 patients at any given time. Patients with rare diseases are frequently misdiagnosed or undiagnosed which may due to the lack of knowledge and experience of care providers. We hypothesize that patients' phenotypic information available in electronic medical records (EMR) can be leveraged to accelerate disease diagnosis based on the intuition that providers need to document associated phenotypic information to support the diagnosis decision, especially for rare diseases. In this study, we proposed a collaborative filtering system enriched with natural language processing and semantic techniques to assist rare disease diagnosis based on phenotypic characterization. Specifically, we leveraged four similarity measurements with two neighborhood algorithms on 2010-2015 Mayo Clinic unstructured large patient cohort and evaluated different approaches. Preliminary results demonstrated that the use of collaborative filtering with phenotypic information is able to stratify patients with relatively similar rare diseases.

  18. A PageRank-based reputation model for personalised manufacturing service recommendation

    NASA Astrophysics Data System (ADS)

    Zhang, W. Y.; Zhang, S.; Guo, S. S.

    2017-05-01

    The number of manufacturing services for cross-enterprise business collaborations is increasing rapidly because of the explosive growth of Web service technologies. This trend demands intelligent and robust models to address information overload in order to enable efficient discovery of manufacturing services. In this paper, we present a personalised manufacturing service recommendation approach, which combines a PageRank-based reputation model and a collaborative filtering technique in a unified framework for recommending the right manufacturing services to an active service user for supply chain deployment. The novel aspect of this research is adapting the PageRank algorithm to a network of service-oriented multi-echelon supply chain in order to determine both user reputation and service reputation. In addition, it explores the use of these methods in alleviating data sparsity and cold start problems that hinder traditional collaborative filtering techniques. A case study is conducted to validate the practicality and effectiveness of the proposed approach in recommending the right manufacturing services to active service users.

  19. Advances in Collaborative Filtering

    NASA Astrophysics Data System (ADS)

    Koren, Yehuda; Bell, Robert

    The collaborative filtering (CF) approach to recommenders has recently enjoyed much interest and progress. The fact that it played a central role within the recently completed Netflix competition has contributed to its popularity. This chapter surveys the recent progress in the field. Matrix factorization techniques, which became a first choice for implementing CF, are described together with recent innovations. We also describe several extensions that bring competitive accuracy into neighborhood methods, which used to dominate the field. The chapter demonstrates how to utilize temporal models and implicit feedback to extend models accuracy. In passing, we include detailed descriptions of some the central methods developed for tackling the challenge of the Netflix Prize competition.

  20. Shopping For Danger: E-commerce techniques applied to collaboration in cyber security

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bruce, Joseph R.; Fink, Glenn A.

    Collaboration among cyber security analysts is essential to a successful protection strategy on the Internet today, but it is uncommonly practiced or encouraged in operating environments. Barriers to productive collaboration often include data sensitivity, time and effort to communicate, institutional policy, and protection of domain knowledge. We propose an ambient collaboration framework, Vulcan, designed to remove the barriers of time and effort and mitigate the others. Vulcan automated data collection, collaborative filtering, and asynchronous dissemination, eliminating the effort implied by explicit collaboration among peers. We instrumented two analytic applications and performed a mock analysis session to build a dataset andmore » test the output of the system.« less

  1. Similarity from multi-dimensional scaling: solving the accuracy and diversity dilemma in information filtering.

    PubMed

    Zeng, Wei; Zeng, An; Liu, Hao; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2014-01-01

    Recommender systems are designed to assist individual users to navigate through the rapidly growing amount of information. One of the most successful recommendation techniques is the collaborative filtering, which has been extensively investigated and has already found wide applications in e-commerce. One of challenges in this algorithm is how to accurately quantify the similarities of user pairs and item pairs. In this paper, we employ the multidimensional scaling (MDS) method to measure the similarities between nodes in user-item bipartite networks. The MDS method can extract the essential similarity information from the networks by smoothing out noise, which provides a graphical display of the structure of the networks. With the similarity measured from MDS, we find that the item-based collaborative filtering algorithm can outperform the diffusion-based recommendation algorithms. Moreover, we show that this method tends to recommend unpopular items and increase the global diversification of the networks in long term.

  2. Evaluation of Keyphrase Extraction Algorithm and Tiling Process for a Document/Resource Recommender within E-Learning Environments

    ERIC Educational Resources Information Center

    Mangina, Eleni; Kilbride, John

    2008-01-01

    The research presented in this paper is an examination of the applicability of IUI techniques in an online e-learning environment. In particular we make use of user modeling techniques, information retrieval and extraction mechanisms and collaborative filtering methods. The domains of e-learning, web-based training and instruction and intelligent…

  3. Similarity from Multi-Dimensional Scaling: Solving the Accuracy and Diversity Dilemma in Information Filtering

    PubMed Central

    Zeng, Wei; Zeng, An; Liu, Hao; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2014-01-01

    Recommender systems are designed to assist individual users to navigate through the rapidly growing amount of information. One of the most successful recommendation techniques is the collaborative filtering, which has been extensively investigated and has already found wide applications in e-commerce. One of challenges in this algorithm is how to accurately quantify the similarities of user pairs and item pairs. In this paper, we employ the multidimensional scaling (MDS) method to measure the similarities between nodes in user-item bipartite networks. The MDS method can extract the essential similarity information from the networks by smoothing out noise, which provides a graphical display of the structure of the networks. With the similarity measured from MDS, we find that the item-based collaborative filtering algorithm can outperform the diffusion-based recommendation algorithms. Moreover, we show that this method tends to recommend unpopular items and increase the global diversification of the networks in long term. PMID:25343243

  4. The recommender system for virtual items in MMORPGs based on a novel collaborative filtering approach

    NASA Astrophysics Data System (ADS)

    Li, S. G.; Shi, L.

    2014-10-01

    The recommendation system for virtual items in massive multiplayer online role-playing games (MMORPGs) has aroused the interest of researchers. Of the many approaches to construct a recommender system, collaborative filtering (CF) has been the most successful one. However, the traditional CFs just lure customers into the purchasing action and overlook customers' satisfaction, moreover, these techniques always suffer from low accuracy under cold-start conditions. Therefore, a novel collaborative filtering (NCF) method is proposed to identify like-minded customers according to the preference similarity coefficient (PSC), which implies correlation between the similarity of customers' characteristics and the similarity of customers' satisfaction level for the product. Furthermore, the analytic hierarchy process (AHP) is used to determine the relative importance of each characteristic of the customer and the improved ant colony optimisation (IACO) is adopted to generate the expression of the PSC. The IACO creates solutions using the Markov random walk model, which can accelerate the convergence of algorithm and prevent prematurity. For a target customer whose neighbours can be found, the NCF can predict his satisfaction level towards the suggested products and recommend the acceptable ones. Under cold-start conditions, the NCF will generate the recommendation list by excluding items that other customers prefer.

  5. Career Goal-Based E-Learning Recommendation Using Enhanced Collaborative Filtering and PrefixSpan

    ERIC Educational Resources Information Center

    Ma, Xueying; Ye, Lu

    2018-01-01

    This article describes how e-learning recommender systems nowadays have applied different kinds of techniques to recommend personalized learning content for users based on their preference, goals, interests and background information. However, the cold-start problem which exists in traditional recommendation algorithms are still left over in…

  6. What Do You Recommend? Implementation and Analyses of Collaborative Information Filtering of Web Resources for Education.

    ERIC Educational Resources Information Center

    Recker, Mimi M.; Walker, Andrew; Lawless, Kimberly

    2003-01-01

    Examines results from one pilot study and two empirical studies of a collaborative filtering system applied in higher education settings. Explains the use of collaborative filtering in electronic commerce and suggests it can be adapted to education to help find useful Web resources and to bring people together with similar interests and beliefs.…

  7. Measuring Learner's Performance in E-Learning Recommender Systems

    ERIC Educational Resources Information Center

    Ghauth, Khairil Imran; Abdullah, Nor Aniza

    2010-01-01

    A recommender system is a piece of software that helps users to identify the most interesting and relevant learning items from a large number of items. Recommender systems may be based on collaborative filtering (by user ratings), content-based filtering (by keywords), and hybrid filtering (by both collaborative and content-based filtering).…

  8. Mississippi State University Center for Air Sea Technology. FY93 and FY 94 Research Program in Navy Ocean Modeling and Prediction

    DTIC Science & Technology

    1994-09-30

    relational versus object oriented DBMS, knowledge discovery, data models, rnetadata, data filtering, clustering techniques, and synthetic data. A secondary...The first was the investigation of Al/ES Lapplications (knowledge discovery, data mining, and clustering ). Here CAST collabo.rated with Dr. Fred Petry...knowledge discovery system based on clustering techniques; implemented an on-line data browser to the DBMS; completed preliminary efforts to apply object

  9. A Hybrid Approach using Collaborative filtering and Content based Filtering for Recommender System

    NASA Astrophysics Data System (ADS)

    Geetha, G.; Safa, M.; Fancy, C.; Saranya, D.

    2018-04-01

    In today’s digital world, it has become an irksome task to find the content of one's liking in an endless variety of content that are being consumed like books, videos, articles, movies, etc. On the other hand there has been an emerging growth among the digital content providers who want to engage as many users on their service as possible for the maximum time. This gave birth to the recommender system comes wherein the content providers recommend users the content according to the users’ taste and liking. In this paper we have proposed a movie recommendation system. A movie recommendation is important in our social life due to its features such as suggesting a set of movies to users based on their interest, or the popularities of the movies. In this paper we are proposing a movie recommendation system that has the ability to recommend movies to a new user as well as the other existing users. It mines movie databases to collect all the important information, such as, popularity and attractiveness, which are required for recommendation. We use content-based and collaborative filtering and also hybrid filtering, which is a combination of the results of these two techniques, to construct a system that provides more precise recommendations concerning movies.

  10. A collaborative filtering recommendation algorithm based on weighted SimRank and social trust

    NASA Astrophysics Data System (ADS)

    Su, Chang; Zhang, Butao

    2017-05-01

    Collaborative filtering is one of the most widely used recommendation technologies, but the data sparsity and cold start problem of collaborative filtering algorithms are difficult to solve effectively. In order to alleviate the problem of data sparsity in collaborative filtering algorithm, firstly, a weighted improved SimRank algorithm is proposed to compute the rating similarity between users in rating data set. The improved SimRank can find more nearest neighbors for target users according to the transmissibility of rating similarity. Then, we build trust network and introduce the calculation of trust degree in the trust relationship data set. Finally, we combine rating similarity and trust to build a comprehensive similarity in order to find more appropriate nearest neighbors for target user. Experimental results show that the algorithm proposed in this paper improves the recommendation precision of the Collaborative algorithm effectively.

  11. Automatic detection of omissions in medication lists

    PubMed Central

    Duncan, George T; Neill, Daniel B; Padman, Rema

    2011-01-01

    Objective Evidence suggests that the medication lists of patients are often incomplete and could negatively affect patient outcomes. In this article, the authors propose the application of collaborative filtering methods to the medication reconciliation task. Given a current medication list for a patient, the authors employ collaborative filtering approaches to predict drugs the patient could be taking but are missing from their observed list. Design The collaborative filtering approach presented in this paper emerges from the insight that an omission in a medication list is analogous to an item a consumer might purchase from a product list. Online retailers use collaborative filtering to recommend relevant products using retrospective purchase data. In this article, the authors argue that patient information in electronic medical records, combined with artificial intelligence methods, can enhance medication reconciliation. The authors formulate the detection of omissions in medication lists as a collaborative filtering problem. Detection of omissions is accomplished using several machine-learning approaches. The effectiveness of these approaches is evaluated using medication data from three long-term care centers. The authors also propose several decision-theoretic extensions to the methodology for incorporating medical knowledge into recommendations. Results Results show that collaborative filtering identifies the missing drug in the top-10 list about 40–50% of the time and the therapeutic class of the missing drug 50%–65% of the time at the three clinics in this study. Conclusion Results suggest that collaborative filtering can be a valuable tool for reconciling medication lists, complementing currently recommended process-driven approaches. However, a one-size-fits-all approach is not optimal, and consideration should be given to context (eg, types of patients and drug regimens) and consequence (eg, the impact of omission on outcomes). PMID:21447497

  12. Automatic detection of omissions in medication lists.

    PubMed

    Hasan, Sharique; Duncan, George T; Neill, Daniel B; Padman, Rema

    2011-01-01

    Evidence suggests that the medication lists of patients are often incomplete and could negatively affect patient outcomes. In this article, the authors propose the application of collaborative filtering methods to the medication reconciliation task. Given a current medication list for a patient, the authors employ collaborative filtering approaches to predict drugs the patient could be taking but are missing from their observed list. The collaborative filtering approach presented in this paper emerges from the insight that an omission in a medication list is analogous to an item a consumer might purchase from a product list. Online retailers use collaborative filtering to recommend relevant products using retrospective purchase data. In this article, the authors argue that patient information in electronic medical records, combined with artificial intelligence methods, can enhance medication reconciliation. The authors formulate the detection of omissions in medication lists as a collaborative filtering problem. Detection of omissions is accomplished using several machine-learning approaches. The effectiveness of these approaches is evaluated using medication data from three long-term care centers. The authors also propose several decision-theoretic extensions to the methodology for incorporating medical knowledge into recommendations. Results show that collaborative filtering identifies the missing drug in the top-10 list about 40-50% of the time and the therapeutic class of the missing drug 50%-65% of the time at the three clinics in this study. Results suggest that collaborative filtering can be a valuable tool for reconciling medication lists, complementing currently recommended process-driven approaches. However, a one-size-fits-all approach is not optimal, and consideration should be given to context (eg, types of patients and drug regimens) and consequence (eg, the impact of omission on outcomes).

  13. A 3D ultrasound scanner: real time filtering and rendering algorithms.

    PubMed

    Cifarelli, D; Ruggiero, C; Brusacà, M; Mazzarella, M

    1997-01-01

    The work described here has been carried out within a collaborative project between DIST and ESAOTE BIOMEDICA aiming to set up a new ultrasonic scanner performing 3D reconstruction. A system is being set up to process and display 3D ultrasonic data in a fast, economical and user friendly way to help the physician during diagnosis. A comparison is presented among several algorithms for digital filtering, data segmentation and rendering for real time, PC based, three-dimensional reconstruction from B-mode ultrasonic biomedical images. Several algorithms for digital filtering have been compared as relates to processing time and to final image quality. Three-dimensional data segmentation techniques and rendering has been carried out with special reference to user friendly features for foreseeable applications and reconstruction speed.

  14. Collaborative filtering recommendation model based on fuzzy clustering algorithm

    NASA Astrophysics Data System (ADS)

    Yang, Ye; Zhang, Yunhua

    2018-05-01

    As one of the most widely used algorithms in recommender systems, collaborative filtering algorithm faces two serious problems, which are the sparsity of data and poor recommendation effect in big data environment. In traditional clustering analysis, the object is strictly divided into several classes and the boundary of this division is very clear. However, for most objects in real life, there is no strict definition of their forms and attributes of their class. Concerning the problems above, this paper proposes to improve the traditional collaborative filtering model through the hybrid optimization of implicit semantic algorithm and fuzzy clustering algorithm, meanwhile, cooperating with collaborative filtering algorithm. In this paper, the fuzzy clustering algorithm is introduced to fuzzy clustering the information of project attribute, which makes the project belong to different project categories with different membership degrees, and increases the density of data, effectively reduces the sparsity of data, and solves the problem of low accuracy which is resulted from the inaccuracy of similarity calculation. Finally, this paper carries out empirical analysis on the MovieLens dataset, and compares it with the traditional user-based collaborative filtering algorithm. The proposed algorithm has greatly improved the recommendation accuracy.

  15. Improved collaborative filtering recommendation algorithm of similarity measure

    NASA Astrophysics Data System (ADS)

    Zhang, Baofu; Yuan, Baoping

    2017-05-01

    The Collaborative filtering recommendation algorithm is one of the most widely used recommendation algorithm in personalized recommender systems. The key is to find the nearest neighbor set of the active user by using similarity measure. However, the methods of traditional similarity measure mainly focus on the similarity of user common rating items, but ignore the relationship between the user common rating items and all items the user rates. And because rating matrix is very sparse, traditional collaborative filtering recommendation algorithm is not high efficiency. In order to obtain better accuracy, based on the consideration of common preference between users, the difference of rating scale and score of common items, this paper presents an improved similarity measure method, and based on this method, a collaborative filtering recommendation algorithm based on similarity improvement is proposed. Experimental results show that the algorithm can effectively improve the quality of recommendation, thus alleviate the impact of data sparseness.

  16. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence.

    PubMed

    Alphy, Anna; Prabakaran, S

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations.

  17. Can Dissimilar Users Contribute to Accuracy and Diversity of Personalized Recommendation?

    NASA Astrophysics Data System (ADS)

    Zeng, Wei; Shang, Ming-Sheng; Zhang, Qian-Ming; Lü, Linyuan; Zhou, Tao

    Recommender systems are becoming a popular and important set of personalization techniques that assist individual users with navigating through the rapidly growing amount of information. A good recommender system should be able to not only find out the objects preferred by users, but also help users in discovering their personalized tastes. The former corresponds to high accuracy of the recommendation, while the latter to high diversity. A big challenge is to design an algorithm that provides both highly accurate and diverse recommendation. Traditional recommendation algorithms only take into account the contributions of similar users, thus, they tend to recommend popular items for users ignoring the diversity of recommendations. In this paper, we propose a recommendation algorithm by considering both the effects of similar and dissimilar users under the framework of collaborative filtering. Extensive analyses on three datasets, namely MovieLens, Netflix and Amazon, show that our method performs much better than the standard collaborative filtering algorithm for both accuracy and diversity.

  18. A Dynamic Recommender System for Improved Web Usage Mining and CRM Using Swarm Intelligence

    PubMed Central

    Alphy, Anna; Prabakaran, S.

    2015-01-01

    In modern days, to enrich e-business, the websites are personalized for each user by understanding their interests and behavior. The main challenges of online usage data are information overload and their dynamic nature. In this paper, to address these issues, a WebBluegillRecom-annealing dynamic recommender system that uses web usage mining techniques in tandem with software agents developed for providing dynamic recommendations to users that can be used for customizing a website is proposed. The proposed WebBluegillRecom-annealing dynamic recommender uses swarm intelligence from the foraging behavior of a bluegill fish. It overcomes the information overload by handling dynamic behaviors of users. Our dynamic recommender system was compared against traditional collaborative filtering systems. The results show that the proposed system has higher precision, coverage, F1 measure, and scalability than the traditional collaborative filtering systems. Moreover, the recommendations given by our system overcome the overspecialization problem by including variety in recommendations. PMID:26229978

  19. Efficient OCT Image Enhancement Based on Collaborative Shock Filtering

    PubMed Central

    2018-01-01

    Efficient enhancement of noisy optical coherence tomography (OCT) images is a key task for interpreting them correctly. In this paper, to better enhance details and layered structures of a human retina image, we propose a collaborative shock filtering for OCT image denoising and enhancement. Noisy OCT image is first denoised by a collaborative filtering method with new similarity measure, and then the denoised image is sharpened by a shock-type filtering for edge and detail enhancement. For dim OCT images, in order to improve image contrast for the detection of tiny lesions, a gamma transformation is first used to enhance the images within proper gray levels. The proposed method integrating image smoothing and sharpening simultaneously obtains better visual results in experiments. PMID:29599954

  20. Efficient OCT Image Enhancement Based on Collaborative Shock Filtering.

    PubMed

    Liu, Guohua; Wang, Ziyu; Mu, Guoying; Li, Peijin

    2018-01-01

    Efficient enhancement of noisy optical coherence tomography (OCT) images is a key task for interpreting them correctly. In this paper, to better enhance details and layered structures of a human retina image, we propose a collaborative shock filtering for OCT image denoising and enhancement. Noisy OCT image is first denoised by a collaborative filtering method with new similarity measure, and then the denoised image is sharpened by a shock-type filtering for edge and detail enhancement. For dim OCT images, in order to improve image contrast for the detection of tiny lesions, a gamma transformation is first used to enhance the images within proper gray levels. The proposed method integrating image smoothing and sharpening simultaneously obtains better visual results in experiments.

  1. Enhancing collaborative filtering by user interest expansion via personalized ranking.

    PubMed

    Liu, Qi; Chen, Enhong; Xiong, Hui; Ding, Chris H Q; Chen, Jian

    2012-02-01

    Recommender systems suggest a few items from many possible choices to the users by understanding their past behaviors. In these systems, the user behaviors are influenced by the hidden interests of the users. Learning to leverage the information about user interests is often critical for making better recommendations. However, existing collaborative-filtering-based recommender systems are usually focused on exploiting the information about the user's interaction with the systems; the information about latent user interests is largely underexplored. To that end, inspired by the topic models, in this paper, we propose a novel collaborative-filtering-based recommender system by user interest expansion via personalized ranking, named iExpand. The goal is to build an item-oriented model-based collaborative-filtering framework. The iExpand method introduces a three-layer, user-interests-item, representation scheme, which leads to more accurate ranking recommendation results with less computation cost and helps the understanding of the interactions among users, items, and user interests. Moreover, iExpand strategically deals with many issues that exist in traditional collaborative-filtering approaches, such as the overspecialization problem and the cold-start problem. Finally, we evaluate iExpand on three benchmark data sets, and experimental results show that iExpand can lead to better ranking performance than state-of-the-art methods with a significant margin.

  2. A filter-mediated communication model for design collaboration in building construction.

    PubMed

    Lee, Jaewook; Jeong, Yongwook; Oh, Minho; Hong, Seung Wan

    2014-01-01

    Multidisciplinary collaboration is an important aspect of modern engineering activities, arising from the growing complexity of artifacts whose design and construction require knowledge and skills that exceed the capacities of any one professional. However, current collaboration in the architecture, engineering, and construction industries often fails due to lack of shared understanding between different participants and limitations of their supporting tools. To achieve a high level of shared understanding, this study proposes a filter-mediated communication model. In the proposed model, participants retain their own data in the form most appropriate for their needs with domain-specific filters that transform the neutral representations into semantically rich ones, as needed by the participants. Conversely, the filters can translate semantically rich, domain-specific data into a neutral representation that can be accessed by other domain-specific filters. To validate the feasibility of the proposed model, we computationally implement the filter mechanism and apply it to a hypothetical test case. The result acknowledges that the filter mechanism can let the participants know ahead of time what will be the implications of their proposed actions, as seen from other participants' points of view.

  3. Collaborative Information Filtering in Cooperative Communities.

    ERIC Educational Resources Information Center

    Okamoto, T.; Miyahara, K.

    1998-01-01

    The purpose of this study was to develop an information filtering system which collects, classifies, selects, and stores various kinds of information found through the Internet. A collaborative form of information gathering was examined and a model was built and implemented in the Internet information space. (AEF)

  4. New similarity of triangular fuzzy number and its application.

    PubMed

    Zhang, Xixiang; Ma, Weimin; Chen, Liping

    2014-01-01

    The similarity of triangular fuzzy numbers is an important metric for application of it. There exist several approaches to measure similarity of triangular fuzzy numbers. However, some of them are opt to be large. To make the similarity well distributed, a new method SIAM (Shape's Indifferent Area and Midpoint) to measure triangular fuzzy number is put forward, which takes the shape's indifferent area and midpoint of two triangular fuzzy numbers into consideration. Comparison with other similarity measurements shows the effectiveness of the proposed method. Then, it is applied to collaborative filtering recommendation to measure users' similarity. A collaborative filtering case is used to illustrate users' similarity based on cloud model and triangular fuzzy number; the result indicates that users' similarity based on triangular fuzzy number can obtain better discrimination. Finally, a simulated collaborative filtering recommendation system is developed which uses cloud model and triangular fuzzy number to express users' comprehensive evaluation on items, and result shows that the accuracy of collaborative filtering recommendation based on triangular fuzzy number is higher.

  5. Integrated approach for automatic target recognition using a network of collaborative sensors.

    PubMed

    Mahalanobis, Abhijit; Van Nevel, Alan

    2006-10-01

    We introduce what is believed to be a novel concept by which several sensors with automatic target recognition (ATR) capability collaborate to recognize objects. Such an approach would be suitable for netted systems in which the sensors and platforms can coordinate to optimize end-to-end performance. We use correlation filtering techniques to facilitate the development of the concept, although other ATR algorithms may be easily substituted. Essentially, a self-configuring geometry of netted platforms is proposed that positions the sensors optimally with respect to each other, and takes into account the interactions among the sensor, the recognition algorithms, and the classes of the objects to be recognized. We show how such a paradigm optimizes overall performance, and illustrate the collaborative ATR scheme for recognizing targets in synthetic aperture radar imagery by using viewing position as a sensor parameter.

  6. An Invitation to Open Innovation in Malaria Drug Discovery: 47 Quality Starting Points from the TCAMS.

    PubMed

    Calderón, Félix; Barros, David; Bueno, José María; Coterón, José Miguel; Fernández, Esther; Gamo, Francisco Javier; Lavandera, José Luís; León, María Luisa; Macdonald, Simon J F; Mallo, Araceli; Manzano, Pilar; Porras, Esther; Fiandor, José María; Castro, Julia

    2011-10-13

    In 2010, GlaxoSmithKline published the structures of 13533 chemical starting points for antimalarial lead identification. By using an agglomerative structural clustering technique followed by computational filters such as antimalarial activity, physicochemical properties, and dissimilarity to known antimalarial structures, we have identified 47 starting points for lead optimization. Their structures are provided. We invite potential collaborators to work with us to discover new clinical candidates.

  7. Ultrasound Image Despeckling Using Stochastic Distance-Based BM3D.

    PubMed

    Santos, Cid A N; Martins, Diego L N; Mascarenhas, Nelson D A

    2017-06-01

    Ultrasound image despeckling is an important research field, since it can improve the interpretability of one of the main categories of medical imaging. Many techniques have been tried over the years for ultrasound despeckling, and more recently, a great deal of attention has been focused on patch-based methods, such as non-local means and block-matching collaborative filtering (BM3D). A common idea in these recent methods is the measure of distance between patches, originally proposed as the Euclidean distance, for filtering additive white Gaussian noise. In this paper, we derive new stochastic distances for the Fisher-Tippett distribution, based on well-known statistical divergences, and use them as patch distance measures in a modified version of the BM3D algorithm for despeckling log-compressed ultrasound images. State-of-the-art results in filtering simulated, synthetic, and real ultrasound images confirm the potential of the proposed approach.

  8. A Filter-Mediated Communication Model for Design Collaboration in Building Construction

    PubMed Central

    Oh, Minho

    2014-01-01

    Multidisciplinary collaboration is an important aspect of modern engineering activities, arising from the growing complexity of artifacts whose design and construction require knowledge and skills that exceed the capacities of any one professional. However, current collaboration in the architecture, engineering, and construction industries often fails due to lack of shared understanding between different participants and limitations of their supporting tools. To achieve a high level of shared understanding, this study proposes a filter-mediated communication model. In the proposed model, participants retain their own data in the form most appropriate for their needs with domain-specific filters that transform the neutral representations into semantically rich ones, as needed by the participants. Conversely, the filters can translate semantically rich, domain-specific data into a neutral representation that can be accessed by other domain-specific filters. To validate the feasibility of the proposed model, we computationally implement the filter mechanism and apply it to a hypothetical test case. The result acknowledges that the filter mechanism can let the participants know ahead of time what will be the implications of their proposed actions, as seen from other participants' points of view. PMID:25309958

  9. High-Performance Monitoring Architecture for Large-Scale Distributed Systems Using Event Filtering

    NASA Technical Reports Server (NTRS)

    Maly, K.

    1998-01-01

    Monitoring is an essential process to observe and improve the reliability and the performance of large-scale distributed (LSD) systems. In an LSD environment, a large number of events is generated by the system components during its execution or interaction with external objects (e.g. users or processes). Monitoring such events is necessary for observing the run-time behavior of LSD systems and providing status information required for debugging, tuning and managing such applications. However, correlated events are generated concurrently and could be distributed in various locations in the applications environment which complicates the management decisions process and thereby makes monitoring LSD systems an intricate task. We propose a scalable high-performance monitoring architecture for LSD systems to detect and classify interesting local and global events and disseminate the monitoring information to the corresponding end- points management applications such as debugging and reactive control tools to improve the application performance and reliability. A large volume of events may be generated due to the extensive demands of the monitoring applications and the high interaction of LSD systems. The monitoring architecture employs a high-performance event filtering mechanism to efficiently process the large volume of event traffic generated by LSD systems and minimize the intrusiveness of the monitoring process by reducing the event traffic flow in the system and distributing the monitoring computation. Our architecture also supports dynamic and flexible reconfiguration of the monitoring mechanism via its Instrumentation and subscription components. As a case study, we show how our monitoring architecture can be utilized to improve the reliability and the performance of the Interactive Remote Instruction (IRI) system which is a large-scale distributed system for collaborative distance learning. The filtering mechanism represents an Intrinsic component integrated with the monitoring architecture to reduce the volume of event traffic flow in the system, and thereby reduce the intrusiveness of the monitoring process. We are developing an event filtering architecture to efficiently process the large volume of event traffic generated by LSD systems (such as distributed interactive applications). This filtering architecture is used to monitor collaborative distance learning application for obtaining debugging and feedback information. Our architecture supports the dynamic (re)configuration and optimization of event filters in large-scale distributed systems. Our work represents a major contribution by (1) survey and evaluating existing event filtering mechanisms In supporting monitoring LSD systems and (2) devising an integrated scalable high- performance architecture of event filtering that spans several kev application domains, presenting techniques to improve the functionality, performance and scalability. This paper describes the primary characteristics and challenges of developing high-performance event filtering for monitoring LSD systems. We survey existing event filtering mechanisms and explain key characteristics for each technique. In addition, we discuss limitations with existing event filtering mechanisms and outline how our architecture will improve key aspects of event filtering.

  10. Image denoising by sparse 3-D transform-domain collaborative filtering.

    PubMed

    Dabov, Kostadin; Foi, Alessandro; Katkovnik, Vladimir; Egiazarian, Karen

    2007-08-01

    We propose a novel image denoising strategy based on an enhanced sparse representation in transform domain. The enhancement of the sparsity is achieved by grouping similar 2-D image fragments (e.g., blocks) into 3-D data arrays which we call "groups." Collaborative filtering is a special procedure developed to deal with these 3-D groups. We realize it using the three successive steps: 3-D transformation of a group, shrinkage of the transform spectrum, and inverse 3-D transformation. The result is a 3-D estimate that consists of the jointly filtered grouped image blocks. By attenuating the noise, the collaborative filtering reveals even the finest details shared by grouped blocks and, at the same time, it preserves the essential unique features of each individual block. The filtered blocks are then returned to their original positions. Because these blocks are overlapping, for each pixel, we obtain many different estimates which need to be combined. Aggregation is a particular averaging procedure which is exploited to take advantage of this redundancy. A significant improvement is obtained by a specially developed collaborative Wiener filtering. An algorithm based on this novel denoising strategy and its efficient implementation are presented in full detail; an extension to color-image denoising is also developed. The experimental results demonstrate that this computationally scalable algorithm achieves state-of-the-art denoising performance in terms of both peak signal-to-noise ratio and subjective visual quality.

  11. Empirical comparison of local structural similarity indices for collaborative-filtering-based recommender systems

    NASA Astrophysics Data System (ADS)

    Zhang, Qian-Ming; Shang, Ming-Sheng; Zeng, Wei; Chen, Yong; Lü, Linyuan

    2010-08-01

    Collaborative filtering is one of the most successful recommendation techniques, which can effectively predict the possible future likes of users based on their past preferences. The key problem of this method is how to define the similarity between users. A standard approach is using the correlation between the ratings that two users give to a set of objects, such as Cosine index and Pearson correlation coefficient. However, the costs of computing this kind of indices are relatively high, and thus it is impossible to be applied in the huge-size systems. To solve this problem, in this paper, we introduce six local-structure-based similarity indices and compare their performances with the above two benchmark indices. Experimental results on two data sets demonstrate that the structure-based similarity indices overall outperform the Pearson correlation coefficient. When the data is dense, the structure-based indices can perform competitively good as Cosine index, while with lower computational complexity. Furthermore, when the data is sparse, the structure-based indices give even better results than Cosine index.

  12. Coarse cluster enhancing collaborative recommendation for social network systems

    NASA Astrophysics Data System (ADS)

    Zhao, Yao-Dong; Cai, Shi-Min; Tang, Ming; Shang, Min-Sheng

    2017-10-01

    Traditional collaborative filtering based recommender systems for social network systems bring very high demands on time complexity due to computing similarities of all pairs of users via resource usages and annotation actions, which thus strongly suppresses recommending speed. In this paper, to overcome this drawback, we propose a novel approach, namely coarse cluster that partitions similar users and associated items at a high speed to enhance user-based collaborative filtering, and then develop a fast collaborative user model for the social tagging systems. The experimental results based on Delicious dataset show that the proposed model is able to dramatically reduce the processing time cost greater than 90 % and relatively improve the accuracy in comparison with the ordinary user-based collaborative filtering, and is robust for the initial parameter. Most importantly, the proposed model can be conveniently extended by introducing more users' information (e.g., profiles) and practically applied for the large-scale social network systems to enhance the recommending speed without accuracy loss.

  13. Model-Based Collaborative Filtering Analysis of Student Response Data: Machine-Learning Item Response Theory

    ERIC Educational Resources Information Center

    Bergner, Yoav; Droschler, Stefan; Kortemeyer, Gerd; Rayyan, Saif; Seaton, Daniel; Pritchard, David E.

    2012-01-01

    We apply collaborative filtering (CF) to dichotomously scored student response data (right, wrong, or no interaction), finding optimal parameters for each student and item based on cross-validated prediction accuracy. The approach is naturally suited to comparing different models, both unidimensional and multidimensional in ability, including a…

  14. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection

    PubMed Central

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-01-01

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users’ privacy. PMID:29751670

  15. A Strategy toward Collaborative Filter Recommended Location Service for Privacy Protection.

    PubMed

    Wang, Peng; Yang, Jing; Zhang, Jianpei

    2018-05-11

    A new collaborative filtered recommendation strategy was proposed for existing privacy and security issues in location services. In this strategy, every user establishes his/her own position profiles according to their daily position data, which is preprocessed using a density clustering method. Then, density prioritization was used to choose similar user groups as service request responders and the neighboring users in the chosen groups recommended appropriate location services using a collaborative filter recommendation algorithm. The two filter algorithms based on position profile similarity and position point similarity measures were designed in the recommendation, respectively. At the same time, the homomorphic encryption method was used to transfer location data for effective protection of privacy and security. A real location dataset was applied to test the proposed strategy and the results showed that the strategy provides better location service and protects users' privacy.

  16. Improved Collaborative Filtering Algorithm via Information Transformation

    NASA Astrophysics Data System (ADS)

    Liu, Jian-Guo; Wang, Bing-Hong; Guo, Qiang

    In this paper, we propose a spreading activation approach for collaborative filtering (SA-CF). By using the opinion spreading process, the similarity between any users can be obtained. The algorithm has remarkably higher accuracy than the standard collaborative filtering using the Pearson correlation. Furthermore, we introduce a free parameter β to regulate the contributions of objects to user-user correlations. The numerical results indicate that decreasing the influence of popular objects can further improve the algorithmic accuracy and personality. We argue that a better algorithm should simultaneously require less computation and generate higher accuracy. Accordingly, we further propose an algorithm involving only the top-N similar neighbors for each target user, which has both less computational complexity and higher algorithmic accuracy.

  17. Using Collaborative Filtering to Support College Students' Use of Online Forum for English Learning

    ERIC Educational Resources Information Center

    Wang, Pei-Yu; Yang, Hui-Chun

    2012-01-01

    This study examined the impact of collaborative filtering (the so-called recommender) on college students' use of an online forum for English learning. The forum was created with an open-source software, Drupal, and its extended recommender module. This study was guided by three main questions: 1) Is there any difference in online behaviors…

  18. Collaborative Filtering Recommendation on Users' Interest Sequences.

    PubMed

    Cheng, Weijie; Yin, Guisheng; Dong, Yuxin; Dong, Hongbin; Zhang, Wansong

    2016-01-01

    As an important factor for improving recommendations, time information has been introduced to model users' dynamic preferences in many papers. However, the sequence of users' behaviour is rarely studied in recommender systems. Due to the users' unique behavior evolution patterns and personalized interest transitions among items, users' similarity in sequential dimension should be introduced to further distinguish users' preferences and interests. In this paper, we propose a new collaborative filtering recommendation method based on users' interest sequences (IS) that rank users' ratings or other online behaviors according to the timestamps when they occurred. This method extracts the semantics hidden in the interest sequences by the length of users' longest common sub-IS (LCSIS) and the count of users' total common sub-IS (ACSIS). Then, these semantics are utilized to obtain users' IS-based similarities and, further, to refine the similarities acquired from traditional collaborative filtering approaches. With these updated similarities, transition characteristics and dynamic evolution patterns of users' preferences are considered. Our new proposed method was compared with state-of-the-art time-aware collaborative filtering algorithms on datasets MovieLens, Flixster and Ciao. The experimental results validate that the proposed recommendation method is effective and outperforms several existing algorithms in the accuracy of rating prediction.

  19. Collaborative Filtering Recommendation on Users’ Interest Sequences

    PubMed Central

    Cheng, Weijie; Yin, Guisheng; Dong, Yuxin; Dong, Hongbin; Zhang, Wansong

    2016-01-01

    As an important factor for improving recommendations, time information has been introduced to model users’ dynamic preferences in many papers. However, the sequence of users’ behaviour is rarely studied in recommender systems. Due to the users’ unique behavior evolution patterns and personalized interest transitions among items, users’ similarity in sequential dimension should be introduced to further distinguish users’ preferences and interests. In this paper, we propose a new collaborative filtering recommendation method based on users’ interest sequences (IS) that rank users’ ratings or other online behaviors according to the timestamps when they occurred. This method extracts the semantics hidden in the interest sequences by the length of users’ longest common sub-IS (LCSIS) and the count of users’ total common sub-IS (ACSIS). Then, these semantics are utilized to obtain users’ IS-based similarities and, further, to refine the similarities acquired from traditional collaborative filtering approaches. With these updated similarities, transition characteristics and dynamic evolution patterns of users’ preferences are considered. Our new proposed method was compared with state-of-the-art time-aware collaborative filtering algorithms on datasets MovieLens, Flixster and Ciao. The experimental results validate that the proposed recommendation method is effective and outperforms several existing algorithms in the accuracy of rating prediction. PMID:27195787

  20. Singular value decomposition for collaborative filtering on a GPU

    NASA Astrophysics Data System (ADS)

    Kato, Kimikazu; Hosino, Tikara

    2010-06-01

    A collaborative filtering predicts customers' unknown preferences from known preferences. In a computation of the collaborative filtering, a singular value decomposition (SVD) is needed to reduce the size of a large scale matrix so that the burden for the next phase computation will be decreased. In this application, SVD means a roughly approximated factorization of a given matrix into smaller sized matrices. Webb (a.k.a. Simon Funk) showed an effective algorithm to compute SVD toward a solution of an open competition called "Netflix Prize". The algorithm utilizes an iterative method so that the error of approximation improves in each step of the iteration. We give a GPU version of Webb's algorithm. Our algorithm is implemented in the CUDA and it is shown to be efficient by an experiment.

  1. Correction of false memory for associated word lists by collaborating groups.

    PubMed

    Weigold, Arne; Russell, Elizabeth J; Natera, Sara N

    2014-01-01

    Collaborative inhibition is often observed for both correct and false memories. However, research examining the mechanisms by which collaborative inhibition occurs, such as retrieval disruption, reality monitoring, or group filtering, is lacking. In addition, the creation of the nominal groups (i.e., groups artificially developed by combining individuals' recall) necessary for examining collaborative inhibition do not use statistical best practices. Using the Deese-Roediger-McDermott paradigm, we examined percentages of correct and false memories in individuals, collaborative interactive groups, and correctly created nominal groups, as well as the processes that the collaborative interactive groups used to determine which memories to report. Results showed evidence of the collaborative inhibition effect. In addition, analyses of the collaborative interactive groups' discussions found that these groups wrote down almost all presented words but less than half of nonpresented critical words, after discussing them, with nonpresented critical words being stated to the group with lower confidence and rejected by other group members more often. Overall, our findings indicated support for the group filtering hypothesis.

  2. Advanced Sine Wave Modulation of Continuous Wave Laser System for Atmospheric CO2 Differential Absorption Measurements

    NASA Technical Reports Server (NTRS)

    Campbell, Joel F.; Lin, Bing; Nehrir, Amin R.

    2014-01-01

    NASA Langley Research Center in collaboration with ITT Exelis have been experimenting with Continuous Wave (CW) laser absorption spectrometer (LAS) as a means of performing atmospheric CO2 column measurements from space to support the Active Sensing of CO2 Emissions over Nights, Days, and Seasons (ASCENDS) mission.Because range resolving Intensity Modulated (IM) CW lidar techniques presented here rely on matched filter correlations, autocorrelation properties without side lobes or other artifacts are highly desirable since the autocorrelation function is critical for the measurements of lidar return powers, laser path lengths, and CO2 column amounts. In this paper modulation techniques are investigated that improve autocorrelation properties. The modulation techniques investigated in this paper include sine waves modulated by maximum length (ML) sequences in various hardware configurations. A CW lidar system using sine waves modulated by ML pseudo random noise codes is described, which uses a time shifting approach to separate channels and make multiple, simultaneous online/offline differential absorption measurements. Unlike the pure ML sequence, this technique is useful in hardware that is band pass filtered as the IM sine wave carrier shifts the main power band. Both amplitude and Phase Shift Keying (PSK) modulated IM carriers are investigated that exibit perfect autocorrelation properties down to one cycle per code bit. In addition, a method is presented to bandwidth limit the ML sequence based on a Gaussian filter implemented in terms of Jacobi theta functions that does not seriously degrade the resolution or introduce side lobes as a means of reducing aliasing and IM carrier bandwidth.

  3. Collaborative Filtering for Expansion of Learner's Background Knowledge in Online Language Learning: Does "Top-Down" Processing Improve Vocabulary Proficiency?

    ERIC Educational Resources Information Center

    Yamada, Masanori; Kitamura, Satoshi; Matsukawa, Hideya; Misono, Tadashi; Kitani, Noriko; Yamauchi, Yuhei

    2014-01-01

    In recent years, collaborative filtering, a recommendation algorithm that incorporates a user's data such as interest, has received worldwide attention as an advanced learning support system. However, accurate recommendations along with a user's interest cannot be ideal as an effective learning environment. This study aims to develop and…

  4. A comparative study: classification vs. user-based collaborative filtering for clinical prediction.

    PubMed

    Hao, Fang; Blair, Rachael Hageman

    2016-12-08

    Recommender systems have shown tremendous value for the prediction of personalized item recommendations for individuals in a variety of settings (e.g., marketing, e-commerce, etc.). User-based collaborative filtering is a popular recommender system, which leverages an individuals' prior satisfaction with items, as well as the satisfaction of individuals that are "similar". Recently, there have been applications of collaborative filtering based recommender systems for clinical risk prediction. In these applications, individuals represent patients, and items represent clinical data, which includes an outcome. Application of recommender systems to a problem of this type requires the recasting a supervised learning problem as unsupervised. The rationale is that patients with similar clinical features carry a similar disease risk. As the "Big Data" era progresses, it is likely that approaches of this type will be reached for as biomedical data continues to grow in both size and complexity (e.g., electronic health records). In the present study, we set out to understand and assess the performance of recommender systems in a controlled yet realistic setting. User-based collaborative filtering recommender systems are compared to logistic regression and random forests with different types of imputation and varying amounts of missingness on four different publicly available medical data sets: National Health and Nutrition Examination Survey (NHANES, 2011-2012 on Obesity), Study to Understand Prognoses Preferences Outcomes and Risks of Treatment (SUPPORT), chronic kidney disease, and dermatology data. We also examined performance using simulated data with observations that are Missing At Random (MAR) or Missing Completely At Random (MCAR) under various degrees of missingness and levels of class imbalance in the response variable. Our results demonstrate that user-based collaborative filtering is consistently inferior to logistic regression and random forests with different imputations on real and simulated data. The results warrant caution for the collaborative filtering for the purpose of clinical risk prediction when traditional classification is feasible and practical. CF may not be desirable in datasets where classification is an acceptable alternative. We describe some natural applications related to "Big Data" where CF would be preferred and conclude with some insights as to why caution may be warranted in this context.

  5. A content-boosted collaborative filtering algorithm for personalized training in interpretation of radiological imaging.

    PubMed

    Lin, Hongli; Yang, Xuedong; Wang, Weisheng

    2014-08-01

    Devising a method that can select cases based on the performance levels of trainees and the characteristics of cases is essential for developing a personalized training program in radiology education. In this paper, we propose a novel hybrid prediction algorithm called content-boosted collaborative filtering (CBCF) to predict the difficulty level of each case for each trainee. The CBCF utilizes a content-based filtering (CBF) method to enhance existing trainee-case ratings data and then provides final predictions through a collaborative filtering (CF) algorithm. The CBCF algorithm incorporates the advantages of both CBF and CF, while not inheriting the disadvantages of either. The CBCF method is compared with the pure CBF and pure CF approaches using three datasets. The experimental data are then evaluated in terms of the MAE metric. Our experimental results show that the CBCF outperforms the pure CBF and CF methods by 13.33 and 12.17 %, respectively, in terms of prediction precision. This also suggests that the CBCF can be used in the development of personalized training systems in radiology education.

  6. Layout Study and Application of Mobile App Recommendation Approach Based On Spark Streaming Framework

    NASA Astrophysics Data System (ADS)

    Wang, H. T.; Chen, T. T.; Yan, C.; Pan, H.

    2018-05-01

    For App recommended areas of mobile phone software, made while using conduct App application recommended combined weighted Slope One algorithm collaborative filtering algorithm items based on further improvement of the traditional collaborative filtering algorithm in cold start, data matrix sparseness and other issues, will recommend Spark stasis parallel algorithm platform, the introduction of real-time streaming streaming real-time computing framework to improve real-time software applications recommended.

  7. A new method for E-government procurement using collaborative filtering and Bayesian approach.

    PubMed

    Zhang, Shuai; Xi, Chengyu; Wang, Yan; Zhang, Wenyu; Chen, Yanhong

    2013-01-01

    Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP) to search for the optimal procurement scheme (OPS). Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M recommendations such that the involved computation load can be alleviated. A trapezoidal fuzzy number similarity algorithm is applied to support the item-based collaborative filtering and Bayesian approach, since some of the services' attributes can be hardly expressed as certain and static values but only be easily represented as fuzzy values. A prototype system is built and validated with an illustrative example from eGP to confirm the feasibility of our approach.

  8. A New Method for E-Government Procurement Using Collaborative Filtering and Bayesian Approach

    PubMed Central

    Wang, Yan

    2013-01-01

    Nowadays, as the Internet services increase faster than ever before, government systems are reinvented as E-government services. Therefore, government procurement sectors have to face challenges brought by the explosion of service information. This paper presents a novel method for E-government procurement (eGP) to search for the optimal procurement scheme (OPS). Item-based collaborative filtering and Bayesian approach are used to evaluate and select the candidate services to get the top-M recommendations such that the involved computation load can be alleviated. A trapezoidal fuzzy number similarity algorithm is applied to support the item-based collaborative filtering and Bayesian approach, since some of the services' attributes can be hardly expressed as certain and static values but only be easily represented as fuzzy values. A prototype system is built and validated with an illustrative example from eGP to confirm the feasibility of our approach. PMID:24385869

  9. A new collaborative recommendation approach based on users clustering using artificial bee colony algorithm.

    PubMed

    Ju, Chunhua; Xu, Chonghuan

    2013-01-01

    Although there are many good collaborative recommendation methods, it is still a challenge to increase the accuracy and diversity of these methods to fulfill users' preferences. In this paper, we propose a novel collaborative filtering recommendation approach based on K-means clustering algorithm. In the process of clustering, we use artificial bee colony (ABC) algorithm to overcome the local optimal problem caused by K-means. After that we adopt the modified cosine similarity to compute the similarity between users in the same clusters. Finally, we generate recommendation results for the corresponding target users. Detailed numerical analysis on a benchmark dataset MovieLens and a real-world dataset indicates that our new collaborative filtering approach based on users clustering algorithm outperforms many other recommendation methods.

  10. A New Collaborative Recommendation Approach Based on Users Clustering Using Artificial Bee Colony Algorithm

    PubMed Central

    Ju, Chunhua

    2013-01-01

    Although there are many good collaborative recommendation methods, it is still a challenge to increase the accuracy and diversity of these methods to fulfill users' preferences. In this paper, we propose a novel collaborative filtering recommendation approach based on K-means clustering algorithm. In the process of clustering, we use artificial bee colony (ABC) algorithm to overcome the local optimal problem caused by K-means. After that we adopt the modified cosine similarity to compute the similarity between users in the same clusters. Finally, we generate recommendation results for the corresponding target users. Detailed numerical analysis on a benchmark dataset MovieLens and a real-world dataset indicates that our new collaborative filtering approach based on users clustering algorithm outperforms many other recommendation methods. PMID:24381525

  11. Initial experience using the rigid forceps technique to remove wall-embedded IVC filters.

    PubMed

    Avery, Allan; Stephens, Maximilian; Redmond, Kendal; Harper, John

    2015-06-01

    Severely tilted and embedded inferior vena cava (IVC) filters remain the most challenging IVC filters to remove. Heavy endothelialisation over the filter hook can prevent engagement with standard snare and cone recovery techniques. The rigid forceps technique offers a way to dissect the endothelial cap and reliably retrieve severely tilted and embedded filters. By developing this technique, failed IVC retrieval rates can be significantly reduced and the optimum safety profile offered by temporary filters can be achieved. We present our initial experience with the rigid forceps technique described by Stavropoulos et al. for removing wall-embedded IVC filters. We retrospectively reviewed the medical imaging and patient records of all patients who underwent a rigid forceps filter removal over a 22-month period across two tertiary referral institutions. The rigid forceps technique had a success rate of 85% (11/13) for IVC filter removals. All filters in the series showed evidence of filter tilt and embedding of the filter hook into the IVC wall. Average filter tilt from the Z-axis was 19 degrees (range 8-56). Filters observed in the case study were either Bard G2X (n = 6) or Cook Celect (n = 7). Average filter dwell time was 421 days (range 47-1053). There were no major complications observed. The rigid forceps technique can be readily emulated and is a safe and effective technique to remove severely tilted and embedded IVC filters. The development of this technique across both institutions has increased the successful filter removal rate, with perceived benefits to the safety profile of our IVC filter programme. © 2015 The Royal Australian and New Zealand College of Radiologists.

  12. A highly efficient approach to protein interactome mapping based on collaborative filtering framework.

    PubMed

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-09

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  13. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    PubMed Central

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly. PMID:25572661

  14. A Highly Efficient Approach to Protein Interactome Mapping Based on Collaborative Filtering Framework

    NASA Astrophysics Data System (ADS)

    Luo, Xin; You, Zhuhong; Zhou, Mengchu; Li, Shuai; Leung, Hareton; Xia, Yunni; Zhu, Qingsheng

    2015-01-01

    The comprehensive mapping of protein-protein interactions (PPIs) is highly desired for one to gain deep insights into both fundamental cell biology processes and the pathology of diseases. Finely-set small-scale experiments are not only very expensive but also inefficient to identify numerous interactomes despite their high accuracy. High-throughput screening techniques enable efficient identification of PPIs; yet the desire to further extract useful knowledge from these data leads to the problem of binary interactome mapping. Network topology-based approaches prove to be highly efficient in addressing this problem; however, their performance deteriorates significantly on sparse putative PPI networks. Motivated by the success of collaborative filtering (CF)-based approaches to the problem of personalized-recommendation on large, sparse rating matrices, this work aims at implementing a highly efficient CF-based approach to binary interactome mapping. To achieve this, we first propose a CF framework for it. Under this framework, we model the given data into an interactome weight matrix, where the feature-vectors of involved proteins are extracted. With them, we design the rescaled cosine coefficient to model the inter-neighborhood similarity among involved proteins, for taking the mapping process. Experimental results on three large, sparse datasets demonstrate that the proposed approach outperforms several sophisticated topology-based approaches significantly.

  15. Information filtering based on transferring similarity.

    PubMed

    Sun, Duo; Zhou, Tao; Liu, Jian-Guo; Liu, Run-Ran; Jia, Chun-Xiao; Wang, Bing-Hong

    2009-07-01

    In this Brief Report, we propose an index of user similarity, namely, the transferring similarity, which involves all high-order similarities between users. Accordingly, we design a modified collaborative filtering algorithm, which provides remarkably higher accurate predictions than the standard collaborative filtering. More interestingly, we find that the algorithmic performance will approach its optimal value when the parameter, contained in the definition of transferring similarity, gets close to its critical value, before which the series expansion of transferring similarity is convergent and after which it is divergent. Our study is complementary to the one reported in [E. A. Leicht, P. Holme, and M. E. J. Newman, Phys. Rev. E 73, 026120 (2006)], and is relevant to the missing link prediction problem.

  16. A generalized model via random walks for information filtering

    NASA Astrophysics Data System (ADS)

    Ren, Zhuo-Ming; Kong, Yixiu; Shang, Ming-Sheng; Zhang, Yi-Cheng

    2016-08-01

    There could exist a simple general mechanism lurking beneath collaborative filtering and interdisciplinary physics approaches which have been successfully applied to online E-commerce platforms. Motivated by this idea, we propose a generalized model employing the dynamics of the random walk in the bipartite networks. Taking into account the degree information, the proposed generalized model could deduce the collaborative filtering, interdisciplinary physics approaches and even the enormous expansion of them. Furthermore, we analyze the generalized model with single and hybrid of degree information on the process of random walk in bipartite networks, and propose a possible strategy by using the hybrid degree information for different popular objects to toward promising precision of the recommendation.

  17. SU-F-I-73: Surface Dose from KV Diagnostic Beams From An On-Board Imager On a Linac Machine Using Different Imaging Techniques and Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ali, I; Hossain, S; Syzek, E

    Purpose: To quantitatively investigate the surface dose deposited in patients imaged with a kV on-board-imager mounted on a radiotherapy machine using different clinical imaging techniques and filters. Methods: A high sensitivity photon diode is used to measure the surface dose on central-axis and at an off-axis-point which is mounted on the top of a phantom setup. The dose is measured for different imaging techniques that include: AP-Pelvis, AP-Head, AP-Abdomen, AP-Thorax, and Extremity. The dose measurements from these imaging techniques are combined with various filtering techniques that include: no-filter (open-field), half-fan bowtie (HF), full-fan bowtie (FF) and Cu-plate filters. The relativemore » surface dose for different imaging and filtering techniques is evaluated quantiatively by the ratio of the dose relative to the Cu-plate filter. Results: The lowest surface dose is deposited with the Cu-plate filter. The highest surface dose deposited results from open fields without filter and it is nearly a factor of 8–30 larger than the corresponding imaging technique with the Cu-plate filter. The AP-Abdomen technique delivers the largest surface dose that is nearly 2.7 times larger than the AP-Head technique. The smallest surface dose is obtained from the Extremity imaging technique. Imaging with bowtie filters decreases the surface dose by nearly 33% in comparison with the open field. The surface doses deposited with the HF or FF-bowtie filters are within few percentages. Image-quality of the radiographic images obtained from the different filtering techniques is similar because the Cu-plate eliminates low-energy photons. The HF- and FF-bowtie filters generate intensity-gradients in the radiographs which affects image-quality in the different imaging technique. Conclusion: Surface dose from kV-imaging decreases significantly with the Cu-plate and bowtie-filters compared to imaging without filters using open-field beams. The use of Cu-plate filter does not affect image-quality and may be used as the default in the different imaging techniques.« less

  18. Collaborative recall in face-to-face and electronic groups.

    PubMed

    Ekeocha, Justina Ohaeri; Brennan, Susan E

    2008-04-01

    When people remember shared experiences, the amount they recall as a collaborating group is less than the amount obtained by pooling their individual memories. We tested the hypothesis that reduced group productivity can be attributed, at least in part, to content filtering, where information is omitted from group products either because individuals fail to retrieve it or choose to withhold it (self-filtering), or because groups reject or fail to incorporate it (group-filtering). Three-person groups viewed a movie clip together and recalled it, first individually, then in face-to-face or electronic groups, and finally individually again. Although both kinds of groups recalled equal amounts, group-filtering occurred more often face-to-face, while self-filtering occurred more often electronically. This suggests that reduced group productivity is due not only to intrapersonal factors stemming from cognitive interference, but also to interpersonal costs of coordinating the group product. Finally, face-to-face group interaction facilitated subsequent individual recall.

  19. Video denoising, deblocking, and enhancement through separable 4-D nonlocal spatiotemporal transforms.

    PubMed

    Maggioni, Matteo; Boracchi, Giacomo; Foi, Alessandro; Egiazarian, Karen

    2012-09-01

    We propose a powerful video filtering algorithm that exploits temporal and spatial redundancy characterizing natural video sequences. The algorithm implements the paradigm of nonlocal grouping and collaborative filtering, where a higher dimensional transform-domain representation of the observations is leveraged to enforce sparsity, and thus regularize the data: 3-D spatiotemporal volumes are constructed by tracking blocks along trajectories defined by the motion vectors. Mutually similar volumes are then grouped together by stacking them along an additional fourth dimension, thus producing a 4-D structure, termed group, where different types of data correlation exist along the different dimensions: local correlation along the two dimensions of the blocks, temporal correlation along the motion trajectories, and nonlocal spatial correlation (i.e., self-similarity) along the fourth dimension of the group. Collaborative filtering is then realized by transforming each group through a decorrelating 4-D separable transform and then by shrinkage and inverse transformation. In this way, the collaborative filtering provides estimates for each volume stacked in the group, which are then returned and adaptively aggregated to their original positions in the video. The proposed filtering procedure addresses several video processing applications, such as denoising, deblocking, and enhancement of both grayscale and color data. Experimental results prove the effectiveness of our method in terms of both subjective and objective visual quality, and show that it outperforms the state of the art in video denoising.

  20. Aggregation Trade Offs in Family Based Recommendations

    NASA Astrophysics Data System (ADS)

    Berkovsky, Shlomo; Freyne, Jill; Coombe, Mac

    Personalized information access tools are frequently based on collaborative filtering recommendation algorithms. Collaborative filtering recommender systems typically suffer from a data sparsity problem, where systems do not have sufficient user data to generate accurate and reliable predictions. Prior research suggested using group-based user data in the collaborative filtering recommendation process to generate group-based predictions and partially resolve the sparsity problem. Although group recommendations are less accurate than personalized recommendations, they are more accurate than general non-personalized recommendations, which are the natural fall back when personalized recommendations cannot be generated. In this work we present initial results of a study that exploits the browsing logs of real families of users gathered in an eHealth portal. The browsing logs allowed us to experimentally compare the accuracy of two group-based recommendation strategies: aggregated group models and aggregated predictions. Our results showed that aggregating individual models into group models resulted in more accurate predictions than aggregating individual predictions into group predictions.

  1. Using Online Role-Play to Promote Collaborative Argument and Collective Action

    ERIC Educational Resources Information Center

    Doerr-Stevens, Candance; Beach, Richard; Boeser, Elizabeth

    2011-01-01

    This article discusses how students use online role-play to collaborate and change real school policy. Playing different characters in an online role-play, students explore controversial aspects of Internet filtering and adopt a plan to change their school's policy. Through engaging in collaborative argumentation during their role-play, students…

  2. Collaborative Beamfocusing Radio (COBRA)

    NASA Astrophysics Data System (ADS)

    Rode, Jeremy P.; Hsu, Mark J.; Smith, David; Husain, Anis

    2013-05-01

    A Ziva team has recently demonstrated a novel technique called Collaborative Beamfocusing Radios (COBRA) which enables an ad-hoc collection of distributed commercial off-the-shelf software defined radios to coherently align and beamform to a remote radio. COBRA promises to operate even in high multipath and non-line-of-sight environments as well as mobile applications without resorting to computationally expensive closed loop techniques that are currently unable to operate with significant movement. COBRA exploits two key technologies to achieve coherent beamforming. The first is Time Reversal (TR) which compensates for multipath and automatically discovers the optimal spatio-temporal matched filter to enable peak signal gains (up to 20 dB) and diffraction-limited focusing at the intended receiver in NLOS and severe multipath environments. The second is time-aligned buffering which enables TR to synchronize distributed transmitters into a collaborative array. This time alignment algorithm avoids causality violations through the use of reciprocal buffering. Preserving spatio-temporal reciprocity through the TR capture and retransmission process achieves coherent alignment across multiple radios at ~GHz carriers using only standard quartz-oscillators. COBRA has been demonstrated in the lab, aligning two off-the-shelf software defined radios over-the-air to an accuracy of better than 2 degrees of carrier alignment at 450 MHz. The COBRA algorithms are lightweight, with computation in 5 ms on a smartphone class microprocessor. COBRA also has low start-up latency, achieving high accuracy from a cold-start in 30 ms. The COBRA technique opens up a large number of new capabilities in communications, and electronic warfare including selective spatial jamming, geolocation and anti-geolocation.

  3. Inferior vena cava filter retrievals, standard and novel techniques.

    PubMed

    Kuyumcu, Gokhan; Walker, T Gregory

    2016-12-01

    The placement of an inferior vena cava (IVC) filter is a well-established management strategy for patients with venous thromboembolism (VTE) disease in whom anticoagulant therapy is either contraindicated or has failed. IVC filters may also be placed for VTE prophylaxis in certain circumstances. There has been a tremendous growth in placement of retrievable IVC filters in the past decade yet the majority of the devices are not removed. Unretrieved IVC filters have several well-known complications that increase in frequency as the filter dwell time increases. These complications include caval wall penetration, filter fracture or migration, caval thrombosis and an increased risk for lower extremity deep vein thrombosis (DVT). Difficulty is sometimes encountered when attempting to retrieve indwelling filters, mainly because of either abnormal filter positioning or endothelization of filter components that are in contact with the IVC wall, thereby causing the filter to become embedded. The length of time that a filter remains indwelling also impacts the retrieval rate, as increased dwell times are associated with more difficult retrievals. Several techniques for difficult retrievals have been described in the medical literature. These techniques range from modifications of standard retrieval techniques to much more complex interventions. Complications related to complex retrievals are more common than those associated with standard retrieval techniques. The risks of complex filter retrievals should be compared with those of life-long anticoagulation associated with an unretrieved filter, and should be individualized. This article summarizes current techniques for IVC filter retrieval from a clinical point of view, with an emphasis on advanced retrieval techniques.

  4. Inferior vena cava filter retrievals, standard and novel techniques

    PubMed Central

    Walker, T. Gregory

    2016-01-01

    The placement of an inferior vena cava (IVC) filter is a well-established management strategy for patients with venous thromboembolism (VTE) disease in whom anticoagulant therapy is either contraindicated or has failed. IVC filters may also be placed for VTE prophylaxis in certain circumstances. There has been a tremendous growth in placement of retrievable IVC filters in the past decade yet the majority of the devices are not removed. Unretrieved IVC filters have several well-known complications that increase in frequency as the filter dwell time increases. These complications include caval wall penetration, filter fracture or migration, caval thrombosis and an increased risk for lower extremity deep vein thrombosis (DVT). Difficulty is sometimes encountered when attempting to retrieve indwelling filters, mainly because of either abnormal filter positioning or endothelization of filter components that are in contact with the IVC wall, thereby causing the filter to become embedded. The length of time that a filter remains indwelling also impacts the retrieval rate, as increased dwell times are associated with more difficult retrievals. Several techniques for difficult retrievals have been described in the medical literature. These techniques range from modifications of standard retrieval techniques to much more complex interventions. Complications related to complex retrievals are more common than those associated with standard retrieval techniques. The risks of complex filter retrievals should be compared with those of life-long anticoagulation associated with an unretrieved filter, and should be individualized. This article summarizes current techniques for IVC filter retrieval from a clinical point of view, with an emphasis on advanced retrieval techniques. PMID:28123984

  5. Effects of the use of multi-layer filter on radiation exposure and the quality of upper airway radiographs compared to the traditional copper filter.

    PubMed

    Klandima, Somphan; Kruatrachue, Anchalee; Wongtapradit, Lawan; Nithipanya, Narong; Ratanaprakarn, Warangkana

    2014-06-01

    The problem of image quality in a large number of upper airway obstructed patients is the superimposition of the airway over the bone of the spine on the AP view. This problem was resolved by increasing KVp to high KVp technique and adding extra radiographic filters (copper filter) to reduce the sharpness of the bone and increase the clarity of the airway. However, this raises a concern that patients might be receiving an unnecessarily higher dose of radiation, as well as the effectiveness of the invented filter compared to the traditional filter. To evaluate the level of radiation dose that patients receive with the use of multi-layer filter compared to non-filter and to evaluate the image quality of the upper airways between using the radiographic filter (multi-layer filter) and the traditional filter (copperfilter). The attenuation curve of both filter materials was first identified. Then, both the filters were tested with Alderson Rando phantom to determine the appropriate exposure. Using the method described, a new type of filter called the multi-layer filter for imaging patients was developed. A randomized control trial was then performed to compare the effectiveness of the newly developed multi-layer filter to the copper filter. The research was conducted in patients with upper airway obstruction treated at Queen Sirikit National Institute of Child Health from October 2006 to September 2007. A total of 132 patients were divided into two groups. The experimental group used high kVp technique with multi-layer filter, while the control group used copper filter. A comparison of film interpretation between the multi-layer filter and the copper filter was made by a number of radiologists who were blinded to both to the technique and type of filter used. Patients had less radiation from undergoing the kVp technique with copper filter and multi-layer filter compared to the conventional technique, where no filter is used. Patients received approximately 65.5% less radiation dose using high kVp technique with multi-layer filter compared to the conventional technique, and 25.9% less than using the traditional copper filter 45% of the radiologists who participated in this study reported that the high kVp technique with multi-layer filter was better for diagnosing stenosis, or narrowing of the upper airways. 33% reported that, both techniques were equal, while 22% reported that the traditional copper filter allowed for better details of airway obstruction. These findings showed that the multi-layered filter was comparable to the copper filter in terms of film interpretation. Using the multi-layer filter resulted in patients receiving a lower dose of radiation, as well as similar film interpretation when compared to the traditional copper filter.

  6. A Novel Technique for Inferior Vena Cava Filter Extraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnston, Edward William, E-mail: ed.johnston@doctors.org.uk; Rowe, Luke Michael Morgan; Brookes, Jocelyn

    Inferior vena cava (IVC) filters are used to protect against pulmonary embolism in high-risk patients. Whilst the insertion of retrievable IVC filters is gaining popularity, a proportion of such devices cannot be removed using standard techniques. We describe a novel approach for IVC filter removal that involves snaring the filter superiorly along with the use of flexible forceps or laser devices to dissect the filter struts from the caval wall. This technique has used to successfully treat three patients without complications in whom standard techniques failed.

  7. Compressed sensing for rapid late gadolinium enhanced imaging of the left atrium: A preliminary study.

    PubMed

    Kamesh Iyer, Srikant; Tasdizen, Tolga; Burgon, Nathan; Kholmovski, Eugene; Marrouche, Nassir; Adluru, Ganesh; DiBella, Edward

    2016-09-01

    Current late gadolinium enhancement (LGE) imaging of left atrial (LA) scar or fibrosis is relatively slow and requires 5-15min to acquire an undersampled (R=1.7) 3D navigated dataset. The GeneRalized Autocalibrating Partially Parallel Acquisitions (GRAPPA) based parallel imaging method is the current clinical standard for accelerating 3D LGE imaging of the LA and permits an acceleration factor ~R=1.7. Two compressed sensing (CS) methods have been developed to achieve higher acceleration factors: a patch based collaborative filtering technique tested with acceleration factor R~3, and a technique that uses a 3D radial stack-of-stars acquisition pattern (R~1.8) with a 3D total variation constraint. The long reconstruction time of these CS methods makes them unwieldy to use, especially the patch based collaborative filtering technique. In addition, the effect of CS techniques on the quantification of percentage of scar/fibrosis is not known. We sought to develop a practical compressed sensing method for imaging the LA at high acceleration factors. In order to develop a clinically viable method with short reconstruction time, a Split Bregman (SB) reconstruction method with 3D total variation (TV) constraints was developed and implemented. The method was tested on 8 atrial fibrillation patients (4 pre-ablation and 4 post-ablation datasets). Blur metric, normalized mean squared error and peak signal to noise ratio were used as metrics to analyze the quality of the reconstructed images, Quantification of the extent of LGE was performed on the undersampled images and compared with the fully sampled images. Quantification of scar from post-ablation datasets and quantification of fibrosis from pre-ablation datasets showed that acceleration factors up to R~3.5 gave good 3D LGE images of the LA wall, using a 3D TV constraint and constrained SB methods. This corresponds to reducing the scan time by half, compared to currently used GRAPPA methods. Reconstruction of 3D LGE images using the SB method was over 20 times faster than standard gradient descent methods. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Graphene-Based Filters and Supercapacitors for Space and Aeronautical Applications

    NASA Technical Reports Server (NTRS)

    Calle, Carlos I.

    2015-01-01

    Overview of the capabilities of graphene for selective filters and for energy storage with a general description of the work being done at NASA Kennedy Space Center in collaboration with the University of California Los Angeles for space and aeronautical applications.

  9. Improved genome-scale multi-target virtual screening via a novel collaborative filtering approach to cold-start problem

    PubMed Central

    Lim, Hansaim; Gray, Paul; Xie, Lei; Poleksic, Aleksandar

    2016-01-01

    Conventional one-drug-one-gene approach has been of limited success in modern drug discovery. Polypharmacology, which focuses on searching for multi-targeted drugs to perturb disease-causing networks instead of designing selective ligands to target individual proteins, has emerged as a new drug discovery paradigm. Although many methods for single-target virtual screening have been developed to improve the efficiency of drug discovery, few of these algorithms are designed for polypharmacology. Here, we present a novel theoretical framework and a corresponding algorithm for genome-scale multi-target virtual screening based on the one-class collaborative filtering technique. Our method overcomes the sparseness of the protein-chemical interaction data by means of interaction matrix weighting and dual regularization from both chemicals and proteins. While the statistical foundation behind our method is general enough to encompass genome-wide drug off-target prediction, the program is specifically tailored to find protein targets for new chemicals with little to no available interaction data. We extensively evaluate our method using a number of the most widely accepted gene-specific and cross-gene family benchmarks and demonstrate that our method outperforms other state-of-the-art algorithms for predicting the interaction of new chemicals with multiple proteins. Thus, the proposed algorithm may provide a powerful tool for multi-target drug design. PMID:27958331

  10. Improved genome-scale multi-target virtual screening via a novel collaborative filtering approach to cold-start problem.

    PubMed

    Lim, Hansaim; Gray, Paul; Xie, Lei; Poleksic, Aleksandar

    2016-12-13

    Conventional one-drug-one-gene approach has been of limited success in modern drug discovery. Polypharmacology, which focuses on searching for multi-targeted drugs to perturb disease-causing networks instead of designing selective ligands to target individual proteins, has emerged as a new drug discovery paradigm. Although many methods for single-target virtual screening have been developed to improve the efficiency of drug discovery, few of these algorithms are designed for polypharmacology. Here, we present a novel theoretical framework and a corresponding algorithm for genome-scale multi-target virtual screening based on the one-class collaborative filtering technique. Our method overcomes the sparseness of the protein-chemical interaction data by means of interaction matrix weighting and dual regularization from both chemicals and proteins. While the statistical foundation behind our method is general enough to encompass genome-wide drug off-target prediction, the program is specifically tailored to find protein targets for new chemicals with little to no available interaction data. We extensively evaluate our method using a number of the most widely accepted gene-specific and cross-gene family benchmarks and demonstrate that our method outperforms other state-of-the-art algorithms for predicting the interaction of new chemicals with multiple proteins. Thus, the proposed algorithm may provide a powerful tool for multi-target drug design.

  11. Measuring the Interestingness of Articles in a Limited User Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pon, R; Cardenas, A; Buttler, David

    Search engines, such as Google, assign scores to news articles based on their relevance to a query. However, not all relevant articles for the query may be interesting to a user. For example, if the article is old or yields little new information, the article would be uninteresting. Relevance scores do not take into account what makes an article interesting, which would vary from user to user. Although methods such as collaborative filtering have been shown to be effective in recommendation systems, in a limited user environment, there are not enough users that would make collaborative filtering effective. A generalmore » framework, called iScore, is presented for defining and measuring the ‘‘interestingness of articles, incorporating user-feedback. iScore addresses the various aspects of what makes an article interesting, such as topic relevance, uniqueness, freshness, source reputation, and writing style. It employs various methods, such as multiple topic tracking, online parameter selection, language models, clustering, sentiment analysis, and phrase extraction to measure these features. Due to varying reasons that users hold about why an article is interesting, an online feature selection method in naι¨ve Bayes is also used to improve recommendation results. iScore can outperform traditional IR techniques by as much as 50.7%. iScore and its components are evaluated in the news recommendation task using three datasets from Yahoo! News, actual users, and Digg.« less

  12. Application of filtering techniques in preprocessing magnetic data

    NASA Astrophysics Data System (ADS)

    Liu, Haijun; Yi, Yongping; Yang, Hongxia; Hu, Guochuang; Liu, Guoming

    2010-08-01

    High precision magnetic exploration is a popular geophysical technique for its simplicity and its effectiveness. The explanation in high precision magnetic exploration is always a difficulty because of the existence of noise and disturbance factors, so it is necessary to find an effective preprocessing method to get rid of the affection of interference factors before further processing. The common way to do this work is by filtering. There are many kinds of filtering methods. In this paper we introduced in detail three popular kinds of filtering techniques including regularized filtering technique, sliding averages filtering technique, compensation smoothing filtering technique. Then we designed the work flow of filtering program based on these techniques and realized it with the help of DELPHI. To check it we applied it to preprocess magnetic data of a certain place in China. Comparing the initial contour map with the filtered contour map, we can see clearly the perfect effect our program. The contour map processed by our program is very smooth and the high frequency parts of data are disappeared. After filtering, we separated useful signals and noisy signals, minor anomaly and major anomaly, local anomaly and regional anomaly. It made us easily to focus on the useful information. Our program can be used to preprocess magnetic data. The results showed the effectiveness of our program.

  13. Importance of Personalized Health-Care Models: A Case Study in Activity Recognition.

    PubMed

    Zdravevski, Eftim; Lameski, Petre; Trajkovik, Vladimir; Pombo, Nuno; Garcia, Nuno

    2018-01-01

    Novel information and communication technologies create possibilities to change the future of health care. Ambient Assisted Living (AAL) is seen as a promising supplement of the current care models. The main goal of AAL solutions is to apply ambient intelligence technologies to enable elderly people to continue to live in their preferred environments. Applying trained models from health data is challenging because the personalized environments could differ significantly than the ones which provided training data. This paper investigates the effects on activity recognition accuracy using single accelerometer of personalized models compared to models built on general population. In addition, we propose a collaborative filtering based approach which provides balance between fully personalized models and generic models. The results show that the accuracy could be improved to 95% with fully personalized models, and up to 91.6% with collaborative filtering based models, which is significantly better than common models that exhibit accuracy of 85.1%. The collaborative filtering approach seems to provide highly personalized models with substantial accuracy, while overcoming the cold start problem that is common for fully personalized models.

  14. Technical Report Series on Global Modeling and Data Assimilation. Volume 16; Filtering Techniques on a Stretched Grid General Circulation Model

    NASA Technical Reports Server (NTRS)

    Takacs, Lawrence L.; Sawyer, William; Suarez, Max J. (Editor); Fox-Rabinowitz, Michael S.

    1999-01-01

    This report documents the techniques used to filter quantities on a stretched grid general circulation model. Standard high-latitude filtering techniques (e.g., using an FFT (Fast Fourier Transformations) to decompose and filter unstable harmonics at selected latitudes) applied on a stretched grid are shown to produce significant distortions of the prognostic state when used to control instabilities near the pole. A new filtering technique is developed which accurately accounts for the non-uniform grid by computing the eigenvectors and eigenfrequencies associated with the stretching. A filter function, constructed to selectively damp those modes whose associated eigenfrequencies exceed some critical value, is used to construct a set of grid-spaced weights which are shown to effectively filter without distortion. Both offline and GCM (General Circulation Model) experiments are shown using the new filtering technique. Finally, a brief examination is also made on the impact of applying the Shapiro filter on the stretched grid.

  15. Cohort Selection and Management Application Leveraging Standards-based Semantic Interoperability and a Groovy DSL

    PubMed Central

    Bucur, Anca; van Leeuwen, Jasper; Chen, Njin-Zu; Claerhout, Brecht; de Schepper, Kristof; Perez-Rey, David; Paraiso-Medina, Sergio; Alonso-Calvo, Raul; Mehta, Keyur; Krykwinski, Cyril

    2016-01-01

    This paper describes a new Cohort Selection application implemented to support streamlining the definition phase of multi-centric clinical research in oncology. Our approach aims at both ease of use and precision in defining the selection filters expressing the characteristics of the desired population. The application leverages our standards-based Semantic Interoperability Solution and a Groovy DSL to provide high expressiveness in the definition of filters and flexibility in their composition into complex selection graphs including splits and merges. Widely-adopted ontologies such as SNOMED-CT are used to represent the semantics of the data and to express concepts in the application filters, facilitating data sharing and collaboration on joint research questions in large communities of clinical users. The application supports patient data exploration and efficient collaboration in multi-site, heterogeneous and distributed data environments. PMID:27570644

  16. Assessment of Snared-Loop Technique When Standard Retrieval of Inferior Vena Cava Filters Fails

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doody, Orla, E-mail: orla_doody@hotmail.com; Noe, Geertje; Given, Mark F.

    Purpose To identify the success and complications related to a variant technique used to retrieve inferior vena cava filters when simple snare approach has failed. Methods A retrospective review of all Cook Guenther Tulip filters and Cook Celect filters retrieved between July 2006 and February 2008 was performed. During this period, 130 filter retrievals were attempted. In 33 cases, the standard retrieval technique failed. Retrieval was subsequently attempted with our modified retrieval technique. Results The retrieval was successful in 23 cases (mean dwell time, 171.84 days; range, 5-505 days) and unsuccessful in 10 cases (mean dwell time, 162.2 days; range,more » 94-360 days). Our filter retrievability rates increased from 74.6% with the standard retrieval method to 92.3% when the snared-loop technique was used. Unsuccessful retrieval was due to significant endothelialization (n = 9) and caval penetration by the filter (n = 1). A single complication occurred in the group, in a patient developing pulmonary emboli after attempted retrieval. Conclusion The technique we describe increased the retrievability of the two filters studied. Hook endothelialization is the main factor resulting in failed retrieval and continues to be a limitation with these filters.« less

  17. Novel and Advanced Techniques for Complex IVC Filter Retrieval.

    PubMed

    Daye, Dania; Walker, T Gregory

    2017-04-01

    Inferior vena cava (IVC) filter placement is indicated for the treatment of venous thromboembolism (VTE) in patients with a contraindication to or a failure of anticoagulation. With the advent of retrievable IVC filters and their ease of placement, an increasing number of such filters are being inserted for prophylaxis in patients at high risk for VTE. Available data show that only a small number of these filters are retrieved within the recommended period, if at all, prompting the FDA to issue a statement on the need for their timely removal. With prolonged dwell times, advanced techniques may be needed for filter retrieval in up to 60% of the cases. In this article, we review standard and advanced IVC filter retrieval techniques including single-access, dual-access, and dissection techniques. Complicated filter retrievals carry a non-negligible risk for complications such as filter fragmentation and resultant embolization of filter components, venous pseudoaneurysms or stenoses, and breach of the integrity of the caval wall. Careful pre-retrieval assessment of IVC filter position, any significant degree of filter tilting or of hook, and/or strut epithelialization and caval wall penetration by filter components should be considered using dedicated cross-sectional imaging for procedural planning. In complex cases, the risk for retrieval complications should be carefully weighed against the risks of leaving the filter permanently indwelling. The decision to remove an embedded IVC filter using advanced techniques should be individualized to each patient and made with caution, based on the patient's age and existing comorbidities.

  18. Günther Tulip inferior vena cava filter retrieval using a bidirectional loop-snare technique.

    PubMed

    Ross, Jordan; Allison, Stephen; Vaidya, Sandeep; Monroe, Eric

    2016-01-01

    Many advanced techniques have been reported in the literature for difficult Günther Tulip filter removal. This report describes a bidirectional loop-snare technique in the setting of a fibrin scar formation around the filter leg anchors. The bidirectional loop-snare technique allows for maximal axial tension and alignment for stripping fibrin scar from the filter legs, a commonly encountered complication of prolonged dwell times.

  19. Evaluating Assessment Using N-Dimensional Filtering.

    ERIC Educational Resources Information Center

    Dron, Jon; Boyne, Chris; Mitchell, Richard

    This paper describes the use of the CoFIND (Collaborative Filter in N Dimensions) system to evaluate two assessment styles. CoFIND is a resource database that organizes itself around its users' needs. Learners enter resources, categorize, then rate them using "qualities," aspects of resources which learners find worthwhile, the n…

  20. Geometric Models for Collaborative Search and Filtering

    ERIC Educational Resources Information Center

    Bitton, Ephrat

    2011-01-01

    This dissertation explores the use of geometric and graphical models for a variety of information search and filtering applications. These models serve to provide an intuitive understanding of the problem domains and as well as computational efficiencies to our solution approaches. We begin by considering a search and rescue scenario where both…

  1. Collaborative filtering for brain-computer interaction using transfer learning and active class selection.

    PubMed

    Wu, Dongrui; Lance, Brent J; Parsons, Thomas D

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both k nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing.

  2. Collaborative Filtering for Brain-Computer Interaction Using Transfer Learning and Active Class Selection

    PubMed Central

    Wu, Dongrui; Lance, Brent J.; Parsons, Thomas D.

    2013-01-01

    Brain-computer interaction (BCI) and physiological computing are terms that refer to using processed neural or physiological signals to influence human interaction with computers, environment, and each other. A major challenge in developing these systems arises from the large individual differences typically seen in the neural/physiological responses. As a result, many researchers use individually-trained recognition algorithms to process this data. In order to minimize time, cost, and barriers to use, there is a need to minimize the amount of individual training data required, or equivalently, to increase the recognition accuracy without increasing the number of user-specific training samples. One promising method for achieving this is collaborative filtering, which combines training data from the individual subject with additional training data from other, similar subjects. This paper describes a successful application of a collaborative filtering approach intended for a BCI system. This approach is based on transfer learning (TL), active class selection (ACS), and a mean squared difference user-similarity heuristic. The resulting BCI system uses neural and physiological signals for automatic task difficulty recognition. TL improves the learning performance by combining a small number of user-specific training samples with a large number of auxiliary training samples from other similar subjects. ACS optimally selects the classes to generate user-specific training samples. Experimental results on 18 subjects, using both nearest neighbors and support vector machine classifiers, demonstrate that the proposed approach can significantly reduce the number of user-specific training data samples. This collaborative filtering approach will also be generalizable to handling individual differences in many other applications that involve human neural or physiological data, such as affective computing. PMID:23437188

  3. Advanced Techniques for Removal of Retrievable Inferior Vena Cava Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iliescu, Bogdan; Haskal, Ziv J., E-mail: ziv2@mac.com

    Inferior vena cava (IVC) filters have proven valuable for the prevention of primary or recurrent pulmonary embolism in selected patients with or at high risk for venous thromboembolic disease. Their use has become commonplace, and the numbers implanted increase annually. During the last 3 years, in the United States, the percentage of annually placed optional filters, i.e., filters than can remain as permanent filters or potentially be retrieved, has consistently exceeded that of permanent filters. In parallel, the complications of long- or short-term filtration have become increasingly evident to physicians, regulatory agencies, and the public. Most filter removals are uneventful,more » with a high degree of success. When routine filter-retrieval techniques prove unsuccessful, progressively more advanced tools and skill sets must be used to enhance filter-retrieval success. These techniques should be used with caution to avoid damage to the filter or cava during IVC retrieval. This review describes the complex techniques for filter retrieval, including use of additional snares, guidewires, angioplasty balloons, and mechanical and thermal approaches as well as illustrates their specific application.« less

  4. Transfemoral Filter Eversion Technique following Unsuccessful Retrieval of Option Inferior Vena Cava Filters: A Single Center Experience.

    PubMed

    Posham, Raghuram; Fischman, Aaron M; Nowakowski, Francis S; Bishay, Vivian L; Biederman, Derek M; Virk, Jaskirat S; Kim, Edward; Patel, Rahul S; Lookstein, Robert A

    2017-06-01

    This report describes the technical feasibility of using the filter eversion technique after unsuccessful retrieval attempts of Option and Option ELITE (Argon Medical Devices, Inc, Athens, Texas) inferior vena cava (IVC) filters. This technique entails the use of endoscopic forceps to evert this specific brand of IVC filter into a sheath inserted into the common femoral vein, in the opposite direction in which the filter is designed to be removed. Filter eversion was attempted in 25 cases with a median dwell time of 134 days (range, 44-2,124 d). Retrieval success was 100% (25/25 cases), with an overall complication rate of 8%. This technique warrants further study. Copyright © 2017 SIR. Published by Elsevier Inc. All rights reserved.

  5. Hybrid employment recommendation algorithm based on Spark

    NASA Astrophysics Data System (ADS)

    Li, Zuoquan; Lin, Yubei; Zhang, Xingming

    2017-08-01

    Aiming at the real-time application of collaborative filtering employment recommendation algorithm (CF), a clustering collaborative filtering recommendation algorithm (CCF) is developed, which applies hierarchical clustering to CF and narrows the query range of neighbour items. In addition, to solve the cold-start problem of content-based recommendation algorithm (CB), a content-based algorithm with users’ information (CBUI) is introduced for job recommendation. Furthermore, a hybrid recommendation algorithm (HRA) which combines CCF and CBUI algorithms is proposed, and implemented on Spark platform. The experimental results show that HRA can overcome the problems of cold start and data sparsity, and achieve good recommendation accuracy and scalability for employment recommendation.

  6. Top 20 Collaborative Internet-Based Science Projects of 1998: Characteristics and Comparisons to Exemplary Science Instruction.

    ERIC Educational Resources Information Center

    Berg, Craig A.; Jefson, Cristy

    This paper utilizes the characteristics of model science instruction to identify exemplary Internet-based science collaborations. The filter for attaining "exemplary" status was based on state and national standards-generating initiatives and the corresponding implications for appropriate student activity in science classrooms. Twenty…

  7. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression

    PubMed Central

    Jiang, Feng; Han, Ji-zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods. PMID:29623088

  8. A Cross-Domain Collaborative Filtering Algorithm Based on Feature Construction and Locally Weighted Linear Regression.

    PubMed

    Yu, Xu; Lin, Jun-Yu; Jiang, Feng; Du, Jun-Wei; Han, Ji-Zhong

    2018-01-01

    Cross-domain collaborative filtering (CDCF) solves the sparsity problem by transferring rating knowledge from auxiliary domains. Obviously, different auxiliary domains have different importance to the target domain. However, previous works cannot evaluate effectively the significance of different auxiliary domains. To overcome this drawback, we propose a cross-domain collaborative filtering algorithm based on Feature Construction and Locally Weighted Linear Regression (FCLWLR). We first construct features in different domains and use these features to represent different auxiliary domains. Thus the weight computation across different domains can be converted as the weight computation across different features. Then we combine the features in the target domain and in the auxiliary domains together and convert the cross-domain recommendation problem into a regression problem. Finally, we employ a Locally Weighted Linear Regression (LWLR) model to solve the regression problem. As LWLR is a nonparametric regression method, it can effectively avoid underfitting or overfitting problem occurring in parametric regression methods. We conduct extensive experiments to show that the proposed FCLWLR algorithm is effective in addressing the data sparsity problem by transferring the useful knowledge from the auxiliary domains, as compared to many state-of-the-art single-domain or cross-domain CF methods.

  9. COMPUTING THERAPY FOR PRECISION MEDICINE: COLLABORATIVE FILTERING INTEGRATES AND PREDICTS MULTI-ENTITY INTERACTIONS.

    PubMed

    Regenbogen, Sam; Wilkins, Angela D; Lichtarge, Olivier

    2016-01-01

    Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses.

  10. COMPUTING THERAPY FOR PRECISION MEDICINE: COLLABORATIVE FILTERING INTEGRATES AND PREDICTS MULTI-ENTITY INTERACTIONS

    PubMed Central

    REGENBOGEN, SAM; WILKINS, ANGELA D.; LICHTARGE, OLIVIER

    2015-01-01

    Biomedicine produces copious information it cannot fully exploit. Specifically, there is considerable need to integrate knowledge from disparate studies to discover connections across domains. Here, we used a Collaborative Filtering approach, inspired by online recommendation algorithms, in which non-negative matrix factorization (NMF) predicts interactions among chemicals, genes, and diseases only from pairwise information about their interactions. Our approach, applied to matrices derived from the Comparative Toxicogenomics Database, successfully recovered Chemical-Disease, Chemical-Gene, and Disease-Gene networks in 10-fold cross-validation experiments. Additionally, we could predict each of these interaction matrices from the other two. Integrating all three CTD interaction matrices with NMF led to good predictions of STRING, an independent, external network of protein-protein interactions. Finally, this approach could integrate the CTD and STRING interaction data to improve Chemical-Gene cross-validation performance significantly, and, in a time-stamped study, it predicted information added to CTD after a given date, using only data prior to that date. We conclude that collaborative filtering can integrate information across multiple types of biological entities, and that as a first step towards precision medicine it can compute drug repurposing hypotheses. PMID:26776170

  11. A Performance Weighted Collaborative Filtering algorithm for personalized radiology education.

    PubMed

    Lin, Hongli; Yang, Xuedong; Wang, Weisheng; Luo, Jiawei

    2014-10-01

    Devising an accurate prediction algorithm that can predict the difficulty level of cases for individuals and then selects suitable cases for them is essential to the development of a personalized training system. In this paper, we propose a novel approach, called Performance Weighted Collaborative Filtering (PWCF), to predict the difficulty level of each case for individuals. The main idea of PWCF is to assign an optimal weight to each rating used for predicting the difficulty level of a target case for a trainee, rather than using an equal weight for all ratings as in traditional collaborative filtering methods. The assigned weight is a function of the performance level of the trainee at which the rating was made. The PWCF method and the traditional method are compared using two datasets. The experimental data are then evaluated by means of the MAE metric. Our experimental results show that PWCF outperforms the traditional methods by 8.12% and 17.05%, respectively, over the two datasets, in terms of prediction precision. This suggests that PWCF is a viable method for the development of personalized training systems in radiology education. Copyright © 2014. Published by Elsevier Inc.

  12. Recommending personally interested contents by text mining, filtering, and interfaces

    DOEpatents

    Xu, Songhua

    2015-10-27

    A personalized content recommendation system includes a client interface device configured to monitor a user's information data stream. A collaborative filter remote from the client interface device generates automated predictions about the interests of the user. A database server stores personal behavioral profiles and user's preferences based on a plurality of monitored past behaviors and an output of the collaborative user personal interest inference engine. A programmed personal content recommendation server filters items in an incoming information stream with the personal behavioral profile and identifies only those items of the incoming information stream that substantially matches the personal behavioral profile. The identified personally relevant content is then recommended to the user following some priority that may consider the similarity between the personal interest matches, the context of the user information consumption behaviors that may be shown by the user's content consumption mode.

  13. CCD filter and transform techniques for interference excision

    NASA Technical Reports Server (NTRS)

    Borsuk, G. M.; Dewitt, R. N.

    1976-01-01

    The theoretical and some experimental results of a study aimed at applying CCD filter and transform techniques to the problem of interference excision within communications channels were presented. Adaptive noise (interference) suppression was achieved by the modification of received signals such that they were orthogonal to the recently measured noise field. CCD techniques were examined to develop real-time noise excision processing. They were recursive filters, circulating filter banks, transversal filter banks, an optical implementation of the chirp Z transform, and a CCD analog FFT.

  14. The Evaluation of Forms of Assessment Using N-Dimensional Filtering

    ERIC Educational Resources Information Center

    Dron, Jon; Boyne, Chris; Mitchell, Richard

    2004-01-01

    This paper describes the use of the CoFIND (Collaborative Filter in N Dimensions) system to evaluate two assessment styles. CoFIND is a resource database which organizes itself around its users' needs. Learners enter resources, categorize, then rate them using "qualities," aspects of resources which learners find worthwhile, the n dimensions of…

  15. Miniaturized High-Temperature Superconducting/Dielectric Multilayer Filters for Satellite Communications

    NASA Technical Reports Server (NTRS)

    Miranda, Felix A.

    1997-01-01

    Most communication satellites contain well over a hundred filters in their payload. Current technology in typical satellite multiplexers use dual-mode cavity or dielectric resonator filters that are large (approx. 25 to 125 cu in) and heavy (up to 600 g). As the complexity of future advanced electronic systems for satellite communications increases, even more filters will be needed, requiring filter miniaturization without performance degradation. Such improvements in filter technology will enhance satellite performance. To reduce the size, weight, and cost of the multiplexers without compromising performance, the NASA Lewis Research Center is collaborating with industry to develop a new class of dual-mode multilayer filters consisting of YBa2Cu3O7-delta high-temperature superconducting (HTS) thin films on LaAlO3 substrates.

  16. Wavelet-Based Signal and Image Processing for Target Recognition

    NASA Astrophysics Data System (ADS)

    Sherlock, Barry G.

    2002-11-01

    The PI visited NSWC Dahlgren, VA, for six weeks in May-June 2002 and collaborated with scientists in the G33 TEAMS facility, and with Marilyn Rudzinsky of T44 Technology and Photonic Systems Branch. During this visit the PI also presented six educational seminars to NSWC scientists on various aspects of signal processing. Several items from the grant proposal were completed, including (1) wavelet-based algorithms for interpolation of 1-d signals and 2-d images; (2) Discrete Wavelet Transform domain based algorithms for filtering of image data; (3) wavelet-based smoothing of image sequence data originally obtained for the CRITTIR (Clutter Rejection Involving Temporal Techniques in the Infra-Red) project. The PI visited the University of Stellenbosch, South Africa to collaborate with colleagues Prof. B.M. Herbst and Prof. J. du Preez on the use of wavelet image processing in conjunction with pattern recognition techniques. The University of Stellenbosch has offered the PI partial funding to support a sabbatical visit in Fall 2003, the primary purpose of which is to enable the PI to develop and enhance his expertise in Pattern Recognition. During the first year, the grant supported publication of 3 referred papers, presentation of 9 seminars and an intensive two-day course on wavelet theory. The grant supported the work of two students who functioned as research assistants.

  17. Multiscale morphological filtering for analysis of noisy and complex images

    NASA Astrophysics Data System (ADS)

    Kher, A.; Mitra, S.

    Images acquired with passive sensing techniques suffer from illumination variations and poor local contrasts that create major difficulties in interpretation and identification tasks. On the other hand, images acquired with active sensing techniques based on monochromatic illumination are degraded with speckle noise. Mathematical morphology offers elegant techniques to handle a wide range of image degradation problems. Unlike linear filters, morphological filters do not blur the edges and hence maintain higher image resolution. Their rich mathematical framework facilitates the design and analysis of these filters as well as their hardware implementation. Morphological filters are easier to implement and are more cost effective and efficient than several conventional linear filters. Morphological filters to remove speckle noise while maintaining high resolution and preserving thin image regions that are particularly vulnerable to speckle noise were developed and applied to SAR imagery. These filters used combination of linear (one-dimensional) structuring elements in different (typically four) orientations. Although this approach preserves more details than the simple morphological filters using two-dimensional structuring elements, the limited orientations of one-dimensional elements approximate the fine details of the region boundaries. A more robust filter designed recently overcomes the limitation of the fixed orientations. This filter uses a combination of concave and convex structuring elements. Morphological operators are also useful in extracting features from visible and infrared imagery. A multiresolution image pyramid obtained with successive filtering and a subsampling process aids in the removal of the illumination variations and enhances local contrasts. A morphology-based interpolation scheme was also introduced to reduce intensity discontinuities created in any morphological filtering task. The generality of morphological filtering techniques in extracting information from a wide variety of images obtained with active and passive sensing techniques is discussed. Such techniques are particularly useful in obtaining more information from fusion of complex images by different sensors such as SAR, visible, and infrared.

  18. Multiscale Morphological Filtering for Analysis of Noisy and Complex Images

    NASA Technical Reports Server (NTRS)

    Kher, A.; Mitra, S.

    1993-01-01

    Images acquired with passive sensing techniques suffer from illumination variations and poor local contrasts that create major difficulties in interpretation and identification tasks. On the other hand, images acquired with active sensing techniques based on monochromatic illumination are degraded with speckle noise. Mathematical morphology offers elegant techniques to handle a wide range of image degradation problems. Unlike linear filters, morphological filters do not blur the edges and hence maintain higher image resolution. Their rich mathematical framework facilitates the design and analysis of these filters as well as their hardware implementation. Morphological filters are easier to implement and are more cost effective and efficient than several conventional linear filters. Morphological filters to remove speckle noise while maintaining high resolution and preserving thin image regions that are particularly vulnerable to speckle noise were developed and applied to SAR imagery. These filters used combination of linear (one-dimensional) structuring elements in different (typically four) orientations. Although this approach preserves more details than the simple morphological filters using two-dimensional structuring elements, the limited orientations of one-dimensional elements approximate the fine details of the region boundaries. A more robust filter designed recently overcomes the limitation of the fixed orientations. This filter uses a combination of concave and convex structuring elements. Morphological operators are also useful in extracting features from visible and infrared imagery. A multiresolution image pyramid obtained with successive filtering and a subsampling process aids in the removal of the illumination variations and enhances local contrasts. A morphology-based interpolation scheme was also introduced to reduce intensity discontinuities created in any morphological filtering task. The generality of morphological filtering techniques in extracting information from a wide variety of images obtained with active and passive sensing techniques is discussed. Such techniques are particularly useful in obtaining more information from fusion of complex images by different sensors such as SAR, visible, and infrared.

  19. A Structural and Content-Based Analysis for Web Filtering.

    ERIC Educational Resources Information Center

    Lee, P. Y.; Hui, S. C.; Fong, A. C. M.

    2003-01-01

    Presents an analysis of the distinguishing features of pornographic Web pages so that effective filtering techniques can be developed. Surveys the existing techniques for Web content filtering and describes the implementation of a Web content filtering system that uses an artificial neural network. (Author/LRW)

  20. Spectral analysis and filtering techniques in digital spatial data processing

    USGS Publications Warehouse

    Pan, Jeng-Jong

    1989-01-01

    A filter toolbox has been developed at the EROS Data Center, US Geological Survey, for retrieving or removing specified frequency information from two-dimensional digital spatial data. This filter toolbox provides capabilities to compute the power spectrum of a given data and to design various filters in the frequency domain. Three types of filters are available in the toolbox: point filter, line filter, and area filter. Both the point and line filters employ Gaussian-type notch filters, and the area filter includes the capabilities to perform high-pass, band-pass, low-pass, and wedge filtering techniques. These filters are applied for analyzing satellite multispectral scanner data, airborne visible and infrared imaging spectrometer (AVIRIS) data, gravity data, and the digital elevation models (DEM) data. -from Author

  1. Edge Preserved Speckle Noise Reduction Using Integrated Fuzzy Filters

    PubMed Central

    Dewal, M. L.; Rohit, Manoj Kumar

    2014-01-01

    Echocardiographic images are inherent with speckle noise which makes visual reading and analysis quite difficult. The multiplicative speckle noise masks finer details, necessary for diagnosis of abnormalities. A novel speckle reduction technique based on integration of geometric, wiener, and fuzzy filters is proposed and analyzed in this paper. The denoising applications of fuzzy filters are studied and analyzed along with 26 denoising techniques. It is observed that geometric filter retains noise and, to address this issue, wiener filter is embedded into the geometric filter during iteration process. The performance of geometric-wiener filter is further enhanced using fuzzy filters and the proposed despeckling techniques are called integrated fuzzy filters. Fuzzy filters based on moving average and median value are employed in the integrated fuzzy filters. The performances of integrated fuzzy filters are tested on echocardiographic images and synthetic images in terms of image quality metrics. It is observed that the performance parameters are highest in case of integrated fuzzy filters in comparison to fuzzy and geometric-fuzzy filters. The clinical validation reveals that the output images obtained using geometric-wiener, integrated fuzzy, nonlocal means, and details preserving anisotropic diffusion filters are acceptable. The necessary finer details are retained in the denoised echocardiographic images. PMID:27437499

  2. Building and Running a Collaborative Internet Filter Is Akin to a Kansas Barn Raising

    ERIC Educational Resources Information Center

    Reddick, Thomas

    2004-01-01

    The Northeast Kansas Library System's filtering project started out as a response to the passage of CIPA, the Children's Internet Protection Act, in January 2001. Originally called "onGuard," it was a service that the Northeast Kansas Library System created for its members. When the Supreme Court ruling did uphold the constitutionality…

  3. Multidimensional student skills with collaborative filtering

    NASA Astrophysics Data System (ADS)

    Bergner, Yoav; Rayyan, Saif; Seaton, Daniel; Pritchard, David E.

    2013-01-01

    Despite the fact that a physics course typically culminates in one final grade for the student, many instructors and researchers believe that there are multiple skills that students acquire to achieve mastery. Assessment validation and data analysis in general may thus benefit from extension to multidimensional ability. This paper introduces an approach for model determination and dimensionality analysis using collaborative filtering (CF), which is related to factor analysis and item response theory (IRT). Model selection is guided by machine learning perspectives, seeking to maximize the accuracy in predicting which students will answer which items correctly. We apply the CF to response data for the Mechanics Baseline Test and combine the results with prior analysis using unidimensional IRT.

  4. Filtering data from the collaborative initial glaucoma treatment study for improved identification of glaucoma progression.

    PubMed

    Schell, Greggory J; Lavieri, Mariel S; Stein, Joshua D; Musch, David C

    2013-12-21

    Open-angle glaucoma (OAG) is a prevalent, degenerate ocular disease which can lead to blindness without proper clinical management. The tests used to assess disease progression are susceptible to process and measurement noise. The aim of this study was to develop a methodology which accounts for the inherent noise in the data and improve significant disease progression identification. Longitudinal observations from the Collaborative Initial Glaucoma Treatment Study (CIGTS) were used to parameterize and validate a Kalman filter model and logistic regression function. The Kalman filter estimates the true value of biomarkers associated with OAG and forecasts future values of these variables. We develop two logistic regression models via generalized estimating equations (GEE) for calculating the probability of experiencing significant OAG progression: one model based on the raw measurements from CIGTS and another model based on the Kalman filter estimates of the CIGTS data. Receiver operating characteristic (ROC) curves and associated area under the ROC curve (AUC) estimates are calculated using cross-fold validation. The logistic regression model developed using Kalman filter estimates as data input achieves higher sensitivity and specificity than the model developed using raw measurements. The mean AUC for the Kalman filter-based model is 0.961 while the mean AUC for the raw measurements model is 0.889. Hence, using the probability function generated via Kalman filter estimates and GEE for logistic regression, we are able to more accurately classify patients and instances as experiencing significant OAG progression. A Kalman filter approach for estimating the true value of OAG biomarkers resulted in data input which improved the accuracy of a logistic regression classification model compared to a model using raw measurements as input. This methodology accounts for process and measurement noise to enable improved discrimination between progression and nonprogression in chronic diseases.

  5. Introducing ADS Labs

    NASA Astrophysics Data System (ADS)

    Accomazzi, Alberto; Henneken, E.; Grant, C. S.; Kurtz, M. J.; Di Milia, G.; Luker, J.; Thompson, D. M.; Bohlen, E.; Murray, S. S.

    2011-05-01

    ADS Labs is a platform that ADS is introducing in order to test and receive feedback from the community on new technologies and prototype services. Currently, ADS Labs features a new interface for abstract searches, faceted filtering of results, visualization of co-authorship networks, article-level recommendations, and a full-text search service. The streamlined abstract search interface provides a simple, one-box search with options for ranking results based on a paper relevancy, freshness, number of citations, and downloads. In addition, it provides advanced rankings based on collaborative filtering techniques. The faceted filtering interface allows users to narrow search results based on a particular property or set of properties ("facets"), allowing users to manage large lists and explore the relationship between them. For any set or sub-set of records, the co-authorship network can be visualized in an interactive way, offering a view of the distribution of contributors and their inter-relationships. This provides an immediate way to detect groups and collaborations involved in a particular research field. For a majority of papers in Astronomy, our new interface will provide a list of related articles of potential interest. The recommendations are based on a number of factors, including text similarity, citations, and co-readership information. The new full-text search interface allows users to find all instances of particular words or phrases in the body of the articles in our full-text archive. This includes all of the scanned literature in ADS as well as a select portion of the current astronomical literature, including ApJ, ApJS, AJ, MNRAS, PASP, A&A, and soon additional content from Springer journals. Fulltext search results include a list of the matching papers as well as a list of "snippets" of text highlighting the context in which the search terms were found. ADS Labs is available at http://adslabs.org

  6. An effective trust-based recommendation method using a novel graph clustering algorithm

    NASA Astrophysics Data System (ADS)

    Moradi, Parham; Ahmadian, Sajad; Akhlaghian, Fardin

    2015-10-01

    Recommender systems are programs that aim to provide personalized recommendations to users for specific items (e.g. music, books) in online sharing communities or on e-commerce sites. Collaborative filtering methods are important and widely accepted types of recommender systems that generate recommendations based on the ratings of like-minded users. On the other hand, these systems confront several inherent issues such as data sparsity and cold start problems, caused by fewer ratings against the unknowns that need to be predicted. Incorporating trust information into the collaborative filtering systems is an attractive approach to resolve these problems. In this paper, we present a model-based collaborative filtering method by applying a novel graph clustering algorithm and also considering trust statements. In the proposed method first of all, the problem space is represented as a graph and then a sparsest subgraph finding algorithm is applied on the graph to find the initial cluster centers. Then, the proposed graph clustering algorithm is performed to obtain the appropriate users/items clusters. Finally, the identified clusters are used as a set of neighbors to recommend unseen items to the current active user. Experimental results based on three real-world datasets demonstrate that the proposed method outperforms several state-of-the-art recommender system methods.

  7. E-Learning 3.0 = E-Learning 2.0 + Web 3.0?

    ERIC Educational Resources Information Center

    Hussain, Fehmida

    2012-01-01

    Web 3.0, termed as the semantic web or the web of data is the transformed version of Web 2.0 with technologies and functionalities such as intelligent collaborative filtering, cloud computing, big data, linked data, openness, interoperability and smart mobility. If Web 2.0 is about social networking and mass collaboration between the creator and…

  8. Regenerative particulate filter development

    NASA Technical Reports Server (NTRS)

    Descamp, V. A.; Boex, M. W.; Hussey, M. W.; Larson, T. P.

    1972-01-01

    Development, design, and fabrication of a prototype filter regeneration unit for regenerating clean fluid particle filter elements by using a backflush/jet impingement technique are reported. Development tests were also conducted on a vortex particle separator designed for use in zero gravity environment. A maintainable filter was designed, fabricated and tested that allows filter element replacement without any leakage or spillage of system fluid. Also described are spacecraft fluid system design and filter maintenance techniques with respect to inflight maintenance for the space shuttle and space station.

  9. The use of linear programming techniques to design optimal digital filters for pulse shaping and channel equalization

    NASA Technical Reports Server (NTRS)

    Houts, R. C.; Burlage, D. W.

    1972-01-01

    A time domain technique is developed to design finite-duration impulse response digital filters using linear programming. Two related applications of this technique in data transmission systems are considered. The first is the design of pulse shaping digital filters to generate or detect signaling waveforms transmitted over bandlimited channels that are assumed to have ideal low pass or bandpass characteristics. The second is the design of digital filters to be used as preset equalizers in cascade with channels that have known impulse response characteristics. Example designs are presented which illustrate that excellent waveforms can be generated with frequency-sampling filters and the ease with which digital transversal filters can be designed for preset equalization.

  10. Vibrato in Singing Voice: The Link between Source-Filter and Sinusoidal Models

    NASA Astrophysics Data System (ADS)

    Arroabarren, Ixone; Carlosena, Alfonso

    2004-12-01

    The application of inverse filtering techniques for high-quality singing voice analysis/synthesis is discussed. In the context of source-filter models, inverse filtering provides a noninvasive method to extract the voice source, and thus to study voice quality. Although this approach is widely used in speech synthesis, this is not the case in singing voice. Several studies have proved that inverse filtering techniques fail in the case of singing voice, the reasons being unclear. In order to shed light on this problem, we will consider here an additional feature of singing voice, not present in speech: the vibrato. Vibrato has been traditionally studied by sinusoidal modeling. As an alternative, we will introduce here a novel noninteractive source filter model that incorporates the mechanisms of vibrato generation. This model will also allow the comparison of the results produced by inverse filtering techniques and by sinusoidal modeling, as they apply to singing voice and not to speech. In this way, the limitations of these conventional techniques, described in previous literature, will be explained. Both synthetic signals and singer recordings are used to validate and compare the techniques presented in the paper.

  11. Quantitative filter forensics for indoor particle sampling.

    PubMed

    Haaland, D; Siegel, J A

    2017-03-01

    Filter forensics is a promising indoor air investigation technique involving the analysis of dust which has collected on filters in central forced-air heating, ventilation, and air conditioning (HVAC) or portable systems to determine the presence of indoor particle-bound contaminants. In this study, we summarize past filter forensics research to explore what it reveals about the sampling technique and the indoor environment. There are 60 investigations in the literature that have used this sampling technique for a variety of biotic and abiotic contaminants. Many studies identified differences between contaminant concentrations in different buildings using this technique. Based on this literature review, we identified a lack of quantification as a gap in the past literature. Accordingly, we propose an approach to quantitatively link contaminants extracted from HVAC filter dust to time-averaged integrated air concentrations. This quantitative filter forensics approach has great potential to measure indoor air concentrations of a wide variety of particle-bound contaminants. Future studies directly comparing quantitative filter forensics to alternative sampling techniques are required to fully assess this approach, but analysis of past research suggests the enormous possibility of this approach. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  12. Guenter Tulip Filter Retrieval Experience: Predictors of Successful Retrieval

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turba, Ulku Cenk, E-mail: uct5d@virginia.edu; Arslan, Bulent, E-mail: ba6e@virginia.edu; Meuse, Michael, E-mail: mm5tz@virginia.edu

    We report our experience with Guenter Tulip filter placement indications, retrievals, and procedural problems, with emphasis on alternative retrieval techniques. We have identified 92 consecutive patients in whom a Guenter Tulip filter was placed and filter removal attempted. We recorded patient demographic information, filter placement and retrieval indications, procedures, standard and nonstandard filter retrieval techniques, complications, and clinical outcomes. The mean time to retrieval for those who experienced filter strut penetration was statistically significant [F(1,90) = 8.55, p = 0.004]. Filter strut(s) IVC penetration and successful retrieval were found to be statistically significant (p = 0.043). The filter hook-IVC relationshipmore » correlated with successful retrieval. A modified guidewire loop technique was applied in 8 of 10 cases where the hook appeared to penetrate the IVC wall and could not be engaged with a loop snare catheter, providing additional technical success in 6 of 8 (75%). Therefore, the total filter retrieval success increased from 88 to 95%. In conclusion, the Guenter Tulip filter has high successful retrieval rates with low rates of complication. Additional maneuvers such as a guidewire loop method can be used to improve retrieval success rates when the filter hook is endothelialized.« less

  13. Retrievable Inferior Vena Cava Filters in Trauma Patients: Prevalence and Management of Thrombus Within the Filter.

    PubMed

    Pan, Y; Zhao, J; Mei, J; Shao, M; Zhang, J; Wu, H

    2016-12-01

    The incidence of thrombus was investigated within retrievable filters placed in trauma patients with confirmed DVT at the time of retrieval and the optimal treatment for this clinical scenario was assessed. A technique called "filter retrieval with manual negative pressure aspiration thrombectomy" for management of filter thrombus was introduced and assessed. The retrievable filters referred for retrieval between January 2008 and December 2015 were retrospectively reviewed to determine the incidence of filter thrombus on a pre-retrieval cavogram. The clinical outcomes of different managements for thrombus within filters were recorded and analyzed. During the study 764 patients having Aegisy Filters implanted were referred for filter removal, from which 236 cases (134 male patients, mean age 50.2 years) of thrombus within the filter were observed on initial pre-retrieval IVC venogram 12-39 days after insertion (average 16.9 days). The incidence of infra-filter thrombus was 30.9%, and complete occlusion of the filter bearing IVC was seen in 2.4% (18) of cases. Retrieval was attempted in all 121 cases with small clots using a regular snare and sheath technique, and was successful in 120. A total of 116 cases with massive thrombus and IVC occlusion by thrombus were treated by CDT and/or the new retrieval technique. Overall, 213 cases (90.3%) of thrombus in the filter were removed successfully without PE. A small thrombus within the filter can be safely removed without additional management. CDT for reduction of the clot burden in filters was effective and safe. Filter retrieval with manual negative pressure aspiration thrombectomy seemed reasonable and valuable for management of massive thrombus within filters in some patients. Full assessment of the value and safety of this technique requires additional studies. Copyright © 2016 European Society for Vascular Surgery. Published by Elsevier Ltd. All rights reserved.

  14. Image processing and recognition for biological images

    PubMed Central

    Uchida, Seiichi

    2013-01-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. PMID:23560739

  15. Collaborative study of an enzymatic digestion method for the isolation of light filth from ground beef or hamburger.

    PubMed

    Alioto, P; Andreas, M

    1976-01-01

    Collaborative results are presented for a proposed method for light filth extraction from ground beef or hamburger. The method involves enzymatic digestion, wet sieving, and extraction with light mineral oil from 40% isopropanol. Recoveries are good and filter papers are clean. This method has been adopted as official first action.

  16. Wavelet Transform Based Filter to Remove the Notches from Signal Under Harmonic Polluted Environment

    NASA Astrophysics Data System (ADS)

    Das, Sukanta; Ranjan, Vikash

    2017-12-01

    The work proposes to annihilate the notches present in the synchronizing signal required for converter operation appearing due to switching of semiconductor devices connected to the system in the harmonic polluted environment. The disturbances in the signal are suppressed by wavelet based novel filtering technique. In the proposed technique, the notches in the signal are determined and eliminated by the wavelet based multi-rate filter using `Daubechies4' (db4) as mother wavelet. The computational complexity of the adapted technique is very less as compared to any other conventional notch filtering techniques. The proposed technique is developed in MATLAB/Simulink and finally validated with dSPACE-1103 interface. The recovered signal, thus obtained, is almost free of the notches.

  17. A Comparison of Retrievability: Celect versus Option Filter.

    PubMed

    Ryu, Robert K; Desai, Kush; Karp, Jennifer; Gupta, Ramona; Evans, Alan Emerson; Rajeswaran, Shankar; Salem, Riad; Lewandowski, Robert J

    2015-06-01

    To compare the retrievability of 2 potentially retrievable inferior vena cava filter devices. A retrospective, institutional review board-approved study of Celect (Cook, Inc, Bloomington, Indiana) and Option (Rex Medical, Conshohocken, Pennsylvania) filters was conducted over a 33-month period at a single institution. Fluoroscopy time, significant filter tilt, use of adjunctive retrieval technique, and strut perforation in the inferior vena cava were recorded on retrieval. Fisher exact test and Mann-Whitney-Wilcoxon test were used for comparison. There were 99 Celect and 86 Option filters deployed. After an average of 2.09 months (range, 0.3-7.6 mo) and 1.94 months (range, 0.47-9.13 mo), respectively, 59% (n = 58) of patients with Celect filters and 74.7% (n = 65) of patients with Option filters presented for filter retrieval. Retrieval failure rates were 3.4% for Celect filters versus 7.7% for Option filters (P = .45). Median fluoroscopy retrieval times were 4.25 minutes for Celect filters versus 6 minutes for Option filters (P = .006). Adjunctive retrieval techniques were used in 5.4% of Celect filter retrievals versus 18.3% of Option filter retrievals (P = .045). The incidence of significant tilting was 8.9% for Celect filters versus 16.7% for Option filters (P = .27). The incidence of strut perforation was 43% for Celect filters versus 0% for Option filters (P < .0001). Retrieval rates for the Celect and Option filters were not significantly different. However, retrieval of the Option filter required a significantly increased amount of fluoroscopy time compared with the Celect filter, and there was a significantly greater usage of adjunctive retrieval techniques for the Option filter. The Celect filter had a significantly higher rate of strut perforation. Copyright © 2015 SIR. Published by Elsevier Inc. All rights reserved.

  18. Multi-Beam Radio Frequency (RF) Aperture Arrays Using Multiplierless Approximate Fast Fourier Transform (FFT)

    DTIC Science & Technology

    2017-08-01

    filtering, correlation and radio- astronomy . In this report approximate transforms that closely follow the DFT have been studied and found. The approximate...communications, data networks, sensor networks, cognitive radio, radar and beamforming, imaging, filtering, correlation and radio- astronomy . FFTs efficiently...public release; distribution is unlimited. 4.3 Digital Hardware and Design Architectures Collaboration for Astronomy Signal Processing and Electronics

  19. Retrieval of Tip-embedded Inferior Vena Cava Filters by Using the Endobronchial Forceps Technique: Experience at a Single Institution.

    PubMed

    Stavropoulos, S William; Ge, Benjamin H; Mondschein, Jeffrey I; Shlansky-Goldberg, Richard D; Sudheendra, Deepak; Trerotola, Scott O

    2015-06-01

    To evaluate the use of endobronchial forceps to retrieve tip-embedded inferior vena cava (IVC) filters. This institutional review board-approved, HIPAA-compliant retrospective study included 114 patients who presented with tip-embedded IVC filters for removal from January 2005 to April 2014. The included patients consisted of 77 women and 37 men with a mean age of 43 years (range, 18-79 years). Filters were identified as tip embedded by using rotational venography. Rigid bronchoscopy forceps were used to dissect the tip or hook of the filter from the wall of the IVC. The filter was then removed through the sheath by using the endobronchial forceps. Statistical analysis entailed calculating percentages, ranges, and means. The endobronchial forceps technique was used to successfully retrieve 109 of 114 (96%) tip-embedded IVC filters on an intention-to-treat basis. Five failures occurred in four patients in whom the technique was attempted but failed and one patient in whom retrieval was not attempted. Filters were in place for a mean of 465 days (range, 31-2976 days). The filters in this study included 10 Recovery, 33 G2, eight G2X, 11 Eclipse, one OptEase, six Option, 13 Günther Tulip, one ALN, and 31 Celect filters. Three minor complications and one major complication occurred, with no permanent sequelae. The endobronchial forceps technique can be safely used to remove tip-embedded IVC filters. © RSNA, 2014.

  20. Measuring the Interestingness of Articles in a Limited User Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pon, Raymond K.

    Search engines, such as Google, assign scores to news articles based on their relevancy to a query. However, not all relevant articles for the query may be interesting to a user. For example, if the article is old or yields little new information, the article would be uninteresting. Relevancy scores do not take into account what makes an article interesting, which varies from user to user. Although methods such as collaborative filtering have been shown to be effective in recommendation systems, in a limited user environment, there are not enough users that would make collaborative filtering effective. A general framework,more » called iScore, is presented for defining and measuring the 'interestingness' of articles, incorporating user-feedback. iScore addresses various aspects of what makes an article interesting, such as topic relevancy, uniqueness, freshness, source reputation, and writing style. It employs various methods to measure these features and uses a classifier operating on these features to recommend articles. The basic iScore configuration is shown to improve recommendation results by as much as 20%. In addition to the basic iScore features, additional features are presented to address the deficiencies of existing feature extractors, such as one that tracks multiple topics, called MTT, and a version of the Rocchio algorithm that learns its parameters online as it processes documents, called eRocchio. The inclusion of both MTT and eRocchio into iScore is shown to improve iScore recommendation results by as much as 3.1% and 5.6%, respectively. Additionally, in TREC11 Adaptive Filter Task, eRocchio is shown to be 10% better than the best filter in the last run of the task. In addition to these two major topic relevancy measures, other features are also introduced that employ language models, phrases, clustering, and changes in topics to improve recommendation results. These additional features are shown to improve recommendation results by iScore by up to 14%. Due to varying reasons that users hold regarding why an article is interesting, an online feature selection method in naive Bayes is also introduced. Online feature selection can improve recommendation results in iScore by up to 18.9%. In summary, iScore in its best configuration can outperform traditional IR techniques by as much as 50.7%. iScore and its components are evaluated in the news recommendation task using three datasets from Yahoo! News, actual users, and Digg. iScore and its components are also evaluated in the TREC Adaptive Filter task using the Reuters RCV1 corpus.« less

  1. Swarm Intelligence for Optimizing Hybridized Smoothing Filter in Image Edge Enhancement

    NASA Astrophysics Data System (ADS)

    Rao, B. Tirumala; Dehuri, S.; Dileep, M.; Vindhya, A.

    In this modern era, image transmission and processing plays a major role. It would be impossible to retrieve information from satellite and medical images without the help of image processing techniques. Edge enhancement is an image processing step that enhances the edge contrast of an image or video in an attempt to improve its acutance. Edges are the representations of the discontinuities of image intensity functions. For processing these discontinuities in an image, a good edge enhancement technique is essential. The proposed work uses a new idea for edge enhancement using hybridized smoothening filters and we introduce a promising technique of obtaining best hybrid filter using swarm algorithms (Artificial Bee Colony (ABC), Particle Swarm Optimization (PSO) and Ant Colony Optimization (ACO)) to search for an optimal sequence of filters from among a set of rather simple, representative image processing filters. This paper deals with the analysis of the swarm intelligence techniques through the combination of hybrid filters generated by these algorithms for image edge enhancement.

  2. Denali, Tulip, and Option Inferior Vena Cava Filter Retrieval: A Single Center Experience.

    PubMed

    Ramaswamy, Raja S; Jun, Emily; van Beek, Darren; Mani, Naganathan; Salter, Amber; Kim, Seung K; Akinwande, Olaguoke

    2018-04-01

    To compare the technical success of filter retrieval in Denali, Tulip, and Option inferior vena cava filters. A retrospective analysis of Denali, Gunther Tulip, and Option IVC filters was conducted. Retrieval failure rates, fluoroscopy time, sedation time, use of advanced retrieval techniques, and filter-related complications that led to retrieval failure were recorded. There were 107 Denali, 43 Option, and 39 Tulip filters deployed and removed with average dwell times of 93.5, 86.0, and 131 days, respectively. Retrieval failure rates were 0.9% for Denali, 11.6% for Option, and 5.1% for Tulip filters (Denali vs. Option p = 0.018; Denali vs. Tulip p = 0.159; Tulip vs. Option p = 0.045). Median fluoroscopy time for filter retrieval was 3.2 min for the Denali filter, 6.75 min for the Option filter, and 4.95 min for the Tulip filter (Denali vs. Option p < 0.01; Denali vs. Tulip p < 0.01; Tulip vs. Option p = 0.67). Advanced retrieval techniques were used in 0.9% of Denali filters, 21.1% in Option filters, and 10.8% in Tulip filters (Denali vs. Option p < 0.01; Denali vs. Tulip p < 0.01; Tulip vs. Option p < 0.01). Filter retrieval failure rates were significantly higher for the Option filter when compared to both the Denali and Tulip filters. Retrieval of the Denali filter required significantly less amount of fluoroscopy time and use of advanced retrieval techniques when compared to both the Option and Tulip filters. The findings of this study indicate easier retrieval of the Denali and Tulip IVC filters when compared to the Option filter.

  3. Speeding Up the Bilateral Filter: A Joint Acceleration Way.

    PubMed

    Dai, Longquan; Yuan, Mengke; Zhang, Xiaopeng

    2016-06-01

    Computational complexity of the brute-force implementation of the bilateral filter (BF) depends on its filter kernel size. To achieve the constant-time BF whose complexity is irrelevant to the kernel size, many techniques have been proposed, such as 2D box filtering, dimension promotion, and shiftability property. Although each of the above techniques suffers from accuracy and efficiency problems, previous algorithm designers were used to take only one of them to assemble fast implementations due to the hardness of combining them together. Hence, no joint exploitation of these techniques has been proposed to construct a new cutting edge implementation that solves these problems. Jointly employing five techniques: kernel truncation, best N-term approximation as well as previous 2D box filtering, dimension promotion, and shiftability property, we propose a unified framework to transform BF with arbitrary spatial and range kernels into a set of 3D box filters that can be computed in linear time. To the best of our knowledge, our algorithm is the first method that can integrate all these acceleration techniques and, therefore, can draw upon one another's strong point to overcome deficiencies. The strength of our method has been corroborated by several carefully designed experiments. In particular, the filtering accuracy is significantly improved without sacrificing the efficiency at running time.

  4. Symmetric Phase Only Filtering for Improved DPIV Data Processing

    NASA Technical Reports Server (NTRS)

    Wernet, Mark P.

    2006-01-01

    The standard approach in Digital Particle Image Velocimetry (DPIV) data processing is to use Fast Fourier Transforms to obtain the cross-correlation of two single exposure subregions, where the location of the cross-correlation peak is representative of the most probable particle displacement across the subregion. This standard DPIV processing technique is analogous to Matched Spatial Filtering, a technique commonly used in optical correlators to perform the crosscorrelation operation. Phase only filtering is a well known variation of Matched Spatial Filtering, which when used to process DPIV image data yields correlation peaks which are narrower and up to an order of magnitude larger than those obtained using traditional DPIV processing. In addition to possessing desirable correlation plane features, phase only filters also provide superior performance in the presence of DC noise in the correlation subregion. When DPIV image subregions contaminated with surface flare light or high background noise levels are processed using phase only filters, the correlation peak pertaining only to the particle displacement is readily detected above any signal stemming from the DC objects. Tedious image masking or background image subtraction are not required. Both theoretical and experimental analyses of the signal-to-noise ratio performance of the filter functions are presented. In addition, a new Symmetric Phase Only Filtering (SPOF) technique, which is a variation on the traditional phase only filtering technique, is described and demonstrated. The SPOF technique exceeds the performance of the traditionally accepted phase only filtering techniques and is easily implemented in standard DPIV FFT based correlation processing with no significant computational performance penalty. An "Automatic" SPOF algorithm is presented which determines when the SPOF is able to provide better signal to noise results than traditional PIV processing. The SPOF based optical correlation processing approach is presented as a new paradigm for more robust cross-correlation processing of low signal-to-noise ratio DPIV image data."

  5. Reference Architecture for MNE 5 Technical System

    DTIC Science & Technology

    2007-05-30

    of being available in most experiments. Core Services A core set of applications whi directories, web portal and collaboration applications etc. A...classifications Messages (xml, JMS, content level…) Meta data filtering, who can initiate services Web browsing Collaboration & messaging Border...Exchange Ref Architecture for MNE5 Tech System.doc 9 of 21 audit logging Person and machine Data lev objects, web services, messages rification el

  6. Application of a modified complementary filtering technique for increased aircraft control system frequency bandwidth in high vibration environment

    NASA Technical Reports Server (NTRS)

    Garren, J. F., Jr.; Niessen, F. R.; Abbott, T. S.; Yenni, K. R.

    1977-01-01

    A modified complementary filtering technique for estimating aircraft roll rate was developed and flown in a research helicopter to determine whether higher gains could be achieved. Use of this technique did, in fact, permit a substantial increase in system frequency bandwidth because, in comparison with first-order filtering, it reduced both noise amplification and control limit-cycle tendencies.

  7. A Comparison of Collaborative and Traditional Instruction in Higher Education

    ERIC Educational Resources Information Center

    Gubera, Chip; Aruguete, Mara S.

    2013-01-01

    Although collaborative instructional techniques have become popular in college courses, it is unclear whether collaborative techniques can replace more traditional instructional methods. We examined the efficacy of collaborative courses (in-class, collaborative activities with no lectures) compared to traditional lecture courses (in-class,…

  8. Video denoising using low rank tensor decomposition

    NASA Astrophysics Data System (ADS)

    Gui, Lihua; Cui, Gaochao; Zhao, Qibin; Wang, Dongsheng; Cichocki, Andrzej; Cao, Jianting

    2017-03-01

    Reducing noise in a video sequence is of vital important in many real-world applications. One popular method is block matching collaborative filtering. However, the main drawback of this method is that noise standard deviation for the whole video sequence is known in advance. In this paper, we present a tensor based denoising framework that considers 3D patches instead of 2D patches. By collecting the similar 3D patches non-locally, we employ the low-rank tensor decomposition for collaborative filtering. Since we specify the non-informative prior over the noise precision parameter, the noise variance can be inferred automatically from observed video data. Therefore, our method is more practical, which does not require knowing the noise variance. The experimental on video denoising demonstrates the effectiveness of our proposed method.

  9. Error analysis of stochastic gradient descent ranking.

    PubMed

    Chen, Hong; Tang, Yi; Li, Luoqing; Yuan, Yuan; Li, Xuelong; Tang, Yuanyan

    2013-06-01

    Ranking is always an important task in machine learning and information retrieval, e.g., collaborative filtering, recommender systems, drug discovery, etc. A kernel-based stochastic gradient descent algorithm with the least squares loss is proposed for ranking in this paper. The implementation of this algorithm is simple, and an expression of the solution is derived via a sampling operator and an integral operator. An explicit convergence rate for leaning a ranking function is given in terms of the suitable choices of the step size and the regularization parameter. The analysis technique used here is capacity independent and is novel in error analysis of ranking learning. Experimental results on real-world data have shown the effectiveness of the proposed algorithm in ranking tasks, which verifies the theoretical analysis in ranking error.

  10. Design of order statistics filters using feedforward neural networks

    NASA Astrophysics Data System (ADS)

    Maslennikova, Yu. S.; Bochkarev, V. V.

    2016-08-01

    In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.

  11. Monorail system for percutaneous repositioning of the Greenfield vena caval filter.

    PubMed

    Guthaner, D F; Wyatt, J O; Mehigan, J T; Wright, A M; Breen, J F; Wexler, L

    1990-09-01

    The authors describe a technique for removing or repositioning a malpositioned Greenfield inferior vena caval filter. A "monorail" system was used, in which a wire was passed from the femoral vein through the apical hole in the filter and out the internal jugular vein; the wire was held taut from above and below and thus facilitated repositioning or removal of the filter. The technique was used successfully in two cases.

  12. TH-CD-202-04: Evaluation of Virtual Non-Contrast Images From a Novel Split-Filter Dual-Energy CT Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, J; Szczykutowicz, T; Bayouth, J

    Purpose: To compare the ability of two dual-energy CT techniques, a novel split-filter single-source technique of superior temporal resolution against an established sequential-scan technique, to remove iodine contrast from images with minimal impact on CT number accuracy. Methods: A phantom containing 8 tissue substitute materials and vials of varying iodine concentrations (1.7–20.1 mg I /mL) was imaged using a Siemens Edge CT scanner. Dual-energy virtual non-contrast (VNC) images were generated using the novel split-filter technique, in which a 120kVp spectrum is filtered by tin and gold to create high- and low-energy spectra with < 1 second temporal separation between themore » acquisition of low- and high-energy data. Additionally, VNC images were generated with the sequential-scan technique (80 and 140kVp) for comparison. CT number accuracy was evaluated for all materials at 15, 25, and 35mGy CTDIvol. Results: The spectral separation was greater for the sequential-scan technique than the split-filter technique with dual-energy ratios of 2.18 and 1.26, respectively. Both techniques successfully removed iodine contrast, resulting in mean CT numbers within 60HU of 0HU (split-filter) and 40HU of 0HU (sequential-scan) for all iodine concentrations. Additionally, for iodine vials of varying diameter (2–20 mm) with the same concentration (9.9 mg I /mL), the system accurately detected iodine for all sizes investigated. Both dual-energy techniques resulted in reduced CT numbers for bone materials (by >400HU for the densest bone). Increasing the imaging dose did not improve the CT number accuracy for bone in VNC images. Conclusion: VNC images from the split-filter technique successfully removed iodine contrast. These results demonstrate a potential for improving dose calculation accuracy and reducing patient imaging dose, while achieving superior temporal resolution in comparison sequential scans. For both techniques, inaccuracies in CT numbers for bone materials necessitate consideration for radiation therapy treatment planning.« less

  13. Video-signal improvement using comb filtering techniques.

    NASA Technical Reports Server (NTRS)

    Arndt, G. D.; Stuber, F. M.; Panneton, R. J.

    1973-01-01

    Significant improvement in the signal-to-noise performance of television signals has been obtained through the application of comb filtering techniques. This improvement is achieved by removing the inherent redundancy in the television signal through linear prediction and by utilizing the unique noise-rejection characteristics of the receiver comb filter. Theoretical and experimental results describe the signal-to-noise ratio and picture-quality improvement obtained through the use of baseband comb filters and the implementation of a comb network as the loop filter in a phase-lock-loop demodulator. Attention is given to the fact that noise becomes correlated when processed by the receiver comb filter.

  14. A hybrid algorithm for speckle noise reduction of ultrasound images.

    PubMed

    Singh, Karamjeet; Ranade, Sukhjeet Kaur; Singh, Chandan

    2017-09-01

    Medical images are contaminated by multiplicative speckle noise which significantly reduce the contrast of ultrasound images and creates a negative effect on various image interpretation tasks. In this paper, we proposed a hybrid denoising approach which collaborate the both local and nonlocal information in an efficient manner. The proposed hybrid algorithm consist of three stages in which at first stage the use of local statistics in the form of guided filter is used to reduce the effect of speckle noise initially. Then, an improved speckle reducing bilateral filter (SRBF) is developed to further reduce the speckle noise from the medical images. Finally, to reconstruct the diffused edges we have used the efficient post-processing technique which jointly considered the advantages of both bilateral and nonlocal mean (NLM) filter for the attenuation of speckle noise efficiently. The performance of proposed hybrid algorithm is evaluated on synthetic, simulated and real ultrasound images. The experiments conducted on various test images demonstrate that our proposed hybrid approach outperforms the various traditional speckle reduction approaches included recently proposed NLM and optimized Bayesian-based NLM. The results of various quantitative, qualitative measures and by visual inspection of denoise synthetic and real ultrasound images demonstrate that the proposed hybrid algorithm have strong denoising capability and able to preserve the fine image details such as edge of a lesion better than previously developed methods for speckle noise reduction. The denoising and edge preserving capability of hybrid algorithm is far better than existing traditional and recently proposed speckle reduction (SR) filters. The success of proposed algorithm would help in building the lay foundation for inventing the hybrid algorithms for denoising of ultrasound images. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. Focus-based filtering + clustering technique for power-law networks with small world phenomenon

    NASA Astrophysics Data System (ADS)

    Boutin, François; Thièvre, Jérôme; Hascoët, Mountaz

    2006-01-01

    Realistic interaction networks usually present two main properties: a power-law degree distribution and a small world behavior. Few nodes are linked to many nodes and adjacent nodes are likely to share common neighbors. Moreover, graph structure usually presents a dense core that is difficult to explore with classical filtering and clustering techniques. In this paper, we propose a new filtering technique accounting for a user-focus. This technique extracts a tree-like graph with also power-law degree distribution and small world behavior. Resulting structure is easily drawn with classical force-directed drawing algorithms. It is also quickly clustered and displayed into a multi-level silhouette tree (MuSi-Tree) from any user-focus. We built a new graph filtering + clustering + drawing API and report a case study.

  16. Multiwavelength absorbance of filter deposits for determination of environmental tobacco smoke and black carbon

    NASA Astrophysics Data System (ADS)

    Lawless, Phil A.; Rodes, Charles E.; Ensor, David S.

    A multiwavelength optical absorption technique has been developed for Teflon filters used for personal exposure sampling with sufficient sensitivity to allow apportionments of environmental tobacco smoke and soot (black) carbon to be made. Measurements on blank filters show that the filter material itself contributes relatively little to the total absorbance and filters from the same lot have similar characteristics; this makes retrospective analysis of filters quite feasible. Using an integrating sphere radiometer and multiple wavelengths to provide specificity, the determination of tobacco smoke and carbon with reasonable accuracy is possible on filters not characterized before exposure. This technique provides a low cost, non-destructive exposure assessment alternative to both standard thermo-gravimetric elemental carbon evaluations on quartz filters and cotinine analyses from urine or saliva samples. The method allows the same sample filter to be used for assessment of mass, carbon, and tobacco smoke without affecting the deposit.

  17. Intensity transform and Wiener filter in measurement of blood flow in arteriography

    NASA Astrophysics Data System (ADS)

    Nunes, Polyana F.; Franco, Marcelo L. N.; Filho, João. B. D.; Patrocínio, Ana C.

    2015-03-01

    Using the arteriography examination, it is possible to check anomalies in blood vessels and diseases such as stroke, stenosis, bleeding and especially in the diagnosis of Encephalic Death in comatose individuals. Encephalic death can be diagnosed only when there is complete interruption of all brain functions, and hence the blood stream. During the examination, there may be some interference on the sensors, such as environmental factors, poor maintenance of equipment, patient movement, among other interference, which can directly affect the noise produced in angiography images. Then, we need to use digital image processing techniques to minimize this noise and improve the pixel count. Therefore, this paper proposes to use median filter and enhancement techniques for transformation of intensity using the sigmoid function together with the Wiener filter so you can get less noisy images. It's been realized two filtering techniques to remove the noise of images, one with the median filter and the other with the Wiener filter along the sigmoid function. For 14 tests quantified, including 7 Encephalic Death and 7 other cases, the technique that achieved a most satisfactory number of pixels quantified, also presenting a lesser amount of noise, is the Wiener filter sigmoid function, and in this case used with 0.03 cuttof.

  18. IVC filter retrieval in adolescents: experience in a tertiary pediatric center.

    PubMed

    Guzman, Anthony K; Zahra, Mahmoud; Trerotola, Scott O; Raffini, Leslie J; Itkin, Maxim; Keller, Marc S; Cahill, Anne Marie

    2016-04-01

    Inferior vena cava (IVC) filters are commonly implanted with the intent to prevent life-threatening pulmonary embolism in at-risk patients with contraindications to anticoagulation. Various studies have reported increases in the rate of venous thromboembolism within the pediatric population. The utility and safety of IVC filters in children has not yet been fully defined. To describe the technique and adjunctive maneuvers of IVC filter removal in children, demonstrate its technical success and identify complications. A retrospective 10-year review was performed of 20 children (13 male, 7 female), mean age: 15.1 years (range: 12-19 years), who underwent IVC filter retrieval. Eleven of 20 (55%) were placed in our institution. Electronic medical records were reviewed for filter characteristics, retrieval technique, technical success and complications. The technical success rate was 100%. Placement indications included: deep venous thrombosis with a contraindication to anticoagulation (10/20, 50%), free-floating thrombus (4/20, 20%), post-trauma pulmonary embolism prophylaxis (3/20, 15%) and pre-thrombolysis pulmonary patient (1/20, 5%). The mean implantation period was 63 days (range: 20-270 days). Standard retrieval was performed in 17/20 patients (85%). Adjunctive techniques were performed in 3/20 patients (15%) and included the double-snare technique, balloon assistance and endobronchial forceps retrieval. Median procedure time was 60 min (range: 45-240 min). Pre-retrieval cavogram demonstrated filter tilt in 5/20 patients (25%) with a mean angle of 17° (range: 8-40). Pre-retrieval CT demonstrated strut wall penetration and tip embedment in one patient each. There were two procedure-related complications: IVC mural dissection noted on venography in one patient and snare catheter fracture requiring retrieval in one patient. There were no early or late complications. In children, IVC filter retrieval can be performed safely but may be challenging, especially in cases of filter tilt or embedding. Adjunctive techniques may increase filter retrieval rates.

  19. Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation

    NASA Technical Reports Server (NTRS)

    Rakoczy, John M.; Herren, Kenneth A.

    2008-01-01

    A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.

  20. Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Herren, Kenneth

    2007-01-01

    A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.

  1. A class of systolizable IIR digital filters and its design for proper scaling and minimum output roundoff noise

    NASA Technical Reports Server (NTRS)

    Lei, Shaw-Min; Yao, Kung

    1990-01-01

    A class of infinite impulse response (IIR) digital filters with a systolizable structure is proposed and its synthesis is investigated. The systolizable structure consists of pipelineable regular modules with local connections and is suitable for VLSI implementation. It is capable of achieving high performance as well as high throughput. This class of filter structure provides certain degrees of freedom that can be used to obtain some desirable properties for the filter. Techniques of evaluating the internal signal powers and the output roundoff noise of the proposed filter structure are developed. Based upon these techniques, a well-scaled IIR digital filter with minimum output roundoff noise is designed using a local optimization approach. The internal signals of all the modes of this filter are scaled to unity in the l2-norm sense. Compared to the Rao-Kailath (1984) orthogonal digital filter and the Gray-Markel (1973) normalized-lattice digital filter, this filter has better scaling properties and lower output roundoff noise.

  2. Monitoring by Control Technique - Fabric Filters

    EPA Pesticide Factsheets

    Stationary source emissions monitoring is required to demonstrate that a source is meeting the requirements in Federal or state rules. This page is about fabric filter control techniques used to reduce pollutant emissions.

  3. Evaluation of Improved Pushback Forecasts Derived from Airline Ground Operations Data

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Theis, Georg; Feron, Eric; Clarke, John-Paul

    2003-01-01

    Accurate and timely predictions of airline pushbacks can potentially lead to improved performance of automated decision-support tools for airport surface traffic, thus reducing the variability and average duration of costly airline delays. One factor which affects the realization of these benefits is the level of uncertainty inherent in the turn processes. To characterize this inherent uncertainty, three techniques are developed for predicting time-to-go until pushback as a function of available ground-time; elapsed ground-time; and the status (not-started/in-progress/completed) of individual turn processes (cleaning, fueling, etc.). These techniques are tested against a large and detailed dataset covering approximately l0(exp 4) real-world turn operations obtained through collaboration with Deutsche Lufthansa AG. Even after the dataset is filtered to obtain a sample of turn operations with minimal uncertainty, the standard deviation of forecast error for all three techniques is lower-bounded away from zero, indicating that turn operations have a significant stochastic component. This lower-bound result shows that decision-support tools must be designed to incorporate robust mechanisms for coping with pushback demand stochasticity, rather than treating the pushback demand process as a known deterministic input.

  4. Image processing and recognition for biological images.

    PubMed

    Uchida, Seiichi

    2013-05-01

    This paper reviews image processing and pattern recognition techniques, which will be useful to analyze bioimages. Although this paper does not provide their technical details, it will be possible to grasp their main tasks and typical tools to handle the tasks. Image processing is a large research area to improve the visibility of an input image and acquire some valuable information from it. As the main tasks of image processing, this paper introduces gray-level transformation, binarization, image filtering, image segmentation, visual object tracking, optical flow and image registration. Image pattern recognition is the technique to classify an input image into one of the predefined classes and also has a large research area. This paper overviews its two main modules, that is, feature extraction module and classification module. Throughout the paper, it will be emphasized that bioimage is a very difficult target for even state-of-the-art image processing and pattern recognition techniques due to noises, deformations, etc. This paper is expected to be one tutorial guide to bridge biology and image processing researchers for their further collaboration to tackle such a difficult target. © 2013 The Author Development, Growth & Differentiation © 2013 Japanese Society of Developmental Biologists.

  5. A Collaborative Location Based Travel Recommendation System through Enhanced Rating Prediction for the Group of Users.

    PubMed

    Ravi, Logesh; Vairavasundaram, Subramaniyaswamy

    2016-01-01

    Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented.

  6. Statistical, Graphical, and Learning Methods for Sensing, Surveillance, and Navigation Systems

    DTIC Science & Technology

    2016-06-28

    harsh propagation environments. Conventional filtering techniques fail to provide satisfactory performance in many important nonlinear or non...Gaussian scenarios. In addition, there is a lack of a unified methodology for the design and analysis of different filtering techniques. To address...these problems, we have proposed a new filtering methodology called belief condensation (BC) DISTRIBUTION A: Distribution approved for public release

  7. Fundamentals of digital filtering with applications in geophysical prospecting for oil

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesko, A.

    This book is a comprehensive work bringing together the important mathematical foundations and computing techniques for numerical filtering methods. The first two parts of the book introduce the techniques, fundamental theory and applications, while the third part treats specific applications in geophysical prospecting. Discussion is limited to linear filters, but takes in related fields such as correlational and spectral analysis.

  8. Modern Display Technologies for Airborne Applications.

    DTIC Science & Technology

    1983-04-01

    the case of LED head-down direct view displays, this requires that special attention be paid to the optical filtering , the electrical drive/address...effectively attenuates the LED specular reflectance component, the colour and neutral density filtering attentuate the diffuse component and the... filter techniques are planned for use with video, multi- colour and advanced versions of numeric, alphanumeric and graphic displays; this technique

  9. Monitoring by Control Technique - Electrified Filter Bed

    EPA Pesticide Factsheets

    Stationary source emissions monitoring is required to demonstrate that a source is meeting the requirements in Federal or state rules. This page is about electrified filter bed control techniques used to reduce pollutant emissions.

  10. Real-time optical signal processors employing optical feedback: amplitude and phase control.

    PubMed

    Gallagher, N C

    1976-04-01

    The development of real-time coherent optical signal processors has increased the appeal of optical computing techniques in signal processing applications. A major limitation of these real-time systems is the. fact that the optical processing material is generally of a phase-only type. The result is that the spatial filters synthesized with these systems must be either phase-only filters or amplitude-only filters. The main concern of this paper is the application of optical feedback techniques to obtain simultaneous and independent amplitude and phase control of the light passing through the system. It is shown that optical feedback techniques may be employed with phase-only spatial filters to obtain this amplitude and phase control. The feedback system with phase-only filters is compared with other feedback systems that employ combinations of phase-only and amplitude-only filters; it is found that the phase-only system is substantially more flexible than the other two systems investigated.

  11. Development and application of new positively charged filters for recovery of bacteriophages from water.

    PubMed Central

    Borrego, J J; Cornax, R; Preston, D R; Farrah, S R; McElhaney, B; Bitton, G

    1991-01-01

    Electronegative and electropositive filters were compared for the recovery of indigenous bacteriophages from water samples, using the VIRADEL technique. Fiber glass and diatomaceous earth filters displayed low adsorption and recovery, but an important increase of the adsorption percentage was observed when the filters were treated with cationic polymers (about 99% adsorption). A new methodology of virus elution was developed in this study, consisting of the slow passage of the eluent through the filter, thus increasing the contact time between eluent and virus adsorbed on the filters. The use of this technique allows a maximum recovery of 71.2% compared with 46.7% phage recovery obtained by the standard elution procedure. High percentages (over 83%) of phage adsorption were obtained with different filters from 1-liter aliquots of the samples, except for Virosorb 1-MDS filters (between 1.6 and 32% phage adsorption). Phage recovery by using the slow passing of the eluent depended on the filter type, with recovery ranging between 1.6% for Virosorb 1-MDS filters treated with polyethyleneimine and 103.2% for diatomaceous earth filters treated with 0.1% Nalco. PMID:2059044

  12. A tool for filtering information in complex systems

    PubMed Central

    Tumminello, M.; Aste, T.; Di Matteo, T.; Mantegna, R. N.

    2005-01-01

    We introduce a technique to filter out complex data sets by extracting a subgraph of representative links. Such a filtering can be tuned up to any desired level by controlling the genus of the resulting graph. We show that this technique is especially suitable for correlation-based graphs, giving filtered graphs that preserve the hierarchical organization of the minimum spanning tree but containing a larger amount of information in their internal structure. In particular in the case of planar filtered graphs (genus equal to 0), triangular loops and four-element cliques are formed. The application of this filtering procedure to 100 stocks in the U.S. equity markets shows that such loops and cliques have important and significant relationships with the market structure and properties. PMID:16027373

  13. A tool for filtering information in complex systems.

    PubMed

    Tumminello, M; Aste, T; Di Matteo, T; Mantegna, R N

    2005-07-26

    We introduce a technique to filter out complex data sets by extracting a subgraph of representative links. Such a filtering can be tuned up to any desired level by controlling the genus of the resulting graph. We show that this technique is especially suitable for correlation-based graphs, giving filtered graphs that preserve the hierarchical organization of the minimum spanning tree but containing a larger amount of information in their internal structure. In particular in the case of planar filtered graphs (genus equal to 0), triangular loops and four-element cliques are formed. The application of this filtering procedure to 100 stocks in the U.S. equity markets shows that such loops and cliques have important and significant relationships with the market structure and properties.

  14. Frequency tracking and variable bandwidth for line noise filtering without a reference.

    PubMed

    Kelly, John W; Collinger, Jennifer L; Degenhart, Alan D; Siewiorek, Daniel P; Smailagic, Asim; Wang, Wei

    2011-01-01

    This paper presents a method for filtering line noise using an adaptive noise canceling (ANC) technique. This method effectively eliminates the sinusoidal contamination while achieving a narrower bandwidth than typical notch filters and without relying on the availability of a noise reference signal as ANC methods normally do. A sinusoidal reference is instead digitally generated and the filter efficiently tracks the power line frequency, which drifts around a known value. The filter's learning rate is also automatically adjusted to achieve faster and more accurate convergence and to control the filter's bandwidth. In this paper the focus of the discussion and the data will be electrocorticographic (ECoG) neural signals, but the presented technique is applicable to other recordings.

  15. Comparison of Factorization-Based Filtering for Landing Navigation

    NASA Technical Reports Server (NTRS)

    McCabe, James S.; Brown, Aaron J.; DeMars, Kyle J.; Carson, John M., III

    2017-01-01

    This paper develops and analyzes methods for fusing inertial navigation data with external data, such as data obtained from an altimeter and a star camera. The particular filtering techniques are based upon factorized forms of the Kalman filter, specifically the UDU and Cholesky factorizations. The factorized Kalman filters are utilized to ensure numerical stability of the navigation solution. Simulations are carried out to compare the performance of the different approaches along a lunar descent trajectory using inertial and external data sources. It is found that the factorized forms improve upon conventional filtering techniques in terms of ensuring numerical stability for the investigated landing navigation scenario.

  16. SPONGY (SPam ONtoloGY): Email Classification Using Two-Level Dynamic Ontology

    PubMed Central

    2014-01-01

    Email is one of common communication methods between people on the Internet. However, the increase of email misuse/abuse has resulted in an increasing volume of spam emails over recent years. An experimental system has been designed and implemented with the hypothesis that this method would outperform existing techniques, and the experimental results showed that indeed the proposed ontology-based approach improves spam filtering accuracy significantly. In this paper, two levels of ontology spam filters were implemented: a first level global ontology filter and a second level user-customized ontology filter. The use of the global ontology filter showed about 91% of spam filtered, which is comparable with other methods. The user-customized ontology filter was created based on the specific user's background as well as the filtering mechanism used in the global ontology filter creation. The main contributions of the paper are (1) to introduce an ontology-based multilevel filtering technique that uses both a global ontology and an individual filter for each user to increase spam filtering accuracy and (2) to create a spam filter in the form of ontology, which is user-customized, scalable, and modularized, so that it can be embedded to many other systems for better performance. PMID:25254240

  17. Design of coupled mace filters for optical pattern recognition using practical spatial light modulators

    NASA Technical Reports Server (NTRS)

    Rajan, P. K.; Khan, Ajmal

    1993-01-01

    Spatial light modulators (SLMs) are being used in correlation-based optical pattern recognition systems to implement the Fourier domain filters. Currently available SLMs have certain limitations with respect to the realizability of these filters. Therefore, it is necessary to incorporate the SLM constraints in the design of the filters. The design of a SLM-constrained minimum average correlation energy (SLM-MACE) filter using the simulated annealing-based optimization technique was investigated. The SLM-MACE filter was synthesized for three different types of constraints. The performance of the filter was evaluated in terms of its recognition (discrimination) capabilities using computer simulations. The correlation plane characteristics of the SLM-MACE filter were found to be reasonably good. The SLM-MACE filter yielded far better results than the analytical MACE filter implemented on practical SLMs using the constrained magnitude technique. Further, the filter performance was evaluated in the presence of noise in the input test images. This work demonstrated the need to include the SLM constraints in the filter design. Finally, a method is suggested to reduce the computation time required for the synthesis of the SLM-MACE filter.

  18. SPONGY (SPam ONtoloGY): email classification using two-level dynamic ontology.

    PubMed

    Youn, Seongwook

    2014-01-01

    Email is one of common communication methods between people on the Internet. However, the increase of email misuse/abuse has resulted in an increasing volume of spam emails over recent years. An experimental system has been designed and implemented with the hypothesis that this method would outperform existing techniques, and the experimental results showed that indeed the proposed ontology-based approach improves spam filtering accuracy significantly. In this paper, two levels of ontology spam filters were implemented: a first level global ontology filter and a second level user-customized ontology filter. The use of the global ontology filter showed about 91% of spam filtered, which is comparable with other methods. The user-customized ontology filter was created based on the specific user's background as well as the filtering mechanism used in the global ontology filter creation. The main contributions of the paper are (1) to introduce an ontology-based multilevel filtering technique that uses both a global ontology and an individual filter for each user to increase spam filtering accuracy and (2) to create a spam filter in the form of ontology, which is user-customized, scalable, and modularized, so that it can be embedded to many other systems for better performance.

  19. Contamination control through filtration of microorganisms

    NASA Technical Reports Server (NTRS)

    Stabekis, P. D.; Lyle, R. G.

    1972-01-01

    A description is given of the various kinds of gas and liquid filters used in decontamination and sterilization procedures. Also discussed are filtration mechanisms, characteristics of filter materials, and the factors affecting filter performance. Summaries are included for filter testing and evaluation techniques and the possible application of the filters to spacecraft sterilization.

  20. Miniaturized dielectric waveguide filters

    NASA Astrophysics Data System (ADS)

    Sandhu, Muhammad Y.; Hunter, Ian C.

    2016-10-01

    Design techniques for a new class of integrated monolithic high-permittivity ceramic waveguide filters are presented. These filters enable a size reduction of 50% compared to air-filled transverse electromagnetic filters with the same unloaded Q-factor. Designs for Chebyshev and asymmetric generalised Chebyshev filter and a diplexer are presented with experimental results for an 1800 MHz Chebyshev filter and a 1700 MHz generalised Chebyshev filter showing excellent agreement with theory.

  1. Development of a low-power, low-cost front end electronics module for large scale distributed neutrino detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    James J. Beatty

    2008-03-08

    A number of concepts have been presented for distributed neutrino detectors formed of large numbers of autonomous detectors. Examples include the Antarctic Ross Ice Shelf Antenna Neutrino Array (ARIANNA) [Barwick 2006], as well as proposed radio extensions to the IceCube detector at South Pole Station such as AURA and IceRay. [Besson 2008]. We have focused on key enabling technical developments required by this class of experiments. The radio Cherenkov signal, generated by the Askaryan mechanism [Askaryan 1962, 1965], is impulsive and coherent up to above 1 GHz. In the frequency domain, the impulsive character of the emission results in simultaneousmore » increase of the power detected in multiple frequency bands. This multiband triggering approach has proven fruitful, especially as anthropogenic interference often results from narrowband communications signals. A typical distributed experiment of this type consists of a station responsible for the readout of a cluster of antennas either near the surface of the ice or deployed in boreholes. Each antenna is instrumented with a broadband low-noise amplifier, followed by an array of filters to facilitate multi-band coincidence trigger schemes at the antenna level. The power in each band is detected at the output of each band filter, using either square-law diode detectors or log-power detectors developed for the cellular telephone market. The use of multiple antennas per station allows a local coincidence among antennas to be used as the next stage of the trigger. Station triggers can then be combined into an array trigger by comparing timestamps of triggers among stations and identifying space-time clusters of station triggers. Data from each station is buffered and can be requested from the individual stations when a multi-station coincidence occurs. This approach has been successfully used in distributed experiments such as the Pierre Auger Observatory. [Abraham et al. 2004] We identified the filters as being especially critical. The frequency range of interest, {approx}200 MHz to {approx}1.2 GHz, is a transitional region where the lumped circuit element approach taken at low frequencies begins to reach limitations due to component tolerances, component losses, and parasitic effects. Active circuits can help to mitigate against these effects at the cost of added power consumption that becomes prohibitive for distributed experiments across the band of interest. At higher frequency microstrip, stripline, and other microwave techniques come to the fore. We have developed designs and design tools for passive filters extending the high frequency techniques to the frequency range of interest. Microstrip and stripline techniques are not usually attractive here because of the large physical dimensions of the resulting circuits, but in this application the tradeoff of size against power consumption favors this choice. These techniques are also intrinsically low-cost, as the filter is built into the circuit boards and the cost of components and their assembly onto the board is avoided. The basic element of the filter tree is an impedance matched wideband diplexer. This consists of a pair of low pass and high pass filters with a shared cutoff frequency and complementary frequency responses. These are designing the lowpass filter as a high order LC filter, which can be implemented as a series of transmission line segments of varying width. This can be transformed in to a CL high pass filter with a complementary frequency response. When the two filters are coupled to a common input, the input impedances of the networks add in parallel to give a constant input impedance as a function of frequency, with power flowing into one leg or the other of the filter pair. These filters can be cascaded to divide the band into the frequency ranges of interest; the broadband impedance matching at the inputs makes coupling of successive stages straightforward. These circuits can be produced in quantity at low cost using standard PCB fabrication techniques. We have determined that to achieve best performance the circuits should be built on a low loss-tangent RF substrate. We are working in cooperation with our colleagues in condensed matter who also have a need for this capability to purchase the equipment for in-house fabrication of prototype quantities of these circuits. We plan to continue the work on these filtersusing internal funds, and produce and characterize the performance of prototypes. We also participated in deployment of a prototype detector station near McMurdo Station, Antarctica in collaboration with colleagues at UCLA and UC-Irvine. The prototype station includes a single-board computer, GPS receiver, ADC board, and Iridium satellite modem powered by an omnidirectional solar array. We operated this station in the austral summer of 2006-2007, and used the Iridium SMS mode to transmit the status of the station until the end of the daylight season.« less

  2. Importance of Adjunct Delivery Techniques to Optimize Deployment Success of Distal Protection Filters During Vein Graft Intervention.

    PubMed

    Kaliyadan, Antony G; Chawla, Harnish; Fischman, David L; Ruggiero, Nicholas; Gannon, Michael; Walinsky, Paul; Savage, Michael P

    2017-02-01

    This study assessed the impact of adjunct delivery techniques on the deployment success of distal protection filters in saphenous vein grafts (SVGs). Despite their proven clinical benefit, distal protection devices are underutilized in SVG interventions. Deployment of distal protection filters can be technically challenging in the presence of complex anatomy. Techniques that facilitate the delivery success of these devices could potentially improve clinical outcomes and promote greater use of distal protection. Outcomes of 105 consecutive SVG interventions with attempted use of a FilterWire distal protection device (Boston Scientific) were reviewed. In patients in whom filter delivery initially failed, the success of attempted redeployment using adjunct delivery techniques was assessed. Two strategies were utilized sequentially: (1) a 0.014" moderate-stiffness hydrophilic guidewire was placed first to function as a parallel buddy wire to support subsequent FilterWire crossing; and (2) if the buddy-wire approach failed, predilation with a 2.0 mm balloon at low pressure was performed followed by reattempted filter delivery. The study population consisted of 80 men and 25 women aged 73 ± 10 years. Mean SVG age was 14 ± 6 years. Complex disease (American College of Cardiology/American Heart Association class B2 or C) was present in 92%. Initial delivery of the FilterWire was successful in 82/105 patients (78.1%). Of the 23 patients with initial failed delivery, 8 (35%) had successful deployment with a buddy wire alone, 7 (30%) had successful deployment with balloon predilation plus buddy wire, 4 (17%) had failed reattempt at deployment despite adjunct maneuvers, and in 4 (17%) no additional attempts at deployment were made at the operator's discretion. Deployment failure was reduced from 21.9% initially to 7.6% after use of adjunct delivery techniques (P<.01). No adverse events were observed with these measures. Deployment of distal protection devices can be technically difficult with complex SVG disease. Adjunct delivery techniques are important to optimize deployment success of distal protection filters during SVG intervention.

  3. Fast Collaborative Filtering from Implicit Feedback withProvable Guarantees

    DTIC Science & Technology

    2016-11-22

    n2 = Ω (( ε d̃2sσK(M2) )2) • n3 = Ω ( K2 ( 10 d̃2sσK(M2)5/2 + 2 √ 2 d̃3sσK(M2)3/2 )2 ε2 ) 212 Fast Collaborative Filtering for some constants c1 and c2...drawback of Method of Moments is that it will not work when there are only a few users available such that N < Θ( K2 ). However, modern recommendation systems...2 √ 2 d̃3sσK(M2)3/2 ) 2ε√ N ≤ c1 1 K √ πmax Since πmax ≤ 1, we need N ≥ Ω ( K2 ( 10 d̃2sσK(M2)5/2 + 2 √ 2 d̃3sσK(M2)3/2 )2 ε2 ) . This con- tributes

  4. [Pediatric surgery 2.0].

    PubMed

    Mesa-Gutiérrez, J C; Bardají, C; Brun, N; Núñez, B; Sánchez, B; Sanvicente, B; Obiols, P; Rigol, S

    2012-04-01

    New tools from the web are a complete breakthrough in management of information. The aim of this paper is to present different resources in a friendly way, with apps and examples in the different phases of the knowledge management for the paediatric surgeon: search, filter, reception, classification, sharing, collaborative work and publication. We are assisting to a real revolution on how to manage knowledge and information. The main charateristics are: immediateness, social component, growing interaction, and easiness. Every physician has clinical questions and the Internet gives us more and more resources to make searchs easier. Along with them we need electronic resources to filter information of quality and to make easier transfer of knowledge to clinical practice. Cloud computing is on continuous development and makes possible sharing information with differents users and computers. The main feature of the apps from the Intenet is the social component, that makes possible interaction, sharing and collaborative work.

  5. Development of a filter regeneration system for advanced spacecraft fluid systems

    NASA Technical Reports Server (NTRS)

    Behrend, A. F., Jr.; Descamp, V. A.

    1974-01-01

    The development of a filter regeneration system for efficiently cleaning fluid particulate filters is presented. Based on a backflush/jet impingement technique, the regeneration system demonstrated a cleaning efficiency of 98.7 to 100%. The operating principles and design features are discussed with emphasis on the primary system components that include a regenerable filter, vortex particle separator, and zero-g particle trap. Techniques and equipment used for ground and zero-g performance tests are described. Test results and conclusions, as well as possible areas for commercial application, are included.

  6. A Comparative Study of Different Deblurring Methods Using Filters

    NASA Astrophysics Data System (ADS)

    Srimani, P. K.; Kavitha, S.

    2011-12-01

    This paper attempts to undertake the study of Restored Gaussian Blurred Images by using four types of techniques of deblurring image viz., Wiener filter, Regularized filter, Lucy Richardson deconvolution algorithm and Blind deconvolution algorithm with an information of the Point Spread Function (PSF) corrupted blurred image. The same is applied to the scanned image of seven months baby in the womb and they are compared with one another, so as to choose the best technique for restored or deblurring image. This paper also attempts to undertake the study of restored blurred image using Regualr Filter(RF) with no information about the Point Spread Function (PSF) by using the same four techniques after executing the guess of the PSF. The number of iterations and the weight threshold of it to choose the best guesses for restored or deblurring image of these techniques are determined.

  7. Estimation of single plane unbalance parameters of a rotor-bearing system using Kalman filtering based force estimation technique

    NASA Astrophysics Data System (ADS)

    Shrivastava, Akash; Mohanty, A. R.

    2018-03-01

    This paper proposes a model-based method to estimate single plane unbalance parameters (amplitude and phase angle) in a rotor using Kalman filter and recursive least square based input force estimation technique. Kalman filter based input force estimation technique requires state-space model and response measurements. A modified system equivalent reduction expansion process (SEREP) technique is employed to obtain a reduced-order model of the rotor system so that limited response measurements can be used. The method is demonstrated using numerical simulations on a rotor-disk-bearing system. Results are presented for different measurement sets including displacement, velocity, and rotational response. Effects of measurement noise level, filter parameters (process noise covariance and forgetting factor), and modeling error are also presented and it is observed that the unbalance parameter estimation is robust with respect to measurement noise.

  8. Cross-Dependency Inference in Multi-Layered Networks: A Collaborative Filtering Perspective.

    PubMed

    Chen, Chen; Tong, Hanghang; Xie, Lei; Ying, Lei; He, Qing

    2017-08-01

    The increasingly connected world has catalyzed the fusion of networks from different domains, which facilitates the emergence of a new network model-multi-layered networks. Examples of such kind of network systems include critical infrastructure networks, biological systems, organization-level collaborations, cross-platform e-commerce, and so forth. One crucial structure that distances multi-layered network from other network models is its cross-layer dependency, which describes the associations between the nodes from different layers. Needless to say, the cross-layer dependency in the network plays an essential role in many data mining applications like system robustness analysis and complex network control. However, it remains a daunting task to know the exact dependency relationships due to noise, limited accessibility, and so forth. In this article, we tackle the cross-layer dependency inference problem by modeling it as a collective collaborative filtering problem. Based on this idea, we propose an effective algorithm Fascinate that can reveal unobserved dependencies with linear complexity. Moreover, we derive Fascinate-ZERO, an online variant of Fascinate that can respond to a newly added node timely by checking its neighborhood dependencies. We perform extensive evaluations on real datasets to substantiate the superiority of our proposed approaches.

  9. A collaborative filtering approach for protein-protein docking scoring functions.

    PubMed

    Bourquard, Thomas; Bernauer, Julie; Azé, Jérôme; Poupon, Anne

    2011-04-22

    A protein-protein docking procedure traditionally consists in two successive tasks: a search algorithm generates a large number of candidate conformations mimicking the complex existing in vivo between two proteins, and a scoring function is used to rank them in order to extract a native-like one. We have already shown that using Voronoi constructions and a well chosen set of parameters, an accurate scoring function could be designed and optimized. However to be able to perform large-scale in silico exploration of the interactome, a near-native solution has to be found in the ten best-ranked solutions. This cannot yet be guaranteed by any of the existing scoring functions. In this work, we introduce a new procedure for conformation ranking. We previously developed a set of scoring functions where learning was performed using a genetic algorithm. These functions were used to assign a rank to each possible conformation. We now have a refined rank using different classifiers (decision trees, rules and support vector machines) in a collaborative filtering scheme. The scoring function newly obtained is evaluated using 10 fold cross-validation, and compared to the functions obtained using either genetic algorithms or collaborative filtering taken separately. This new approach was successfully applied to the CAPRI scoring ensembles. We show that for 10 targets out of 12, we are able to find a near-native conformation in the 10 best ranked solutions. Moreover, for 6 of them, the near-native conformation selected is of high accuracy. Finally, we show that this function dramatically enriches the 100 best-ranking conformations in near-native structures.

  10. A Collaborative Filtering Approach for Protein-Protein Docking Scoring Functions

    PubMed Central

    Bourquard, Thomas; Bernauer, Julie; Azé, Jérôme; Poupon, Anne

    2011-01-01

    A protein-protein docking procedure traditionally consists in two successive tasks: a search algorithm generates a large number of candidate conformations mimicking the complex existing in vivo between two proteins, and a scoring function is used to rank them in order to extract a native-like one. We have already shown that using Voronoi constructions and a well chosen set of parameters, an accurate scoring function could be designed and optimized. However to be able to perform large-scale in silico exploration of the interactome, a near-native solution has to be found in the ten best-ranked solutions. This cannot yet be guaranteed by any of the existing scoring functions. In this work, we introduce a new procedure for conformation ranking. We previously developed a set of scoring functions where learning was performed using a genetic algorithm. These functions were used to assign a rank to each possible conformation. We now have a refined rank using different classifiers (decision trees, rules and support vector machines) in a collaborative filtering scheme. The scoring function newly obtained is evaluated using 10 fold cross-validation, and compared to the functions obtained using either genetic algorithms or collaborative filtering taken separately. This new approach was successfully applied to the CAPRI scoring ensembles. We show that for 10 targets out of 12, we are able to find a near-native conformation in the 10 best ranked solutions. Moreover, for 6 of them, the near-native conformation selected is of high accuracy. Finally, we show that this function dramatically enriches the 100 best-ranking conformations in near-native structures. PMID:21526112

  11. Optical filters for UV to near IR space applications

    NASA Astrophysics Data System (ADS)

    Begou, T.; Krol, H.; Hecquet, Christophe; Bondet, C.; Lumeau, J.; Grèzes-Besset, C.; Lequime, M.

    2017-11-01

    We present hereafter the results on the fabrication of complex optical filters within the Institut Fresnel in close collaboration with CILAS. Bandpass optical filters dedicated to astronomy and space applications, with central wavelengths ranging from ultraviolet to near infrared, were deposited on both sides of glass substrates with performances in very good congruence with theoretical designs. For these applications, the required functions are particularly complex as they must present a very narrow bandwidth as well as a high level of rejection over a broad spectral range. In addition to those severe optical performances, insensitivity to environmental conditions is necessary. For this purpose, robust solutions with particularly stable performances have to be proposed.

  12. Lumped element filters for electronic warfare systems

    NASA Astrophysics Data System (ADS)

    Morgan, D.; Ragland, R.

    1986-02-01

    Increasing demands which future generations of electronic warfare (EW) systems are to satisfy include a reduction in the size of the equipment. The present paper is concerned with lumped element filters which can make a significant contribution to the downsizing of advanced EW systems. Lumped element filter design makes it possible to obtain very small package sizes by utilizing classical low frequency inductive and capacitive components which are small compared to the size of a wavelength. Cost-effective, temperature-stable devices can be obtained on the basis of new design techniques. Attention is given to aspects of design flexibility, an interdigital filter equivalent circuit diagram, conditions for which the use of lumped element filters can be recommended, construction techniques, a design example, and questions regarding the application of lumped element filters to EW processing systems.

  13. Collaboration spotting for dental science.

    PubMed

    Leonardi, E; Agocs, A; Fragkiskos, S; Kasfikis, N; Le Goff, J M; Cristalli, M P; Luzzi, V; Polimeni, A

    2014-10-06

    The goal of the Collaboration Spotting project is to create an automatic system to collect information about publications and patents related to a given technology, to identify the key players involved, and to highlight collaborations and related technologies. The collected information can be visualized in a web browser as interactive graphical maps showing in an intuitive way the players and their collaborations (Sociogram) and the relations among the technologies (Technogram). We propose to use the system to study technologies related to Dental Science. In order to create a Sociogram, we create a logical filter based on a set of keywords related to the technology under study. This filter is used to extract a list of publications from the Web of Science™ database. The list is validated by an expert in the technology and sent to CERN where it is inserted in the Collaboration Spotting database. Here, an automatic software system uses the data to generate the final maps. We studied a set of recent technologies related to bone regeneration procedures of oro--maxillo--facial critical size defects, namely the use of Porous HydroxyApatite (HA) as a bone substitute alone (bone graft) or as a tridimensional support (scaffold) for insemination and differentiation ex--vivo of Mesenchymal Stem Cells. We produced the Sociograms for these technologies and the resulting maps are now accessible on--line. The Collaboration Spotting system allows the automatic creation of interactive maps to show the current and historical state of research on a specific technology. These maps are an ideal tool both for researchers who want to assess the state--of--the--art in a given technology, and for research organizations who want to evaluate their contribution to the technological development in a given field. We demonstrated that the system can be used for Dental Science and produced the maps for an initial set of technologies in this field. We now plan to enlarge the set of mapped technologies in order to make the Collaboration Spotting system a useful reference tool for Dental Science research.

  14. Collaboration Spotting for oral medicine.

    PubMed

    Leonardi, E; Agocs, A; Fragkiskos, S; Kasfikis, N; Le Goff, J M; Cristalli, M P; Luzzi, V; Polimeni, A

    2014-09-01

    The goal of the Collaboration Spotting project is to create an automatic system to collect information about publications and patents related to a given technology, to identify the key players involved, and to highlight collaborations and related technologies. The collected information can be visualized in a web browser as interactive graphical maps showing in an intuitive way the players and their collaborations (Sociogram) and the relations among the technologies (Technogram). We propose to use the system to study technologies related to oral medicine. In order to create a sociogram, we create a logical filter based on a set of keywords related to the technology under study. This filter is used to extract a list of publications from the Web of Science™ database. The list is validated by an expert in the technology and sent to CERN where it is inserted in the Collaboration Spotting database. Here, an automatic software system uses the data to generate the final maps. We studied a set of recent technologies related to bone regeneration procedures of oro-maxillo-facial critical size defects, namely the use of porous hydroxyapatite (HA) as a bone substitute alone (bone graft) or as a tridimensional support (scaffold) for insemination and differentiation ex vivo of mesenchymal stem cells. We produced the sociograms for these technologies and the resulting maps are now accessible on-line. The Collaboration Spotting system allows the automatic creation of interactive maps to show the current and historical state of research on a specific technology. These maps are an ideal tool both for researchers who want to assess the state-of-the-art in a given technology, and for research organizations who want to evaluate their contribution to the technological development in a given field. We demonstrated that the system can be used in oral medicine as is produced the maps for an initial set of technologies in this field. We now plan to enlarge the set of mapped technologies in order to make the Collaboration Spotting system a useful reference tool for oral medicine research.

  15. Discrete filtering techniques applied to sequential GPS range measurements

    NASA Technical Reports Server (NTRS)

    Vangraas, Frank

    1987-01-01

    The basic navigation solution is described for position and velocity based on range and delta range (Doppler) measurements from NAVSTAR Global Positioning System satellites. The application of discrete filtering techniques is examined to reduce the white noise distortions on the sequential range measurements. A second order (position and velocity states) Kalman filter is implemented to obtain smoothed estimates of range by filtering the dynamics of the signal from each satellite separately. Test results using a simulated GPS receiver show a steady-state noise reduction, the input noise variance divided by the output noise variance, of a factor of four. Recommendations for further noise reduction based on higher order Kalman filters or additional delta range measurements are included.

  16. High-latitude filtering in a global grid-point model using model normal modes. [Fourier filters for synoptic weather forecasting

    NASA Technical Reports Server (NTRS)

    Takacs, L. L.; Kalnay, E.; Navon, I. M.

    1985-01-01

    A normal modes expansion technique is applied to perform high latitude filtering in the GLAS fourth order global shallow water model with orography. The maximum permissible time step in the solution code is controlled by the frequency of the fastest propagating mode, which can be a gravity wave. Numerical methods are defined for filtering the data to identify the number of gravity modes to be included in the computations in order to obtain the appropriate zonal wavenumbers. The performances of the model with and without the filter, and with a time tendency and a prognostic field filter are tested with simulations of the Northern Hemisphere winter. The normal modes expansion technique is shown to leave the Rossby modes intact and permit 3-5 day predictions, a range not possible with the other high-latitude filters.

  17. A tool for filtering information in complex systems

    NASA Astrophysics Data System (ADS)

    Tumminello, M.; Aste, T.; Di Matteo, T.; Mantegna, R. N.

    2005-07-01

    We introduce a technique to filter out complex data sets by extracting a subgraph of representative links. Such a filtering can be tuned up to any desired level by controlling the genus of the resulting graph. We show that this technique is especially suitable for correlation-based graphs, giving filtered graphs that preserve the hierarchical organization of the minimum spanning tree but containing a larger amount of information in their internal structure. In particular in the case of planar filtered graphs (genus equal to 0), triangular loops and four-element cliques are formed. The application of this filtering procedure to 100 stocks in the U.S. equity markets shows that such loops and cliques have important and significant relationships with the market structure and properties. This paper was submitted directly (Track II) to the PNAS office.Abbreviations: MST, minimum spanning tree; PMFG, Planar Maximally Filtered Graph; r-clique, clique of r elements.

  18. Selection vector filter framework

    NASA Astrophysics Data System (ADS)

    Lukac, Rastislav; Plataniotis, Konstantinos N.; Smolka, Bogdan; Venetsanopoulos, Anastasios N.

    2003-10-01

    We provide a unified framework of nonlinear vector techniques outputting the lowest ranked vector. The proposed framework constitutes a generalized filter class for multichannel signal processing. A new class of nonlinear selection filters are based on the robust order-statistic theory and the minimization of the weighted distance function to other input samples. The proposed method can be designed to perform a variety of filtering operations including previously developed filtering techniques such as vector median, basic vector directional filter, directional distance filter, weighted vector median filters and weighted directional filters. A wide range of filtering operations is guaranteed by the filter structure with two independent weight vectors for angular and distance domains of the vector space. In order to adapt the filter parameters to varying signal and noise statistics, we provide also the generalized optimization algorithms taking the advantage of the weighted median filters and the relationship between standard median filter and vector median filter. Thus, we can deal with both statistical and deterministic aspects of the filter design process. It will be shown that the proposed method holds the required properties such as the capability of modelling the underlying system in the application at hand, the robustness with respect to errors in the model of underlying system, the availability of the training procedure and finally, the simplicity of filter representation, analysis, design and implementation. Simulation studies also indicate that the new filters are computationally attractive and have excellent performance in environments corrupted by bit errors and impulsive noise.

  19. Destriping of Landsat MSS images by filtering techniques

    USGS Publications Warehouse

    Pan, Jeng-Jong; Chang, Chein-I

    1992-01-01

    : The removal of striping noise encountered in the Landsat Multispectral Scanner (MSS) images can be generally done by using frequency filtering techniques. Frequency do~ain filteri~g has, how~ver, se,:era~ prob~ems~ such as storage limitation of data required for fast Fourier transforms, nngmg artl~acts appe~nng at hlgh-mt,enslty.dlscontinuities, and edge effects between adjacent filtered data sets. One way for clrcu~,,:entmg the above difficulties IS, to design a spatial filter to convolve with the images. Because it is known that the,stnpmg a.lways appears at frequencies of 1/6, 1/3, and 1/2 cycles per line, it is possible to design a simple one-dimensIOnal spat~a~ fll,ter to take advantage of this a priori knowledge to cope with the above problems. The desired filter is the type of ~mlte Impuls~ response which can be designed by a linear programming and Remez's exchange algorithm coupled ~lth an adaptIve tec,hmque. In addition, a four-step spatial filtering technique with an appropriate adaptive approach IS also presented which may be particularly useful for geometrically rectified MSS images.

  20. Applications of charge-coupled device transversal filters to communication

    NASA Technical Reports Server (NTRS)

    Buss, D. D.; Bailey, W. H.; Brodersen, R. W.; Hewes, C. R.; Tasch, A. F., Jr.

    1975-01-01

    The paper discusses the computational power of state-of-the-art charged-coupled device (CCD) transversal filters in communications applications. Some of the performance limitations of CCD transversal filters are discussed, with attention given to time delay and bandwidth, imperfect charge transfer efficiency, weighting coefficient error, noise, and linearity. The application of CCD transversal filters to matched filtering, spectral filtering, and Fourier analysis is examined. Techniques for making programmable transversal filters are briefly outlined.

  1. Adaptive Filtering to Enhance Noise Immunity of Impedance and Admittance Spectroscopy: Comparison with Fourier Transformation

    NASA Astrophysics Data System (ADS)

    Stupin, Daniil D.; Koniakhin, Sergei V.; Verlov, Nikolay A.; Dubina, Michael V.

    2017-05-01

    The time-domain technique for impedance spectroscopy consists of computing the excitation voltage and current response Fourier images by fast or discrete Fourier transformation and calculating their relation. Here we propose an alternative method for excitation voltage and current response processing for deriving a system impedance spectrum based on a fast and flexible adaptive filtering method. We show the equivalence between the problem of adaptive filter learning and deriving the system impedance spectrum. To be specific, we express the impedance via the adaptive filter weight coefficients. The noise-canceling property of adaptive filtering is also justified. Using the RLC circuit as a model system, we experimentally show that adaptive filtering yields correct admittance spectra and elements ratings in the high-noise conditions when the Fourier-transform technique fails. Providing the additional sensitivity of impedance spectroscopy, adaptive filtering can be applied to otherwise impossible-to-interpret time-domain impedance data. The advantages of adaptive filtering are justified with practical living-cell impedance measurements.

  2. A comparative study on preprocessing techniques in diabetic retinopathy retinal images: illumination correction and contrast enhancement.

    PubMed

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation.

  3. Full-color large-scaled computer-generated holograms using RGB color filters.

    PubMed

    Tsuchiyama, Yasuhiro; Matsushima, Kyoji

    2017-02-06

    A technique using RGB color filters is proposed for creating high-quality full-color computer-generated holograms (CGHs). The fringe of these CGHs is composed of more than a billion pixels. The CGHs reconstruct full-parallax three-dimensional color images with a deep sensation of depth caused by natural motion parallax. The simulation technique as well as the principle and challenges of high-quality full-color reconstruction are presented to address the design of filter properties suitable for large-scaled CGHs. Optical reconstructions of actual fabricated full-color CGHs are demonstrated in order to verify the proposed techniques.

  4. Post-acquisition data mining techniques for LC-MS/MS-acquired data in drug metabolite identification.

    PubMed

    Dhurjad, Pooja Sukhdev; Marothu, Vamsi Krishna; Rathod, Rajeshwari

    2017-08-01

    Metabolite identification is a crucial part of the drug discovery process. LC-MS/MS-based metabolite identification has gained widespread use, but the data acquired by the LC-MS/MS instrument is complex, and thus the interpretation of data becomes troublesome. Fortunately, advancements in data mining techniques have simplified the process of data interpretation with improved mass accuracy and provide a potentially selective, sensitive, accurate and comprehensive way for metabolite identification. In this review, we have discussed the targeted (extracted ion chromatogram, mass defect filter, product ion filter, neutral loss filter and isotope pattern filter) and untargeted (control sample comparison, background subtraction and metabolomic approaches) post-acquisition data mining techniques, which facilitate the drug metabolite identification. We have also discussed the importance of integrated data mining strategy.

  5. Sensor failure detection system. [for the F100 turbofan engine

    NASA Technical Reports Server (NTRS)

    Beattie, E. C.; Laprad, R. F.; Mcglone, M. E.; Rock, S. M.; Akhter, M. M.

    1981-01-01

    Advanced concepts for detecting, isolating, and accommodating sensor failures were studied to determine their applicability to the gas turbine control problem. Five concepts were formulated based upon such techniques as Kalman filters and a screening process led to the selection of one advanced concept for further evaluation. The selected advanced concept uses a Kalman filter to generate residuals, a weighted sum square residuals technique to detect soft failures, likelihood ratio testing of a bank of Kalman filters for isolation, and reconfiguring of the normal mode Kalman filter by eliminating the failed input to accommodate the failure. The advanced concept was compared to a baseline parameter synthesis technique. The advanced concept was shown to be a viable concept for detecting, isolating, and accommodating sensor failures for the gas turbine applications.

  6. Automating "Word of Mouth" to Recommend Classes to Students: An Application of Social Information Filtering Algorithms

    ERIC Educational Resources Information Center

    Booker, Queen Esther

    2009-01-01

    An approach used to tackle the problem of helping online students find the classes they want and need is a filtering technique called "social information filtering," a general approach to personalized information filtering. Social information filtering essentially automates the process of "word-of-mouth" recommendations: items are recommended to a…

  7. Measuring the Interestingness of News Articles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pon, R K; Cardenas, A F; Buttler, D J

    An explosive growth of online news has taken place. Users are inundated with thousands of news articles, only some of which are interesting. A system to filter out uninteresting articles would aid users that need to read and analyze many articles daily, such as financial analysts and government officials. The most obvious approach for reducing the amount of information overload is to learn keywords of interest for a user (Carreira et al., 2004). Although filtering articles based on keywords removes many irrelevant articles, there are still many uninteresting articles that are highly relevant to keyword searches. A relevant article maymore » not be interesting for various reasons, such as the article's age or if it discusses an event that the user has already read about in other articles. Although it has been shown that collaborative filtering can aid in personalized recommendation systems (Wang et al., 2006), a large number of users is needed. In a limited user environment, such as a small group of analysts monitoring news events, collaborative filtering would be ineffective. The definition of what makes an article interesting--or its 'interestingness'--varies from user to user and is continually evolving, calling for adaptable user personalization. Furthermore, due to the nature of news, most articles are uninteresting since many are similar or report events outside the scope of an individual's concerns. There has been much work in news recommendation systems, but none have yet addressed the question of what makes an article interesting.« less

  8. Toward visual user interfaces supporting collaborative multimedia content management

    NASA Astrophysics Data System (ADS)

    Husein, Fathi; Leissler, Martin; Hemmje, Matthias

    2000-12-01

    Supporting collaborative multimedia content management activities, as e.g., image and video acquisition, exploration, and access dialogues between naive users and multi media information systems is a non-trivial task. Although a wide variety of experimental and prototypical multimedia storage technologies as well as corresponding indexing and retrieval engines are available, most of them lack appropriate support for collaborative end-user oriented user interface front ends. The development of advanced user adaptable interfaces is necessary for building collaborative multimedia information- space presentations based upon advanced tools for information browsing, searching, filtering, and brokering to be applied on potentially very large and highly dynamic multimedia collections with a large number of users and user groups. Therefore, the development of advanced and at the same time adaptable and collaborative computer graphical information presentation schemes that allow to easily apply adequate visual metaphors for defined target user stereotypes has to become a key focus within ongoing research activities trying to support collaborative information work with multimedia collections.

  9. Telemetry Modernization with Open Architecture Software-Defined Radio Technology

    DTIC Science & Technology

    2016-01-01

    digital (A/D) con- vertors and separated into narrowband channels through digital down-conversion ( DDC ) techniques implemented in field-programmable...Lexington, MA 02420-9108 781-981-4204 Operations center Recording Filter FPGA DDC Filter Channel 1 Filter FPGA DDC Filter Channel n Wideband tuner A

  10. Performance Evaluation of Axial Flow AG-1 FC and Prototype FM (High Strength) HEPA Filters - 13123

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giffin, Paxton K.; Parsons, Michael S.; Wilson, John A.

    High efficiency particulate air (HEPA) filters are routinely used in DOE nuclear containment activities. The Nuclear Air Cleaning Handbook (NACH) stipulates that air cleaning devices and equipment used in DOE nuclear applications must meet the American Society of Mechanical Engineers (ASME) Code on Nuclear Air and Gas Treatment (AG-1) standard. This testing activity evaluates two different axial flow HEPA filters, those from AG-1 Sections FC and FM. Section FM is under development and has not yet been added to AG-1 due to a lack of qualification data available for these filters. Section FC filters are axial flow units that utilizemore » a fibrous glass filtering medium. The section FM filters utilize a similar fibrous glass medium, but also have scrim backing. The scrim-backed filters have demonstrated the ability to endure pressure impulses capable of completely destroying FC filters. The testing activities presented herein will examine the total lifetime loading for both FC and FM filters under ambient conditions and at elevated conditions of temperature and relative humidity. Results will include loading curves, penetration curves, and testing condition parameters. These testing activities have been developed through collaborations with representatives from the National Nuclear Security Administration (NNSA), DOE Office of Environmental Management (DOE-EM), New Mexico State University, and Mississippi State University. (authors)« less

  11. A CANDLE for a deeper in vivo insight

    PubMed Central

    Coupé, Pierrick; Munz, Martin; Manjón, Jose V; Ruthazer, Edward S; Louis Collins, D.

    2012-01-01

    A new Collaborative Approach for eNhanced Denoising under Low-light Excitation (CANDLE) is introduced for the processing of 3D laser scanning multiphoton microscopy images. CANDLE is designed to be robust for low signal-to-noise ratio (SNR) conditions typically encountered when imaging deep in scattering biological specimens. Based on an optimized non-local means filter involving the comparison of filtered patches, CANDLE locally adapts the amount of smoothing in order to deal with the noise inhomogeneity inherent to laser scanning fluorescence microscopy images. An extensive validation on synthetic data, images acquired on microspheres and in vivo images is presented. These experiments show that the CANDLE filter obtained competitive results compared to a state-of-the-art method and a locally adaptive optimized nonlocal means filter, especially under low SNR conditions (PSNR<8dB). Finally, the deeper imaging capabilities enabled by the proposed filter are demonstrated on deep tissue in vivo images of neurons and fine axonal processes in the Xenopus tadpole brain. PMID:22341767

  12. Entrapment of Guide Wire in an Inferior Vena Cava Filter: A Technique for Removal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdel-Aal, Ahmed Kamel, E-mail: akamel@uabmc.edu; Saddekni, Souheil; Hamed, Maysoon Farouk

    Entrapment of a central venous catheter (CVC) guide wire in an inferior vena cava (IVC) filter is a rare, but reported complication during CVC placement. With the increasing use of vena cava filters (VCFs), this number will most likely continue to grow. The consequences of this complication can be serious, as continued traction upon the guide wire may result in filter dislodgement and migration, filter fracture, or injury to the IVC. We describe a case in which a J-tipped guide wire introduced through a left subclavian access without fluoroscopic guidance during CVC placement was entrapped at the apex of anmore » IVC filter. We describe a technique that we used successfully in removing the entrapped wire through the left subclavian access site. We also present simple useful recommendations to prevent this complication.« less

  13. Rule-based fuzzy vector median filters for 3D phase contrast MRI segmentation

    NASA Astrophysics Data System (ADS)

    Sundareswaran, Kartik S.; Frakes, David H.; Yoganathan, Ajit P.

    2008-02-01

    Recent technological advances have contributed to the advent of phase contrast magnetic resonance imaging (PCMRI) as standard practice in clinical environments. In particular, decreased scan times have made using the modality more feasible. PCMRI is now a common tool for flow quantification, and for more complex vector field analyses that target the early detection of problematic flow conditions. Segmentation is one component of this type of application that can impact the accuracy of the final product dramatically. Vascular segmentation, in general, is a long-standing problem that has received significant attention. Segmentation in the context of PCMRI data, however, has been explored less and can benefit from object-based image processing techniques that incorporate fluids specific information. Here we present a fuzzy rule-based adaptive vector median filtering (FAVMF) algorithm that in combination with active contour modeling facilitates high-quality PCMRI segmentation while mitigating the effects of noise. The FAVMF technique was tested on 111 synthetically generated PC MRI slices and on 15 patients with congenital heart disease. The results were compared to other multi-dimensional filters namely the adaptive vector median filter, the adaptive vector directional filter, and the scalar low pass filter commonly used in PC MRI applications. FAVMF significantly outperformed the standard filtering methods (p < 0.0001). Two conclusions can be drawn from these results: a) Filtering should be performed after vessel segmentation of PC MRI; b) Vector based filtering methods should be used instead of scalar techniques.

  14. Grid artifact reduction for direct digital radiography detectors based on rotated stationary grids with homomorphic filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Dong Sik; Lee, Sanggyun

    2013-06-15

    Purpose: Grid artifacts are caused when using the antiscatter grid in obtaining digital x-ray images. In this paper, research on grid artifact reduction techniques is conducted especially for the direct detectors, which are based on amorphous selenium. Methods: In order to analyze and reduce the grid artifacts, the authors consider a multiplicative grid image model and propose a homomorphic filtering technique. For minimal damage due to filters, which are used to suppress the grid artifacts, rotated grids with respect to the sampling direction are employed, and min-max optimization problems for searching optimal grid frequencies and angles for given sampling frequenciesmore » are established. The authors then propose algorithms for the grid artifact reduction based on the band-stop filters as well as low-pass filters. Results: The proposed algorithms are experimentally tested for digital x-ray images, which are obtained from direct detectors with the rotated grids, and are compared with other algorithms. It is shown that the proposed algorithms can successfully reduce the grid artifacts for direct detectors. Conclusions: By employing the homomorphic filtering technique, the authors can considerably suppress the strong grid artifacts with relatively narrow-bandwidth filters compared to the normal filtering case. Using rotated grids also significantly reduces the ringing artifact. Furthermore, for specific grid frequencies and angles, the authors can use simple homomorphic low-pass filters in the spatial domain, and thus alleviate the grid artifacts with very low implementation complexity.« less

  15. Ballistic Imaging and Scattering Measurements for Diesel Spray Combustion: Optical Development and Phenomenological Studies

    DTIC Science & Technology

    2016-04-01

    polystyrene spheres in a water suspension. The impact of spatial filtering , temporal filtering , and scattering path length on image resolution are...The impact of spatial filtering , temporal filtering , and scattering path length on image resolution are reported. The technique is demonstrated...cell filled with polystyrene spheres in a water suspension. The impact of spatial filtering , temporal filtering , and scattering path length on image

  16. Triangular covariance factorizations for. Ph.D. Thesis. - Calif. Univ.

    NASA Technical Reports Server (NTRS)

    Thornton, C. L.

    1976-01-01

    An improved computational form of the discrete Kalman filter is derived using an upper triangular factorization of the error covariance matrix. The covariance P is factored such that P = UDUT where U is unit upper triangular and D is diagonal. Recursions are developed for propagating the U-D covariance factors together with the corresponding state estimate. The resulting algorithm, referred to as the U-D filter, combines the superior numerical precision of square root filtering techniques with an efficiency comparable to that of Kalman's original formula. Moreover, this method is easily implemented and involves no more computer storage than the Kalman algorithm. These characteristics make the U-D method an attractive realtime filtering technique. A new covariance error analysis technique is obtained from an extension of the U-D filter equations. This evaluation method is flexible and efficient and may provide significantly improved numerical results. Cost comparisons show that for a large class of problems the U-D evaluation algorithm is noticeably less expensive than conventional error analysis methods.

  17. Applications of Kalman filtering to real-time trace gas concentration measurements

    NASA Technical Reports Server (NTRS)

    Leleux, D. P.; Claps, R.; Chen, W.; Tittel, F. K.; Harman, T. L.

    2002-01-01

    A Kalman filtering technique is applied to the simultaneous detection of NH3 and CO2 with a diode-laser-based sensor operating at 1.53 micrometers. This technique is developed for improving the sensitivity and precision of trace gas concentration levels based on direct overtone laser absorption spectroscopy in the presence of various sensor noise sources. Filter performance is demonstrated to be adaptive to real-time noise and data statistics. Additionally, filter operation is successfully performed with dynamic ranges differing by three orders of magnitude. Details of Kalman filter theory applied to the acquired spectroscopic data are discussed. The effectiveness of this technique is evaluated by performing NH3 and CO2 concentration measurements and utilizing it to monitor varying ammonia and carbon dioxide levels in a bioreactor for water reprocessing, located at the NASA-Johnson Space Center. Results indicate a sensitivity enhancement of six times, in terms of improved minimum detectable absorption by the gas sensor.

  18. Homogenous polynomially parameter-dependent H∞ filter designs of discrete-time fuzzy systems.

    PubMed

    Zhang, Huaguang; Xie, Xiangpeng; Tong, Shaocheng

    2011-10-01

    This paper proposes a novel H(∞) filtering technique for a class of discrete-time fuzzy systems. First, a novel kind of fuzzy H(∞) filter, which is homogenous polynomially parameter dependent on membership functions with an arbitrary degree, is developed to guarantee the asymptotic stability and a prescribed H(∞) performance of the filtering error system. Second, relaxed conditions for H(∞) performance analysis are proposed by using a new fuzzy Lyapunov function and the Finsler lemma with homogenous polynomial matrix Lagrange multipliers. Then, based on a new kind of slack variable technique, relaxed linear matrix inequality-based H(∞) filtering conditions are proposed. Finally, two numerical examples are provided to illustrate the effectiveness of the proposed approach.

  19. Desensitized Optimal Filtering and Sensor Fusion Toolkit

    NASA Technical Reports Server (NTRS)

    Karlgaard, Christopher D.

    2015-01-01

    Analytical Mechanics Associates, Inc., has developed a software toolkit that filters and processes navigational data from multiple sensor sources. A key component of the toolkit is a trajectory optimization technique that reduces the sensitivity of Kalman filters with respect to model parameter uncertainties. The sensor fusion toolkit also integrates recent advances in adaptive Kalman and sigma-point filters for non-Gaussian problems with error statistics. This Phase II effort provides new filtering and sensor fusion techniques in a convenient package that can be used as a stand-alone application for ground support and/or onboard use. Its modular architecture enables ready integration with existing tools. A suite of sensor models and noise distribution as well as Monte Carlo analysis capability are included to enable statistical performance evaluations.

  20. A comparison of the static and flow methods for the detection of ice nuclei

    NASA Astrophysics Data System (ADS)

    Hussain, K.; Kayani, S. A.

    The use of the membrane-filter processing chamber to study ice nuclei concentrations has become wide-spread since its introduction by Bigg et al. in 1961. The technique is convenient because of the simplicity of its operation and because it could be run remote from the place of field study. It has however been found to suffer from a number of drawbacks, namely, the volume effect, the chamber height effect, the vapour depletion effect, etc. Comparison of the results obtained by running a traditional filter processor and a continuous flow chamber under identical temperature and humidity conditions for polluted Manchester air has shown that the latter technique detects more ice nuclei than the former one by a factor of about 14±4. These results confirm that the filter technique suffers from the vapour depletion effect. The present results are in agreement with Bigg et al., Mossop and Thorndike, and King. In the light of our findings the filter technique does not appear to be a standard method. Therefore the ice nuclei data obtained with the filter method should not be extended to clouds in order to study their microphysical properties.

  1. Comparison of active-set method deconvolution and matched-filtering for derivation of an ultrasound transit time spectrum.

    PubMed

    Wille, M-L; Zapf, M; Ruiter, N V; Gemmeke, H; Langton, C M

    2015-06-21

    The quality of ultrasound computed tomography imaging is primarily determined by the accuracy of ultrasound transit time measurement. A major problem in analysis is the overlap of signals making it difficult to detect the correct transit time. The current standard is to apply a matched-filtering approach to the input and output signals. This study compares the matched-filtering technique with active set deconvolution to derive a transit time spectrum from a coded excitation chirp signal and the measured output signal. The ultrasound wave travels in a direct and a reflected path to the receiver, resulting in an overlap in the recorded output signal. The matched-filtering and deconvolution techniques were applied to determine the transit times associated with the two signal paths. Both techniques were able to detect the two different transit times; while matched-filtering has a better accuracy (0.13 μs versus 0.18 μs standard deviations), deconvolution has a 3.5 times improved side-lobe to main-lobe ratio. A higher side-lobe suppression is important to further improve image fidelity. These results suggest that a future combination of both techniques would provide improved signal detection and hence improved image fidelity.

  2. Vena Cava Filter Retrieval with Aorto-Iliac Arterial Strut Penetration.

    PubMed

    Holly, Brian P; Gaba, Ron C; Lessne, Mark L; Lewandowski, Robert J; Ryu, Robert K; Desai, Kush R; Sing, Ronald F

    2018-05-03

    To evaluate the safety and technical success of inferior vena cava (IVC) filter retrieval in the setting of aorto-iliac arterial strut penetration. IVC filter registries from six large United States IVC filter retrieval practices were retrospectively reviewed to identify patients who underwent IVC filter retrieval in the setting of filter strut penetration into the adjacent aorta or iliac artery. Patient demographics, implant duration, indication for placement, IVC filter type, retrieval technique and technical success, adverse events, and post procedural clinical outcomes were identified. Arterial penetration was determined based on pre-procedure CT imaging in all cases. The IVC filter retrieval technique used was at the discretion of the operating physician. Seventeen patients from six US centers who underwent retrieval of an IVC filter with at least one strut penetrating either the aorta or iliac artery were identified. Retrieval technical success rate was 100% (17/17), without any major adverse events. Post-retrieval follow-up ranging from 10 days to 2 years (mean 4.6 months) was available in 12/17 (71%) patients; no delayed adverse events were encountered. Findings from this series suggest that chronically indwelling IVC filters with aorto-iliac arterial strut penetration may be safely retrieved.

  3. A robust spatial filtering technique for multisource localization and geoacoustic inversion.

    PubMed

    Stotts, S A

    2005-07-01

    Geoacoustic inversion and source localization using beamformed data from a ship of opportunity has been demonstrated with a bottom-mounted array. An alternative approach, which lies within a class referred to as spatial filtering, transforms element level data into beam data, applies a bearing filter, and transforms back to element level data prior to performing inversions. Automation of this filtering approach is facilitated for broadband applications by restricting the inverse transform to the degrees of freedom of the array, i.e., the effective number of elements, for frequencies near or below the design frequency. A procedure is described for nonuniformly spaced elements that guarantees filter stability well above the design frequency. Monitoring energy conservation with respect to filter output confirms filter stability. Filter performance with both uniformly spaced and nonuniformly spaced array elements is discussed. Vertical (range and depth) and horizontal (range and bearing) ambiguity surfaces are constructed to examine filter performance. Examples that demonstrate this filtering technique with both synthetic data and real data are presented along with comparisons to inversion results using beamformed data. Examinations of cost functions calculated within a simulated annealing algorithm reveal the efficacy of the approach.

  4. Teaching learning based optimization-functional link artificial neural network filter for mixed noise reduction from magnetic resonance image.

    PubMed

    Kumar, M; Mishra, S K

    2017-01-01

    The clinical magnetic resonance imaging (MRI) images may get corrupted due to the presence of the mixture of different types of noises such as Rician, Gaussian, impulse, etc. Most of the available filtering algorithms are noise specific, linear, and non-adaptive. There is a need to develop a nonlinear adaptive filter that adapts itself according to the requirement and effectively applied for suppression of mixed noise from different MRI images. In view of this, a novel nonlinear neural network based adaptive filter i.e. functional link artificial neural network (FLANN) whose weights are trained by a recently developed derivative free meta-heuristic technique i.e. teaching learning based optimization (TLBO) is proposed and implemented. The performance of the proposed filter is compared with five other adaptive filters and analyzed by considering quantitative metrics and evaluating the nonparametric statistical test. The convergence curve and computational time are also included for investigating the efficiency of the proposed as well as competitive filters. The simulation outcomes of proposed filter outperform the other adaptive filters. The proposed filter can be hybridized with other evolutionary technique and utilized for removing different noise and artifacts from others medical images more competently.

  5. Spatial filtering with photonic crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maigyte, Lina; Staliunas, Kestutis; Institució Catalana de Recerca i Estudis Avançats

    2015-03-15

    Photonic crystals are well known for their celebrated photonic band-gaps—the forbidden frequency ranges, for which the light waves cannot propagate through the structure. The frequency (or chromatic) band-gaps of photonic crystals can be utilized for frequency filtering. In analogy to the chromatic band-gaps and the frequency filtering, the angular band-gaps and the angular (spatial) filtering are also possible in photonic crystals. In this article, we review the recent advances of the spatial filtering using the photonic crystals in different propagation regimes and for different geometries. We review the most evident configuration of filtering in Bragg regime (with the back-reflection—i.e., inmore » the configuration with band-gaps) as well as in Laue regime (with forward deflection—i.e., in the configuration without band-gaps). We explore the spatial filtering in crystals with different symmetries, including axisymmetric crystals; we discuss the role of chirping, i.e., the dependence of the longitudinal period along the structure. We also review the experimental techniques to fabricate the photonic crystals and numerical techniques to explore the spatial filtering. Finally, we discuss several implementations of such filters for intracavity spatial filtering.« less

  6. An extended Kalman filter approach to non-stationary Bayesian estimation of reduced-order vocal fold model parameters.

    PubMed

    Hadwin, Paul J; Peterson, Sean D

    2017-04-01

    The Bayesian framework for parameter inference provides a basis from which subject-specific reduced-order vocal fold models can be generated. Previously, it has been shown that a particle filter technique is capable of producing estimates and associated credibility intervals of time-varying reduced-order vocal fold model parameters. However, the particle filter approach is difficult to implement and has a high computational cost, which can be barriers to clinical adoption. This work presents an alternative estimation strategy based upon Kalman filtering aimed at reducing the computational cost of subject-specific model development. The robustness of this approach to Gaussian and non-Gaussian noise is discussed. The extended Kalman filter (EKF) approach is found to perform very well in comparison with the particle filter technique at dramatically lower computational cost. Based upon the test cases explored, the EKF is comparable in terms of accuracy to the particle filter technique when greater than 6000 particles are employed; if less particles are employed, the EKF actually performs better. For comparable levels of accuracy, the solution time is reduced by 2 orders of magnitude when employing the EKF. By virtue of the approximations used in the EKF, however, the credibility intervals tend to be slightly underpredicted.

  7. A Recommendation Algorithm for Automating Corollary Order Generation

    PubMed Central

    Klann, Jeffrey; Schadow, Gunther; McCoy, JM

    2009-01-01

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards. PMID:20351875

  8. A recommendation algorithm for automating corollary order generation.

    PubMed

    Klann, Jeffrey; Schadow, Gunther; McCoy, J M

    2009-11-14

    Manual development and maintenance of decision support content is time-consuming and expensive. We explore recommendation algorithms, e-commerce data-mining tools that use collective order history to suggest purchases, to assist with this. In particular, previous work shows corollary order suggestions are amenable to automated data-mining techniques. Here, an item-based collaborative filtering algorithm augmented with association rule interestingness measures mined suggestions from 866,445 orders made in an inpatient hospital in 2007, generating 584 potential corollary orders. Our expert physician panel evaluated the top 92 and agreed 75.3% were clinically meaningful. Also, at least one felt 47.9% would be directly relevant in guideline development. This automated generation of a rough-cut of corollary orders confirms prior indications about automated tools in building decision support content. It is an important step toward computerized augmentation to decision support development, which could increase development efficiency and content quality while automatically capturing local standards.

  9. Toward detecting deception in intelligent systems

    NASA Astrophysics Data System (ADS)

    Santos, Eugene, Jr.; Johnson, Gregory, Jr.

    2004-08-01

    Contemporary decision makers often must choose a course of action using knowledge from several sources. Knowledge may be provided from many diverse sources including electronic sources such as knowledge-based diagnostic or decision support systems or through data mining techniques. As the decision maker becomes more dependent on these electronic information sources, detecting deceptive information from these sources becomes vital to making a correct, or at least more informed, decision. This applies to unintentional disinformation as well as intentional misinformation. Our ongoing research focuses on employing models of deception and deception detection from the fields of psychology and cognitive science to these systems as well as implementing deception detection algorithms for probabilistic intelligent systems. The deception detection algorithms are used to detect, classify and correct attempts at deception. Algorithms for detecting unexpected information rely upon a prediction algorithm from the collaborative filtering domain to predict agent responses in a multi-agent system.

  10. Effective wind speed estimation: Comparison between Kalman Filter and Takagi-Sugeno observer techniques.

    PubMed

    Gauterin, Eckhard; Kammerer, Philipp; Kühn, Martin; Schulte, Horst

    2016-05-01

    Advanced model-based control of wind turbines requires knowledge of the states and the wind speed. This paper benchmarks a nonlinear Takagi-Sugeno observer for wind speed estimation with enhanced Kalman Filter techniques: The performance and robustness towards model-structure uncertainties of the Takagi-Sugeno observer, a Linear, Extended and Unscented Kalman Filter are assessed. Hence the Takagi-Sugeno observer and enhanced Kalman Filter techniques are compared based on reduced-order models of a reference wind turbine with different modelling details. The objective is the systematic comparison with different design assumptions and requirements and the numerical evaluation of the reconstruction quality of the wind speed. Exemplified by a feedforward loop employing the reconstructed wind speed, the benefit of wind speed estimation within wind turbine control is illustrated. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

  11. Estimation and filtering techniques for high-accuracy GPS applications

    NASA Technical Reports Server (NTRS)

    Lichten, S. M.

    1989-01-01

    Techniques for determination of very precise orbits for satellites of the Global Positioning System (GPS) are currently being studied and demonstrated. These techniques can be used to make cm-accurate measurements of station locations relative to the geocenter, monitor earth orientation over timescales of hours, and provide tropospheric and clock delay calibrations during observations made with deep space radio antennas at sites where the GPS receivers have been collocated. For high-earth orbiters, meter-level knowledge of position will be available from GPS, while at low altitudes, sub-decimeter accuracy will be possible. Estimation of satellite orbits and other parameters such as ground station positions is carried out with a multi-satellite batch sequential pseudo-epoch state process noise filter. Both square-root information filtering (SRIF) and UD-factorized covariance filtering formulations are implemented in the software.

  12. Dual linear structured support vector machine tracking method via scale correlation filter

    NASA Astrophysics Data System (ADS)

    Li, Weisheng; Chen, Yanquan; Xiao, Bin; Feng, Chen

    2018-01-01

    Adaptive tracking-by-detection methods based on structured support vector machine (SVM) performed well on recent visual tracking benchmarks. However, these methods did not adopt an effective strategy of object scale estimation, which limits the overall tracking performance. We present a tracking method based on a dual linear structured support vector machine (DLSSVM) with a discriminative scale correlation filter. The collaborative tracker comprised of a DLSSVM model and a scale correlation filter obtains good results in tracking target position and scale estimation. The fast Fourier transform is applied for detection. Extensive experiments show that our tracking approach outperforms many popular top-ranking trackers. On a benchmark including 100 challenging video sequences, the average precision of the proposed method is 82.8%.

  13. Multi-Dimensional High Order Essentially Non-Oscillatory Finite Difference Methods in Generalized Coordinates

    NASA Technical Reports Server (NTRS)

    Shu, Chi-Wang

    1998-01-01

    This project is about the development of high order, non-oscillatory type schemes for computational fluid dynamics. Algorithm analysis, implementation, and applications are performed. Collaborations with NASA scientists have been carried out to ensure that the research is relevant to NASA objectives. The combination of ENO finite difference method with spectral method in two space dimension is considered, jointly with Cai [3]. The resulting scheme behaves nicely for the two dimensional test problems with or without shocks. Jointly with Cai and Gottlieb, we have also considered one-sided filters for spectral approximations to discontinuous functions [2]. We proved theoretically the existence of filters to recover spectral accuracy up to the discontinuity. We also constructed such filters for practical calculations.

  14. Performances of JEM-EUSO: angular reconstruction. The JEM-EUSO Collaboration

    NASA Astrophysics Data System (ADS)

    Adams, J. H.; Ahmad, S.; Albert, J.-N.; Allard, D.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Arai, Y.; Asano, K.; Ave Pernas, M.; Baragatti, P.; Barrillon, P.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Blaksley, C.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Blümer, J.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Briggs, M. S.; Briz, S.; Bruno, A.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellinic, G.; Catalano, C.; Catalano, G.; Cellino, A.; Chikawa, M.; Christl, M. J.; Cline, D.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; de Castro, A. J.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Dell'Oro, A.; De Simone, N.; Di Martino, M.; Distratis, G.; Dulucq, F.; Dupieux, M.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Falk, S.; Fang, K.; Fenu, F.; Fernández-Gómez, I.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Franceschi, A.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; Garipov, G.; Geary, J.; Gelmini, G.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guzmán, A.; Hachisu, Y.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Insolia, A.; Isgrò, F.; Itow, Y.; Joven, E.; Judd, E. G.; Jung, A.; Kajino, F.; Kajino, T.; Kaneko, I.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Keilhauer, B.; Khrenov, B. A.; Kim, J.-S.; Kim, S.-W.; Kim, S.-W.; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lee, J.; Licandro, J.; Lim, H.; López, F.; Maccarone, M. C.; Mannheim, K.; Maravilla, D.; Marcelli, L.; Marini, A.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Medina-Tanco, G.; Mernik, T.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Murakami, M. Nagano; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Panasyuk, M. I.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perez Cano, S.; Peter, T.; Picozza, P.; Pierog, T.; Piotrowski, L. W.; Piraino, S.; Plebaniak, Z.; Pollini, A.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Reardon, P.; Reyes, M.; Ricci, M.; Rodríguez, I.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez-Cano, G.; Sagawa, H.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sánchez, S.; Santangelo, A.; Santiago Crúz, L.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Silva López, H. H.; Sledd, J.; Słomińska, K.; Sobey, A.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Trillaud, F.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Valore, L.; Vankova, G.; Vigorito, C.; Villaseñor, L.; von Ballmoos, P.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J.; Weber, M.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, K.; Yoshida, S.; Young, R.; Zotov, M. Yu.; Zuccaro Marchi, A.

    2015-11-01

    Mounted on the International Space Station(ISS), the Extreme Universe Space Observatory, on-board the Japanese Experimental Module (JEM-EUSO), relies on the well established fluorescence technique to observe Extensive Air Showers (EAS) developing in the earth's atmosphere. Focusing on the detection of Ultra High Energy Cosmic Rays (UHECR) in the decade of 1020eV, JEM-EUSO will face new challenges by applying this technique from space. The EUSO Simulation and Analysis Framework (ESAF) has been developed in this context to provide a full end-to-end simulation frame, and assess the overall performance of the detector. Within ESAF, angular reconstruction can be separated into two conceptually different steps. The first step is pattern recognition, or filtering, of the signal to separate it from the background. The second step is to perform different types of fitting in order to search for the relevant geometrical parameters that best describe the previously selected signal. In this paper, we discuss some of the techniques we have implemented in ESAF to perform the geometrical reconstruction of EAS seen by JEM-EUSO. We also conduct thorough tests to assess the performances of these techniques in conditions which are relevant to the scope of the JEM-EUSO mission. We conclude by showing the expected angular resolution in the energy range that JEM-EUSO is expected to observe.

  15. A Multi-Agent System for Intelligent Online Education.

    ERIC Educational Resources Information Center

    O'Riordan, Colm; Griffith, Josephine

    1999-01-01

    Describes the system architecture of an intelligent Web-based education system that includes user modeling agents, information filtering agents for automatic information gathering, and the multi-agent interaction. Discusses information management; user interaction; support for collaborative peer-peer learning; implementation; testing; and future…

  16. Imagining the Digital Library in a Commercialized Internet.

    ERIC Educational Resources Information Center

    Heckart, Ronald J.

    1999-01-01

    Discusses digital library planning in light of Internet commerce and technological innovation in marketing and customer relations that are transforming user expectations about Web sites that offer products and services. Topics include user self-sufficiency; personalized service; artificial intelligence; collaborative filtering; and electronic…

  17. Comparing Information Access Approaches.

    ERIC Educational Resources Information Center

    Chalmers, Matthew

    1999-01-01

    Presents a broad view of information access, drawing from philosophy and semiology in constructing a framework for comparative discussion that is used to examine the information representations that underlie four approaches to information access--information retrieval, workflow, collaborative filtering, and the path model. Contains 32 references.…

  18. A Comparative Study on Preprocessing Techniques in Diabetic Retinopathy Retinal Images: Illumination Correction and Contrast Enhancement

    PubMed Central

    Rasta, Seyed Hossein; Partovi, Mahsa Eisazadeh; Seyedarabi, Hadi; Javadzadeh, Alireza

    2015-01-01

    To investigate the effect of preprocessing techniques including contrast enhancement and illumination correction on retinal image quality, a comparative study was carried out. We studied and implemented a few illumination correction and contrast enhancement techniques on color retinal images to find out the best technique for optimum image enhancement. To compare and choose the best illumination correction technique we analyzed the corrected red and green components of color retinal images statistically and visually. The two contrast enhancement techniques were analyzed using a vessel segmentation algorithm by calculating the sensitivity and specificity. The statistical evaluation of the illumination correction techniques were carried out by calculating the coefficients of variation. The dividing method using the median filter to estimate background illumination showed the lowest Coefficients of variations in the red component. The quotient and homomorphic filtering methods after the dividing method presented good results based on their low Coefficients of variations. The contrast limited adaptive histogram equalization increased the sensitivity of the vessel segmentation algorithm up to 5% in the same amount of accuracy. The contrast limited adaptive histogram equalization technique has a higher sensitivity than the polynomial transformation operator as a contrast enhancement technique for vessel segmentation. Three techniques including the dividing method using the median filter to estimate background, quotient based and homomorphic filtering were found as the effective illumination correction techniques based on a statistical evaluation. Applying the local contrast enhancement technique, such as CLAHE, for fundus images presented good potentials in enhancing the vasculature segmentation. PMID:25709940

  19. Strategies for prevention of iatrogenic inferior vena cava filter entrapment and dislodgement during central venous catheter placement.

    PubMed

    Wu, Alex; Helo, Naseem; Moon, Eunice; Tam, Matthew; Kapoor, Baljendra; Wang, Weiping

    2014-01-01

    Iatrogenic migration of inferior vena cava (IVC) filters is a potentially life-threatening complication that can arise during blind insertion of central venous catheters when the guide wire becomes entangled with the filter. In this study, we reviewed the occurrence of iatrogenic migration of IVC filters in the literature and assessed methods for preventing this complication. A literature search was conducted to identify reports of filter/wire entrapment and subsequent IVC filter migration. Clinical outcomes and complications were identified. A total of 38 cases of filter/wire entrapment were identified. All of these cases involved J-tip guide wires. Filters included 23 Greenfield filters, 14 VenaTech filters, and one TrapEase filter. In 18 cases of filter/wire entrapment, there was migration of the filter to the heart and other central venous structures. Retrieval of the migrated filter was successful in only four of the 18 cases, and all of these cases were complicated by strut fracture and distant embolization of fragments. One patient required resuscitation during retrieval. Successful disengagement was possible in 20 cases without filter migration. Iatrogenic migration of an IVC filter is an uncommon complication related to wire/filter entrapment. This complication can be prevented with knowledge of the patient's history, use of proper techniques when placing a central venous catheter, identification of wire entrapment at an early stage, and use of an appropriate technique to disengage an entrapped wire. Copyright © 2014 Society for Vascular Surgery. Published by Mosby, Inc. All rights reserved.

  20. Fast digital noise filter capable of locating spectral peaks and shoulders

    NASA Technical Reports Server (NTRS)

    Edwards, T. R.; Knight, R. D.

    1972-01-01

    Experimental data frequently have a poor signal-to-noise ratio which one would like to enhance before analysis. With the data in digital form, this may be accomplished by means of a digital filter. A fast digital filter based upon the principle of least squares and using the techniques of convoluting integers is described. In addition to smoothing, this filter also is capable of accurately and simultaneously locating spectral peaks and shoulders. This technique has been adapted into a computer subroutine, and results of several test cases are shown, including mass spectral data and data from a proportional counter for the High Energy Astronomy Observatory.

  1. Conversion and matched filter approximations for serial minimum-shift keyed modulation

    NASA Technical Reports Server (NTRS)

    Ziemer, R. E.; Ryan, C. R.; Stilwell, J. H.

    1982-01-01

    Serial minimum-shift keyed (MSK) modulation, a technique for generating and detecting MSK using series filtering, is ideally suited for high data rate applications provided the required conversion and matched filters can be closely approximated. Low-pass implementations of these filters as parallel inphase- and quadrature-mixer structures are characterized in this paper in terms of signal-to-noise ratio (SNR) degradation from ideal and envelope deviation. Several hardware implementation techniques utilizing microwave devices or lumped elements are presented. Optimization of parameter values results in realizations whose SNR degradation is less than 0.5 dB at error probabilities of .000001.

  2. Design optimisation of powers-of-two FIR filter using self-organising random immigrants GA

    NASA Astrophysics Data System (ADS)

    Chandra, Abhijit; Chattopadhyay, Sudipta

    2015-01-01

    In this communication, we propose a novel design strategy of multiplier-less low-pass finite impulse response (FIR) filter with the aid of a recent evolutionary optimisation technique, known as the self-organising random immigrants genetic algorithm. Individual impulse response coefficients of the proposed filter have been encoded as sum of signed powers-of-two. During the formulation of the cost function for the optimisation algorithm, both the frequency response characteristic and the hardware cost of the discrete coefficient FIR filter have been considered. The role of crossover probability of the optimisation technique has been evaluated on the overall performance of the proposed strategy. For this purpose, the convergence characteristic of the optimisation technique has been included in the simulation results. In our analysis, two design examples of different specifications have been taken into account. In order to substantiate the efficiency of our proposed structure, a number of state-of-the-art design strategies of multiplier-less FIR filter have also been included in this article for the purpose of comparison. Critical analysis of the result unambiguously establishes the usefulness of our proposed approach for the hardware efficient design of digital filter.

  3. Constructed Pools-and-Riffles: Application and Assessment in Illinois.

    NASA Astrophysics Data System (ADS)

    Day, D. M.; Dodd, H. R.; Carney, D. A.; Holtrop, A. M.; Whiles, M. R.; White, B.; Roseboom, D.; Kinney, W.; Keefer, L. L.; Beardsley, J.

    2005-05-01

    The diversity of Illinois' streams provides a broad range of conditions, and thus a variety of restoration techniques may be required to adequately compensate for watershed alterations. Resource management agencies and research institutions in the state have collaborated on a variety of applied research initiatives to assess the efficacy of various stream protection and restoration techniques. Constructed pool-and-riffle structures have received significant attention because they tend to address watershed processes (i.e., channel evolution model) and may benefit biotic communities and processes along with physical habitat. Constructed pools-and-riffles have been applied primarily to address geomorphic instability, yet understanding biological responses can provide further rationale for their use and design specifications. In three stream systems around the state, fish were collected pre- and post- installation of structures, using primarily electrofishing techniques (e.g., electric seine & backpack). In general, within the first five years after installation, changes in fish communities have included a shift from high-abundance, small cyprinid-dominated assemblages to low-density Centrarchidae and Catostomidae assemblages. Changes in macro invertebrates at selected sites included increases in filter feeders and sensitive taxa such as the Ephemeroptera, Plecoptera, and Trichoptera (EPT). Ongoing assessments will be critical for understanding long-term influences on stream ecosystem structure and function.

  4. A Collaborative Location Based Travel Recommendation System through Enhanced Rating Prediction for the Group of Users

    PubMed Central

    Ravi, Logesh; Vairavasundaram, Subramaniyaswamy

    2016-01-01

    Rapid growth of web and its applications has created a colossal importance for recommender systems. Being applied in various domains, recommender systems were designed to generate suggestions such as items or services based on user interests. Basically, recommender systems experience many issues which reflects dwindled effectiveness. Integrating powerful data management techniques to recommender systems can address such issues and the recommendations quality can be increased significantly. Recent research on recommender systems reveals an idea of utilizing social network data to enhance traditional recommender system with better prediction and improved accuracy. This paper expresses views on social network data based recommender systems by considering usage of various recommendation algorithms, functionalities of systems, different types of interfaces, filtering techniques, and artificial intelligence techniques. After examining the depths of objectives, methodologies, and data sources of the existing models, the paper helps anyone interested in the development of travel recommendation systems and facilitates future research direction. We have also proposed a location recommendation system based on social pertinent trust walker (SPTW) and compared the results with the existing baseline random walk models. Later, we have enhanced the SPTW model for group of users recommendations. The results obtained from the experiments have been presented. PMID:27069468

  5. Adaptive torque estimation of robot joint with harmonic drive transmission

    NASA Astrophysics Data System (ADS)

    Shi, Zhiguo; Li, Yuankai; Liu, Guangjun

    2017-11-01

    Robot joint torque estimation using input and output position measurements is a promising technique, but the result may be affected by the load variation of the joint. In this paper, a torque estimation method with adaptive robustness and optimality adjustment according to load variation is proposed for robot joint with harmonic drive transmission. Based on a harmonic drive model and a redundant adaptive robust Kalman filter (RARKF), the proposed approach can adapt torque estimation filtering optimality and robustness to the load variation by self-tuning the filtering gain and self-switching the filtering mode between optimal and robust. The redundant factor of RARKF is designed as a function of the motor current for tolerating the modeling error and load-dependent filtering mode switching. The proposed joint torque estimation method has been experimentally studied in comparison with a commercial torque sensor and two representative filtering methods. The results have demonstrated the effectiveness of the proposed torque estimation technique.

  6. Application of an Optimal Tuner Selection Approach for On-Board Self-Tuning Engine Models

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Armstrong, Jeffrey B.; Garg, Sanjay

    2012-01-01

    An enhanced design methodology for minimizing the error in on-line Kalman filter-based aircraft engine performance estimation applications is presented in this paper. It specific-ally addresses the under-determined estimation problem, in which there are more unknown parameters than available sensor measurements. This work builds upon an existing technique for systematically selecting a model tuning parameter vector of appropriate dimension to enable estimation by a Kalman filter, while minimizing the estimation error in the parameters of interest. While the existing technique was optimized for open-loop engine operation at a fixed design point, in this paper an alternative formulation is presented that enables the technique to be optimized for an engine operating under closed-loop control throughout the flight envelope. The theoretical Kalman filter mean squared estimation error at a steady-state closed-loop operating point is derived, and the tuner selection approach applied to minimize this error is discussed. A technique for constructing a globally optimal tuning parameter vector, which enables full-envelope application of the technology, is also presented, along with design steps for adjusting the dynamic response of the Kalman filter state estimates. Results from the application of the technique to linear and nonlinear aircraft engine simulations are presented and compared to the conventional approach of tuner selection. The new methodology is shown to yield a significant improvement in on-line Kalman filter estimation accuracy.

  7. A Personalized QoS Prediction Approach for CPS Service Recommendation Based on Reputation and Location-Aware Collaborative Filtering.

    PubMed

    Kuang, Li; Yu, Long; Huang, Lan; Wang, Yin; Ma, Pengju; Li, Chuanbin; Zhu, Yujia

    2018-05-14

    With the rapid development of cyber-physical systems (CPS), building cyber-physical systems with high quality of service (QoS) has become an urgent requirement in both academia and industry. During the procedure of building Cyber-physical systems, it has been found that a large number of functionally equivalent services exist, so it becomes an urgent task to recommend suitable services from the large number of services available in CPS. However, since it is time-consuming, and even impractical, for a single user to invoke all of the services in CPS to experience their QoS, a robust QoS prediction method is needed to predict unknown QoS values. A commonly used method in QoS prediction is collaborative filtering, however, it is hard to deal with the data sparsity and cold start problem, and meanwhile most of the existing methods ignore the data credibility issue. Thence, in order to solve both of these challenging problems, in this paper, we design a framework of QoS prediction for CPS services, and propose a personalized QoS prediction approach based on reputation and location-aware collaborative filtering. Our approach first calculates the reputation of users by using the Dirichlet probability distribution, so as to identify untrusted users and process their unreliable data, and then it digs out the geographic neighborhood in three levels to improve the similarity calculation of users and services. Finally, the data from geographical neighbors of users and services are fused to predict the unknown QoS values. The experiments using real datasets show that our proposed approach outperforms other existing methods in terms of accuracy, efficiency, and robustness.

  8. A Personalized QoS Prediction Approach for CPS Service Recommendation Based on Reputation and Location-Aware Collaborative Filtering

    PubMed Central

    Huang, Lan; Wang, Yin; Ma, Pengju; Li, Chuanbin; Zhu, Yujia

    2018-01-01

    With the rapid development of cyber-physical systems (CPS), building cyber-physical systems with high quality of service (QoS) has become an urgent requirement in both academia and industry. During the procedure of building Cyber-physical systems, it has been found that a large number of functionally equivalent services exist, so it becomes an urgent task to recommend suitable services from the large number of services available in CPS. However, since it is time-consuming, and even impractical, for a single user to invoke all of the services in CPS to experience their QoS, a robust QoS prediction method is needed to predict unknown QoS values. A commonly used method in QoS prediction is collaborative filtering, however, it is hard to deal with the data sparsity and cold start problem, and meanwhile most of the existing methods ignore the data credibility issue. Thence, in order to solve both of these challenging problems, in this paper, we design a framework of QoS prediction for CPS services, and propose a personalized QoS prediction approach based on reputation and location-aware collaborative filtering. Our approach first calculates the reputation of users by using the Dirichlet probability distribution, so as to identify untrusted users and process their unreliable data, and then it digs out the geographic neighborhood in three levels to improve the similarity calculation of users and services. Finally, the data from geographical neighbors of users and services are fused to predict the unknown QoS values. The experiments using real datasets show that our proposed approach outperforms other existing methods in terms of accuracy, efficiency, and robustness. PMID:29757995

  9. Complications of inferior vena cava filters.

    PubMed

    Sella, David M; Oldenburg, W Andrew

    2013-03-01

    With the introduction of retrievable inferior vena cava filters, the number being placed for protection from pulmonary embolism is steadily increasing. Despite this increased usage, the true incidence of complications associated with inferior vena cava filters is unknown. This article reviews the known complications associated with these filters and suggests recommendations and techniques for inferior vena cava filter removal. Copyright © 2013. Published by Elsevier Inc.

  10. On selecting satellite conjunction filter parameters

    NASA Astrophysics Data System (ADS)

    Alfano, Salvatore; Finkleman, David

    2014-06-01

    This paper extends concepts of signal detection theory to predict the performance of conjunction screening techniques and guiding the selection of keepout and screening thresholds. The most efficient way to identify satellites likely to collide is to employ filters to identify orbiting pairs that should not come close enough over a prescribed time period to be considered hazardous. Such pairings can then be eliminated from further computation to accelerate overall processing time. Approximations inherent in filtering techniques include screening using only unperturbed Newtonian two body astrodynamics and uncertainties in orbit elements. Therefore, every filtering process is vulnerable to including objects that are not threats and excluding some that are threats, Type I and Type II errors. The approach in this paper guides selection of the best operating point for the filters suited to a user's tolerance for false alarms and unwarned threats. We demonstrate the approach using three archetypal filters with an initial three-day span, select filter parameters based on performance, and then test those parameters using eight historical snapshots of the space catalog. This work provides a mechanism for selecting filter parameters but the choices depend on the circumstances.

  11. The Performance of A Sampled Data Delay Lock Loop Implemented with a Kalman Loop Filter.

    DTIC Science & Technology

    1980-01-01

    que for analysis is computer simulation. Other techniques include state variable techniques and z-transform methods. Since the Kalman filter is linear...LOGIC NOT SHOWN Figure 2. Block diagram of the sampled data delay lock loop (SDDLL) Es A/ A 3/A/ Figure 3. Sampled error voltage ( Es ) as a function of...from a sum of two components. The first component is the previous filtered es - timate advanced one step forward by the state transition matrix. The 8

  12. Iterative design of one- and two-dimensional FIR digital filters. [Finite duration Impulse Response

    NASA Technical Reports Server (NTRS)

    Suk, M.; Choi, K.; Algazi, V. R.

    1976-01-01

    The paper describes a new iterative technique for designing FIR (finite duration impulse response) digital filters using a frequency weighted least squares approximation. The technique is as easy to implement (via FFT) and as effective in two dimensions as in one dimension, and there are virtually no limitations on the class of filter frequency spectra approximated. An adaptive adjustment of the frequency weight to achieve other types of design approximation such as Chebyshev type design is discussed.

  13. Finite word length effects on digital filter implementation.

    NASA Technical Reports Server (NTRS)

    Bowman, J. D.; Clark, F. H.

    1972-01-01

    This paper is a discussion of two known techniques to analyze finite word length effects on digital filters. These techniques are extended to several additional programming forms and the results verified experimentally. A correlation of the analytical weighting functions for the two methods is made through the Mason Gain Formula.

  14. Beyond Information Retrieval: Ways To Provide Content in Context.

    ERIC Educational Resources Information Center

    Wiley, Deborah Lynne

    1998-01-01

    Provides an overview of information retrieval from mainframe systems to Web search engines; discusses collaborative filtering, data extraction, data visualization, agent technology, pattern recognition, classification and clustering, and virtual communities. Argues that rather than huge data-storage centers and proprietary software, we need…

  15. Effects of high-order correlations on personalized recommendations for bipartite networks

    NASA Astrophysics Data System (ADS)

    Liu, Jian-Guo; Zhou, Tao; Che, Hong-An; Wang, Bing-Hong; Zhang, Yi-Cheng

    2010-02-01

    In this paper, we introduce a modified collaborative filtering (MCF) algorithm, which has remarkably higher accuracy than the standard collaborative filtering. In the MCF, instead of the cosine similarity index, the user-user correlations are obtained by a diffusion process. Furthermore, by considering the second-order correlations, we design an effective algorithm that depresses the influence of mainstream preferences. Simulation results show that the algorithmic accuracy, measured by the average ranking score, is further improved by 20.45% and 33.25% in the optimal cases of MovieLens and Netflix data. More importantly, the optimal value λ depends approximately monotonously on the sparsity of the training set. Given a real system, we could estimate the optimal parameter according to the data sparsity, which makes this algorithm easy to be applied. In addition, two significant criteria of algorithmic performance, diversity and popularity, are also taken into account. Numerical results show that as the sparsity increases, the algorithm considering the second-order correlation can outperform the MCF simultaneously in all three criteria.

  16. Towards collaborative filtering recommender systems for tailored health communications.

    PubMed

    Marlin, Benjamin M; Adams, Roy J; Sadasivam, Rajani; Houston, Thomas K

    2013-01-01

    The goal of computer tailored health communications (CTHC) is to promote healthy behaviors by sending messages tailored to individual patients. Current CTHC systems collect baseline patient "profiles" and then use expert-written, rule-based systems to target messages to subsets of patients. Our main interest in this work is the study of collaborative filtering-based CTHC systems that can learn to tailor future message selections to individual patients based explicit feedback about past message selections. This paper reports the results of a study designed to collect explicit feedback (ratings) regarding four aspects of messages from 100 subjects in the smoking cessation support domain. Our results show that most users have positive opinions of most messages and that the ratings for all four aspects of the messages are highly correlated with each other. Finally, we conduct a range of rating prediction experiments comparing several different model variations. Our results show that predicting future ratings based on each user's past ratings contributes the most to predictive accuracy.

  17. Towards Collaborative Filtering Recommender Systems for Tailored Health Communications

    PubMed Central

    Marlin, Benjamin M.; Adams, Roy J.; Sadasivam, Rajani; Houston, Thomas K.

    2013-01-01

    The goal of computer tailored health communications (CTHC) is to promote healthy behaviors by sending messages tailored to individual patients. Current CTHC systems collect baseline patient “profiles” and then use expert-written, rule-based systems to target messages to subsets of patients. Our main interest in this work is the study of collaborative filtering-based CTHC systems that can learn to tailor future message selections to individual patients based explicit feedback about past message selections. This paper reports the results of a study designed to collect explicit feedback (ratings) regarding four aspects of messages from 100 subjects in the smoking cessation support domain. Our results show that most users have positive opinions of most messages and that the ratings for all four aspects of the messages are highly correlated with each other. Finally, we conduct a range of rating prediction experiments comparing several different model variations. Our results show that predicting future ratings based on each user’s past ratings contributes the most to predictive accuracy. PMID:24551430

  18. Probability-based collaborative filtering model for predicting gene-disease associations.

    PubMed

    Zeng, Xiangxiang; Ding, Ningxiang; Rodríguez-Patón, Alfonso; Zou, Quan

    2017-12-28

    Accurately predicting pathogenic human genes has been challenging in recent research. Considering extensive gene-disease data verified by biological experiments, we can apply computational methods to perform accurate predictions with reduced time and expenses. We propose a probability-based collaborative filtering model (PCFM) to predict pathogenic human genes. Several kinds of data sets, containing data of humans and data of other nonhuman species, are integrated in our model. Firstly, on the basis of a typical latent factorization model, we propose model I with an average heterogeneous regularization. Secondly, we develop modified model II with personal heterogeneous regularization to enhance the accuracy of aforementioned models. In this model, vector space similarity or Pearson correlation coefficient metrics and data on related species are also used. We compared the results of PCFM with the results of four state-of-arts approaches. The results show that PCFM performs better than other advanced approaches. PCFM model can be leveraged for predictions of disease genes, especially for new human genes or diseases with no known relationships.

  19. Making sense of sparse rating data in collaborative filtering via topographic organization of user preference patterns.

    PubMed

    Polcicová, Gabriela; Tino, Peter

    2004-01-01

    We introduce topographic versions of two latent class models (LCM) for collaborative filtering. Latent classes are topologically organized on a square grid. Topographic organization of latent classes makes orientation in rating/preference patterns captured by the latent classes easier and more systematic. The variation in film rating patterns is modelled by multinomial and binomial distributions with varying independence assumptions. In the first stage of topographic LCM construction, self-organizing maps with neural field organized according to the LCM topology are employed. We apply our system to a large collection of user ratings for films. The system can provide useful visualization plots unveiling user preference patterns buried in the data, without loosing potential to be a good recommender model. It appears that multinomial distribution is most adequate if the model is regularized by tight grid topologies. Since we deal with probabilistic models of the data, we can readily use tools from probability and information theories to interpret and visualize information extracted by our system.

  20. Full complex spatial filtering with a phase mostly DMD. [Deformable Mirror Device

    NASA Technical Reports Server (NTRS)

    Florence, James M.; Juday, Richard D.

    1991-01-01

    A new technique for implementing fully complex spatial filters with a phase mostly deformable mirror device (DMD) light modulator is described. The technique combines two or more phase-modulating flexure-beam mirror elements into a single macro-pixel. By manipulating the relative phases of the individual sub-pixels within the macro-pixel, the amplitude and the phase can be independently set for this filtering element. The combination of DMD sub-pixels into a macro-pixel is accomplished by adjusting the optical system resolution, thereby trading off system space bandwidth product for increased filtering flexibility. Volume in the larger dimensioned space, space bandwidth-complex axes count, is conserved. Experimental results are presented mapping out the coupled amplitude and phase characteristics of the individual flexure-beam DMD elements and demonstrating the independent control of amplitude and phase in a combined macro-pixel. This technique is generally applicable for implementation with any type of phase modulating light modulator.

  1. Measurement of Two-Plasmon-Decay Dependence on Plasma Density Scale Length

    NASA Astrophysics Data System (ADS)

    Haberberger, D.

    2013-10-01

    An accurate understanding of the plasma scale-length (Lq) conditions near quarter-critical density is important in quantifying the hot electrons generated by the two-plasmon-decay (TPD) instability in long-scale-length plasmas. A novel target platform was developed to vary the density scale length and an innovative diagnostic was implemented to measure the density profiles above 1021 cm-3 where TPD is expected to have the largest growth. A series of experiments was performed using the four UV (351-nm) beams on OMEGA EP that varied the Lq by changing the radius of curvature of the target while maintaining a constant Iq/Tq. The fraction of laser energy converted to hot electrons (fhot) was observed to increase rapidly from 0.005% to 1% by increasing the plasma scale length from 130 μm to 300 μm, corresponding to target diameters of 0.4 mm to 8 mm. A new diagnostic was developed based on refractometry using angular spectral filters to overcome the large phase accumulation in standard interferometric techniques. The angular filter refractometer measures the refraction angles of a 10-ps, 263-nm probe laser after propagating through the plasma. An angular spectral filter is used in the Fourier plane of the probe beam, where the refractive angles of the rays are mapped to space. The edges of the filter are present in the image plane and represent contours of constant refraction angle. These contours are used to infer the phase of the probe beam, which are used to calculate the plasma density profile. In long-scale-length plasmas, the diagnostic currently measures plasma densities from ~1019 cm-3 to ~2 × 1021 cm-3. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944. In collaboration with D. H. Edgell, S. X. Hu, S. Ivancic, R. Boni, C. Dorrer, and D. H. Froula (Laboratory for Laser Energetics, U. of Rochester).

  2. Adaptive filtering of GOCE-derived gravity gradients of the disturbing potential in the context of the space-wise approach

    NASA Astrophysics Data System (ADS)

    Piretzidis, Dimitrios; Sideris, Michael G.

    2017-09-01

    Filtering and signal processing techniques have been widely used in the processing of satellite gravity observations to reduce measurement noise and correlation errors. The parameters and types of filters used depend on the statistical and spectral properties of the signal under investigation. Filtering is usually applied in a non-real-time environment. The present work focuses on the implementation of an adaptive filtering technique to process satellite gravity gradiometry data for gravity field modeling. Adaptive filtering algorithms are commonly used in communication systems, noise and echo cancellation, and biomedical applications. Two independent studies have been performed to introduce adaptive signal processing techniques and test the performance of the least mean-squared (LMS) adaptive algorithm for filtering satellite measurements obtained by the gravity field and steady-state ocean circulation explorer (GOCE) mission. In the first study, a Monte Carlo simulation is performed in order to gain insights about the implementation of the LMS algorithm on data with spectral behavior close to that of real GOCE data. In the second study, the LMS algorithm is implemented on real GOCE data. Experiments are also performed to determine suitable filtering parameters. Only the four accurate components of the full GOCE gravity gradient tensor of the disturbing potential are used. The characteristics of the filtered gravity gradients are examined in the time and spectral domain. The obtained filtered GOCE gravity gradients show an agreement of 63-84 mEötvös (depending on the gravity gradient component), in terms of RMS error, when compared to the gravity gradients derived from the EGM2008 geopotential model. Spectral-domain analysis of the filtered gradients shows that the adaptive filters slightly suppress frequencies in the bandwidth of approximately 10-30 mHz. The limitations of the adaptive LMS algorithm are also discussed. The tested filtering algorithm can be connected to and employed in the first computational steps of the space-wise approach, where a time-wise Wiener filter is applied at the first stage of GOCE gravity gradient filtering. The results of this work can be extended to using other adaptive filtering algorithms, such as the recursive least-squares and recursive least-squares lattice filters.

  3. Adaptive Filter Techniques for Optical Beam Jitter Control and Target Tracking

    DTIC Science & Technology

    2008-12-01

    OPTICAL BEAM JITTER CONTROL AND TARGET TRACKING Michael J. Beerer Civilian, United States Air Force B.S., University of California Irvine, 2006...TECHNIQUES FOR OPTICAL BEAM JITTER CONTROL AND TARGET TRACKING by Michael J. Beerer December 2008 Thesis Advisor: Brij N. Agrawal Co...DATE December 2008 3. REPORT TYPE AND DATES COVERED Master’s Thesis 4. TITLE AND SUBTITLE Adaptive Filter Techniques for Optical Beam Jitter

  4. Theatre Ballistic Missile Defense-Multisensor Fusion, Targeting and Tracking Techniques

    DTIC Science & Technology

    1998-03-01

    Washington, D.C., 1994. 8. Brown , R., and Hwang , P., Introduction to Random Signals and Applied Kaiman Filtering, Third Edition, John Wiley and Sons...C. ADDING MEASUREMENT NOISE 15 III. EXTENDED KALMAN FILTER 19 A. DISCRETE TIME KALMAN FILTER 19 B. EXTENDED KALMAN FILTER 21 C. EKF IN TARGET...tracking algorithms. 17 18 in. EXTENDED KALMAN FILTER This chapter provides background information on the development of a tracking algorithm

  5. Survey of digital filtering

    NASA Technical Reports Server (NTRS)

    Nagle, H. T., Jr.

    1972-01-01

    A three part survey is made of the state-of-the-art in digital filtering. Part one presents background material including sampled data transformations and the discrete Fourier transform. Part two, digital filter theory, gives an in-depth coverage of filter categories, transfer function synthesis, quantization and other nonlinear errors, filter structures and computer aided design. Part three presents hardware mechanization techniques. Implementations by general purpose, mini-, and special-purpose computers are presented.

  6. Comparison of weighting techniques for acoustic full waveform inversion

    NASA Astrophysics Data System (ADS)

    Jeong, Gangwon; Hwang, Jongha; Min, Dong-Joo

    2017-12-01

    To reconstruct long-wavelength structures in full waveform inversion (FWI), the wavefield-damping and weighting techniques have been used to synthesize and emphasize low-frequency data components in frequency-domain FWI. However, these methods have some weak points. The application of wavefield-damping method on filtered data fails to synthesize reliable low-frequency data; the optimization formula obtained introducing the weighting technique is not theoretically complete, because it is not directly derived from the objective function. In this study, we address these weak points and present how to overcome them. We demonstrate that the source estimation in FWI using damped wavefields fails when the data used in the FWI process does not satisfy the causality condition. This phenomenon occurs when a non-causal filter is applied to data. We overcome this limitation by designing a causal filter. Also we modify the conventional weighting technique so that its optimization formula is directly derived from the objective function, retaining its original characteristic of emphasizing the low-frequency data components. Numerical results show that the newly designed causal filter enables to recover long-wavelength structures using low-frequency data components synthesized by damping wavefields in frequency-domain FWI, and the proposed weighting technique enhances the inversion results.

  7. Improving liquid chromatography-tandem mass spectrometry determinations by modifying noise frequency spectrum between two consecutive wavelet-based low-pass filtering procedures.

    PubMed

    Chen, Hsiao-Ping; Liao, Hui-Ju; Huang, Chih-Min; Wang, Shau-Chun; Yu, Sung-Nien

    2010-04-23

    This paper employs one chemometric technique to modify the noise spectrum of liquid chromatography-tandem mass spectrometry (LC-MS/MS) chromatogram between two consecutive wavelet-based low-pass filter procedures to improve the peak signal-to-noise (S/N) ratio enhancement. Although similar techniques of using other sets of low-pass procedures such as matched filters have been published, the procedures developed in this work are able to avoid peak broadening disadvantages inherent in matched filters. In addition, unlike Fourier transform-based low-pass filters, wavelet-based filters efficiently reject noises in the chromatograms directly in the time domain without distorting the original signals. In this work, the low-pass filtering procedures sequentially convolve the original chromatograms against each set of low pass filters to result in approximation coefficients, representing the low-frequency wavelets, of the first five resolution levels. The tedious trials of setting threshold values to properly shrink each wavelet are therefore no longer required. This noise modification technique is to multiply one wavelet-based low-pass filtered LC-MS/MS chromatogram with another artificial chromatogram added with thermal noises prior to the other wavelet-based low-pass filter. Because low-pass filter cannot eliminate frequency components below its cut-off frequency, more efficient peak S/N ratio improvement cannot be accomplished using consecutive low-pass filter procedures to process LC-MS/MS chromatograms. In contrast, when the low-pass filtered LC-MS/MS chromatogram is conditioned with the multiplication alteration prior to the other low-pass filter, much better ratio improvement is achieved. The noise frequency spectrum of low-pass filtered chromatogram, which originally contains frequency components below the filter cut-off frequency, is altered to span a broader range with multiplication operation. When the frequency range of this modified noise spectrum shifts toward the high frequency regimes, the other low-pass filter is able to provide better filtering efficiency to obtain higher peak S/N ratios. Real LC-MS/MS chromatograms, of which typically less than 6-fold peak S/N ratio improvement achieved with two consecutive wavelet-based low-pass filters remains the same S/N ratio improvement using one-step wavelet-based low-pass filter, are improved to accomplish much better ratio enhancement 25-folds to 40-folds typically when the noise frequency spectrum is modified between two low-pass filters. The linear standard curves using the filtered LC-MS/MS signals are validated. The filtered LC-MS/MS signals are also reproducible. The more accurate determinations of very low concentration samples (S/N ratio about 7-9) are obtained using the filtered signals than the determinations using the original signals. Copyright 2010 Elsevier B.V. All rights reserved.

  8. Percutaneous Retrieval of Permanent Inferior Vena Cava Filters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tamrazi, Anobel, E-mail: atamraz1@jhmi.edu; Wadhwa, Vibhor, E-mail: vwadhwa1@jhmi.edu; Holly, Brian, E-mail: bholly3@jhmi.edu

    PurposeTo evaluate the feasibility, risks, and techniques of percutaneous removal of permanent TrapEase and Simon Nitinol IVC filters.Materials and MethodsBetween August 2011 and August 2015, 12 patients (5 women, 7 men; age range, 26–75 years) underwent an attempt at percutaneous removal of permanent TrapEase (10) and Simon Nitinol (2) IVC filters due to a history of IVC filter complications or need for lifelong anticoagulation due to the filter. Medical records were reviewed for filter dwell time, presence of iliocaval deep venous thrombosis, procedural technique, and complications.ResultsFilter dwell times ranged from 7 days to 15 years (mean 5.1 years). Successful removal of permanent IVC filtersmore » was possible in 11 of 12 patients (91.6 %). In 1 patient, a chronically thrombosed IVC filter could not be removed despite laser sheath assistance, but was successfully recanalized with the PowerWire RF guidewire. In the failed retrieval attempt, a stent was placed through the chronically thrombosed IVC filter with restoration of in-line flow. One major complication of large venous groin hematoma was encountered.ConclusionsIn carefully selected patients, percutaneous removal of permanent IVC filters can be performed safely despite prolonged filter dwell times. Extraction of chronically embedded permanent IVC filters may be facilitated by jugular and femoral approaches, often with laser sheath assistance. Chronic filter thrombosis and caval scarring may increase the risk of retrieval failure.« less

  9. The Application of Collaborative Business Intelligence Technology in the Hospital SPD Logistics Management Model.

    PubMed

    Liu, Tongzhu; Shen, Aizong; Hu, Xiaojian; Tong, Guixian; Gu, Wei

    2017-06-01

    We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers.

  10. An Adaptive Kalman Filter Using a Simple Residual Tuning Method

    NASA Technical Reports Server (NTRS)

    Harman, Richard R.

    1999-01-01

    One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. A. H. Jazwinski developed a specialized version of this technique for estimation of process noise. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.

  11. Absorption Filter Based Optical Diagnostics in High Speed Flows

    NASA Technical Reports Server (NTRS)

    Samimy, Mo; Elliott, Gregory; Arnette, Stephen

    1996-01-01

    Two major regimes where laser light scattered by molecules or particles in a flow contains significant information about the flow are Mie scattering and Rayleigh scattering. Mie scattering is used to obtain only velocity information, while Rayleigh scattering can be used to measure both the velocity and the thermodynamic properties of the flow. Now, recently introduced (1990, 1991) absorption filter based diagnostic techniques have started a new era in flow visualization, simultaneous velocity and thermodynamic measurements, and planar velocity measurements. Using a filtered planar velocimetry (FPV) technique, we have modified the optically thick iodine filter profile of Miles, et al., and used it in the pressure-broaden regime which accommodates measurements in a wide range of velocity applications. Measuring velocity and thermodynamic properties simultaneously, using absorption filtered based Rayleigh scattering, involves not only the measurement of the Doppler shift, but also the spectral profile of the Rayleigh scattering signal. Using multiple observation angles, simultaneous measurement of one component velocity and thermodynamic properties in a supersonic jet were measured. Presently, the technique is being extended for simultaneous measurements of all three components of velocity and thermodynamic properties.

  12. Doppler lidar wind measurement with the edge technique

    NASA Technical Reports Server (NTRS)

    Korb, C. Laurence; Gentry, Bruce M.

    1992-01-01

    The edge technique is a new and powerful method for measuring small frequency shifts. Range resolved lidar measurements of winds can be made with high accuracy and high vertical resolution using the edge technique to measure the Doppler shift of an atmospheric backscattered signal from a pulsed laser. The edge technique can be used at near-infrared or visible wavelengths using well developed solid state lasers and detectors with various edge filters. In the edge technique, the laser frequency is located on the steep slope of the spectral response function of a high resolution optical filter. Due to the steep slope of the edge, very small frequency shifts cause large changes in measured signal. The frequency of the outgoing laser pulse is determined by measuring its location on the edge of the filter. This is accomplished by sending a small portion of the beam to the edge detection setup where the incoming light is split into two channels - an edge filter and an energy monitor channel. The energy monitor signal is used to normalize the edge filter signal for magnitude. The laser return backscattered from the atmosphere is collected by a telescope and directed through the edge detection setup to determine its frequency (location on the edge) in a similar manner for each range element. The Doppler shift, and thus the wind, is determined from a differential measurement of the frequency of the outgoing laser pulse and the frequency of the laser return backscattered from the atmosphere. We have conducted simulations of the performance of an edge lidar system using an injection seeded pulsed Nd:YAG laser at 1.06 microns. The central fringe of a Fabry-Perot etalon is used as a high resolution edge filter to measure the shift of the aerosol return.

  13. Imaging Multi-Order Fabry-Perot Spectrometer (IMOFPS) for spaceborne measurements of CO

    NASA Astrophysics Data System (ADS)

    Johnson, Brian R.; Kampe, Thomas U.; Cook, William B.; Miecznik, Grzegorz; Novelli, Paul C.; Snell, Hilary E.; Turner-Valle, Jennifer A.

    2003-11-01

    An instrument concept for an Imaging Multi-Order Fabry-Perot Spectrometer (IMOFPS) has been developed for measuring tropospheric carbon monoxide (CO) from space. The concept is based upon a correlation technique similar in nature to multi-order Fabry-Perot (FP) interferometer or gas filter radiometer techniques, which simultaneously measure atmospheric emission from several infrared vibration-rotation lines of CO. Correlation techniques provide a multiplex advantage for increased throughput, high spectral resolution and selectivity necessary for profiling tropospheric CO. Use of unconventional multilayer interference filter designs leads to improvement in CO spectral line correlation compared with the traditional FP multi-order technique, approaching the theoretical performance of gas filter correlation radiometry. In this implementation, however, the gas cell is replaced with a simple, robust solid interference filter. In addition to measuring CO, the correlation filter technique can be applied to measurements of other important gases such as carbon dioxide, nitrous oxide and methane. Imaging the scene onto a 2-D detector array enables a limited range of spectral sampling owing to the field-angle dependence of the filter transmission function. An innovative anamorphic optical system provides a relatively large instrument field-of-view for imaging along the orthogonal direction across the detector array. An important advantage of the IMOFPS concept is that it is a small, low mass and high spectral resolution spectrometer having no moving parts. A small, correlation spectrometer like IMOFPS would be well suited for global observations of CO2, CO, and CH4 from low Earth or regional observations from Geostationary orbit. A prototype instrument is in development for flight demonstration on an airborne platform with potential applications to atmospheric chemistry, wild fire and biomass burning, and chemical dispersion monitoring.

  14. An ultra-low-power filtering technique for biomedical applications.

    PubMed

    Zhang, Tan-Tan; Mak, Pui-In; Vai, Mang-I; Mak, Peng-Un; Wan, Feng; Martins, R P

    2011-01-01

    This paper describes an ultra-low-power filtering technique for biomedical applications designated as T-wave sensing in heart-activities detection systems. The topology is based on a source-follower-based Biquad operating in the sub-threshold region. With the intrinsic advantages of simplicity and high linearity of the source-follower, ultra-low-cutoff filtering can be achieved, simultaneously with ultra low power and good linearity. An 8(th)-order 2.4-Hz lowpass filter design example optimized in a 0.35-μm CMOS process was designed achieving over 85-dB dynamic range, 74-dB stopband attenuation and consuming only 0.36 nW at a 3-V supply.

  15. Rotational response of suspended particles to turbulent flow: laboratory and numerical synthesis

    NASA Astrophysics Data System (ADS)

    Variano, Evan; Zhao, Lihao; Byron, Margaret; Bellani, Gabriele; Tao, Yiheng; Andersson, Helge

    2014-11-01

    Using laboratory and DNS measurements, we consider how aspherical and inertial particles suspended in a turbulent flow act to ``filter'' the fluid-phase vorticity. We use three approaches to predict the magnitude and structure of this filter. The first approach is based on Buckingham's Pi theorem, which shows a clear result for the relationship between filter strength and particle aspect ratio. Results are less clear for the dependence of filter strength on Stokes number; we briefly discuss some issues in the proper definition of Stokes number for use in this context. The second approach to predicting filter strength is based on a consideration of vorticity and enstrophy spectra in the fluid phase. This method has a useful feature: it can be used to predict the filter a priori, without need for measurements as input. We compare the results of this approach to measurements as a method of validation. The third and final approach to predicting filter strength is from the consideration of torques experienced by particles, and how the ``angular slip'' or ``spin slip'' evolves in an unsteady flow. We show results from our DNS that indicate different flow conditions in which the spin slip is more or less important in setting the particle rotation dynamics. Collaboration made possible by the Peder Sather Center.

  16. COMPARISON OF MEMBRANE FILTER, MULTIPLE-FERMENTATION-TUBE, AND PRESENCE-ABSENCE TECHNIQUES FOR DETECTING TOTAL COLIFORMS IN SMALL COMMUNITY WATER SYSTEMS

    EPA Science Inventory

    Methods for detecting total coliform bacteria in drinking water were compared using 1483 different drinking water samples from 15 small community water systems in Vermont and New Hampshire. The methods included the membrane filter (MF) technique, a ten tube fermentation tube tech...

  17. Curve fitting air sample filter decay curves to estimate transuranic content.

    PubMed

    Hayes, Robert B; Chiou, Hung Cheng

    2004-01-01

    By testing industry standard techniques for radon progeny evaluation on air sample filters, a new technique is developed to evaluate transuranic activity on air filters by curve fitting the decay curves. The industry method modified here is simply the use of filter activity measurements at different times to estimate the air concentrations of radon progeny. The primary modification was to not look for specific radon progeny values but rather transuranic activity. By using a method that will provide reasonably conservative estimates of the transuranic activity present on a filter, some credit for the decay curve shape can then be taken. By carrying out rigorous statistical analysis of the curve fits to over 65 samples having no transuranic activity taken over a 10-mo period, an optimization of the fitting function and quality tests for this purpose was attained.

  18. Sidelobe suppression in all-fiber acousto-optic tunable filter using torsional acoustic wave.

    PubMed

    Lee, Kwang Jo; Hwang, In-Kag; Park, Hyun Chul; Kim, Byoung Yoon

    2010-06-07

    We propose two techniques to suppress intrinsic sidelobe spectra in all-fiber acousto-optic tunable filter using torsional acoustic wave. The techniques are based on either double-pass filter configuration or axial tailoring of mode coupling strength along an acousto-optic interaction region in a highly birefringent optical fiber. The sidelobe peak in the filter spectrum is experimentally suppressed from -8.3 dB to -16.4 dB by employing double-pass configuration. Axial modulation of acousto-optic coupling strength is proposed using axial variation of the fiber diameter, and the simulation results show that the maximum side peak of -9.3 dB can be reduced to -22.2dB. We also discuss the possibility of further spectral shaping of the filter based on the axial tailoring of acousto-optic coupling strength.

  19. Improving signal-to-noise ratios of liquid chromatography-tandem mass spectrometry peaks using noise frequency spectrum modification between two consecutive matched-filtering procedures.

    PubMed

    Wang, Shau-Chun; Huang, Chih-Min; Chiang, Shu-Min

    2007-08-17

    This paper reports a simple chemometric technique to alter the noise spectrum of liquid chromatography-tandem mass spectrometry (LC-MS-MS) chromatogram between two consecutive matched filter procedures to improve the peak signal-to-noise (S/N) ratio enhancement. This technique is to multiply one match-filtered LC-MS-MS chromatogram with another artificial chromatogram added with thermal noises prior to the second matched filter. Because matched filter cannot eliminate low-frequency components inherent in the flicker noises of spike-like sharp peaks randomly riding on LC-MS-MS chromatograms, efficient peak S/N ratio improvement cannot be accomplished using one-step or consecutive matched filter procedures to process LC-MS-MS chromatograms. In contrast, when the match-filtered LC-MS-MS chromatogram is conditioned with the multiplication alteration prior to the second matched filter, much better efficient ratio improvement is achieved. The noise frequency spectrum of match-filtered chromatogram, which originally contains only low-frequency components, is altered to span a boarder range with multiplication operation. When the frequency range of this modified noise spectrum shifts toward higher frequency regime, the second matched filter, working as a low-pass filter, is able to provide better filtering efficiency to obtain higher peak S/N ratios. Real LC-MS-MS chromatograms containing random spike-like peaks, of which peak S/N ratio improvement is less than four times with two consecutive matched filters typically, are remedied to accomplish much better ratio enhancement approximately 16-folds when the noise frequency spectrum is modified between two matched filters.

  20. Radar data smoothing filter study

    NASA Technical Reports Server (NTRS)

    White, J. V.

    1984-01-01

    The accuracy of the current Wallops Flight Facility (WFF) data smoothing techniques for a variety of radars and payloads is examined. Alternative data reduction techniques are given and recommendations are made for improving radar data processing at WFF. A data adaptive algorithm, based on Kalman filtering and smoothing techniques, is also developed for estimating payload trajectories above the atmosphere from noisy time varying radar data. This algorithm is tested and verified using radar tracking data from WFF.

  1. Simultaneous Determination of Octinoxate, Oxybenzone, and Octocrylene in a Sunscreen Formulation Using Validated Spectrophotometric and Chemometric Methods.

    PubMed

    Abdel-Ghany, Maha F; Abdel-Aziz, Omar; Ayad, Miriam F; Mikawy, Neven N

    2015-01-01

    Accurate, reliable, and sensitive spectrophotometric and chemometric methods were developed for simultaneous determination of octinoxate (OMC), oxybenzone (OXY), and octocrylene (OCR) in a sunscreen formulation without prior separation steps, including derivative ratio spectra zero crossing (DRSZ), double divisor ratio spectra derivative (DDRD), mean centering ratio spectra (MCR), and partial least squares (PLS-2). With the DRSZ technique, the UV filters could be determined in the ranges of 0.5-13.0, 0.3-9.0, and 0.5-9.0 μg/mL at 265.2, 246.6, and 261.8 nm, respectively. By utilizing the DDRD technique, UV filters could be determined in the above ranges at 237.8, 241.0, and 254.2 nm, respectively. With the MCR technique, the UV filters could be determined in the above ranges at 381.7, 383.2, and 355.6 nm, respectively. The PLS-2 technique successfully quantified the examined UV filters in the ranges of 0.5-9.3, 0.3-7.1, and 0.5-6.9 μg/mL, respectively. All the methods were validated according to the International Conference on Harmonization guidelines and successfully applied to determine the UV filters in pure form, laboratory-prepared mixtures, and a sunscreen formulation. The obtained results were statistically compared with reference and reported methods of analysis for OXY, OMC, and OCR, and there were no significant differences with respect to accuracy and precision of the adopted techniques.

  2. Detecting Weak Spectral Lines in Interferometric Data through Matched Filtering

    NASA Astrophysics Data System (ADS)

    Loomis, Ryan A.; Öberg, Karin I.; Andrews, Sean M.; Walsh, Catherine; Czekala, Ian; Huang, Jane; Rosenfeld, Katherine A.

    2018-04-01

    Modern radio interferometers enable observations of spectral lines with unprecedented spatial resolution and sensitivity. In spite of these technical advances, many lines of interest are still at best weakly detected and therefore necessitate detection and analysis techniques specialized for the low signal-to-noise ratio (S/N) regime. Matched filters can leverage knowledge of the source structure and kinematics to increase sensitivity of spectral line observations. Application of the filter in the native Fourier domain improves S/N while simultaneously avoiding the computational cost and ambiguities associated with imaging, making matched filtering a fast and robust method for weak spectral line detection. We demonstrate how an approximate matched filter can be constructed from a previously observed line or from a model of the source, and we show how this filter can be used to robustly infer a detection significance for weak spectral lines. When applied to ALMA Cycle 2 observations of CH3OH in the protoplanetary disk around TW Hya, the technique yields a ≈53% S/N boost over aperture-based spectral extraction methods, and we show that an even higher boost will be achieved for observations at higher spatial resolution. A Python-based open-source implementation of this technique is available under the MIT license at http://github.com/AstroChem/VISIBLE.

  3. Tailoring noise frequency spectrum between two consecutive second derivative filtering procedures to improve liquid chromatography-mass spectrometry determinations.

    PubMed

    Wang, Shau-Chun; Lin, Chiao-Juan; Chiang, Shu-Min; Yu, Sung-Nien

    2008-03-15

    This paper reports a simple chemometric technique to alter the noise spectrum of a liquid chromatography-mass spectrometry (LC-MS) chromatogram between two consecutive second-derivative filter procedures to improve the peak signal-to-noise (S/N) ratio enhancement. This technique is to multiply one second-derivative filtered LC-MS chromatogram with another artificial chromatogram added with thermal noises prior to the other second-derivative filter. Because the second-derivative filter cannot eliminate frequency components within its own filter bandwidth, more efficient peak S/N ratio improvement cannot be accomplished using consecutive second-derivative filter procedures to process LC-MS chromatograms. In contrast, when the second-derivative filtered LC-MS chromatogram is conditioned with the multiplication alteration prior to the other second-derivative filter, much better ratio improvement is achieved. The noise frequency spectrum of the second-derivative filtered chromatogram, which originally contains frequency components within the filter bandwidth, is altered to span a broader range with multiplication operation. When the frequency range of this modified noise spectrum shifts toward the other regimes, the other second-derivative filter, working as a band-pass filter, is able to provide better filtering efficiency to obtain higher peak S/N ratios. Real LC-MS chromatograms, of which 5-fold peak S/N ratio improvement achieved with two consecutive second-derivative filters remains the same S/N ratio improvement using a one-step second-derivative filter, are improved to accomplish much better ratio enhancement, approximately 25-fold or higher when the noise frequency spectrum is modified between two matched filters. The linear standard curve using the filtered LC-MS signals is validated. The filtered LC-MS signals are also more reproducible. The more accurate determinations of very low-concentration samples (S/N ratio about 5-7) are obtained via standard addition procedures using the filtered signals rather than the determinations using the original signals.

  4. Discrete square root filtering - A survey of current techniques.

    NASA Technical Reports Server (NTRS)

    Kaminskii, P. G.; Bryson, A. E., Jr.; Schmidt, S. F.

    1971-01-01

    Current techniques in square root filtering are surveyed and related by applying a duality association. Four efficient square root implementations are suggested, and compared with three common conventional implementations in terms of computational complexity and precision. It is shown that the square root computational burden should not exceed the conventional by more than 50% in most practical problems. An examination of numerical conditioning predicts that the square root approach can yield twice the effective precision of the conventional filter in ill-conditioned problems. This prediction is verified in two examples.

  5. An efficient incremental learning mechanism for tracking concept drift in spam filtering

    PubMed Central

    Sheu, Jyh-Jian; Chu, Ko-Tsung; Li, Nien-Feng; Lee, Cheng-Chi

    2017-01-01

    This research manages in-depth analysis on the knowledge about spams and expects to propose an efficient spam filtering method with the ability of adapting to the dynamic environment. We focus on the analysis of email’s header and apply decision tree data mining technique to look for the association rules about spams. Then, we propose an efficient systematic filtering method based on these association rules. Our systematic method has the following major advantages: (1) Checking only the header sections of emails, which is different from those spam filtering methods at present that have to analyze fully the email’s content. Meanwhile, the email filtering accuracy is expected to be enhanced. (2) Regarding the solution to the problem of concept drift, we propose a window-based technique to estimate for the condition of concept drift for each unknown email, which will help our filtering method in recognizing the occurrence of spam. (3) We propose an incremental learning mechanism for our filtering method to strengthen the ability of adapting to the dynamic environment. PMID:28182691

  6. Input filter compensation for switching regulators

    NASA Technical Reports Server (NTRS)

    Lee, F. C.

    1984-01-01

    Problems caused by input filter interaction and conventional input filter design techniques are discussed. The concept of feedforward control is modeled with an input filter and a buck regulator. Experimental measurement and comparison to the analytical predictions is carried out. Transient response and the use of a feedforward loop to stabilize the regulator system is described. Other possible applications for feedforward control are included.

  7. Optimized digital filtering techniques for radiation detection with HPGe detectors

    NASA Astrophysics Data System (ADS)

    Salathe, Marco; Kihm, Thomas

    2016-02-01

    This paper describes state-of-the-art digital filtering techniques that are part of GEANA, an automatic data analysis software used for the GERDA experiment. The discussed filters include a novel, nonlinear correction method for ballistic deficits, which is combined with one of three shaping filters: a pseudo-Gaussian, a modified trapezoidal, or a modified cusp filter. The performance of the filters is demonstrated with a 762 g Broad Energy Germanium (BEGe) detector, produced by Canberra, that measures γ-ray lines from radioactive sources in an energy range between 59.5 and 2614.5 keV. At 1332.5 keV, together with the ballistic deficit correction method, all filters produce a comparable energy resolution of 1.61 keV FWHM. This value is superior to those measured by the manufacturer and those found in publications with detectors of a similar design and mass. At 59.5 keV, the modified cusp filter without a ballistic deficit correction produced the best result, with an energy resolution of 0.46 keV. It is observed that the loss in resolution by using a constant shaping time over the entire energy range is small when using the ballistic deficit correction method.

  8. Effect of different thickness of material filter on Tc-99m spectra and performance parameters of gamma camera

    NASA Astrophysics Data System (ADS)

    Nazifah, A.; Norhanna, S.; Shah, S. I.; Zakaria, A.

    2014-11-01

    This study aimed to investigate the effects of material filter technique on Tc-99m spectra and performance parameters of Philip ADAC forte dual head gamma camera. Thickness of material filter was selected on the basis of percentage attenuation of various gamma ray energies by different thicknesses of zinc material. A cylindrical source tank of NEMA single photon emission computed tomography (SPECT) Triple Line Source Phantom filled with water and Tc-99m radionuclide injected was used for spectra, uniformity and sensitivity measurements. Vinyl plastic tube was used as a line source for spatial resolution. Images for uniformity were reconstructed by filtered back projection method. Butterworth filter of order 5 and cut off frequency 0.35 cycles/cm was selected. Chang's attenuation correction method was applied by selecting 0.13/cm linear attenuation coefficient. Count rate was decreased with material filter from the compton region of Tc-99m energy spectrum, also from the photopeak region. Spatial resolution was improved. However, uniformity of tomographic image was equivocal, and system volume sensitivity was reduced by material filter. Material filter improved system's spatial resolution. Therefore, the technique may be used for phantom studies to improve the image quality.

  9. SU-E-I-37: Low-Dose Real-Time Region-Of-Interest X-Ray Fluoroscopic Imaging with a GPU-Accelerated Spatially Different Bilateral Filtering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chung, H; Lee, J; Pua, R

    2014-06-01

    Purpose: The purpose of our study is to reduce imaging radiation dose while maintaining image quality of region of interest (ROI) in X-ray fluoroscopy. A low-dose real-time ROI fluoroscopic imaging technique which includes graphics-processing-unit- (GPU-) accelerated image processing for brightness compensation and noise filtering was developed in this study. Methods: In our ROI fluoroscopic imaging, a copper filter is placed in front of the X-ray tube. The filter contains a round aperture to reduce radiation dose to outside of the aperture. To equalize the brightness difference between inner and outer ROI regions, brightness compensation was performed by use of amore » simple weighting method that applies selectively to the inner ROI, the outer ROI, and the boundary zone. A bilateral filtering was applied to the images to reduce relatively high noise in the outer ROI images. To speed up the calculation of our technique for real-time application, the GPU-acceleration was applied to the image processing algorithm. We performed a dosimetric measurement using an ion-chamber dosimeter to evaluate the amount of radiation dose reduction. The reduction of calculation time compared to a CPU-only computation was also measured, and the assessment of image quality in terms of image noise and spatial resolution was conducted. Results: More than 80% of dose was reduced by use of the ROI filter. The reduction rate depended on the thickness of the filter and the size of ROI aperture. The image noise outside the ROI was remarkably reduced by the bilateral filtering technique. The computation time for processing each frame image was reduced from 3.43 seconds with single CPU to 9.85 milliseconds with GPU-acceleration. Conclusion: The proposed technique for X-ray fluoroscopy can substantially reduce imaging radiation dose to the patient while maintaining image quality particularly in the ROI region in real-time.« less

  10. An annotation system for 3D fluid flow visualization

    NASA Technical Reports Server (NTRS)

    Loughlin, Maria M.; Hughes, John F.

    1995-01-01

    Annotation is a key activity of data analysis. However, current systems for data analysis focus almost exclusively on visualization. We propose a system which integrates annotations into a visualization system. Annotations are embedded in 3D data space, using the Post-it metaphor. This embedding allows contextual-based information storage and retrieval, and facilitates information sharing in collaborative environments. We provide a traditional database filter and a Magic Lens filter to create specialized views of the data. The system has been customized for fluid flow applications, with features which allow users to store parameters of visualization tools and sketch 3D volumes.

  11. Pneumocystis jiroveci in HIV/AIDS patients: detection by FTA filter paper together with PCR in noninvasive induced sputum specimens.

    PubMed

    Jaijakul, Siraya; Saksirisampant, Wilai; Prownebon, Juraratt; Yenthakam, Sutin; Mungthin, Mathirut; Leelayoova, Saovanee; Nuchprayoon, Surang

    2005-09-01

    To detect P. jiroveci (previously named P. carinii) by PCR using FTA filter paper to extract the DNA, from noninvasive induced sputum samples of HIV/AIDS patients. Fifty two HIV/AIDS patients suspected of Pneumocystis jiroveci pneumonia (PJP) in King Chulalongkorn Memorial Hospital were recruited. Both cytological method and PCR with FTA filter paper technique were performed to detect P jiroveci from each specimen. The detectability rate of P. jiroveci infection was 21%. The PCR with FTA filter paper method was 4 folds much more sensitive than Giemsa staining technique. P. jiroveci was detected in 18% of the HIV/AIDS patients in spite of receiving standard PJP prophylaxis. Detection of P. jiroveci by using FTA filter paper together with PCR in induced sputum samples could detect more cases of P. jiroveci infection than by using cytological method. DNA extraction using the FTA filter paper was more rapid and convenient than other extraction methods. The causes of failure of PJP prophylaxis should be evaluated.

  12. Filtering and left ventricle segmentation of the fetal heart in ultrasound images

    NASA Astrophysics Data System (ADS)

    Vargas-Quintero, Lorena; Escalante-Ramírez, Boris

    2013-11-01

    In this paper, we propose to use filtering methods and a segmentation algorithm for the analysis of fetal heart in ultrasound images. Since noise speckle makes difficult the analysis of ultrasound images, the filtering process becomes a useful task in these types of applications. The filtering techniques consider in this work assume that the speckle noise is a random variable with a Rayleigh distribution. We use two multiresolution methods: one based on wavelet decomposition and the another based on the Hermite transform. The filtering process is used as way to strengthen the performance of the segmentation tasks. For the wavelet-based approach, a Bayesian estimator at subband level for pixel classification is employed. The Hermite method computes a mask to find those pixels that are corrupted by speckle. On the other hand, we picked out a method based on a deformable model or "snake" to evaluate the influence of the filtering techniques in the segmentation task of left ventricle in fetal echocardiographic images.

  13. Tailoring noise frequency spectrum to improve NIR determinations.

    PubMed

    Xie, Shaofei; Xiang, Bingren; Yu, Liyan; Deng, Haishan

    2009-12-15

    Near infrared spectroscopy (NIR) contains excessive background noise and weak analytical signals caused by near infrared overtones and combinations. That makes it difficult to achieve quantitative determinations of low concentration samples by NIR. A simple chemometric approach has been established to modify the noise frequency spectrum to improve NIR determinations. The proposed method is to multiply one Savitzky-Golay filtered NIR spectrum with another reference spectrum added with thermal noises before the other Savitzky-Golay filter. Since Savitzky-Golay filter is a kind of low-pass filter and cannot eliminate low frequency components of NIR spectrum, using one step or two consecutive Savitzky-Golay filter procedures cannot improve the determination of NIR greatly. Meanwhile, significant improvement is achieved via the Savitzky-Golay filtered NIR spectrum processed with the multiplication alteration before the other Savitzky-Golay filter. The frequency range of the modified noise spectrum shifts toward higher frequency regime via multiplication operation. So the second Savitzky-Golay filter is able to provide better filtering efficiency to obtain satisfied result. The improvement of NIR determination with tailoring noise frequency spectrum technique was demonstrated by both simulated dataset and two measured NIR spectral datasets. It is expected that noise frequency spectrum technique will be adopted mostly in applications where quantitative determination of low concentration sample is crucial.

  14. Directional bilateral filters for smoothing fluorescence microscopy images

    NASA Astrophysics Data System (ADS)

    Venkatesh, Manasij; Mohan, Kavya; Seelamantula, Chandra Sekhar

    2015-08-01

    Images obtained through fluorescence microscopy at low numerical aperture (NA) are noisy and have poor resolution. Images of specimens such as F-actin filaments obtained using confocal or widefield fluorescence microscopes contain directional information and it is important that an image smoothing or filtering technique preserve the directionality. F-actin filaments are widely studied in pathology because the abnormalities in actin dynamics play a key role in diagnosis of cancer, cardiac diseases, vascular diseases, myofibrillar myopathies, neurological disorders, etc. We develop the directional bilateral filter as a means of filtering out the noise in the image without significantly altering the directionality of the F-actin filaments. The bilateral filter is anisotropic to start with, but we add an additional degree of anisotropy by employing an oriented domain kernel for smoothing. The orientation is locally adapted using a structure tensor and the parameters of the bilateral filter are optimized for within the framework of statistical risk minimization. We show that the directional bilateral filter has better denoising performance than the traditional Gaussian bilateral filter and other denoising techniques such as SURE-LET, non-local means, and guided image filtering at various noise levels in terms of peak signal-to-noise ratio (PSNR). We also show quantitative improvements in low NA images of F-actin filaments.

  15. The identification of criteria to evaluate prehospital trauma care using the Delphi technique.

    PubMed

    Rosengart, Matthew R; Nathens, Avery B; Schiff, Melissa A

    2007-03-01

    Current trauma system performance improvement emphasizes hospital- and patient-based outcome measures such as mortality and morbidity, with little focus upon the processes of prehospital trauma care. Little data exist to suggest which prehospital criteria should serve as potential filters. This study identifies the most important filters for auditing prehospital trauma care using a Delphi technique to achieve consensus of expert opinion. Experts in trauma care from the United States (n = 81) were asked to generate filters of potential utility in monitoring the prehospital aspect of the trauma system, and were then required to rank these questions in order of importance to identify those of greatest importance. Twenty-eight filters ranking in the highest tertile are proposed. The majority (54%) pertains to aspects of emergency medical services, which comprise 7 of the top 10 (70%) filters. Triage filters follow in priority ranking, comprising 29% of the final list. Filters concerning interfacility transfers and transportation ranked lowest. This study identifies audit filters representing the most important aspects of prehospital trauma care that merit continued evaluation and monitoring. A subsequent trial addressing the utility of these filters could potentially enhance the sensitivity of identifying deviations in prehospital care, standardize the performance improvement process, and translate into an improvement in patient care and outcome.

  16. Designing the Undesignable: Social Software and Control

    ERIC Educational Resources Information Center

    Dron, Jon

    2007-01-01

    Social software, such as blogs, wikis, tagging systems and collaborative filters, treats the group as a first-class object within the system. Drawing from theories of transactional distance and control, this paper proposes a model of e-learning that extends traditional concepts of learner-teacher-content interactions to include these emergent…

  17. Memory in the Information Age: New Tools for Second Language Acquisition.

    ERIC Educational Resources Information Center

    Chapin, Alex

    2003-01-01

    Describes a Middlebury College second language vocabulary learning database that goes well beyond flashcards, because it keeps track of what students learn. Discusses further expansion of the system through collaborative filtering software to establish learner profiles. A learner profile could then be used to create instructional materials just…

  18. Networked Information: Finding What's Out There.

    ERIC Educational Resources Information Center

    Lynch, Clifford A.

    1997-01-01

    Clifford A. Lynch, developer of MELVYL and former director of library automation at the University of California, is now executive director for the Coalition for Networked Information (CNI). This interview discusses Lynch's background, MELVYL, the Web and the role of libraries and librarians, community and collaborative filtering, the library of…

  19. Blogging and Internet Filters in Schools

    ERIC Educational Resources Information Center

    Shearer, Kimberly M.

    2010-01-01

    Success in today's global market requires students to attain numerous 21st-Century skills, including collaborative and communication skills, and knowledge of how to use technology to both locate and create information. The use of instructional blogging in the classroom is one way to help students develop such skills. The Children's Internet…

  20. On the Recommender System for University Library

    ERIC Educational Resources Information Center

    Fu, Shunkai; Zhang, Yao; Seinminn

    2013-01-01

    Libraries are important to universities, and they have two primary features: readers as well as collections are highly professional. In this study, based on the experimental study with five millions of users' borrowing records, our discussion covers: (1) the necessity of recommender system for university libraries; (2) collaborative filtering (CF)…

  1. Semantic Web-Driven LMS Architecture towards a Holistic Learning Process Model Focused on Personalization

    ERIC Educational Resources Information Center

    Kerkiri, Tania

    2010-01-01

    A comprehensive presentation is here made on the modular architecture of an e-learning platform with a distinctive emphasis on content personalization, combining advantages from semantic web technology, collaborative filtering and recommendation systems. Modules of this architecture handle information about both the domain-specific didactic…

  2. Personalized Recommendation of Learning Material Using Sequential Pattern Mining and Attribute Based Collaborative Filtering

    ERIC Educational Resources Information Center

    Salehi, Mojtaba; Nakhai Kamalabadi, Isa; Ghaznavi Ghoushchi, Mohammad Bagher

    2014-01-01

    Material recommender system is a significant part of e-learning systems for personalization and recommendation of appropriate materials to learners. However, in the existing recommendation algorithms, dynamic interests and multi-preference of learners and multidimensional-attribute of materials are not fully considered simultaneously. Moreover,…

  3. Contexts in a Paper Recommendation System with Collaborative Filtering

    ERIC Educational Resources Information Center

    Winoto, Pinata; Tang, Tiffany Ya; McCalla, Gordon

    2012-01-01

    Making personalized paper recommendations to users in an educational domain is not a trivial task of simply matching users' interests with a paper topic. Therefore, we proposed a context-aware multidimensional paper recommendation system that considers additional user and paper features. Earlier experiments on experienced graduate students…

  4. The Application of Collaborative Business Intelligence Technology in the Hospital SPD Logistics Management Model

    PubMed Central

    LIU, Tongzhu; SHEN, Aizong; HU, Xiaojian; TONG, Guixian; GU, Wei

    2017-01-01

    Background: We aimed to apply collaborative business intelligence (BI) system to hospital supply, processing and distribution (SPD) logistics management model. Methods: We searched Engineering Village database, China National Knowledge Infrastructure (CNKI) and Google for articles (Published from 2011 to 2016), books, Web pages, etc., to understand SPD and BI related theories and recent research status. For the application of collaborative BI technology in the hospital SPD logistics management model, we realized this by leveraging data mining techniques to discover knowledge from complex data and collaborative techniques to improve the theories of business process. Results: For the application of BI system, we: (i) proposed a layered structure of collaborative BI system for intelligent management in hospital logistics; (ii) built data warehouse for the collaborative BI system; (iii) improved data mining techniques such as supporting vector machines (SVM) and swarm intelligence firefly algorithm to solve key problems in hospital logistics collaborative BI system; (iv) researched the collaborative techniques oriented to data and business process optimization to improve the business processes of hospital logistics management. Conclusion: Proper combination of SPD model and BI system will improve the management of logistics in the hospitals. The successful implementation of the study requires: (i) to innovate and improve the traditional SPD model and make appropriate implement plans and schedules for the application of BI system according to the actual situations of hospitals; (ii) the collaborative participation of internal departments in hospital including the department of information, logistics, nursing, medical and financial; (iii) timely response of external suppliers. PMID:28828316

  5. Multi-filter spectrophotometry of quasar environments

    NASA Technical Reports Server (NTRS)

    Craven, Sally E.; Hickson, Paul; Yee, Howard K. C.

    1993-01-01

    A many-filter photometric technique for determining redshifts and morphological types, by fitting spectral templates to spectral energy distributions, has good potential for application in surveys. Despite success in studies performed on simulated data, the results have not been fully reliable when applied to real, low signal-to-noise data. We are investigating techniques to improve the fitting process.

  6. The Double Edge Technique for Doppler lidar wind measurement

    NASA Technical Reports Server (NTRS)

    Korb, C. Laurence; Gentry, Bruce M.; Li, S. Xingfu; Flesia, Cristina; Chen, Huailin; Mathur, S.

    1998-01-01

    The edge technique utilizes the edge of a high spectral resolution filter for high accuracy wind measurement using direct detection lidar. The signal is split between an edge filter channel and a broadband energy monitor channel. The energy monitor channel is used for signal normalization. The edge measurement is made as a differential frequency measurement between the outgoing laser signal and the atmospheric backscattered return for each pulse. As a result, the measurement is insensitive to laser and edge filter frequency jitter and drift at a level less than a few parts in 10(exp 10). We will discuss the methodology of the technique in detail, present a broad range of simulation results, and provide preprints of a journal article currently in press.

  7. High Resolution BPM Upgrade for the ATF Damping Ring at KEK

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eddy, N.; Briegel, C.; Fellenz, B.

    2011-08-17

    A beam position monitor (BPM) upgrade at the KEK Accelerator Test Facility (ATF) damping ring has been accomplished, carried out by a KEK/FNAL/SLAC collaboration under the umbrella of the global ILC R&D effort. The upgrade consists of a high resolution, high reproducibility read-out system, based on analog and digital down-conversion techniques, digital signal processing, and also implements a new automatic gain error correction schema. The technical concept and realization as well as results of beam studies are presented. The next generation of linear colliders require ultra-low vertical emittance of <2 pm-rad. The damping ring at the KEK Accelerator Test Facilitymore » (ATF) is designed to demonstrate this mission critical goal. A high resolution beam position monitor (BPM) system for the damping ring is one of the key tools for realizing this goal. The BPM system needs to provide two distnict measurements. First, a very high resolution ({approx}100-200nm) closed-orbit measurement which is averaged over many turns and realized with narrowband filter techniques - 'narrowband mode'. This is needed to monitor and steer the beam along an optimum orbit and to facilitate beam-based alignment to minimize non-linear field effects. Second, is the ability to make turn by turn (TBT) measurements to support optics studies and corrections necessary to achieve the design performance. As the TBT measurement necessitates a wider bandwidth, it is often referred to as 'wideband mode'. The BPM upgrade was initiated as a KEK/SLAC/FNAL collaboration in the frame of the Global Design Initiative of the International Linear Collider. The project was realized and completed using Japan-US funds with Fermilab as the core partner.« less

  8. A Personalized Electronic Movie Recommendation System Based on Support Vector Machine and Improved Particle Swarm Optimization

    PubMed Central

    Wang, Xibin; Luo, Fengji; Qian, Ying; Ranzi, Gianluca

    2016-01-01

    With the rapid development of ICT and Web technologies, a large an amount of information is becoming available and this is producing, in some instances, a condition of information overload. Under these conditions, it is difficult for a person to locate and access useful information for making decisions. To address this problem, there are information filtering systems, such as the personalized recommendation system (PRS) considered in this paper, that assist a person in identifying possible products or services of interest based on his/her preferences. Among available approaches, collaborative Filtering (CF) is one of the most widely used recommendation techniques. However, CF has some limitations, e.g., the relatively simple similarity calculation, cold start problem, etc. In this context, this paper presents a new regression model based on the support vector machine (SVM) classification and an improved PSO (IPSO) for the development of an electronic movie PRS. In its implementation, a SVM classification model is first established to obtain a preliminary movie recommendation list based on which a SVM regression model is applied to predict movies’ ratings. The proposed PRS not only considers the movie’s content information but also integrates the users’ demographic and behavioral information to better capture the users’ interests and preferences. The efficiency of the proposed method is verified by a series of experiments based on the MovieLens benchmark data set. PMID:27898691

  9. A Personalized Electronic Movie Recommendation System Based on Support Vector Machine and Improved Particle Swarm Optimization.

    PubMed

    Wang, Xibin; Luo, Fengji; Qian, Ying; Ranzi, Gianluca

    2016-01-01

    With the rapid development of ICT and Web technologies, a large an amount of information is becoming available and this is producing, in some instances, a condition of information overload. Under these conditions, it is difficult for a person to locate and access useful information for making decisions. To address this problem, there are information filtering systems, such as the personalized recommendation system (PRS) considered in this paper, that assist a person in identifying possible products or services of interest based on his/her preferences. Among available approaches, collaborative Filtering (CF) is one of the most widely used recommendation techniques. However, CF has some limitations, e.g., the relatively simple similarity calculation, cold start problem, etc. In this context, this paper presents a new regression model based on the support vector machine (SVM) classification and an improved PSO (IPSO) for the development of an electronic movie PRS. In its implementation, a SVM classification model is first established to obtain a preliminary movie recommendation list based on which a SVM regression model is applied to predict movies' ratings. The proposed PRS not only considers the movie's content information but also integrates the users' demographic and behavioral information to better capture the users' interests and preferences. The efficiency of the proposed method is verified by a series of experiments based on the MovieLens benchmark data set.

  10. Techniques for noise removal and registration of TIMS data

    USGS Publications Warehouse

    Hummer-Miller, S.

    1990-01-01

    Extracting subtle differences from highly correlated thermal infrared aircraft data is possible with appropriate noise filters, constructed and applied in the spatial frequency domain. This paper discusses a heuristic approach to designing noise filters for removing high- and low-spatial frequency striping and banding. Techniques for registering thermal infrared aircraft data to a topographic base using Thematic Mapper data are presented. The noise removal and registration techniques are applied to TIMS thermal infrared aircraft data. -Author

  11. Measurements of dimethyl sulfide and SO2 during GTE/CITE 3

    NASA Technical Reports Server (NTRS)

    Ferek, Ronald J.; Hegg, Dean A.

    1993-01-01

    As part of NASA's Tropospheric Experiment Chemical Instrumentation Test and Evaluation (GTE/CITE 3) Sulfur Gas Intercomparison, we conducted measurements of dimethyl sulfide (DMS) and SO2 using two techniques well suited for sampling from an aircraft due to their simplicity of design. DMS was collected by preconcentration on gold wire preceded by a KOH-impregnated filter oxidant scrubber, and analyzed by gas chromatography with flame photometric detection. SO2 was collected on K2CO3/glycerol-impregnated filters and analyzed by ion chromatography. In blind tests, both techniques produced excellent agreement with National Institutes of Standards and Technology (NIST) standards. For field measurements, the DMS technique produced excellent correlation with the mean of the six different techniques intercompared. For SO2, the five techniques intercompared were rather poorly correlated, but correlations between the three techniques which passed NIST standards tests were somewhat better. Our SO2 filter measurements exhibited rather large uncertainties due to higher than normal variabiltiy of the filter blanks, which we believe was caused by extended storage in the field. In measurements conducted off the coast of Natal, Brazil, a diurnal afternoon minimum in DMS concentrations accompanied by a corresponding maximum in SO2 concentrations was observed. However, due to rather large uncertainties in the SO2 measurements, any conclusions about the SO2 trend must by considered tentative.

  12. Improving Image Matching by Reducing Surface Reflections Using Polarising Filter Techniques

    NASA Astrophysics Data System (ADS)

    Conen, N.; Hastedt, H.; Kahmen, O.; Luhmann, T.

    2018-05-01

    In dense stereo matching applications surface reflections may lead to incorrect measurements and blunders in the resulting point cloud. To overcome the problem of disturbing reflexions polarising filters can be mounted on the camera lens and light source. Reflections in the images can be suppressed by crossing the polarising direction of the filters leading to homogeneous illuminated images and better matching results. However, the filter may influence the camera's orientation parameters as well as the measuring accuracy. To quantify these effects, a calibration and an accuracy analysis is conducted within a spatial test arrangement according to the German guideline VDI/VDE 2634.1 (2002) using a DSLR with and without polarising filter. In a second test, the interior orientation is analysed in more detail. The results do not show significant changes of the measuring accuracy in object space and only very small changes of the interior orientation (Δc ≤ 4 μm) with the polarising filter in use. Since in medical applications many tiny reflections are present and impede robust surface measurements, a prototypic trinocular endoscope is equipped with polarising technique. The interior and relative orientation is determined and analysed. The advantage of the polarising technique for medical image matching is shown in an experiment with a moistened pig kidney. The accuracy and completeness of the resulting point cloud can be improved clearly when using polarising filters. Furthermore, an accuracy analysis using a laser triangulation system is performed and the special reflection properties of metallic surfaces are presented.

  13. Ultra-accurate collaborative information filtering via directed user similarity

    NASA Astrophysics Data System (ADS)

    Guo, Q.; Song, W.-J.; Liu, J.-G.

    2014-07-01

    A key challenge of the collaborative filtering (CF) information filtering is how to obtain the reliable and accurate results with the help of peers' recommendation. Since the similarities from small-degree users to large-degree users would be larger than the ones in opposite direction, the large-degree users' selections are recommended extensively by the traditional second-order CF algorithms. By considering the users' similarity direction and the second-order correlations to depress the influence of mainstream preferences, we present the directed second-order CF (HDCF) algorithm specifically to address the challenge of accuracy and diversity of the CF algorithm. The numerical results for two benchmark data sets, MovieLens and Netflix, show that the accuracy of the new algorithm outperforms the state-of-the-art CF algorithms. Comparing with the CF algorithm based on random walks proposed by Liu et al. (Int. J. Mod. Phys. C, 20 (2009) 285) the average ranking score could reach 0.0767 and 0.0402, which is enhanced by 27.3% and 19.1% for MovieLens and Netflix, respectively. In addition, the diversity, precision and recall are also enhanced greatly. Without relying on any context-specific information, tuning the similarity direction of CF algorithms could obtain accurate and diverse recommendations. This work suggests that the user similarity direction is an important factor to improve the personalized recommendation performance.

  14. Social and content aware One-Class recommendation of papers in scientific social networks.

    PubMed

    Wang, Gang; He, XiRan; Ishuga, Carolyne Isigi

    2017-01-01

    With the rapid development of information technology, scientific social networks (SSNs) have become the fastest and most convenient way for researchers to communicate with each other. Many published papers are shared via SSNs every day, resulting in the problem of information overload. How to appropriately recommend personalized and highly valuable papers for researchers is becoming more urgent. However, when recommending papers in SSNs, only a small amount of positive instances are available, leaving a vast amount of unlabelled data, in which negative instances and potential unseen positive instances are mixed together, which naturally belongs to One-Class Collaborative Filtering (OCCF) problem. Therefore, considering the extreme data imbalance and data sparsity of this OCCF problem, a hybrid approach of Social and Content aware One-class Recommendation of Papers in SSNs, termed SCORP, is proposed in this study. Unlike previous approaches recommended to address the OCCF problem, social information, which has been proved playing a significant role in performing recommendations in many domains, is applied in both the profiling of content-based filtering and the collaborative filtering to achieve superior recommendations. To verify the effectiveness of the proposed SCORP approach, a real-life dataset from CiteULike was employed. The experimental results demonstrate that the proposed approach is superior to all of the compared approaches, thus providing a more effective method for recommending papers in SSNs.

  15. Social and content aware One-Class recommendation of papers in scientific social networks

    PubMed Central

    Wang, Gang; He, XiRan

    2017-01-01

    With the rapid development of information technology, scientific social networks (SSNs) have become the fastest and most convenient way for researchers to communicate with each other. Many published papers are shared via SSNs every day, resulting in the problem of information overload. How to appropriately recommend personalized and highly valuable papers for researchers is becoming more urgent. However, when recommending papers in SSNs, only a small amount of positive instances are available, leaving a vast amount of unlabelled data, in which negative instances and potential unseen positive instances are mixed together, which naturally belongs to One-Class Collaborative Filtering (OCCF) problem. Therefore, considering the extreme data imbalance and data sparsity of this OCCF problem, a hybrid approach of Social and Content aware One-class Recommendation of Papers in SSNs, termed SCORP, is proposed in this study. Unlike previous approaches recommended to address the OCCF problem, social information, which has been proved playing a significant role in performing recommendations in many domains, is applied in both the profiling of content-based filtering and the collaborative filtering to achieve superior recommendations. To verify the effectiveness of the proposed SCORP approach, a real-life dataset from CiteULike was employed. The experimental results demonstrate that the proposed approach is superior to all of the compared approaches, thus providing a more effective method for recommending papers in SSNs. PMID:28771495

  16. Sub-5-ps optical pulse generation from a 1.55-µm distributed-feedback laser diode with nanosecond electric pulse excitation and spectral filtering.

    PubMed

    Chen, Shaoqiang; Sato, Aya; Ito, Takashi; Yoshita, Masahiro; Akiyama, Hidefumi; Yokoyama, Hiroyuki

    2012-10-22

    This paper reports generation of sub-5-ps Fourier-transform limited optical pulses from a 1.55-µm gain-switched single-mode distributed-feedback laser diode via nanosecond electric excitation and a simple spectral-filtering technique. Typical damped oscillations of the whole lasing spectrum were observed in the time-resolved waveform. Through a spectral-filtering technique, the initial relaxation oscillation pulse and the following components in the output pulse can be well separated, and the initial short pulse can be selectively extracted by filtering out the short-wavelength components in the spectrum. Short pulses generated by this simple method are expected to have wide potential applications comparable to mode-locking lasers.

  17. Study of different filtering techniques applied to spectra from airborne gamma spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wilhelm, Emilien; Gutierrez, Sebastien; Reboli, Anne

    2015-07-01

    One of the features of spectra obtained by airborne gamma spectrometry is low counting statistics due to the short acquisition time (1 s) and the large source-detector distance (40 m). It leads to considerable uncertainty in radionuclide identification and determination of their respective activities from the windows method recommended by the IAEA, especially for low-level radioactivity. The present work compares the results obtained with filters in terms of errors of the filtered spectra with the window method and over the whole gamma energy range. The results are used to determine which filtering technique is the most suitable in combination withmore » some method for total stripping of the spectrum. (authors)« less

  18. Optimum constrained image restoration filters

    NASA Technical Reports Server (NTRS)

    Riemer, T. E.; Mcgillem, C. D.

    1974-01-01

    The filter was developed in Hilbert space by minimizing the radius of gyration of the overall or composite system point-spread function subject to constraints on the radius of gyration of the restoration filter point-spread function, the total noise power in the restored image, and the shape of the composite system frequency spectrum. An iterative technique is introduced which alters the shape of the optimum composite system point-spread function, producing a suboptimal restoration filter which suppresses undesirable secondary oscillations. Finally this technique is applied to multispectral scanner data obtained from the Earth Resources Technology Satellite to provide resolution enhancement. An experimental approach to the problems involving estimation of the effective scanner aperture and matching the ERTS data to available restoration functions is presented.

  19. Morphology studies of hydrophobic silica on filter surface prepared via spray technique

    NASA Astrophysics Data System (ADS)

    Shahfiq Zulkifli, Nazrul; Zaini Yunos, Muhamad; Ahmad, Azlinnorazia; Harun, Zawati; Akhair, Siti Hajar Mohd; Adibah Raja Ahmad, Raja; Hafeez Azhar, Faiz; Rashid, Abdul Qaiyyum Abd; Ismail, Al Emran

    2017-08-01

    This study investigated the effect of the hydrophobic surface treatment effect of air filter performance by using silica aerogel powder as an additive by using spray coating techniques. The membrane characterization tests were carried out on a filter prepared from different additive concentration. Studies on the cross-section and the distribution of particles on the membrane were carried out using a scanning electron microscope (SEM), and the surface morphology was investigated by x-ray spectroscopy (EDS). The results are shown by SEM and EDS that the microstructure filter, especially in the upper layer and sub-layer has been changed. The results also show an increase of hydrophobicity due to the increased quantity of silica aerogel powder.

  20. An Efficient Recommendation Filter Model on Smart Home Big Data Analytics for Enhanced Living Environments.

    PubMed

    Chen, Hao; Xie, Xiaoyun; Shu, Wanneng; Xiong, Naixue

    2016-10-15

    With the rapid growth of wireless sensor applications, the user interfaces and configurations of smart homes have become so complicated and inflexible that users usually have to spend a great amount of time studying them and adapting to their expected operation. In order to improve user experience, a weighted hybrid recommender system based on a Kalman Filter model is proposed to predict what users might want to do next, especially when users are located in a smart home with an enhanced living environment. Specifically, a weight hybridization method was introduced, which combines contextual collaborative filter and the contextual content-based recommendations. This method inherits the advantages of the optimum regression and the stability features of the proposed adaptive Kalman Filter model, and it can predict and revise the weight of each system component dynamically. Experimental results show that the hybrid recommender system can optimize the distribution of weights of each component, and achieve more reasonable recall and precision rates.

  1. An Efficient Recommendation Filter Model on Smart Home Big Data Analytics for Enhanced Living Environments

    PubMed Central

    Chen, Hao; Xie, Xiaoyun; Shu, Wanneng; Xiong, Naixue

    2016-01-01

    With the rapid growth of wireless sensor applications, the user interfaces and configurations of smart homes have become so complicated and inflexible that users usually have to spend a great amount of time studying them and adapting to their expected operation. In order to improve user experience, a weighted hybrid recommender system based on a Kalman Filter model is proposed to predict what users might want to do next, especially when users are located in a smart home with an enhanced living environment. Specifically, a weight hybridization method was introduced, which combines contextual collaborative filter and the contextual content-based recommendations. This method inherits the advantages of the optimum regression and the stability features of the proposed adaptive Kalman Filter model, and it can predict and revise the weight of each system component dynamically. Experimental results show that the hybrid recommender system can optimize the distribution of weights of each component, and achieve more reasonable recall and precision rates. PMID:27754456

  2. Social Network Analysis of Biomedical Research Collaboration Networks in a CTSA Institution

    PubMed Central

    Bian, Jiang; Xie, Mengjun; Topaloglu, Umit; Hudson, Teresa; Eswaran, Hari; Hogan, William

    2014-01-01

    BACKGROUND The popularity of social networks has triggered a number of research efforts on network analyses of research collaborations in the Clinical and Translational Science Award (CTSA) community. Those studies mainly focus on the general understanding of collaboration networks by measuring common network metrics. More fundamental questions about collaborations still remain unanswered such as recognizing “influential” nodes and identifying potential new collaborations that are most rewarding. METHODS We analyzed biomedical research collaboration networks (RCNs) constructed from a dataset of research grants collected at a CTSA institution (i.e. University of Arkansas for Medical Sciences (UAMS)) in a comprehensive and systematic manner. First, our analysis covers the full spectrum of a RCN study: from network modeling to network characteristics measurement, from key nodes recognition to potential links (collaborations) suggestion. Second, our analysis employs non-conventional model and techniques including a weighted network model for representing collaboration strength, rank aggregation for detecting important nodes, and Random Walk with Restart (RWR) for suggesting new research collaborations. RESULTS By applying our models and techniques to RCNs at UAMS prior to and after the CTSA, we have gained valuable insights that not only reveal the temporal evolution of the network dynamics but also assess the effectiveness of the CTSA and its impact on a research institution. We find that collaboration networks at UAMS are not scale-free but small-world. Quantitative measures have been obtained to evident that the RCNs at UAMS are moving towards favoring multidisciplinary research. Moreover, our link prediction model creates the basis of collaboration recommendations with an impressive accuracy (AUC: 0.990, MAP@3: 1.48 and MAP@5: 1.522). Last but not least, an open-source visual analytical tool for RCNs is being developed and released through Github. CONCLUSIONS Through this study, we have developed a set of techniques and tools for analyzing research collaboration networks and conducted a comprehensive case study focusing on a CTSA institution. Our findings demonstrate the promising future of these techniques and tools in understanding the generative mechanisms of research collaborations and helping identify beneficial collaborations to members in the research community. PMID:24560679

  3. A motion-compensated image filter for low-dose fluoroscopy in a real-time tumor-tracking radiotherapy system

    PubMed Central

    Miyamoto, Naoki; Ishikawa, Masayori; Sutherland, Kenneth; Suzuki, Ryusuke; Matsuura, Taeko; Toramatsu, Chie; Takao, Seishin; Nihongi, Hideaki; Shimizu, Shinichi; Umegaki, Kikuo; Shirato, Hiroki

    2015-01-01

    In the real-time tumor-tracking radiotherapy system, a surrogate fiducial marker inserted in or near the tumor is detected by fluoroscopy to realize respiratory-gated radiotherapy. The imaging dose caused by fluoroscopy should be minimized. In this work, an image processing technique is proposed for tracing a moving marker in low-dose imaging. The proposed tracking technique is a combination of a motion-compensated recursive filter and template pattern matching. The proposed image filter can reduce motion artifacts resulting from the recursive process based on the determination of the region of interest for the next frame according to the current marker position in the fluoroscopic images. The effectiveness of the proposed technique and the expected clinical benefit were examined by phantom experimental studies with actual tumor trajectories generated from clinical patient data. It was demonstrated that the marker motion could be traced in low-dose imaging by applying the proposed algorithm with acceptable registration error and high pattern recognition score in all trajectories, although some trajectories were not able to be tracked with the conventional spatial filters or without image filters. The positional accuracy is expected to be kept within ±2 mm. The total computation time required to determine the marker position is a few milliseconds. The proposed image processing technique is applicable for imaging dose reduction. PMID:25129556

  4. Filtered Rayleigh Scattering Measurements in a Buoyant Flow Field

    DTIC Science & Technology

    2008-03-01

    ENY/08-M22 Abstract Filtered Rayleigh Scattering (FRS) is a non-intrusive, laser -based flow characterization technique that consists of a narrow...linewidth laser , a molecular absorption filter, and a high resolution camera behind the filter to record images. Gases of different species have...different molecular scattering cross-sections that become apparent as they pass through the interrogating laser light source, and this difference is

  5. Novel Digital Signal Processing and Detection Techniques.

    DTIC Science & Technology

    1980-09-01

    decimation and interpolation [11, 1 2]. * Submitted by: Bede Liu Department of Electrical .l Engineering and Computer Science Princeton University ...on the use of recursive filters for decimation and interpolation. 4- UNCL.ASSIFIED~ SECURITY CLASSIFICATION OF PAGEfW1,en Data Fneprd) ...filter structure for realizing low-pass filter is developed 16,7]. By employing decimation and interpolation, the filter uses only coefficients 0, +1, and

  6. Kalman filter approach for uncertainty quantification in time-resolved laser-induced incandescence.

    PubMed

    Hadwin, Paul J; Sipkens, Timothy A; Thomson, Kevin A; Liu, Fengshan; Daun, Kyle J

    2018-03-01

    Time-resolved laser-induced incandescence (TiRe-LII) data can be used to infer spatially and temporally resolved volume fractions and primary particle size distributions of soot-laden aerosols, but these estimates are corrupted by measurement noise as well as uncertainties in the spectroscopic and heat transfer submodels used to interpret the data. Estimates of the temperature, concentration, and size distribution of soot primary particles within a sample aerosol are typically made by nonlinear regression of modeled spectral incandescence decay, or effective temperature decay, to experimental data. In this work, we employ nonstationary Bayesian estimation techniques to infer aerosol properties from simulated and experimental LII signals, specifically the extended Kalman filter and Schmidt-Kalman filter. These techniques exploit the time-varying nature of both the measurements and the models, and they reveal how uncertainty in the estimates computed from TiRe-LII data evolves over time. Both techniques perform better when compared with standard deterministic estimates; however, we demonstrate that the Schmidt-Kalman filter produces more realistic uncertainty estimates.

  7. Nonlinear filtering techniques for noisy geophysical data: Using big data to predict the future

    NASA Astrophysics Data System (ADS)

    Moore, J. M.

    2014-12-01

    Chaos is ubiquitous in physical systems. Within the Earth sciences it is readily evident in seismology, groundwater flows and drilling data. Models and workflows have been applied successfully to understand and even to predict chaotic systems in other scientific fields, including electrical engineering, neurology and oceanography. Unfortunately, the high levels of noise characteristic of our planet's chaotic processes often render these frameworks ineffective. This contribution presents techniques for the reduction of noise associated with measurements of nonlinear systems. Our ultimate aim is to develop data assimilation techniques for forward models that describe chaotic observations, such as episodic tremor and slip (ETS) events in fault zones. A series of nonlinear filters are presented and evaluated using classical chaotic systems. To investigate whether the filters can successfully mitigate the effect of noise typical of Earth science, they are applied to sunspot data. The filtered data can be used successfully to forecast sunspot evolution for up to eight years (see figure).

  8. Kalman filter techniques for accelerated Cartesian dynamic cardiac imaging.

    PubMed

    Feng, Xue; Salerno, Michael; Kramer, Christopher M; Meyer, Craig H

    2013-05-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome, and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and signal-to-noise ratio. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view-sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. Copyright © 2012 Wiley Periodicals, Inc.

  9. Kalman Filter Techniques for Accelerated Cartesian Dynamic Cardiac Imaging

    PubMed Central

    Feng, Xue; Salerno, Michael; Kramer, Christopher M.; Meyer, Craig H.

    2012-01-01

    In dynamic MRI, spatial and temporal parallel imaging can be exploited to reduce scan time. Real-time reconstruction enables immediate visualization during the scan. Commonly used view-sharing techniques suffer from limited temporal resolution, and many of the more advanced reconstruction methods are either retrospective, time-consuming, or both. A Kalman filter model capable of real-time reconstruction can be used to increase the spatial and temporal resolution in dynamic MRI reconstruction. The original study describing the use of the Kalman filter in dynamic MRI was limited to non-Cartesian trajectories, because of a limitation intrinsic to the dynamic model used in that study. Here the limitation is overcome and the model is applied to the more commonly used Cartesian trajectory with fast reconstruction. Furthermore, a combination of the Kalman filter model with Cartesian parallel imaging is presented to further increase the spatial and temporal resolution and SNR. Simulations and experiments were conducted to demonstrate that the Kalman filter model can increase the temporal resolution of the image series compared with view sharing techniques and decrease the spatial aliasing compared with TGRAPPA. The method requires relatively little computation, and thus is suitable for real-time reconstruction. PMID:22926804

  10. Shift-phase code multiplexing technique for holographic memories and optical interconnection

    NASA Astrophysics Data System (ADS)

    Honma, Satoshi; Muto, Shinzo; Okamoto, Atsushi

    2008-03-01

    Holographic technologies for optical memories and interconnection devices have been studied actively because of high storage capacity, many wiring patterns and high transmission rate. Among multiplexing techniques such as angular, phase code and wavelength-multiplexing, speckle multiplexing technique have gotten attention due to the simple optical setup having an adjustable random phase filter in only one direction. To keep simple construction and to suppress crosstalk among adjacent page data or wiring patterns for efficient holographic memories and interconnection, we have to consider about optimum randomness of the phase filter. The high randomness causes expanding an illumination area of reference beam on holographic media. On the other hands, the small randomness causes the crosstalk between adjacent hologram data. We have proposed the method of holographic multiplexing, shift-phase code multiplexing with a two-dimensional orthogonal matrix phase filter. A lot of orthogonal phase codes can be produced by shifting the phase filter in one direction. It is able to read and record the individual holograms with low crosstalk. We give the basic experimental result on holographic data multiplexing and consider the phase pattern of the filter to suppress the crosstalk between adjacent holograms sufficiently.

  11. Signal Identification and Isolation Utilizing Radio Frequency Photonics

    DTIC Science & Technology

    2017-09-01

    analyzers can measure the frequency of signals and filters can be used to separate the signals apart from one another. This report will review...different techniques for spectrum analysis and isolation. 15. SUBJECT TERMS radio frequency, photonics, spectrum analyzer, filters 16. SECURITY CLASSIFICATION...Analyzers .......................................................................................... 3 3.2 Frequency Identification using Filters

  12. Stabilized Alkali-Metal Ultraviolet-Band-Pass Filters

    NASA Technical Reports Server (NTRS)

    Mardesich, Nick; Fraschetti, George A.; Mccann, Timothy; Mayall, Sherwood D.; Dunn, Donald E.; Trauger, John T.

    1995-01-01

    Layers of bismuth 5 to 10 angstrom thick incorporated into alkali-metal ultraviolet-band-pass optical filters by use of advanced fabrication techniques. In new filters layer of bismuth helps to reduce surface migration of sodium. Sodium layer made more stable and decreased tendency to form pinholes by migration.

  13. Effective Multi-Query Expansions: Collaborative Deep Networks for Robust Landmark Retrieval.

    PubMed

    Wang, Yang; Lin, Xuemin; Wu, Lin; Zhang, Wenjie

    2017-03-01

    Given a query photo issued by a user (q-user), the landmark retrieval is to return a set of photos with their landmarks similar to those of the query, while the existing studies on the landmark retrieval focus on exploiting geometries of landmarks for similarity matches between candidate photos and a query photo. We observe that the same landmarks provided by different users over social media community may convey different geometry information depending on the viewpoints and/or angles, and may, subsequently, yield very different results. In fact, dealing with the landmarks with low quality shapes caused by the photography of q-users is often nontrivial and has seldom been studied. In this paper, we propose a novel framework, namely, multi-query expansions, to retrieve semantically robust landmarks by two steps. First, we identify the top- k photos regarding the latent topics of a query landmark to construct multi-query set so as to remedy its possible low quality shape. For this purpose, we significantly extend the techniques of Latent Dirichlet Allocation. Then, motivated by the typical collaborative filtering methods, we propose to learn a collaborative deep networks-based semantically, nonlinear, and high-level features over the latent factor for landmark photo as the training set, which is formed by matrix factorization over collaborative user-photo matrix regarding the multi-query set. The learned deep network is further applied to generate the features for all the other photos, meanwhile resulting into a compact multi-query set within such space. Then, the final ranking scores are calculated over the high-level feature space between the multi-query set and all other photos, which are ranked to serve as the final ranking list of landmark retrieval. Extensive experiments are conducted on real-world social media data with both landmark photos together with their user information to show the superior performance over the existing methods, especially our recently proposed multi-query based mid-level pattern representation method [1].

  14. Weighted Optimization-Based Distributed Kalman Filter for Nonlinear Target Tracking in Collaborative Sensor Networks.

    PubMed

    Chen, Jie; Li, Jiahong; Yang, Shuanghua; Deng, Fang

    2017-11-01

    The identification of the nonlinearity and coupling is crucial in nonlinear target tracking problem in collaborative sensor networks. According to the adaptive Kalman filtering (KF) method, the nonlinearity and coupling can be regarded as the model noise covariance, and estimated by minimizing the innovation or residual errors of the states. However, the method requires large time window of data to achieve reliable covariance measurement, making it impractical for nonlinear systems which are rapidly changing. To deal with the problem, a weighted optimization-based distributed KF algorithm (WODKF) is proposed in this paper. The algorithm enlarges the data size of each sensor by the received measurements and state estimates from its connected sensors instead of the time window. A new cost function is set as the weighted sum of the bias and oscillation of the state to estimate the "best" estimate of the model noise covariance. The bias and oscillation of the state of each sensor are estimated by polynomial fitting a time window of state estimates and measurements of the sensor and its neighbors weighted by the measurement noise covariance. The best estimate of the model noise covariance is computed by minimizing the weighted cost function using the exhaustive method. The sensor selection method is in addition to the algorithm to decrease the computation load of the filter and increase the scalability of the sensor network. The existence, suboptimality and stability analysis of the algorithm are given. The local probability data association method is used in the proposed algorithm for the multitarget tracking case. The algorithm is demonstrated in simulations on tracking examples for a random signal, one nonlinear target, and four nonlinear targets. Results show the feasibility and superiority of WODKF against other filtering algorithms for a large class of systems.

  15. A Monte Carlo technique for signal level detection in implanted intracranial pressure monitoring.

    PubMed

    Avent, R K; Charlton, J D; Nagle, H T; Johnson, R N

    1987-01-01

    Statistical monitoring techniques like CUSUM, Trigg's tracking signal and EMP filtering have a major advantage over more recent techniques, such as Kalman filtering, because of their inherent simplicity. In many biomedical applications, such as electronic implantable devices, these simpler techniques have greater utility because of the reduced requirements on power, logic complexity and sampling speed. The determination of signal means using some of the earlier techniques are reviewed in this paper, and a new Monte Carlo based method with greater capability to sparsely sample a waveform and obtain an accurate mean value is presented. This technique may find widespread use as a trend detection method when reduced power consumption is a requirement.

  16. The Lockheed alternate partial polarizer universal filter

    NASA Technical Reports Server (NTRS)

    Title, A. M.

    1976-01-01

    A tunable birefringent filter using an alternate partial polarizer design has been built. The filter has a transmission of 38% in polarized light. Its full width at half maximum is .09A at 5500A. It is tunable from 4500 to 8500A by means of stepping motor actuated rotating half wave plates and polarizers. Wave length commands and thermal compensation commands are generated by a PPD 11/10 minicomputer. The alternate partial polarizer universal filter is compared with the universal birefringent filter and the design techniques, construction methods, and filter performance are discussed in some detail. Based on the experience of this filter some conclusions regarding the future of birefringent filters are elaborated.

  17. Enhanced orbit determination filter: Inclusion of ground system errors as filter parameters

    NASA Technical Reports Server (NTRS)

    Masters, W. C.; Scheeres, D. J.; Thurman, S. W.

    1994-01-01

    The theoretical aspects of an orbit determination filter that incorporates ground-system error sources as model parameters for use in interplanetary navigation are presented in this article. This filter, which is derived from sequential filtering theory, allows a systematic treatment of errors in calibrations of transmission media, station locations, and earth orientation models associated with ground-based radio metric data, in addition to the modeling of the spacecraft dynamics. The discussion includes a mathematical description of the filter and an analytical comparison of its characteristics with more traditional filtering techniques used in this application. The analysis in this article shows that this filter has the potential to generate navigation products of substantially greater accuracy than more traditional filtering procedures.

  18. An adaptive technique for estimating the atmospheric density profile during the AE mission

    NASA Technical Reports Server (NTRS)

    Argentiero, P.

    1973-01-01

    A technique is presented for processing accelerometer data obtained during the AE missions in order to estimate the atmospheric density profile. A minimum variance, adaptive filter is utilized. The trajectory of the probe and probe parameters are in a consider mode where their estimates are unimproved but their associated uncertainties are permitted an impact on filter behavior. Simulations indicate that the technique is effective in estimating a density profile to within a few percentage points.

  19. Binary optical filters for scale invariant pattern recognition

    NASA Technical Reports Server (NTRS)

    Reid, Max B.; Downie, John D.; Hine, Butler P.

    1992-01-01

    Binary synthetic discriminant function (BSDF) optical filters which are invariant to scale changes in the target object of more than 50 percent are demonstrated in simulation and experiment. Efficient databases of scale invariant BSDF filters can be designed which discriminate between two very similar objects at any view scaled over a factor of 2 or more. The BSDF technique has considerable advantages over other methods for achieving scale invariant object recognition, as it also allows determination of the object's scale. In addition to scale, the technique can be used to design recognition systems invariant to other geometric distortions.

  20. The constrained discrete-time state-dependent Riccati equation technique for uncertain nonlinear systems

    NASA Astrophysics Data System (ADS)

    Chang, Insu

    The objective of the thesis is to introduce a relatively general nonlinear controller/estimator synthesis framework using a special type of the state-dependent Riccati equation technique. The continuous time state-dependent Riccati equation (SDRE) technique is extended to discrete-time under input and state constraints, yielding constrained (C) discrete-time (D) SDRE, referred to as CD-SDRE. For the latter, stability analysis and calculation of a region of attraction are carried out. The derivation of the D-SDRE under state-dependent weights is provided. Stability of the D-SDRE feedback system is established using Lyapunov stability approach. Receding horizon strategy is used to take into account the constraints on D-SDRE controller. Stability condition of the CD-SDRE controller is analyzed by using a switched system. The use of CD-SDRE scheme in the presence of constraints is then systematically demonstrated by applying this scheme to problems of spacecraft formation orbit reconfiguration under limited performance on thrusters. Simulation results demonstrate the efficacy and reliability of the proposed CD-SDRE. The CD-SDRE technique is further investigated in a case where there are uncertainties in nonlinear systems to be controlled. First, the system stability under each of the controllers in the robust CD-SDRE technique is separately established. The stability of the closed-loop system under the robust CD-SDRE controller is then proven based on the stability of each control system comprising switching configuration. A high fidelity dynamical model of spacecraft attitude motion in 3-dimensional space is derived with a partially filled fuel tank, assumed to have the first fuel slosh mode. The proposed robust CD-SDRE controller is then applied to the spacecraft attitude control system to stabilize its motion in the presence of uncertainties characterized by the first fuel slosh mode. The performance of the robust CD-SDRE technique is discussed. Subsequently, filtering techniques are investigated by using the D-SDRE technique. Detailed derivation of the D-SDRE-based filter (D-SDREF) is provided under the assumption of Gaussian noises and the stability condition of the error signal between the measured signal and the estimated signals is proven to be input-to-state stable. For the non-Gaussian distributed noises, we propose a filter by combining the D-SDREF and the particle filter (PF), named the combined D-SDRE/PF. Two algorithms for the filtering techniques are provided. Several filtering techniques are compared with challenging numerical examples to show the reliability and efficacy of the proposed D-SDREF and the combined D-SDRE/PF.

  1. Modeling Adsorption Based Filters (Bio-remediation of Heavy Metal Contaminated Water)

    NASA Astrophysics Data System (ADS)

    McCarthy, Chris

    I will discuss kinetic models of adsorption, as well as models of filters based on those mechanisms. These mathematical models have been developed in support of our interdisciplinary lab group, which is centered at BMCC/CUNY (City University of New York). Our group conducts research into bio-remediation of heavy metal contaminated water via filtration. The filters are constructed out of biomass, such as spent tea leaves. The spent tea leaves are available in large quantities as a result of the industrial production of tea beverages. The heavy metals bond with the surfaces of the tea leaves (adsorption). The models involve differential equations, stochastic methods, and recursive functions. I will compare the models' predictions to data obtained from computer simulations and experimentally by our lab group. Funding: CUNY Collaborative Incentive Research Grant (Round 12); CUNY Research Scholars Program.

  2. Design Techniques for Uniform-DFT, Linear Phase Filter Banks

    NASA Technical Reports Server (NTRS)

    Sun, Honglin; DeLeon, Phillip

    1999-01-01

    Uniform-DFT filter banks are an important class of filter banks and their theory is well known. One notable characteristic is their very efficient implementation when using polyphase filters and the FFT. Separately, linear phase filter banks, i.e. filter banks in which the analysis filters have a linear phase are also an important class of filter banks and desired in many applications. Unfortunately, it has been proved that one cannot design critically-sampled, uniform-DFT, linear phase filter banks and achieve perfect reconstruction. In this paper, we present a least-squares solution to this problem and in addition prove that oversampled, uniform-DFT, linear phase filter banks (which are also useful in many applications) can be constructed for perfect reconstruction. Design examples are included illustrate the methods.

  3. Quantitative filter technique measurements of spectral light absorption by aquatic particles using a portable integrating cavity absorption meter (QFT-ICAM).

    PubMed

    Röttgers, Rüdiger; Doxaran, David; Dupouy, Cecile

    2016-01-25

    The accurate determination of light absorption coefficients of particles in water, especially in very oligotrophic oceanic areas, is still a challenging task. Concentrating aquatic particles on a glass fiber filter and using the Quantitative Filter Technique (QFT) is a common practice. Its routine application is limited by the necessary use of high performance spectrophotometers, distinct problems induced by the strong scattering of the filters and artifacts induced by freezing and storing samples. Measurements of the sample inside a large integrating sphere reduce scattering effects and direct field measurements avoid artifacts due to sample preservation. A small, portable, Integrating Cavity Absorption Meter setup (QFT-ICAM) is presented, that allows rapid measurements of a sample filter. The measurement technique takes into account artifacts due to chlorophyll-a fluorescence. The QFT-ICAM is shown to be highly comparable to similar measurements in laboratory spectrophotometers, in terms of accuracy, precision, and path length amplification effects. No spectral artifacts were observed when compared to measurement of samples in suspension, whereas freezing and storing of sample filters induced small losses of water-soluble pigments (probably phycoerythrins). Remaining problems in determining the particulate absorption coefficient with the QFT-ICAM are strong sample-to-sample variations of the path length amplification, as well as fluorescence by pigments that is emitted in a different spectral region than that of chlorophyll-a.

  4. Comparison of Nonlinear Filtering Techniques for Lunar Surface Roving Navigation

    NASA Technical Reports Server (NTRS)

    Kimber, Lemon; Welch, Bryan W.

    2008-01-01

    Leading up to the Apollo missions the Extended Kalman Filter, a modified version of the Kalman Filter, was developed to estimate the state of a nonlinear system. Throughout the Apollo missions, Potter's Square Root Filter was used for lunar navigation. Now that NASA is returning to the Moon, the filters used during the Apollo missions must be compared to the filters that have been developed since that time, the Bierman-Thornton Filter (UD) and the Unscented Kalman Filter (UKF). The UD Filter involves factoring the covariance matrix into UDUT and has similar accuracy to the Square Root Filter; however it requires less computation time. Conversely, the UKF, which uses sigma points, is much more computationally intensive than any of the filters; however it produces the most accurate results. The Extended Kalman Filter, Potter's Square Root Filter, the Bierman-Thornton UD Filter, and the Unscented Kalman Filter each prove to be the most accurate filter depending on the specific conditions of the navigation system.

  5. Improving the quality of reconstructed X-ray CT images of polymer gel dosimeters: zero-scan coupled with adaptive mean filtering.

    PubMed

    Kakakhel, M B; Jirasek, A; Johnston, H; Kairn, T; Trapp, J V

    2017-03-01

    This study evaluated the feasibility of combining the 'zero-scan' (ZS) X-ray computed tomography (CT) based polymer gel dosimeter (PGD) readout with adaptive mean (AM) filtering for improving the signal to noise ratio (SNR), and to compare these results with available average scan (AS) X-ray CT readout techniques. NIPAM PGD were manufactured, irradiated with 6 MV photons, CT imaged and processed in Matlab. AM filter for two iterations, with 3 × 3 and 5 × 5 pixels (kernel size), was used in two scenarios (a) the CT images were subjected to AM filtering (pre-processing) and these were further employed to generate AS and ZS gel images, and (b) the AS and ZS images were first reconstructed from the CT images and then AM filtering was carried out (post-processing). SNR was computed in an ROI of 30 × 30 for different pre and post processing cases. Results showed that the ZS technique combined with AM filtering resulted in improved SNR. Using the previously-recommended 25 images for reconstruction the ZS pre-processed protocol can give an increase of 44% and 80% in SNR for 3 × 3 and 5 × 5 kernel sizes respectively. However, post processing using both techniques and filter sizes introduced blur and a reduction in the spatial resolution. Based on this work, it is possible to recommend that the ZS method may be combined with pre-processed AM filtering using appropriate kernel size, to produce a large increase in the SNR of the reconstructed PGD images.

  6. Terminal homing position estimation forAutonomous underwater vehicle docking

    DTIC Science & Technology

    2017-06-01

    used by the AUV to improve its position estimate. Due to the nonlinearity of the D-USBL measurements, the Extended Kalman Filter (EKF), Unscented...Kalman Filter (UKF) and forward and backward smoothing (FBS) filter were utilized to estimate the position of the AUV. After performance of these... filters was deemed unsatisfactory, a new smoothing technique called the Moving Horizon Estimation (MHE) with epi-splines was introduced. The MHE

  7. Filtering in Hybrid Dynamic Bayesian Networks

    NASA Technical Reports Server (NTRS)

    Andersen, Morten Nonboe; Andersen, Rasmus Orum; Wheeler, Kevin

    2000-01-01

    We implement a 2-time slice dynamic Bayesian network (2T-DBN) framework and make a 1-D state estimation simulation, an extension of the experiment in (v.d. Merwe et al., 2000) and compare different filtering techniques. Furthermore, we demonstrate experimentally that inference in a complex hybrid DBN is possible by simulating fault detection in a watertank system, an extension of the experiment in (Koller & Lerner, 2000) using a hybrid 2T-DBN. In both experiments, we perform approximate inference using standard filtering techniques, Monte Carlo methods and combinations of these. In the watertank simulation, we also demonstrate the use of 'non-strict' Rao-Blackwellisation. We show that the unscented Kalman filter (UKF) and UKF in a particle filtering framework outperform the generic particle filter, the extended Kalman filter (EKF) and EKF in a particle filtering framework with respect to accuracy in terms of estimation RMSE and sensitivity with respect to choice of network structure. Especially we demonstrate the superiority of UKF in a PF framework when our beliefs of how data was generated are wrong. Furthermore, we investigate the influence of data noise in the watertank simulation using UKF and PFUKD and show that the algorithms are more sensitive to changes in the measurement noise level that the process noise level. Theory and implementation is based on (v.d. Merwe et al., 2000).

  8. Aligning Collaborative and Culturally Responsive Evaluation Approaches

    ERIC Educational Resources Information Center

    Askew, Karyl; Beverly, Monifa Green; Jay, Michelle L.

    2012-01-01

    The authors, three African-American women trained as collaborative evaluators, offer a comparative analysis of collaborative evaluation (O'Sullivan, 2004) and culturally responsive evaluation approaches (Frierson, Hood, & Hughes, 2002; Kirkhart & Hopson, 2010). Collaborative evaluation techniques immerse evaluators in the cultural milieu…

  9. Thermographic image analysis for classification of ACL rupture disease, bone cancer, and feline hyperthyroid, with Gabor filters

    NASA Astrophysics Data System (ADS)

    Alvandipour, Mehrdad; Umbaugh, Scott E.; Mishra, Deependra K.; Dahal, Rohini; Lama, Norsang; Marino, Dominic J.; Sackman, Joseph

    2017-05-01

    Thermography and pattern classification techniques are used to classify three different pathologies in veterinary images. Thermographic images of both normal and diseased animals were provided by the Long Island Veterinary Specialists (LIVS). The three pathologies are ACL rupture disease, bone cancer, and feline hyperthyroid. The diagnosis of these diseases usually involves radiology and laboratory tests while the method that we propose uses thermographic images and image analysis techniques and is intended for use as a prescreening tool. Images in each category of pathologies are first filtered by Gabor filters and then various features are extracted and used for classification into normal and abnormal classes. Gabor filters are linear filters that can be characterized by the two parameters wavelength λ and orientation θ. With two different wavelength and five different orientations, a total of ten different filters were studied. Different combinations of camera views, filters, feature vectors, normalization methods, and classification methods, produce different tests that were examined and the sensitivity, specificity and success rate for each test were produced. Using the Gabor features alone, sensitivity, specificity, and overall success rates of 85% for each of the pathologies was achieved.

  10. Pixelated filters for spatial imaging

    NASA Astrophysics Data System (ADS)

    Mathieu, Karine; Lequime, Michel; Lumeau, Julien; Abel-Tiberini, Laetitia; Savin De Larclause, Isabelle; Berthon, Jacques

    2015-10-01

    Small satellites are often used by spatial agencies to meet scientific spatial mission requirements. Their payloads are composed of various instruments collecting an increasing amount of data, as well as respecting the growing constraints relative to volume and mass; So small-sized integrated camera have taken a favored place among these instruments. To ensure scene specific color information sensing, pixelated filters seem to be more attractive than filter wheels. The work presented here, in collaboration with Institut Fresnel, deals with the manufacturing of this kind of component, based on thin film technologies and photolithography processes. CCD detectors with a pixel pitch about 30 μm were considered. In the configuration where the matrix filters are positioned the closest to the detector, the matrix filters are composed of 2x2 macro pixels (e.g. 4 filters). These 4 filters have a bandwidth about 40 nm and are respectively centered at 550, 700, 770 and 840 nm with a specific rejection rate defined on the visible spectral range [500 - 900 nm]. After an intense design step, 4 thin-film structures have been elaborated with a maximum thickness of 5 μm. A run of tests has allowed us to choose the optimal micro-structuration parameters. The 100x100 matrix filters prototypes have been successfully manufactured with lift-off and ion assisted deposition processes. High spatial and spectral characterization, with a dedicated metrology bench, showed that initial specifications and simulations were globally met. These excellent performances knock down the technological barriers for high-end integrated specific multi spectral imaging.

  11. Proposing a Wiki-Based Technique for Collaborative Essay Writing (Propuesta de un modelo pedagógico para la escritura colaborativa de ensayos en un entorno virtual wiki)

    ERIC Educational Resources Information Center

    Ortiz Navarrete, Mabel; Ferreira Cabrera, Anita

    2014-01-01

    This paper aims at proposing a technique for students learning English as a foreign language when they collaboratively write an argumentative essay in a wiki environment. A wiki environment and collaborative work play an important role within the academic writing task. Nevertheless, an appropriate and systematic work assignment is required in…

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirokawa, Takako; /U. Colorado, Boulder /SLAC

    In this paper, we examine data acquisition in a high harmonic generation (HHG) lab and preliminary data analysis with the Cyclohexadiene Collaboration at the Linac Coherent Lightsource (LCLS) at SLAC National Accelerator Laboratory. HHG experiments have a large number of parameters that need to be monitored constantly. In particular, the pressure of the target is critical to HHG yield. However, this pressure can fluctuate wildly and without a tool to monitor it, it is difficult to analyze the correlation between HHG yield and the pressure. I used the Arduino microcontroller board and created a complementary MATLAB graphical user interface (GUI),more » thereby enhancing the ease with which users can acquire time-stamped parameter data. Using the Arduino, it is much easier to match the pressure to the corresponding HHG yield. Collecting data by using the Arduino and the GUI is flexible, user-friendly, and cost-effective. In the future, we hope to be able to control and monitor parts of the lab with the Arduino alone. While more parameter information is needed in the HHG lab, we needed to reduce the amount of data during the cyclohexadiene collaboration. This was achieved by sorting the data into bins and filtering out unnecessary details. This method was highly effective in that it minimized the amount of data without losing any valuable information. This effective preliminary data analysis technique will continue to be used to decrease the size of the collected data.« less

  13. Multidimensional deconvolution of optical microscope and ultrasound imaging using adaptive least-mean-square (LMS) inverse filtering

    NASA Astrophysics Data System (ADS)

    Sapia, Mark Angelo

    2000-11-01

    Three-dimensional microscope images typically suffer from reduced resolution due to the effects of convolution, optical aberrations and out-of-focus blurring. Two- dimensional ultrasound images are also degraded by convolutional bluffing and various sources of noise. Speckle noise is a major problem in ultrasound images. In microscopy and ultrasound, various methods of digital filtering have been used to improve image quality. Several methods of deconvolution filtering have been used to improve resolution by reversing the convolutional effects, many of which are based on regularization techniques and non-linear constraints. The technique discussed here is a unique linear filter for deconvolving 3D fluorescence microscopy or 2D ultrasound images. The process is to solve for the filter completely in the spatial-domain using an adaptive algorithm to converge to an optimum solution for de-blurring and resolution improvement. There are two key advantages of using an adaptive solution: (1)it efficiently solves for the filter coefficients by taking into account all sources of noise and degraded resolution at the same time, and (2)achieves near-perfect convergence to the ideal linear deconvolution filter. This linear adaptive technique has other advantages such as avoiding artifacts of frequency-domain transformations and concurrent adaptation to suppress noise. Ultimately, this approach results in better signal-to-noise characteristics with virtually no edge-ringing. Many researchers have not adopted linear techniques because of poor convergence, noise instability and negative valued data in the results. The methods presented here overcome many of these well-documented disadvantages and provide results that clearly out-perform other linear methods and may also out-perform regularization and constrained algorithms. In particular, the adaptive solution is most responsible for overcoming the poor performance associated with linear techniques. This linear adaptive approach to deconvolution is demonstrated with results of restoring blurred phantoms for both microscopy and ultrasound and restoring 3D microscope images of biological cells and 2D ultrasound images of human subjects (courtesy of General Electric and Diasonics, Inc.).

  14. Standardization of search methods for guideline development: an international survey of evidence-based guideline development groups.

    PubMed

    Deurenberg, Rikie; Vlayen, Joan; Guillo, Sylvie; Oliver, Thomas K; Fervers, Beatrice; Burgers, Jako

    2008-03-01

    Effective literature searching is particularly important for clinical practice guideline development. Sophisticated searching and filtering mechanisms are needed to help ensure that all relevant research is reviewed. To assess the methods used for the selection of evidence for guideline development by evidence-based guideline development organizations. A semistructured questionnaire assessing the databases, search filters and evaluation methods used for literature retrieval was distributed to eight major organizations involved in evidence-based guideline development. All of the organizations used search filters as part of guideline development. The medline database was the primary source accessed for literature retrieval. The OVID or SilverPlatter interfaces were used in preference to the freely accessed PubMed interface. The Cochrane Library, embase, cinahl and psycinfo databases were also frequently used by the organizations. All organizations reported the intention to improve and validate their filters for finding literature specifically relevant for guidelines. In the first international survey of its kind, eight major guideline development organizations indicated a strong interest in identifying, improving and standardizing search filters to improve guideline development. It is to be hoped that this will result in the standardization of, and open access to, search filters, an improvement in literature searching outcomes and greater collaboration among guideline development organizations.

  15. Collaborative emitter tracking using Rao-Blackwellized random exchange diffusion particle filtering

    NASA Astrophysics Data System (ADS)

    Bruno, Marcelo G. S.; Dias, Stiven S.

    2014-12-01

    We introduce in this paper the fully distributed, random exchange diffusion particle filter (ReDif-PF) to track a moving emitter using multiple received signal strength (RSS) sensors. We consider scenarios with both known and unknown sensor model parameters. In the unknown parameter case, a Rao-Blackwellized (RB) version of the random exchange diffusion particle filter, referred to as the RB ReDif-PF, is introduced. In a simulated scenario with a partially connected network, the proposed ReDif-PF outperformed a PF tracker that assimilates local neighboring measurements only and also outperformed a linearized random exchange distributed extended Kalman filter (ReDif-EKF). Furthermore, the novel ReDif-PF matched the tracking error performance of alternative suboptimal distributed PFs based respectively on iterative Markov chain move steps and selective average gossiping with an inter-node communication cost that is roughly two orders of magnitude lower than the corresponding cost for the Markov chain and selective gossip filters. Compared to a broadcast-based filter which exactly mimics the optimal centralized tracker or its equivalent (exact) consensus-based implementations, ReDif-PF showed a degradation in steady-state error performance. However, compared to the optimal consensus-based trackers, ReDif-PF is better suited for real-time applications since it does not require iterative inter-node communication between measurement arrivals.

  16. Optimum coding techniques for MST radars

    NASA Technical Reports Server (NTRS)

    Sulzer, M. P.; Woodman, R. F.

    1986-01-01

    The optimum coding technique for MST (mesosphere stratosphere troposphere) radars is that which gives the lowest possible sidelobes in practice and can be implemented without too much computing power. Coding techniques are described in Farley (1985). A technique mentioned briefly there but not fully developed and not in general use is discussed here. This is decoding by means of a filter which is not matched to the transmitted waveform, in order to reduce sidelobes below the level obtained with a matched filter. This is the first part of the technique discussed here; the second part consists of measuring the transmitted waveform and using it as the basis for the decoding filter, thus reducing errors due to imperfections in the transmitter. There are two limitations to this technique. The first is a small loss in signal to noise ratio (SNR), which usually is not significant. The second problem is related to incomplete information received at the lowest ranges. An appendix shows a technique for handling this problem. Finally, it is shown that the use of complementary codes on transmission and nonmatched decoding gives the lowest possible sidelobe level and the minimum loss in SNR due to mismatch.

  17. SkyMapper Filter Set: Design and Fabrication of Large-Scale Optical Filters

    NASA Astrophysics Data System (ADS)

    Bessell, Michael; Bloxham, Gabe; Schmidt, Brian; Keller, Stefan; Tisserand, Patrick; Francis, Paul

    2011-07-01

    The SkyMapper Southern Sky Survey will be conducted from Siding Spring Observatory with u, v, g, r, i, and z filters that comprise glued glass combination filters with dimensions of 309 × 309 × 15 mm. In this article we discuss the rationale for our bandpasses and physical characteristics of the filter set. The u, v, g, and z filters are entirely glass filters, which provide highly uniform bandpasses across the complete filter aperture. The i filter uses glass with a short-wave pass coating, and the r filter is a complete dielectric filter. We describe the process by which the filters were constructed, including the processes used to obtain uniform dielectric coatings and optimized narrowband antireflection coatings, as well as the technique of gluing the large glass pieces together after coating using UV transparent epoxy cement. The measured passbands, including extinction and CCD QE, are presented.

  18. Endobronchial Forceps-Assisted and Excimer Laser-Assisted Inferior Vena Cava Filter Removal: The Data, Where We Are, and How It Is Done.

    PubMed

    Chen, James X; Montgomery, Jennifer; McLennan, Gordon; Stavropoulos, S William

    2018-06-01

    The recognition of inferior vena cava filter related complications has motivated increased attentiveness in clinical follow-up of patients with inferior vena cava filters and has led to development of multiple approaches for retrieving filters that are challenging or impossible to remove using conventional techniques. Endobronchial forceps and excimer lasers are tools for designed to aid in complex inferior vena cava filter removals. This article discusses endobronchial forceps-assisted and excimer laser-assisted inferior vena cava filter retrievals. Copyright © 2018 Elsevier Inc. All rights reserved.

  19. Effect of high latitude filtering on NWP skill

    NASA Technical Reports Server (NTRS)

    Kalnay, E.; Takacs, L. L.; Hoffman, R. N.

    1984-01-01

    The high latitude filtering techniques commonly employed in global grid point models to eliminate the high frequency waves associated with the convergence of meridians, can introduce serious distortions which ultimately affect the solution at all latitudes. Experiments completed so far with the 4 deg x 5 deg, 9-level GLAS Fourth Order Model indicate that the high latitude filter currently in operation affects only minimally its forecasting skill. In one case, however, the use of pressure gradient filter significantly improved the forecast. Three day forecasts with the pressure gradient and operational filters are compared as are 5-day forecasts with no filter.

  20. Frequency-selective quantitation of short-echo time 1H magnetic resonance spectra

    NASA Astrophysics Data System (ADS)

    Poullet, Jean-Baptiste; Sima, Diana M.; Van Huffel, Sabine; Van Hecke, Paul

    2007-06-01

    Accurate and efficient filtering techniques are required to suppress large nuisance components present in short-echo time magnetic resonance (MR) spectra. This paper discusses two powerful filtering techniques used in long-echo time MR spectral quantitation, the maximum-phase FIR filter (MP-FIR) and the Hankel-Lanczos Singular Value Decomposition with Partial ReOrthogonalization (HLSVD-PRO), and shows that they can be applied to their more complex short-echo time spectral counterparts. Both filters are validated and compared through extensive simulations. Their properties are discussed. In particular, the capability of MP-FIR for dealing with macromolecular components is emphasized. Although this property does not make a large difference for long-echo time MR spectra, it can be important when quantifying short-echo time spectra.

  1. Micromechanical Signal Processors

    NASA Astrophysics Data System (ADS)

    Nguyen, Clark Tu-Cuong

    Completely monolithic high-Q micromechanical signal processors constructed of polycrystalline silicon and integrated with CMOS electronics are described. The signal processors implemented include an oscillator, a bandpass filter, and a mixer + filter--all of which are components commonly required for up- and down-conversion in communication transmitters and receivers, and all of which take full advantage of the high Q of micromechanical resonators. Each signal processor is designed, fabricated, then studied with particular attention to the performance consequences associated with miniaturization of the high-Q element. The fabrication technology which realizes these components merges planar integrated circuit CMOS technologies with those of polysilicon surface micromachining. The technologies are merged in a modular fashion, where the CMOS is processed in the first module, the microstructures in a following separate module, and at no point in the process sequence are steps from each module intermixed. Although the advantages of such modularity include flexibility in accommodating new module technologies, the developed process constrained the CMOS metallization to a high temperature refractory metal (tungsten metallization with TiSi _2 contact barriers) and constrained the micromachining process to long-term temperatures below 835^circC. Rapid-thermal annealing (RTA) was used to relieve residual stress in the mechanical structures. To reduce the complexity involved with developing this merged process, capacitively transduced resonators are utilized. High-Q single resonator and spring-coupled micromechanical resonator filters are also investigated, with particular attention to noise performance, bandwidth control, and termination design. The noise in micromechanical filters is found to be fairly high due to poor electromechanical coupling on the micro-scale with present-day technologies. Solutions to this high series resistance problem are suggested, including smaller electrode-to-resonator gaps to increase the coupling capacitance. Active Q-control techniques are demonstrated which control the bandwidth of micromechanical filters and simulate filter terminations with little passband distortion. Noise analysis shows that these active techniques are relatively quiet when compared with other resistive techniques. Modulation techniques are investigated whereby a single resonator or a filter constructed from several such resonators can provide both a mixing and a filtering function, or a filtering and amplitude modulation function. These techniques center around the placement of a carrier signal on the micromechanical resonator. Finally, micro oven stabilization is investigated in an attempt to null the temperature coefficient of a polysilicon micromechanical resonator. Here, surface micromachining procedures are utilized to fabricate a polysilicon resonator on a microplatform--two levels of suspension--equipped with heater and temperature sensing resistors, which are then imbedded in a feedback loop to control the platform (and resonator) temperature. (Abstract shortened by UMI.).

  2. Restructuring the Future Classroom--A Global Perspective

    ERIC Educational Resources Information Center

    Shivakumar, G. S.; Manichander, T.

    2013-01-01

    The students are the consumers as well as co-creators of knowledge. Information does not flow top-down any more. Networks, peers and students inquisitiveness teach students. Teachers act as filters. Collaboration is the key. In today's world for the netgen, knowingly or unknowingly technology and the free flow of information via internet has made…

  3. Archive 2.0: What Composition Students and Academic Libraries Can Gain from Digital-Collaborative Pedagogies

    ERIC Educational Resources Information Center

    Vetter, Matthew A.

    2014-01-01

    Research across disciplines in recent years has demonstrated a number of gains involved in community engagement and service-learning pedagogies. More recently, these pedagogies are being filtered into digital contexts as instructors begin to realize the opportunities made available by online writing venues. This presentation describes a specific…

  4. Improved Personalized Recommendation Based on Causal Association Rule and Collaborative Filtering

    ERIC Educational Resources Information Center

    Lei, Wu; Qing, Fang; Zhou, Jin

    2016-01-01

    There are usually limited user evaluation of resources on a recommender system, which caused an extremely sparse user rating matrix, and this greatly reduce the accuracy of personalized recommendation, especially for new users or new items. This paper presents a recommendation method based on rating prediction using causal association rules.…

  5. USE OF BONE CHAR FOR THE REMOVAL OF ARSENIC AND URANIUM FROM GROUNDWATER AT THE PINE RIDGE RESERVATION

    EPA Science Inventory

    The student project team will work with faculty advisors at UIUC, advisors at Oglala Lakota College, and with residents of the Pine Ridge Reservation. Through this collaborative effort, we expect to identify filter materials including bone char that will effectively remove ars...

  6. Symposium on Applications and the Internet (SAINT 2003) Proceedings (Orlando, Florida, January 27-31, 2003).

    ERIC Educational Resources Information Center

    Helal, Sumi, Ed.; Oie, Yuji, Ed.; Chang, Carl, Ed.; Murai, Jun, Ed.

    This proceedings from the 2003 Symposium on Applications and the Internet (SAINT) contains papers from sessions on: (1) mobile Internet, including a target-driven cache replacement policy, context-awareness for service discovery, and XML transformation; (2) collaboration technology I, including human-network-based filtering, virtual collaboration…

  7. Walking on a user similarity network towards personalized recommendations.

    PubMed

    Gan, Mingxin

    2014-01-01

    Personalized recommender systems have been receiving more and more attention in addressing the serious problem of information overload accompanying the rapid evolution of the world-wide-web. Although traditional collaborative filtering approaches based on similarities between users have achieved remarkable success, it has been shown that the existence of popular objects may adversely influence the correct scoring of candidate objects, which lead to unreasonable recommendation results. Meanwhile, recent advances have demonstrated that approaches based on diffusion and random walk processes exhibit superior performance over collaborative filtering methods in both the recommendation accuracy and diversity. Building on these results, we adopt three strategies (power-law adjustment, nearest neighbor, and threshold filtration) to adjust a user similarity network from user similarity scores calculated on historical data, and then propose a random walk with restart model on the constructed network to achieve personalized recommendations. We perform cross-validation experiments on two real data sets (MovieLens and Netflix) and compare the performance of our method against the existing state-of-the-art methods. Results show that our method outperforms existing methods in not only recommendation accuracy and diversity, but also retrieval performance.

  8. Lazy collaborative filtering for data sets with missing values.

    PubMed

    Ren, Yongli; Li, Gang; Zhang, Jun; Zhou, Wanlei

    2013-12-01

    As one of the biggest challenges in research on recommender systems, the data sparsity issue is mainly caused by the fact that users tend to rate a small proportion of items from the huge number of available items. This issue becomes even more problematic for the neighborhood-based collaborative filtering (CF) methods, as there are even lower numbers of ratings available in the neighborhood of the query item. In this paper, we aim to address the data sparsity issue in the context of neighborhood-based CF. For a given query (user, item), a set of key ratings is first identified by taking the historical information of both the user and the item into account. Then, an auto-adaptive imputation (AutAI) method is proposed to impute the missing values in the set of key ratings. We present a theoretical analysis to show that the proposed imputation method effectively improves the performance of the conventional neighborhood-based CF methods. The experimental results show that our new method of CF with AutAI outperforms six existing recommendation methods in terms of accuracy.

  9. An Innovations-Based Noise Cancelling Technique on Inverse Kepstrum Whitening Filter and Adaptive FIR Filter in Beamforming Structure

    PubMed Central

    Jeong, Jinsoo

    2011-01-01

    This paper presents an acoustic noise cancelling technique using an inverse kepstrum system as an innovations-based whitening application for an adaptive finite impulse response (FIR) filter in beamforming structure. The inverse kepstrum method uses an innovations-whitened form from one acoustic path transfer function between a reference microphone sensor and a noise source so that the rear-end reference signal will then be a whitened sequence to a cascaded adaptive FIR filter in the beamforming structure. By using an inverse kepstrum filter as a whitening filter with the use of a delay filter, the cascaded adaptive FIR filter estimates only the numerator of the polynomial part from the ratio of overall combined transfer functions. The test results have shown that the adaptive FIR filter is more effective in beamforming structure than an adaptive noise cancelling (ANC) structure in terms of signal distortion in the desired signal and noise reduction in noise with nonminimum phase components. In addition, the inverse kepstrum method shows almost the same convergence level in estimate of noise statistics with the use of a smaller amount of adaptive FIR filter weights than the kepstrum method, hence it could provide better computational simplicity in processing. Furthermore, the rear-end inverse kepstrum method in beamforming structure has shown less signal distortion in the desired signal than the front-end kepstrum method and the front-end inverse kepstrum method in beamforming structure. PMID:22163987

  10. Günther Tulip and Celect IVC filters in multiple-trauma patients.

    PubMed

    Rosenthal, David; Kochupura, Paul V; Wellons, Eric D; Burkett, Allison B; Methodius-Rayford, Walaya C

    2009-08-01

    To evaluate results with the retrievable Günther Tulip (GT) and Celect inferior vena cava filters (IVCFs) placed at the intensive care unit (ICU) bedside under "real-time" intravascular ultrasound (IVUS) guidance in multiple-trauma patients. Between December 2004 and December 2008, 187 multiple-trauma patients (109 men; mean age 44+/-2 years, range 17-71) with contraindications to low-dose anticoagulation therapy or sequential compression devices had Günther Tulip (n = 97) or Celect (n = 90) retrievable IVCFs placed under real-time IVUS guidance. Günther Tulip filters were inserted using a "double-puncture" technique. The Celect IVCFs were placed with a simplified single-puncture technique in which the filter introducer sheath was advanced until the radiopaque tip "covered" the IVUS image of the renal vein, indicating that the filter sheath was in position for filter deployment. The 2 filter groups were compared on the endpoints of technical implantation success, retrievability, prevention of PE, and procedure-related deep vein thrombosis (DVT). As verified by abdominal radiography, 93.1% (174/187) of IVCFs were placed without complications; 6 IVCFs (all GT; p = 0.03 versus Celect) were misplaced in the iliac vein but uneventfully retrieved and replaced in the IVC within 24 hours. Two insertion site femoral vein DVTs (both in the dual puncture group; p>0.2) and 5 groin hematomas occurred during follow-up. GT filters were in place a mean of 107 days and Celect 97 days. In this time, 2 pulmonary embolisms occurred (1 in each group; p>0.2). Of the 115 filters scheduled for retrieval (50 Günther Tulip, 65 Celect), 33 (23 Günther Tulip, 10 Celect) could not be retrieved (p = 0.0004). Vena cavography identified filter tilting (>20 degrees ) in 21 cases (15 GT, 6 Celect), while 12 filters (8 GT, 4 Celect) had extended indwell times (mean 187 days) and excessive tissue ingrowth covering the retrieval hook. Subjectively, the Celect filters were clinically "easier" to retrieve; they also had fewer cases of significant tilt (>20%) than the GT filters, but the difference was not statistically significant. GT and Celect IVCFs placed at the ICU bedside under IVUS guidance in multiple-trauma patients was simple, safe, and avoided transporting critically ill patients out of the ICU. Further investigation of the single sheath IVUS technique and the role of retrievable IVCFs in multi-trauma patients is warranted.

  11. De-Dopplerization of Acoustic Measurements

    DTIC Science & Technology

    2017-08-10

    band energy obtained from fractional octave band digital filters generates a de-Dopplerized spectrum without complex resampling algorithms. An...energy obtained from fractional octave band digital filters generates a de-Dopplerized spectrum without complex resampling algorithms. An equation...fractional octave representation and smearing that occurs within the spectrum11, digital filtering techniques were not considered by these earlier

  12. Linear-Quadratic Control of a MEMS Micromirror using Kalman Filtering

    DTIC Science & Technology

    2011-12-01

    LINEAR-QUADRATIC CONTROL OF A MEMS MICROMIRROR USING KALMAN FILTERING THESIS Jamie P...A MEMS MICROMIRROR USING KALMAN FILTERING THESIS Presented to the Faculty Department of Electrical Engineering Graduate School of...actuated micromirrors fabricated by PolyMUMPs. Successful application of these techniques enables demonstration of smooth, stable deflections of 50% and

  13. 47 CFR 15.611 - General technical requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... spectrum by licensed services. These techniques may include adaptive or “notch” filtering, or complete... frequencies below 30 MHz, when a notch filter is used to avoid interference to a specific frequency band, the... below the applicable part 15 limits. (ii) For frequencies above 30 MHz, when a notch filter is used to...

  14. Two techniques enable sampling of filtered and unfiltered molten metals

    NASA Technical Reports Server (NTRS)

    Burris, L., Jr.; Pierce, R. D.; Tobias, K. R.; Winsch, I. O.

    1967-01-01

    Filtered samples of molten metals are obtained by filtering through a plug of porous material fitted in the end of a sample tube, and unfiltered samples are obtained by using a capillary-tube extension rod with a perforated bucket. With these methods there are no sampling errors or loss of liquid.

  15. Assessing FRET using Spectral Techniques

    PubMed Central

    Leavesley, Silas J.; Britain, Andrea L.; Cichon, Lauren K.; Nikolaev, Viacheslav O.; Rich, Thomas C.

    2015-01-01

    Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein–protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP–Epac–YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. PMID:23929684

  16. Assessing FRET using spectral techniques.

    PubMed

    Leavesley, Silas J; Britain, Andrea L; Cichon, Lauren K; Nikolaev, Viacheslav O; Rich, Thomas C

    2013-10-01

    Förster resonance energy transfer (FRET) techniques have proven invaluable for probing the complex nature of protein-protein interactions, protein folding, and intracellular signaling events. These techniques have traditionally been implemented with the use of one or more fluorescence band-pass filters, either as fluorescence microscopy filter cubes, or as dichroic mirrors and band-pass filters in flow cytometry. In addition, new approaches for measuring FRET, such as fluorescence lifetime and acceptor photobleaching, have been developed. Hyperspectral techniques for imaging and flow cytometry have also shown to be promising for performing FRET measurements. In this study, we have compared traditional (filter-based) FRET approaches to three spectral-based approaches: the ratio of acceptor-to-donor peak emission, linear spectral unmixing, and linear spectral unmixing with a correction for direct acceptor excitation. All methods are estimates of FRET efficiency, except for one-filter set and three-filter set FRET indices, which are included for consistency with prior literature. In the first part of this study, spectrofluorimetric data were collected from a CFP-Epac-YFP FRET probe that has been used for intracellular cAMP measurements. All comparisons were performed using the same spectrofluorimetric datasets as input data, to provide a relevant comparison. Linear spectral unmixing resulted in measurements with the lowest coefficient of variation (0.10) as well as accurate fits using the Hill equation. FRET efficiency methods produced coefficients of variation of less than 0.20, while FRET indices produced coefficients of variation greater than 8.00. These results demonstrate that spectral FRET measurements provide improved response over standard, filter-based measurements. Using spectral approaches, single-cell measurements were conducted through hyperspectral confocal microscopy, linear unmixing, and cell segmentation with quantitative image analysis. Results from these studies confirmed that spectral imaging is effective for measuring subcellular, time-dependent FRET dynamics and that additional fluorescent signals can be readily separated from FRET signals, enabling multilabel studies of molecular interactions. © 2013 International Society for Advancement of Cytometry. Copyright © 2013 International Society for Advancement of Cytometry.

  17. Comparison of Cross Flow Filtration Performance for Manganese Oxide/Sludge Mixtures and Monosodium Titanate/Sludge Mixtures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Poirier, M.R.

    2002-06-07

    Personnel performed engineering-scale tests at the Filtration Research Engineering Demonstration (FRED) to determine crossflow filter performance with a 5.6 M sodium solution containing varying concentrations of sludge and sodium permanganate. The work represents another in a series of collaborative efforts between the University of South Carolina and the Savannah River Technology Center in support of the process development efforts for the Savannah River Site. The current tests investigated filter performance with slurry containing simulated Tank 40H Sludge and sodium permanganate at concentrations between 0.070 weight percent and 3.04 weight percent insoluble solids.

  18. Developing a denoising filter for electron microscopy and tomography data in the cloud.

    PubMed

    Starosolski, Zbigniew; Szczepanski, Marek; Wahle, Manuel; Rusu, Mirabela; Wriggers, Willy

    2012-09-01

    The low radiation conditions and the predominantly phase-object image formation of cryo-electron microscopy (cryo-EM) result in extremely high noise levels and low contrast in the recorded micrographs. The process of single particle or tomographic 3D reconstruction does not completely eliminate this noise and is even capable of introducing new sources of noise during alignment or when correcting for instrument parameters. The recently developed Digital Paths Supervised Variance (DPSV) denoising filter uses local variance information to control regional noise in a robust and adaptive manner. The performance of the DPSV filter was evaluated in this review qualitatively and quantitatively using simulated and experimental data from cryo-EM and tomography in two and three dimensions. We also assessed the benefit of filtering experimental reconstructions for visualization purposes and for enhancing the accuracy of feature detection. The DPSV filter eliminates high-frequency noise artifacts (density gaps), which would normally preclude the accurate segmentation of tomography reconstructions or the detection of alpha-helices in single-particle reconstructions. This collaborative software development project was carried out entirely by virtual interactions among the authors using publicly available development and file sharing tools.

  19. Noise reduction with complex bilateral filter.

    PubMed

    Matsumoto, Mitsuharu

    2017-12-01

    This study introduces a noise reduction technique that uses a complex bilateral filter. A bilateral filter is a nonlinear filter originally developed for images that can reduce noise while preserving edge information. It is an attractive filter and has been used in many applications in image processing. When it is applied to an acoustical signal, small-amplitude noise is reduced while the speech signal is preserved. However, a bilateral filter cannot handle noise with relatively large amplitudes owing to its innate characteristics. In this study, the noisy signal is transformed into the time-frequency domain and the filter is improved to handle complex spectra. The high-amplitude noise is reduced in the time-frequency domain via the proposed filter. The features and the potential of the proposed filter are also confirmed through experiments.

  20. An Inter-Personal Information Sharing Model Based on Personalized Recommendations

    NASA Astrophysics Data System (ADS)

    Kamei, Koji; Funakoshi, Kaname; Akahani, Jun-Ichi; Satoh, Tetsuji

    In this paper, we propose an inter-personal information sharing model among individuals based on personalized recommendations. In the proposed model, we define an information resource as shared between people when both of them consider it important --- not merely when they both possess it. In other words, the model defines the importance of information resources based on personalized recommendations from identifiable acquaintances. The proposed method is based on a collaborative filtering system that focuses on evaluations from identifiable acquaintances. It utilizes both user evaluations for documents and their contents. In other words, each user profile is represented as a matrix of credibility to the other users' evaluations on each domain of interests. We extended the content-based collaborative filtering method to distinguish other users to whom the documents should be recommended. We also applied a concept-based vector space model to represent the domain of interests instead of the previous method which represented them by a term-based vector space model. We introduce a personalized concept-base compiled from each user's information repository to improve the information retrieval in the user's environment. Furthermore, the concept-spaces change from user to user since they reflect the personalities of the users. Because of different concept-spaces, the similarity between a document and a user's interest varies for each user. As a result, a user receives recommendations from other users who have different view points, achieving inter-personal information sharing based on personalized recommendations. This paper also describes an experimental simulation of our information sharing model. In our laboratory, five participants accumulated a personal repository of e-mails and web pages from which they built their own concept-base. Then we estimated the user profiles according to personalized concept-bases and sets of documents which others evaluated. We simulated inter-personal recommendation based on the user profiles and evaluated the performance of the recommendation method by comparing the recommended documents to the result of the content-based collaborative filtering.

  1. A high powered radar interference mitigation technique for communications signal recovery with fpga implementation

    DTIC Science & Technology

    2017-03-01

    2016.7485263.] 14. SUBJECT TERMS parameter estimation; matched- filter detection; QPSK; radar; interference; LSE, cyber, electronic warfare 15. NUMBER OF...signal is routed through a maximum-likelihood detector (MLD), which is a bank of four filters matched to the four symbols of the QPSK constellation... filters matched for each of the QPSK symbols is used to demodulate the signal after cancellation. The matched filters are defined as the complex

  2. Multi-filter spectrophotometry simulations

    NASA Technical Reports Server (NTRS)

    Callaghan, Kim A. S.; Gibson, Brad K.; Hickson, Paul

    1993-01-01

    To complement both the multi-filter observations of quasar environments described in these proceedings, as well as the proposed UBC 2.7 m Liquid Mirror Telescope (LMT) redshift survey, we have initiated a program of simulated multi-filter spectrophotometry. The goal of this work, still very much in progress, is a better quantitative assessment of the multiband technique as a viable mechanism for obtaining useful redshift and morphological class information from large scale multi-filter surveys.

  3. Mapping accuracy via spectrally and structurally based filtering techniques: comparisons through visual observations

    NASA Astrophysics Data System (ADS)

    Chockalingam, Letchumanan

    2005-01-01

    The data of Gunung Ledang region of Malaysia acquired through LANDSAT are considered to map certain hydrogeolocial features. To map these significant features, image-processing tools such as contrast enhancement, edge detection techniques are employed. The advantages of these techniques over the other methods are evaluated from the point of their validity in properly isolating features of hydrogeolocial interest are discussed. As these techniques take the advantage of spectral aspects of the images, these techniques have several limitations to meet the objectives. To discuss these limitations, a morphological transformation, which generally considers the structural aspects rather than spectral aspects from the image, are applied to provide comparisons between the results derived from spectral based and the structural based filtering techniques.

  4. ProDaMa: an open source Python library to generate protein structure datasets.

    PubMed

    Armano, Giuliano; Manconi, Andrea

    2009-10-02

    The huge difference between the number of known sequences and known tertiary structures has justified the use of automated methods for protein analysis. Although a general methodology to solve these problems has not been yet devised, researchers are engaged in developing more accurate techniques and algorithms whose training plays a relevant role in determining their performance. From this perspective, particular importance is given to the training data used in experiments, and researchers are often engaged in the generation of specialized datasets that meet their requirements. To facilitate the task of generating specialized datasets we devised and implemented ProDaMa, an open source Python library than provides classes for retrieving, organizing, updating, analyzing, and filtering protein data. ProDaMa has been used to generate specialized datasets useful for secondary structure prediction and to develop a collaborative web application aimed at generating and sharing protein structure datasets. The library, the related database, and the documentation are freely available at the URL http://iasc.diee.unica.it/prodama.

  5. Fast Katz and Commuters: Efficient Estimation of Social Relatedness in Large Networks

    NASA Astrophysics Data System (ADS)

    Esfandiar, Pooya; Bonchi, Francesco; Gleich, David F.; Greif, Chen; Lakshmanan, Laks V. S.; On, Byung-Won

    Motivated by social network data mining problems such as link prediction and collaborative filtering, significant research effort has been devoted to computing topological measures including the Katz score and the commute time. Existing approaches typically approximate all pairwise relationships simultaneously. In this paper, we are interested in computing: the score for a single pair of nodes, and the top-k nodes with the best scores from a given source node. For the pairwise problem, we apply an iterative algorithm that computes upper and lower bounds for the measures we seek. This algorithm exploits a relationship between the Lanczos process and a quadrature rule. For the top-k problem, we propose an algorithm that only accesses a small portion of the graph and is related to techniques used in personalized PageRank computing. To test the scalability and accuracy of our algorithms we experiment with three real-world networks and find that these algorithms run in milliseconds to seconds without any preprocessing.

  6. Fast katz and commuters : efficient estimation of social relatedness in large networks.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    On, Byung-Won; Lakshmanan, Laks V. S.; Greif, Chen

    Motivated by social network data mining problems such as link prediction and collaborative filtering, significant research effort has been devoted to computing topological measures including the Katz score and the commute time. Existing approaches typically approximate all pairwise relationships simultaneously. In this paper, we are interested in computing: the score for a single pair of nodes, and the top-k nodes with the best scores from a given source node. For the pairwise problem, we apply an iterative algorithm that computes upper and lower bounds for the measures we seek. This algorithm exploits a relationship between the Lanczos process and amore » quadrature rule. For the top-k problem, we propose an algorithm that only accesses a small portion of the graph and is related to techniques used in personalized PageRank computing. To test the scalability and accuracy of our algorithms we experiment with three real-world networks and find that these algorithms run in milliseconds to seconds without any preprocessing.« less

  7. Quantitative comparison between full-spectrum and filter-based imaging in hyperspectral fluorescence microscopy

    PubMed Central

    GAO, L.; HAGEN, N.; TKACZYK, T.S.

    2012-01-01

    Summary We implement a filterless illumination scheme on a hyperspectral fluorescence microscope to achieve full-range spectral imaging. The microscope employs polarisation filtering, spatial filtering and spectral unmixing filtering to replace the role of traditional filters. Quantitative comparisons between full-spectrum and filter-based microscopy are provided in the context of signal dynamic range and accuracy of measured fluorophores’ emission spectra. To show potential applications, a five-colour cell immunofluorescence imaging experiment is theoretically simulated. Simulation results indicate that the use of proposed full-spectrum imaging technique may result in three times improvement in signal dynamic range compared to that can be achieved in the filter-based imaging. PMID:22356127

  8. Speckle noise reduction of 1-look SAR imagery

    NASA Technical Reports Server (NTRS)

    Nathan, Krishna S.; Curlander, John C.

    1987-01-01

    Speckle noise is inherent to synthetic aperture radar (SAR) imagery. Since the degradation of the image due to this noise results in uncertainties in the interpretation of the scene and in a loss of apparent resolution, it is desirable to filter the image to reduce this noise. In this paper, an adaptive algorithm based on the calculation of the local statistics around a pixel is applied to 1-look SAR imagery. The filter adapts to the nonstationarity of the image statistics since the size of the blocks is very small compared to that of the image. The performance of the filter is measured in terms of the equivalent number of looks (ENL) of the filtered image and the resulting resolution degradation. The results are compared to those obtained from different techniques applied to similar data. The local adaptive filter (LAF) significantly increases the ENL of the final image. The associated loss of resolution is also lower than that for other commonly used speckle reduction techniques.

  9. An Adaptive Kalman Filter using a Simple Residual Tuning Method

    NASA Technical Reports Server (NTRS)

    Harman, Richard R.

    1999-01-01

    One difficulty in using Kalman filters in real world situations is the selection of the correct process noise, measurement noise, and initial state estimate and covariance. These parameters are commonly referred to as tuning parameters. Multiple methods have been developed to estimate these parameters. Most of those methods such as maximum likelihood, subspace, and observer Kalman Identification require extensive offline processing and are not suitable for real time processing. One technique, which is suitable for real time processing, is the residual tuning method. Any mismodeling of the filter tuning parameters will result in a non-white sequence for the filter measurement residuals. The residual tuning technique uses this information to estimate corrections to those tuning parameters. The actual implementation results in a set of sequential equations that run in parallel with the Kalman filter. Equations for the estimation of the measurement noise have also been developed. These algorithms are used to estimate the process noise and measurement noise for the Wide Field Infrared Explorer star tracker and gyro.

  10. Comparison of edge detection techniques for M7 subtype Leukemic cell in terms of noise filters and threshold value

    NASA Astrophysics Data System (ADS)

    Salam, Afifah Salmi Abdul; Isa, Mohd. Nazrin Md.; Ahmad, Muhammad Imran; Che Ismail, Rizalafande

    2017-11-01

    This paper will focus on the study and identifying various threshold values for two commonly used edge detection techniques, which are Sobel and Canny Edge detection. The idea is to determine which values are apt in giving accurate results in identifying a particular leukemic cell. In addition, evaluating suitability of edge detectors are also essential as feature extraction of the cell depends greatly on image segmentation (edge detection). Firstly, an image of M7 subtype of Acute Myelocytic Leukemia (AML) is chosen due to its diagnosing which were found lacking. Next, for an enhancement in image quality, noise filters are applied. Hence, by comparing images with no filter, median and average filter, useful information can be acquired. Each threshold value is fixed with value 0, 0.25 and 0.5. From the investigation found, without any filter, Canny with a threshold value of 0.5 yields the best result.

  11. Improving indoor air quality and thermal comfort in office building by using combination filters

    NASA Astrophysics Data System (ADS)

    Kabrein, H.; Yusof, M. Z. M.; Hariri, A.; Leman, A. M.; Afandi, A.

    2017-09-01

    Poor indoor air quality and thermal comfort condition in the workspace affected the occupants’ health and work productivity, especially when adapting the recirculation of air in heating ventilation and air-conditioning (HVAC) system. The recirculation of air was implemented in this study by mixing the circulated returned indoor air with the outdoor fresh air. The aims of this study are to assess the indoor thermal comfort and indoor air quality (IAQ) in the office buildings, equipped with combination filters. The air filtration technique consisting minimum efficiency reporting value (MERV) filter and activated carbon fiber (ACF) filter, located before the fan coil units. The findings of the study show that the technique of mixing recirculation air with the fresh air through the combination filters met the recommended thermal comfort condition in the workspace. Furthermore, the result of the post-occupancy evaluation (POE) and the environmental measurements comply with the ASHRAE 55 standard. In addition, the level of CO2 concentration continued to decrease during the period of the measurement.

  12. Linear Phase Sharp Transition BPF to Detect Noninvasive Maternal and Fetal Heart Rate.

    PubMed

    Marchon, Niyan; Naik, Gourish; Pai, K R

    2018-01-01

    Fetal heart rate (FHR) detection can be monitored using either direct fetal scalp electrode recording (invasive) or by indirect noninvasive technique. Weeks before delivery, the invasive method poses a risk factor to the fetus, while the latter provides accurate fetal ECG (FECG) information which can help diagnose fetal's well-being. Our technique employs variable order linear phase sharp transition (LPST) FIR band-pass filter which shows improved stopband attenuation at higher filter orders. The fetal frequency fiduciary edges form the band edges of the filter characterized by varying amounts of overlap of maternal ECG (MECG) spectrum. The one with the minimum maternal spectrum overlap was found to be optimum with no power line interference and maximum fetal heart beats being detected. The improved filtering is reflected in the enhancement of the performance of the fetal QRS detector (FQRS). The improvement has also occurred in fetal heart rate obtained using our algorithm which is in close agreement with the true reference (i.e., invasive fetal scalp ECG). The performance parameters of the FQRS detector such as sensitivity (Se), positive predictive value (PPV), and accuracy (F 1 ) were found to improve even for lower filter order. The same technique was extended to evaluate maternal QRS detector (MQRS) and found to yield satisfactory maternal heart rate (MHR) results.

  13. Optimal-adaptive filters for modelling spectral shape, site amplification, and source scaling

    USGS Publications Warehouse

    Safak, Erdal

    1989-01-01

    This paper introduces some applications of optimal filtering techniques to earthquake engineering by using the so-called ARMAX models. Three applications are presented: (a) spectral modelling of ground accelerations, (b) site amplification (i.e., the relationship between two records obtained at different sites during an earthquake), and (c) source scaling (i.e., the relationship between two records obtained at a site during two different earthquakes). A numerical example for each application is presented by using recorded ground motions. The results show that the optimal filtering techniques provide elegant solutions to above problems, and can be a useful tool in earthquake engineering.

  14. Apodized RFI filtering of synthetic aperture radar images

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doerry, Armin Walter

    2014-02-01

    Fine resolution Synthetic Aperture Radar (SAR) systems necessarily require wide bandwidths that often overlap spectrum utilized by other wireless services. These other emitters pose a source of Radio Frequency Interference (RFI) to the SAR echo signals that degrades SAR image quality. Filtering, or excising, the offending spectral contaminants will mitigate the interference, but at a cost of often degrading the SAR image in other ways, notably by raising offensive sidelobe levels. This report proposes borrowing an idea from nonlinear sidelobe apodization techniques to suppress interference without the attendant increase in sidelobe levels. The simple post-processing technique is termed Apodized RFImore » Filtering (ARF).« less

  15. [Extemporaneous withdrawal with a mini-spike filter: A low infection risk technique for drawing up bevacizumab for intravitreal injection].

    PubMed

    Le Rouic, J F; Breger, D; Peronnet, P; Hermouet-Leclair, E; Alphandari, A; Pousset-Decré, C; Badat, I; Becquet, F

    2016-05-01

    To describe a technique for extemporaneously drawing up bevacizumab for intravitreal injection (IVT) and report the rate of post-injection endophthtalmitis. Retrospective monocentric analysis (January 2010-December 2014) of all IVT of bevacizumab drawn up with the following technique: in the operating room (class ISO 7) through a mini-spike with an integrated bacteria retentive air filter. The surgeon was wearing sterile gloves and a mask. The assisting nurse wore a mask. The bevacizumab vial was discarded at the end of each session. Six thousand two hundred and thirty-six bevacizumab injections were performed. One case of endophthalmitis was noted (0.016%). During the same period, 4 cases of endophthalmitis were found after IVT of other drugs (4/32,992; 0.012%. P=0.8). Intravitreal injection of bevacizumab after extemporaneous withdrawal through a mini-spike filter is a simple and safe technique. The risk of postoperative endophthalmitis is very low. This simple technique facilitates access to compounded bevacizumab. Copyright © 2016 Elsevier Masson SAS. All rights reserved.

  16. On the design of recursive digital filters

    NASA Technical Reports Server (NTRS)

    Shenoi, K.; Narasimha, M. J.; Peterson, A. M.

    1976-01-01

    A change of variables is described which transforms the problem of designing a recursive digital filter to that of approximation by a ratio of polynomials on a finite interval. Some analytic techniques for the design of low-pass filters are presented, illustrating the use of the transformation. Also considered are methods for the design of phase equalizers.

  17. Microbial survey of a full-scale, biologically active filter for treatment of drinking water.

    PubMed

    White, Colin P; Debry, Ronald W; Lytle, Darren A

    2012-09-01

    The microbial community of a full-scale, biologically active drinking water filter was surveyed using molecular techniques. Nitrosomonas, Nitrospira, Sphingomonadales, and Rhizobiales dominated the clone libraries. The results elucidate the microbial ecology of biological filters and demonstrate that biological treatment of drinking water should be considered a viable alternative to physicochemical methods.

  18. X-band preamplifier filter

    NASA Technical Reports Server (NTRS)

    Manshadi, F.

    1986-01-01

    A low-loss bandstop filter designed and developed for the Deep Space Network's 34-meter high-efficiency antennas is described. The filter is used for protection of the X-band traveling wave masers from the 20-kW transmitter signal. A combination of empirical and theoretical techniques was employed as well as computer simulation to verify the design before fabrication.

  19. Superconducting Magnetometry for Cardiovascular Studies and AN Application of Adaptive Filtering.

    NASA Astrophysics Data System (ADS)

    Leifer, Mark Curtis

    Sensitive magnetic detectors utilizing Superconducting Quantum Interference Devices (SQUID's) have been developed and used for studying the cardiovascular system. The theory of magnetic detection of cardiac currents is discussed, and new experimental data supporting the validity of the theory is presented. Measurements on both humans and dogs, in both healthy and diseased states, are presented using the new technique, which is termed vector magnetocardiography. In the next section, a new type of superconducting magnetometer with a room temperature pickup is analyzed, and techniques for optimizing its sensitivity to low-frequency sub-microamp currents are presented. Performance of the actual device displays significantly improved sensitivity in this frequency range, and the ability to measure currents in intact, in vivo biological fibers. The final section reviews the theoretical operation of a digital self-optimizing filter, and presents a four-channel software implementation of the system. The application of the adaptive filter to enhancement of geomagnetic signals for earthquake forecasting is discussed, and the adaptive filter is shown to outperform existing techniques in suppressing noise from geomagnetic records.

  20. Fabrication of dense wavelength division multiplexing filters with large useful area

    NASA Astrophysics Data System (ADS)

    Lee, Cheng-Chung; Chen, Sheng-Hui; Hsu, Jin-Cherng; Kuo, Chien-Cheng

    2006-08-01

    Dense Wavelength Division Multiplexers (DWDM), a kind of narrow band-pass filter, are extremely sensitive to the optical thickness error in each composite layer. Therefore to have a large useful coating area is extreme difficult because of the uniformity problem. To enlarge the useful coating area it is necessary to improve their design and their fabrication. In this study, we discuss how the tooling factors at different positions and for different materials are related to the optical performance of the design. 100GHz DWDM filters were fabricated by E-gun evaporation with ion-assisted deposition (IAD). To improve the coating uniformity, an analysis technique called shaping tooling factor (STF) was used to analyze the deviation of the optical thickness in different materials so as to enlarge the useful coating area. Also a technique of etching the deposited layers with oxygen ions was introduced. When the above techniques were applied in the fabrication of 100 GHz DWDM filters, the uniformity was better than +/-0.002% over an area of 72 mm in diameter and better than +/-0.0006% over 20mm in diameter.

  1. Noise reduction techniques for Bayer-matrix images

    NASA Astrophysics Data System (ADS)

    Kalevo, Ossi; Rantanen, Henry

    2002-04-01

    In this paper, some arrangements to apply Noise Reduction (NR) techniques for images captured by a single sensor digital camera are studied. Usually, the NR filter processes full three-color component image data. This requires that raw Bayer-matrix image data, available from the image sensor, is first interpolated by using Color Filter Array Interpolation (CFAI) method. Another choice is that the raw Bayer-matrix image data is processed directly. The advantages and disadvantages of both processing orders, before (pre-) CFAI and after (post-) CFAI, are studied with linear, multi-stage median, multistage median hybrid and median-rational filters .The comparison is based on the quality of the output image, the processing power requirements and the amount of memory needed. Also the solution, which improves preservation of details in the NR filtering before the CFAI, is proposed.

  2. Automated selection of the most epithelium-rich areas in gynecologic tumor sections.

    PubMed

    Schipper, N W; Baak, J P; Smeulders, A W

    1991-12-01

    The paper describes an image analysis technique for automated selection of the epithelium-rich areas in standard paraffin tissue sections of ovarian and endometrial premalignancies and malignancies. Two staining procedures were evaluated, Feulgen (pararosanilin) and CAM 5.2, demonstrating the presence of cytokeratin 8 and 18; both were counterstained with naphthol yellow. The technique is based on the corresponding image processing method of automated estimation of the percentage of epithelium in interactively selected microscope fields. With the technique, one image is recorded with a filter to demonstrate where epithelium and stroma lie. This filter is chosen according to the type of staining: it is yellow (lambda = 552 nm) for Feulgen and blue (lambda = 470 nm) for anticytokeratin CAM 5.2. When stroma cannot be distinguished from lumina with the green filter or from epithelium with the blue filter, a second image is recorded from the same microscope field, with a blue filter (lambda = 420 nm) for Feulgen and a yellow filter (lambda = 576 nm) for anticytokeratin CAM 5.2. Discrimination between epithelium and stroma is based on the image contrast range and the packing of nuclei in the yellow image and on the automated classification of the gray value histogram peaks in the blue image. For Feulgen stain the method was evaluated on 30 ovarian tumors of the common epithelial types (8 borderline tumors and 22 carcinomas with various degrees of differentiation) and 30 endometrial carcinomas of different grades.(ABSTRACT TRUNCATED AT 250 WORDS)

  3. Gaussian Process Kalman Filter for Focal Plane Wavefront Correction and Exoplanet Signal Extraction

    NASA Astrophysics Data System (ADS)

    Sun, He; Kasdin, N. Jeremy

    2018-01-01

    Currently, the ultimate limitation of space-based coronagraphy is the ability to subtract the residual PSF after wavefront correction to reveal the planet. Called reference difference imaging (RDI), the technique consists of conducting wavefront control to collect the reference point spread function (PSF) by observing a bright star, and then extracting target planet signals by subtracting a weighted sum of reference PSFs. Unfortunately, this technique is inherently inefficient because it spends a significant fraction of the observing time on the reference star rather than the target star with the planet. Recent progress in model based wavefront estimation suggests an alternative approach. A Kalman filter can be used to estimate the stellar PSF for correction by the wavefront control system while simultaneously estimating the planet signal. Without observing the reference star, the (extended) Kalman filter directly utilizes the wavefront correction data and combines the time series observations and model predictions to estimate the stellar PSF and planet signals. Because wavefront correction is used during the entire observation with no slewing, the system has inherently better stability. In this poster we show our results aimed at further improving our Kalman filter estimation accuracy by including not only temporal correlations but also spatial correlations among neighboring pixels in the images. This technique is known as a Gaussian process Kalman filter (GPKF). We also demonstrate the advantages of using a Kalman filter rather than RDI by simulating a real space exoplanet detection mission.

  4. A Flexible Electronic Commerce Recommendation System

    NASA Astrophysics Data System (ADS)

    Gong, Songjie

    Recommendation systems have become very popular in E-commerce websites. Many of the largest commerce websites are already using recommender technologies to help their customers find products to purchase. An electronic commerce recommendation system learns from a customer and recommends products that the customer will find most valuable from among the available products. But most recommendation methods are hard-wired into the system and they support only fixed recommendations. This paper presented a framework of flexible electronic commerce recommendation system. The framework is composed by user model interface, recommendation engine, recommendation strategy model, recommendation technology group, user interest model and database interface. In the recommender strategy model, the method can be collaborative filtering, content-based filtering, mining associate rules method, knowledge-based filtering method or the mixed method. The system mapped the implementation and demand through strategy model, and the whole system would be design as standard parts to adapt to the change of the recommendation strategy.

  5. Adaptable Iterative and Recursive Kalman Filter Schemes

    NASA Technical Reports Server (NTRS)

    Zanetti, Renato

    2014-01-01

    Nonlinear filters are often very computationally expensive and usually not suitable for real-time applications. Real-time navigation algorithms are typically based on linear estimators, such as the extended Kalman filter (EKF) and, to a much lesser extent, the unscented Kalman filter. The Iterated Kalman filter (IKF) and the Recursive Update Filter (RUF) are two algorithms that reduce the consequences of the linearization assumption of the EKF by performing N updates for each new measurement, where N is the number of recursions, a tuning parameter. This paper introduces an adaptable RUF algorithm to calculate N on the go, a similar technique can be used for the IKF as well.

  6. Investigation on filter method for smoothing spiral phase plate

    NASA Astrophysics Data System (ADS)

    Zhang, Yuanhang; Wen, Shenglin; Luo, Zijian; Tang, Caixue; Yan, Hao; Yang, Chunlin; Liu, Mincai; Zhang, Qinghua; Wang, Jian

    2018-03-01

    Spiral phase plate (SPP) for generating vortex hollow beams has high efficiency in various applications. However, it is difficult to obtain an ideal spiral phase plate because of its continuous-varying helical phase and discontinued phase step. This paper describes the demonstration of continuous spiral phase plate using filter methods. The numerical simulations indicate that different filter method including spatial domain filter, frequency domain filter has unique impact on surface topography of SPP and optical vortex characteristics. The experimental results reveal that the spatial Gaussian filter method for smoothing SPP is suitable for Computer Controlled Optical Surfacing (CCOS) technique and obtains good optical properties.

  7. Electrostatic atomization--Experiment, theory and industrial applications

    NASA Astrophysics Data System (ADS)

    Okuda, H.; Kelly, Arnold J.

    1996-05-01

    Experimental and theoretical research has been initiated at the Princeton Plasma Physics Laboratory on the electrostatic atomization process in collaboration with Charged Injection Corporation. The goal of this collaboration is to set up a comprehensive research and development program on the electrostatic atomization at the Princeton Plasma Physics Laboratory so that both institutions can benefit from the collaboration. Experimental, theoretical and numerical simulation approaches are used for this purpose. An experiment consisting of a capillary sprayer combined with a quadrupole mass filter and a charge detector was installed at the Electrostatic Atomization Laboratory to study fundamental properties of the charged droplets such as the distribution of charges with respect to the droplet radius. In addition, a numerical simulation model is used to study interaction of beam electrons with atmospheric pressure water vapor, supporting an effort to develop an electrostatic water mist fire-fighting nozzle.

  8. Image sharpening for mixed spatial and spectral resolution satellite systems

    NASA Technical Reports Server (NTRS)

    Hallada, W. A.; Cox, S.

    1983-01-01

    Two methods of image sharpening (reconstruction) are compared. The first, a spatial filtering technique, extrapolates edge information from a high spatial resolution panchromatic band at 10 meters and adds it to the low spatial resolution narrow spectral bands. The second method, a color normalizing technique, is based on the ability to separate image hue and brightness components in spectral data. Using both techniques, multispectral images are sharpened from 30, 50, 70, and 90 meter resolutions. Error rates are calculated for the two methods and all sharpened resolutions. The results indicate that the color normalizing method is superior to the spatial filtering technique.

  9. Molecular filter based planar Doppler velocimetry

    NASA Astrophysics Data System (ADS)

    Elliott, Gregory S.; Beutner, Thomas J.

    1999-11-01

    Molecular filter based diagnostics are continuing to gain popularity as a research tool for investigations in areas of aerodynamics, fluid mechanics, and combustion. This class of diagnostics has gone by many terms including Filtered Rayleigh Scattering, Doppler Global Velocimetry, and Planar Doppler Velocimetry. The majority of this article reviews recent advances in Planar Doppler Velocimetry in measuring up to three velocity components over a planar region in a flowfield. The history of the development of these techniques is given with a description of typical systems, components, and levels of uncertainty in the measurement. Current trends indicate that uncertainties on the order of 1 m/s are possible with these techniques. A comprehensive review is also given on the application of Planar Doppler Velocimetry to laboratory flows, supersonic flows, and large scale subsonic wind tunnels. The article concludes with a description of future trends, which may simplify the technique, followed by a description of techniques which allow multi-property measurements (i.e. velocity, density, temperature, and pressure) simultaneously.

  10. Filter-based chemical sensors for hazardous materials

    NASA Astrophysics Data System (ADS)

    Major, Kevin J.; Ewing, Kenneth J.; Poutous, Menelaos K.; Sanghera, Jasbinder S.; Aggarwal, Ishwar D.

    2014-05-01

    The development of new techniques for the detection of homemade explosive devices is an area of intense research for the defense community. Such sensors must exhibit high selectivity to detect explosives and/or explosives related materials in a complex environment. Spectroscopic techniques such as FTIR are capable of discriminating between the volatile components of explosives; however, there is a need for less expensive systems for wide-range use in the field. To tackle this challenge we are investigating the use of multiple, overlapping, broad-band infrared (IR) filters to enable discrimination of volatile chemicals associated with an explosive device from potential background interferants with similar chemical signatures. We present an optical approach for the detection of fuel oil (the volatile component in ammonium nitrate-fuel oil explosives) that relies on IR absorption spectroscopy in a laboratory environment. Our proposed system utilizes a three filter set to separate the IR signals from fuel oil and various background interferants in the sample headspace. Filter responses for the chemical spectra are calculated using a Gaussian filter set. We demonstrate that using a specifically chosen filter set enables discrimination of pure fuel oil, hexanes, and acetone, as well as various mixtures of these components. We examine the effects of varying carrier gasses and humidity on the collected spectra and corresponding filter response. We study the filter response on these mixtures over time as well as present a variety of methods for observing the filter response functions to determine the response of this approach to detecting fuel oil in various environments.

  11. Behavior of Filters and Smoothers for Strongly Nonlinear Dynamics

    NASA Technical Reports Server (NTRS)

    Zhu, Yanqui; Cohn, Stephen E.; Todling, Ricardo

    1999-01-01

    The Kalman filter is the optimal filter in the presence of known gaussian error statistics and linear dynamics. Filter extension to nonlinear dynamics is non trivial in the sense of appropriately representing high order moments of the statistics. Monte Carlo, ensemble-based, methods have been advocated as the methodology for representing high order moments without any questionable closure assumptions. Investigation along these lines has been conducted for highly idealized dynamics such as the strongly nonlinear Lorenz model as well as more realistic models of the means and atmosphere. A few relevant issues in this context are related to the necessary number of ensemble members to properly represent the error statistics and, the necessary modifications in the usual filter situations to allow for correct update of the ensemble members. The ensemble technique has also been applied to the problem of smoothing for which similar questions apply. Ensemble smoother examples, however, seem to be quite puzzling in that results state estimates are worse than for their filter analogue. In this study, we use concepts in probability theory to revisit the ensemble methodology for filtering and smoothing in data assimilation. We use the Lorenz model to test and compare the behavior of a variety of implementations of ensemble filters. We also implement ensemble smoothers that are able to perform better than their filter counterparts. A discussion of feasibility of these techniques to large data assimilation problems will be given at the time of the conference.

  12. Signal Processing Equipment and Techniques for Use in Measuring Ocean Acoustic Multipath Structures

    DTIC Science & Technology

    1983-12-01

    Demodulator 3.4 Digital Demodulator 3.4.1 Number of Bits in the Input A/D Converter Quantization Effects The Demodulator Output Filter Effects of... power caused by ignoring cross spectral term a) First order Butterworth filter b) Second order Butterworth filter 48 3.4 Ordering of e...spectrum 59 3.7 Multiplying D/A Converter input and output spectra a) Input b) Output 60 3.8 Demodulator output spectrum prior to filtering 63

  13. Processing Functional Near Infrared Spectroscopy Signal with a Kalman Filter to Assess Working Memory during Simulated Flight.

    PubMed

    Durantin, Gautier; Scannella, Sébastien; Gateau, Thibault; Delorme, Arnaud; Dehais, Frédéric

    2015-01-01

    Working memory (WM) is a key executive function for operating aircraft, especially when pilots have to recall series of air traffic control instructions. There is a need to implement tools to monitor WM as its limitation may jeopardize flight safety. An innovative way to address this issue is to adopt a Neuroergonomics approach that merges knowledge and methods from Human Factors, System Engineering, and Neuroscience. A challenge of great importance for Neuroergonomics is to implement efficient brain imaging techniques to measure the brain at work and to design Brain Computer Interfaces (BCI). We used functional near infrared spectroscopy as it has been already successfully tested to measure WM capacity in complex environment with air traffic controllers (ATC), pilots, or unmanned vehicle operators. However, the extraction of relevant features from the raw signal in ecological environment is still a critical issue due to the complexity of implementing real-time signal processing techniques without a priori knowledge. We proposed to implement the Kalman filtering approach, a signal processing technique that is efficient when the dynamics of the signal can be modeled. We based our approach on the Boynton model of hemodynamic response. We conducted a first experiment with nine participants involving a basic WM task to estimate the noise covariances of the Kalman filter. We then conducted a more ecological experiment in our flight simulator with 18 pilots who interacted with ATC instructions (two levels of difficulty). The data was processed with the same Kalman filter settings implemented in the first experiment. This filter was benchmarked with a classical pass-band IIR filter and a Moving Average Convergence Divergence (MACD) filter. Statistical analysis revealed that the Kalman filter was the most efficient to separate the two levels of load, by increasing the observed effect size in prefrontal areas involved in WM. In addition, the use of a Kalman filter increased the performance of the classification of WM levels based on brain signal. The results suggest that Kalman filter is a suitable approach for real-time improvement of near infrared spectroscopy signal in ecological situations and the development of BCI.

  14. Processing Functional Near Infrared Spectroscopy Signal with a Kalman Filter to Assess Working Memory during Simulated Flight

    PubMed Central

    Durantin, Gautier; Scannella, Sébastien; Gateau, Thibault; Delorme, Arnaud; Dehais, Frédéric

    2016-01-01

    Working memory (WM) is a key executive function for operating aircraft, especially when pilots have to recall series of air traffic control instructions. There is a need to implement tools to monitor WM as its limitation may jeopardize flight safety. An innovative way to address this issue is to adopt a Neuroergonomics approach that merges knowledge and methods from Human Factors, System Engineering, and Neuroscience. A challenge of great importance for Neuroergonomics is to implement efficient brain imaging techniques to measure the brain at work and to design Brain Computer Interfaces (BCI). We used functional near infrared spectroscopy as it has been already successfully tested to measure WM capacity in complex environment with air traffic controllers (ATC), pilots, or unmanned vehicle operators. However, the extraction of relevant features from the raw signal in ecological environment is still a critical issue due to the complexity of implementing real-time signal processing techniques without a priori knowledge. We proposed to implement the Kalman filtering approach, a signal processing technique that is efficient when the dynamics of the signal can be modeled. We based our approach on the Boynton model of hemodynamic response. We conducted a first experiment with nine participants involving a basic WM task to estimate the noise covariances of the Kalman filter. We then conducted a more ecological experiment in our flight simulator with 18 pilots who interacted with ATC instructions (two levels of difficulty). The data was processed with the same Kalman filter settings implemented in the first experiment. This filter was benchmarked with a classical pass-band IIR filter and a Moving Average Convergence Divergence (MACD) filter. Statistical analysis revealed that the Kalman filter was the most efficient to separate the two levels of load, by increasing the observed effect size in prefrontal areas involved in WM. In addition, the use of a Kalman filter increased the performance of the classification of WM levels based on brain signal. The results suggest that Kalman filter is a suitable approach for real-time improvement of near infrared spectroscopy signal in ecological situations and the development of BCI. PMID:26834607

  15. Envelope filter sequence to delete blinks and overshoots.

    PubMed

    Merino, Manuel; Gómez, Isabel María; Molina, Alberto J

    2015-05-30

    Eye movements have been used in control interfaces and as indicators of somnolence, workload and concentration. Different techniques can be used to detect them: we focus on the electrooculogram (EOG) in which two kinds of interference occur: blinks and overshoots. While they both draw bell-shaped waveforms, blinks are caused by the eyelid, whereas overshoots occur due to target localization error and are placed on saccade. They need to be extracted from the EOG to increase processing effectiveness. This paper describes off- and online processing implementations based on lower envelope for removing bell-shaped noise; they are compared with a 300-ms-median filter. Techniques were analyzed using two kinds of EOG data: those modeled from our own design, and real signals. Using a model signal allowed to compare filtered outputs with ideal data, so that it was possible to quantify processing precision to remove noise caused by blinks, overshoots, and general interferences. We analyzed the ability to delete blinks and overshoots, and waveform preservation. Our technique had a high capacity for reducing interference amplitudes (>97%), even exceeding median filter (MF) results. However, the MF obtained better waveform preservation, with a smaller dependence on fixation width. The proposed technique is better at deleting blinks and overshoots than the MF in model and real EOG signals.

  16. An ultrasensitive bio-surrogate for nanoporous filter membrane performance metrology directed towards contamination control in microlithography applications

    NASA Astrophysics Data System (ADS)

    Ahmad, Farhan; Mish, Barbara; Qiu, Jian; Singh, Amarnauth; Varanasi, Rao; Bedford, Eilidh; Smith, Martin

    2016-03-01

    Contamination tolerances in semiconductor manufacturing processes have changed dramatically in the past two decades, reaching below 20 nm according to the guidelines of the International Technology Roadmap for Semiconductors. The move to narrower line widths drives the need for innovative filtration technologies that can achieve higher particle/contaminant removal performance resulting in cleaner process fluids. Nanoporous filter membrane metrology tools that have been the workhorse over the past decade are also now reaching limits. For example, nanoparticle (NP) challenge testing is commonly applied for assessing particle retention performance of filter membranes. Factors such as high NP size dispersity, low NP detection sensitivity, and high NP particle-filter affinity impose challenges in characterizing the next generation of nanoporous filter membranes. We report a novel bio-surrogate, 5 nm DNA-dendrimer conjugate for evaluating particle retention performance of nanoporous filter membranes. A technique capable of single molecule detection is employed to detect sparse concentration of conjugate in filter permeate, providing >1000- fold higher detection sensitivity than any existing 5 nm-sized particle enumeration technique. This bio-surrogate also offers narrow size distribution, high stability and chemical tunability. This bio-surrogate can discriminate various sub-15 nm pore-rated nanoporous filter membranes based on their particle retention performance. Due to high bio-surrogate detection sensitivity, a lower challenge concentration of bio-surrogate (as compared to other NPs of this size) can be used for filter testing, providing a better representation of customer applications. This new method should provide better understanding of the next generation filter membranes for removing defect-causing contaminants from lithography processes.

  17. Shaper-Based Filters for the compensation of the load cell response in dynamic mass measurement

    NASA Astrophysics Data System (ADS)

    Richiedei, Dario; Trevisani, Alberto

    2018-01-01

    This paper proposes a novel model-based signal filtering technique for dynamic mass measurement through load cells. Load cells are sensors with an underdamped oscillatory response which usually imposes a long settling time. Real-time filtering is therefore necessary to compensate for such a dynamics and to quickly retrieve the mass of the measurand (which is the steady state value of the load cell response) before the measured signal actually settles. This problem has a big impact on the throughput of industrial weighing machines. In this paper a novel solution to this problem is developed: a model-based filtering technique is proposed to ensure accurate, robust and rapid estimation of the mass of the measurand. The digital filters proposed are referred to as Shaper-Based Filters (SBFs) and are based on the convolution of the load cell output signal with a sequence of few impulses (typically, between 2 and 5). The amplitudes and the instants of application of such impulses are computed through the analytical development of the load cell step response, by imposing the admissible residual oscillation in the steady-state filtered signal and by requiring the desired sensitivity of the filter. The inclusion of robustness specifications tackles effectively the unavoidable uncertainty and variability in the load cell frequency and damping. The effectiveness of the proposed filters is proved experimentally through an industrial set up: the load-cell-instrumented weigh bucket of a multihead weighing machine for packaging. A performance comparison with other benchmark filters is provided and discussed too.

  18. Exploring Affiliation Network Models as a Collaborative Filtering Mechanism in E-Learning

    ERIC Educational Resources Information Center

    Rodriguez, Daniel; Sicilia, Miguel Angel; Sanchez-Alonso, Salvador; Lezcano, Leonardo; Garcia-Barriocanal, Elena

    2011-01-01

    The online interaction of learners and tutors in activities with concrete objectives provides a valuable source of data that can be analyzed for different purposes. One of these purposes is the use of the information extracted from that interaction to aid tutors and learners in decision making about either the configuration of further learning…

  19. Recommending Learning Activities in Social Network Using Data Mining Algorithms

    ERIC Educational Resources Information Center

    Mahnane, Lamia

    2017-01-01

    In this paper, we show how data mining algorithms (e.g. Apriori Algorithm (AP) and Collaborative Filtering (CF)) is useful in New Social Network (NSN-AP-CF). "NSN-AP-CF" processes the clusters based on different learning styles. Next, it analyzes the habits and the interests of the users through mining the frequent episodes by the…

  20. A Delphi Study on Collaborative Learning in Distance Education: The Faculty Perspective

    ERIC Educational Resources Information Center

    O'Neill, Susan; Scott, Murray; Conboy, Kieran

    2011-01-01

    This paper focuses on the factors that influence collaborative learning in distance education. Distance education has been around for many years and the use of collaborative learning techniques in distance education is becoming increasingly popular. Several studies have demonstrated the superiority of collaborative learning over traditional modes…

  1. Use of alpha spectroscopy for conducting rapid surveys of transuranic activity on air sample filters and smears.

    PubMed

    Hayes, Robert B; Peña, Adan M; Goff, Thomas E

    2005-08-01

    This paper demonstrates the utility of a portable alpha Continuous Air Monitor (CAM) as a bench top scalar counter for multiple sample types. These include using the CAM to count fixed air sample filters and radiological smears. In counting radiological smears, the CAM is used very much like a gas flow proportional counter (GFPC), albeit with a lower efficiency. Due to the typically low background in this configuration, the minimum detectable activity for a 5-min count should be in the range of about 10 dpm which is acceptably below the 20 dpm limit for transuranic isotopes. When counting fixed air sample filters, the CAM algorithm along with other measurable characteristics can be used to identify and quantify the presence of transuranic isotopes in the samples. When the radiological control technician wants to take some credit from naturally occurring radioactive material contributions due to radon progeny producing higher energy peaks (as in the case with a fixed air sample filter), then more elaborate techniques are required. The techniques presented here will generate a decision level of about 43 dpm for such applications. The calibration for this application should alternatively be done using the default values of channels 92-126 for region of interest 1. This can be done within 10 to 15 min resulting in a method to rapidly evaluate air filters for transuranic activity. When compared to the 1-h count technique described by , the technique presented in the present work demonstrates a technique whereby more than two thirds of samples can be rapidly shown (within 10 to 15 min) to be within regulatory compliant limits. In both cases, however, spectral quality checks are required to insure sample self attenuation is not a significant bias in the activity estimates. This will allow the same level of confidence when using these techniques for activity quantification as is presently available for air monitoring activity quantification using CAMs.

  2. Enhanced sludge dewatering by electrofiltration. A feasibility study.

    PubMed

    Saveyn, H; Huybregts, L; Van der Meeren, P

    2001-01-01

    Sludge treatment is a major issue in today's waste water treatment. One of the problems encountered is the limiting dewaterability of mainly biological sludges, causing high final treatment costs for incineration or landfill. Although during recent years, improvements are realised in the field of dewatering, the actual dry solids content after dewatering remains at a maximum value of about 35%. In order to increase the dry solids content, the technique of electrofiltration was investigated. Electrofiltration is the combination of two known techniques, traditional pressure filtration and electroosmotic/electrophoretic dewatering. Pressure filtration is based on pressure as the driving force for dewatering a sludge. Limitations hereby lie in the clogging of the filter cloth due to the build-up of the filtercake. Electroosmotic/electrophoretic dewatering is based on an electric field to separate sludge colloid particles from the surrounding liquid by placing the sludge liquor between two oppositely charged electrodes. In this case, mobile sludge particles will move to one electrode due to their natural surface charge, and the liquid phase will be collected at the oppositely charged electrode. Combination of both techniques makes it possible to create a more homogeneous filter cake and prevent the filter from clogging, resulting in higher cake dry solids contents and shorter filtration cycles. To investigate the feasibility of this technique for the dewatering of activated sludge, a filter unit was developed for investigations on lab scale. Multiple dewatering tests were performed in which the electric parameters for electrofiltration were varied. It was derived from these experiments that very high filter cake dry solids contents (to more than 60%), and short filtration cycles were attainable by using a relatively small electric DC field. The power consumption was very low compared to the power needed to dewater sludge by thermal drying techniques. For this reason, this technique seems very promising for the dewatering of biological sludges.

  3. An Adaptive Filter for the Removal of Drifting Sinusoidal Noise Without a Reference.

    PubMed

    Kelly, John W; Siewiorek, Daniel P; Smailagic, Asim; Wang, Wei

    2016-01-01

    This paper presents a method for filtering sinusoidal noise with a variable bandwidth filter that is capable of tracking a sinusoid's drifting frequency. The method, which is based on the adaptive noise canceling (ANC) technique, will be referred to here as the adaptive sinusoid canceler (ASC). The ASC eliminates sinusoidal contamination by tracking its frequency and achieving a narrower bandwidth than typical notch filters. The detected frequency is used to digitally generate an internal reference instead of relying on an external one as ANC filters typically do. The filter's bandwidth adjusts to achieve faster and more accurate convergence. In this paper, the focus of the discussion and the data is physiological signals, specifically electrocorticographic (ECoG) neural data contaminated with power line noise, but the presented technique could be applicable to other recordings as well. On simulated data, the ASC was able to reliably track the noise's frequency, properly adjust its bandwidth, and outperform comparative methods including standard notch filters and an adaptive line enhancer. These results were reinforced by visual results obtained from real ECoG data. The ASC showed that it could be an effective method for increasing signal to noise ratio in the presence of drifting sinusoidal noise, which is of significant interest for biomedical applications.

  4. High temperature superconducting YBCO microwave filters

    NASA Astrophysics Data System (ADS)

    Aghabagheri, S.; Rasti, M.; Mohammadizadeh, M. R.; Kameli, P.; Salamati, H.; Mohammadpour-Aghdam, K.; Faraji-Dana, R.

    2018-06-01

    Epitaxial thin films of YBCO high temperature superconductor are widely used in telecommunication technology such as microwave filter, antenna, coupler and etc., due to their lower surface resistance and lower microwave loss than their normal conductor counterparts. Thin films of YBCO were fabricated by PLD technique on LAO substrate. Transition temperature and width were 88 K and 3 K, respectively. A filter pattern was designed and implemented by wet photolithography method on the films. Characterization of the filter at 77 K has been compared with the simulation results and the results for a made gold filter. Both YBCO and gold filters show high microwave loss. For YBCO filter, the reason may be due to the improper contacts on the feedlines and for gold filter, low thickness of the gold film has caused the loss increased.

  5. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern

    PubMed Central

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-01-01

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method. PMID:28657602

  6. Colorization-Based RGB-White Color Interpolation using Color Filter Array with Randomly Sampled Pattern.

    PubMed

    Oh, Paul; Lee, Sukho; Kang, Moon Gi

    2017-06-28

    Recently, several RGB-White (RGBW) color filter arrays (CFAs) have been proposed, which have extra white (W) pixels in the filter array that are highly sensitive. Due to the high sensitivity, the W pixels have better SNR (Signal to Noise Ratio) characteristics than other color pixels in the filter array, especially, in low light conditions. However, most of the RGBW CFAs are designed so that the acquired RGBW pattern image can be converted into the conventional Bayer pattern image, which is then again converted into the final color image by using conventional demosaicing methods, i.e., color interpolation techniques. In this paper, we propose a new RGBW color filter array based on a totally different color interpolation technique, the colorization algorithm. The colorization algorithm was initially proposed for colorizing a gray image into a color image using a small number of color seeds. Here, we adopt this algorithm as a color interpolation technique, so that the RGBW color filter array can be designed with a very large number of W pixels to make the most of the highly sensitive characteristics of the W channel. The resulting RGBW color filter array has a pattern with a large proportion of W pixels, while the small-numbered RGB pixels are randomly distributed over the array. The colorization algorithm makes it possible to reconstruct the colors from such a small number of RGB values. Due to the large proportion of W pixels, the reconstructed color image has a high SNR value, especially higher than those of conventional CFAs in low light condition. Experimental results show that many important information which are not perceived in color images reconstructed with conventional CFAs are perceived in the images reconstructed with the proposed method.

  7. Optimizing dual-energy x-ray parameters for the ExacTrac clinical stereoscopic imaging system to enhance soft-tissue imaging.

    PubMed

    Bowman, Wesley A; Robar, James L; Sattarivand, Mike

    2017-03-01

    Stereoscopic x-ray image guided radiotherapy for lung tumors is often hindered by bone overlap and limited soft-tissue contrast. This study aims to evaluate the feasibility of dual-energy imaging techniques and to optimize parameters of the ExacTrac stereoscopic imaging system to enhance soft-tissue imaging for application to lung stereotactic body radiation therapy. Simulated spectra and a physical lung phantom were used to optimize filter material, thickness, tube potentials, and weighting factors to obtain bone subtracted dual-energy images. Spektr simulations were used to identify material in the atomic number range (3-83) based on a metric defined to separate spectra of high and low-energies. Both energies used the same filter due to time constraints of imaging in the presence of respiratory motion. The lung phantom contained bone, soft tissue, and tumor mimicking materials, and it was imaged with a filter thickness in the range of (0-0.7) mm and a kVp range of (60-80) for low energy and (120,140) for high energy. Optimal dual-energy weighting factors were obtained when the bone to soft-tissue contrast-to-noise ratio (CNR) was minimized. Optimal filter thickness and tube potential were achieved by maximizing tumor-to-background CNR. Using the optimized parameters, dual-energy images of an anthropomorphic Rando phantom with a spherical tumor mimicking material inserted in his lung were acquired and evaluated for bone subtraction and tumor contrast. Imaging dose was measured using the dual-energy technique with and without beam filtration and matched to that of a clinical conventional single energy technique. Tin was the material of choice for beam filtering providing the best energy separation, non-toxicity, and non-reactiveness. The best soft-tissue-weighted image in the lung phantom was obtained using 0.2 mm tin and (140, 60) kVp pair. Dual-energy images of the Rando phantom with the tin filter had noticeable improvement in bone elimination, tumor contrast, and noise content when compared to dual-energy imaging with no filtration. The surface dose was 0.52 mGy per each stereoscopic view for both clinical single energy technique and the dual-energy technique in both cases of with and without the tin filter. Dual-energy soft-tissue imaging is feasible without additional imaging dose using the ExacTrac stereoscopic imaging system with optimized acquisition parameters and no beam filtration. Addition of a single tin filter for both the high and low energies has noticeable improvements on dual-energy imaging with optimized parameters. Clinical implementation of a dual-energy technique on ExacTrac stereoscopic imaging could improve lung tumor visibility. © 2017 American Association of Physicists in Medicine.

  8. Tracking with time-delayed data in multisensor systems

    NASA Astrophysics Data System (ADS)

    Hilton, Richard D.; Martin, David A.; Blair, William D.

    1993-08-01

    When techniques for target tracking are expanded to make use of multiple sensors in a multiplatform system, the possibility of time delayed data becomes a reality. When a discrete-time Kalman filter is applied and some of the data entering the filter are delayed, proper processing of these late data is a necessity for obtaining an optimal estimate of a target's state. If this problem is not given special care, the quality of the state estimates can be degraded relative to that quality provided by a single sensor. A negative-time update technique is developed using the criteria of minimum mean-square error (MMSE) under the constraint that only the results of the most recent update are saved. The performance of the MMSE technique is compared to that of the ad hoc approach employed in the Cooperative Engagement Capabilities (CEC) system for processing data from multiple platforms. It was discovered that the MMSE technique is a stable solution to the negative-time update problem, while the CEC technique was found to be less than desirable when used with filters designed for tracking highly maneuvering targets at relatively low data rates. The MMSE negative-time update technique was found to be a superior alternative to the existing CEC negative-time update technique.

  9. Calculations to Support On-line Neutron Spectrum Adjustment by Measurements with Miniature Fission Chambers in the JSI TRIGA Reactor

    NASA Astrophysics Data System (ADS)

    Kaiba, Tanja; Radulović, Vladimir; Žerovnik, Gašper; Snoj, Luka; Fourmentel, Damien; Barbot, LoÏc; Destouches, Christophe AE(; )

    2018-01-01

    Preliminary calculations were performed with the aim to establish optimal experimental conditions for the measurement campaign within the collaboration between the Jožef Stefan Institute (JSI) and Commissariat à l'Énergie Atomique et aux Énergies Alternatives (CEA Cadarache). The goal of the project is to additionally characterize the neutron spectruminside the JSI TRIGA reactor core with focus on the measurement epi-thermal and fast part of the spectrum. Measurements will be performed with fission chambers containing different fissile materials (235U, 237Np and 242Pu) covered with thermal neutron filters (Cd and Gd). The changes in the detected signal and neutron flux spectrum with and without transmission filter were studied. Additional effort was put into evaluation of the effect of the filter geometry (e.g. opening on the top end of the filter) on the detector signal. After the analysis of the scoping calculations it was concluded to position the experiment in the outside core ring inside one of the empty fuel element positions.

  10. Comment on "A re-assessment of the safety of silver in household water treatment: rapid systematic review of mammalian in vivo genotoxicity studies".

    PubMed

    Lantagne, Daniele; Rayner, Justine; Mittelman, Anjuliee; Pennell, Kurt

    2017-11-13

    We wish to thank Fewtrell, Majuru, and Hunter for their article highlighting genotoxic risks associated with the use of particulate silver for primary drinking water treatment. The recent promotion of colloidal silver products for household water treatment in developing countries is problematic due to previously identified concerns regarding manufacturing quality and questionable advertising practices, as well as the low efficiency of silver nanoparticles to treat bacteria, viruses, and protozoa in source waters. However, in the conclusion statement of the manuscript, Fewtrell et al. state, "Before colloidal Ag or AgNP are used in filter matrices for drinking water treatment, consideration needs to be given to how much silver is likely to be released from the matrix during the life of the filter." Unfortunately, it appears Fewtrell et al. were unaware that studies of silver nanoparticle and silver ion elution from ceramic filters manufactured and used in developing countries have already been completed. These existing studies have found that: 1) silver ions, not silver nanoparticles, are eluted from ceramic filters treated with silver nanoparticles or silver nitrate; and, 2) silver ions have not been shown to be genotoxic. Thus, the existing recommendation of applying silver nanoparticles to ceramic filters to prevent biofilm formation within the filter and improve microbiological efficacy should still be adhered to, as there is no identified risk to people who drink water from ceramic filters treated with silver nanoparticles or silver nitrate. We note that efforts should continue to minimize exposure to silver nanoparticles (and silica) to employees in ceramic filter factories in collaboration with the organizations that provide technical assistance to ceramic filter factories.

  11. Machine Learning in the Presence of an Adversary: Attacking and Defending the SpamBayes Spam Filter

    DTIC Science & Technology

    2008-05-20

    Machine learning techniques are often used for decision making in security critical applications such as intrusion detection and spam filtering...filter. The defenses shown in this thesis are able to work against the attacks developed against SpamBayes and are sufficiently generic to be easily extended into other statistical machine learning algorithms.

  12. Microbial Survey of a Full-Scale, Biologically Active Filter for Treatment of Drinking Water

    PubMed Central

    DeBry, Ronald W.; Lytle, Darren A.

    2012-01-01

    The microbial community of a full-scale, biologically active drinking water filter was surveyed using molecular techniques. Nitrosomonas, Nitrospira, Sphingomonadales, and Rhizobiales dominated the clone libraries. The results elucidate the microbial ecology of biological filters and demonstrate that biological treatment of drinking water should be considered a viable alternative to physicochemical methods. PMID:22752177

  13. Using a Collaborative Critiquing Technique to Develop Chemistry Students' Technical Writing Skills

    ERIC Educational Resources Information Center

    Carr, Jeremy M.

    2013-01-01

    The technique, termed "collaborative critiquing", was developed to teach fundamental technical writing skills to analytical chemistry students for the preparation of laboratory reports. This exercise, which can be completed prior to peer-review activities, is novel, highly interactive, and allows students to take responsibility for their…

  14. Optimized Beam Sculpting with Generalized Fringe-rate Filters

    NASA Astrophysics Data System (ADS)

    Parsons, Aaron R.; Liu, Adrian; Ali, Zaki S.; Cheng, Carina

    2016-03-01

    We generalize the technique of fringe-rate filtering, whereby visibilities measured by a radio interferometer are re-weighted according to their temporal variation. As the Earth rotates, radio sources traverse through an interferometer’s fringe pattern at rates that depend on their position on the sky. Capitalizing on this geometric interpretation of fringe rates, we employ time-domain convolution kernels to enact fringe-rate filters that sculpt the effective primary beam of antennas in an interferometer. As we show, beam sculpting through fringe-rate filtering can be used to optimize measurements for a variety of applications, including mapmaking, minimizing polarization leakage, suppressing instrumental systematics, and enhancing the sensitivity of power-spectrum measurements. We show that fringe-rate filtering arises naturally in minimum variance treatments of many of these problems, enabling optimal visibility-based approaches to analyses of interferometric data that avoid systematics potentially introduced by traditional approaches such as imaging. Our techniques have recently been demonstrated in Ali et al., where new upper limits were placed on the 21 {cm} power spectrum from reionization, showcasing the ability of fringe-rate filtering to successfully boost sensitivity and reduce the impact of systematics in deep observations.

  15. Sensitivity and specificity of monochromatic photography of the ocular fundus in differentiating optic nerve head drusen and optic disc oedema: optic disc drusen and oedema.

    PubMed

    Gili, Pablo; Flores-Rodríguez, Patricia; Yangüela, Julio; Orduña-Azcona, Javier; Martín-Ríos, María Dolores

    2013-03-01

    Evaluation of the efficacy of monochromatic photography of the ocular fundus in differentiating optic nerve head drusen (ONHD) and optic disc oedema (ODE). Sixty-six patients with ONHD, 31 patients with ODE and 70 healthy subjects were studied. Colour and monochromatic fundus photography with different filters (green, red and autofluorescence) were performed. The results were analysed blindly by two observers. The sensitivity, specificity and interobserver agreement (k) of each test were assessed. Colour photography offers 65.5 % sensitivity and 100 % specificity for the diagnosis of ONHD. Monochromatic photography improves sensitivity and specificity and provides similar results: green filter (71.20 % sensitivity, 96.70 % specificity), red filter (80.30 % sensitivity, 96.80 % specificity), and autofluorescence technique (87.8 % sensitivity, 100 % specificity). The interobserver agreement was good with all techniques used: autofluorescence (k = 0.957), green filter (k = 0.897), red filter (k = 0.818) and colour (k = 0.809). Monochromatic fundus photography permits ONHD and ODE to be differentiated, with good sensitivity and very high specificity. The best results were obtained with autofluorescence and red filter study.

  16. SiC: filter for extreme ultraviolet

    NASA Astrophysics Data System (ADS)

    Mitrofanov, Alexander V.; Pudonin, Fedor A.; Zhitnik, Igor A.

    1994-09-01

    It is proposed to use thin films of silicon carbide as Extreme Ultraviolet bandpass filters transparent within 135-304 A band and with excellent cutoff blocking of the strong L(subscript (alpha) ) 1216 A line radiation. Mesh or particle track porous membrane supporting 200-800 A thickness SiC filters have been made by RF sputtering techniques. We describe the design and performance of these filters. Such type SiC filter was used in front of the microchannel plate detector of the TEREK X-Ray Telescope mounted on the Solar Observatory CORONAS-I which was successfully launched on March 2, 1994.

  17. HYBRID SILICON-ON-SAPPHIRE/SCALED CMOS INTERFERENCE MITIGATION FRONT END BASED ON SIMULTANEOUS NOISE CANCELLATION, ACTIVE-INTERFERENCE CANCELLATION AND N-PATH-MIXER FILTERING

    DTIC Science & Technology

    2017-04-01

    INTERFERENCE-CANCELLATION AND N-PATH-MIXER FILTERING Harish Krishnaswamy, Negar Reiskarimian, and Linxiao Zhang Columbia University APRIL 2017 Final...INTERFERENCE-CANCELLATION AND N- PATH-MIXER FILTERING 5a. CONTRACT NUMBER FA8650-14-1-7414 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 61101E/62716E 6...techniques for developing interference mitigation technology (IMT) enabling frequency-agile, reconfigurable filter -less receivers. Wideband noise

  18. Wideband bandpass filters employing broadside-coupled microstrip lines for MIC and MMIC applications

    NASA Technical Reports Server (NTRS)

    Tran, M.; Nguyen, C.

    1994-01-01

    Wideband bandpass filters employing half-wavelength broadside-coupled microstrip lines suitable for microwave and mm-wave integrated monolithic integrated circuits (MIC and MMIC) are presented. Several filters have been developed at X-band (8 to 12 GHz) with 1 dB insertion loss. Fair agreement between the measured and calculated results has been observed. The analysis of the broadside-coupled microstrip lines used in the filters, based on the quasi-static spectral domain technique, is also described.

  19. Collaborative learning in radiologic science education.

    PubMed

    Yates, Jennifer L

    2006-01-01

    Radiologic science is a complex health profession, requiring the competent use of technology as well as the ability to function as part of a team, think critically, exercise independent judgment, solve problems creatively and communicate effectively. This article presents a review of literature in support of the relevance of collaborative learning to radiologic science education. In addition, strategies for effective design, facilitation and authentic assessment of activities are provided for educators wishing to incorporate collaborative techniques into their program curriculum. The connection between the benefits of collaborative learning and necessary workplace skills, particularly in the areas of critical thinking, creative problem solving and communication skills, suggests that collaborative learning techniques may be particularly useful in the education of future radiologic technologists. This article summarizes research identifying the benefits of collaborative learning for adult education and identifying the link between these benefits and the necessary characteristics of medical imaging technologists.

  20. Introducer curving technique for the prevention of tilting of transfemoral Günther Tulip inferior vena cava filter.

    PubMed

    Xiao, Liang; Huang, De-sheng; Shen, Jing; Tong, Jia-jie

    2012-01-01

    To determine whether the introducer curving technique is useful in decreasing the degree of tilting of transfemoral Tulip filters. The study sample group consisted of 108 patients with deep vein thrombosis who were enrolled and planned to undergo thrombolysis, and who accepted transfemoral Tulip filter insertion procedure. The patients were randomly divided into Group C and Group T. The introducer curving technique was Adopted in Group T. The post-implantation filter tilting angle (ACF) was measured in an anteroposterior projection. The retrieval hook adhering to the vascular wall was measured via tangential cavogram during retrieval. The overall average ACF was 5.8 ± 4.14 degrees. In Group C, the average ACF was 7.1 ± 4.52 degrees. In Group T, the average ACF was 4.4 ± 3.20 degrees. The groups displayed a statistically significant difference (t = 3.573, p = 0.001) in ACF. Additionally, the difference of ACF between the left and right approaches turned out to be statistically significant (7.1 ± 4.59 vs. 5.1 ± 3.82, t = 2.301, p = 0.023). The proportion of severe tilt (ACF ≥ 10°) in Group T was significantly lower than that in Group C (9.3% vs. 24.1%, χ(2) = 4.267, p = 0.039). Between the groups, the difference in the rate of the retrieval hook adhering to the vascular wall was also statistically significant (2.9% vs. 24.2%, χ(2) = 5.030, p = 0.025). The introducer curving technique appears to minimize the incidence and extent of transfemoral Tulip filter tilting.

  1. Optical filter selection for high confidence discrimination of strongly overlapping infrared chemical spectra.

    PubMed

    Major, Kevin J; Poutous, Menelaos K; Ewing, Kenneth J; Dunnill, Kevin F; Sanghera, Jasbinder S; Aggarwal, Ishwar D

    2015-09-01

    Optical filter-based chemical sensing techniques provide a new avenue to develop low-cost infrared sensors. These methods utilize multiple infrared optical filters to selectively measure different response functions for various chemicals, dependent on each chemical's infrared absorption. Rather than identifying distinct spectral features, which can then be used to determine the identity of a target chemical, optical filter-based approaches rely on measuring differences in the ensemble response between a given filter set and specific chemicals of interest. Therefore, the results of such methods are highly dependent on the original optical filter choice, which will dictate the selectivity, sensitivity, and stability of any filter-based sensing method. Recently, a method has been developed that utilizes unique detection vector operations defined by optical multifilter responses, to discriminate between volatile chemical vapors. This method, comparative-discrimination spectral detection (CDSD), is a technique which employs broadband optical filters to selectively discriminate between chemicals with highly overlapping infrared absorption spectra. CDSD has been shown to correctly distinguish between similar chemicals in the carbon-hydrogen stretch region of the infrared absorption spectra from 2800-3100 cm(-1). A key challenge to this approach is how to determine which optical filter sets should be utilized to achieve the greatest discrimination between target chemicals. Previous studies used empirical approaches to select the optical filter set; however this is insufficient to determine the optimum selectivity between strongly overlapping chemical spectra. Here we present a numerical approach to systematically study the effects of filter positioning and bandwidth on a number of three-chemical systems. We describe how both the filter properties, as well as the chemicals in each set, affect the CDSD results and subsequent discrimination. These results demonstrate the importance of choosing the proper filter set and chemicals for comparative discrimination, in order to identify the target chemical of interest in the presence of closely matched chemical interferents. These findings are an integral step in the development of experimental prototype sensors, which will utilize CDSD.

  2. Recent advances in analysis and prediction of Rock Falls, Rock Slides, and Rock Avalanches using 3D point clouds

    NASA Astrophysics Data System (ADS)

    Abellan, A.; Carrea, D.; Jaboyedoff, M.; Riquelme, A.; Tomas, R.; Royan, M. J.; Vilaplana, J. M.; Gauvin, N.

    2014-12-01

    The acquisition of dense terrain information using well-established 3D techniques (e.g. LiDAR, photogrammetry) and the use of new mobile platforms (e.g. Unmanned Aerial Vehicles) together with the increasingly efficient post-processing workflows for image treatment (e.g. Structure From Motion) are opening up new possibilities for analysing, modeling and predicting rock slope failures. Examples of applications at different scales ranging from the monitoring of small changes at unprecedented level of detail (e.g. sub millimeter-scale deformation under lab-scale conditions) to the detection of slope deformation at regional scale. In this communication we will show the main accomplishments of the Swiss National Foundation project "Characterizing and analysing 3D temporal slope evolution" carried out at Risk Analysis group (Univ. of Lausanne) in close collaboration with the RISKNAT and INTERES groups (Univ. of Barcelona and Univ. of Alicante, respectively). We have recently developed a series of innovative approaches for rock slope analysis using 3D point clouds, some examples include: the development of semi-automatic methodologies for the identification and extraction of rock-slope features such as discontinuities, type of material, rockfalls occurrence and deformation. Moreover, we have been improving our knowledge in progressive rupture characterization thanks to several algorithms, some examples include the computing of 3D deformation, the use of filtering techniques on permanently based TLS, the use of rock slope failure analogies at different scales (laboratory simulations, monitoring at glacier's front, etc.), the modelling of the influence of external forces such as precipitation on the acceleration of the deformation rate, etc. We have also been interested on the analysis of rock slope deformation prior to the occurrence of fragmental rockfalls and the interaction of this deformation with the spatial location of future events. In spite of these recent advances, a great challenge still remains in the development of new algorithms for more accurate techniques for 3D point cloud treatment (e.g. filtering, segmentation, etc.) aiming to improve rock slope characterization and monitoring, a series of exciting research findings are expected in the forthcoming years.

  3. The Development of a Microbial Challenge Test with Acholeplasma laidlawii To Rate Mycoplasma-Retentive Filters by Filter Manufacturers.

    PubMed

    Folmsbee, Martha; Lentine, Kerry Roche; Wright, Christine; Haake, Gerhard; Mcburnie, Leesa; Ashtekar, Dilip; Beck, Brian; Hutchison, Nick; Okhio-Seaman, Laura; Potts, Barbara; Pawar, Vinayak; Windsor, Helena

    2014-01-01

    Mycoplasma are bacteria that can penetrate 0.2 and 0.22 μm rated sterilizing-grade filters and even some 0.1 μm rated filters. Primary applications for mycoplasma filtration include large scale mammalian and bacterial cell culture media and serum filtration. The Parenteral Drug Association recognized the absence of standard industry test parameters for testing and classifying 0.1 μm rated filters for mycoplasma clearance and formed a task force to formulate consensus test parameters. The task force established some test parameters by common agreement, based upon general industry practices, without the need for additional testing. However, the culture medium and incubation conditions, for generating test mycoplasma cells, varied from filter company to filter company and was recognized as a serious gap by the task force. Standardization of the culture medium and incubation conditions required collaborative testing in both commercial filter company laboratories and in an Independent laboratory (Table I). The use of consensus test parameters will facilitate the ultimate cross-industry goal of standardization of 0.1 μm filter claims for mycoplasma clearance. However, it is still important to recognize filter performance will depend on the actual conditions of use. Therefore end users should consider, using a risk-based approach, whether process-specific evaluation of filter performance may be warranted for their application. Mycoplasma are small bacteria that have the ability to penetrate sterilizing-grade filters. Filtration of large-scale mammalian and bacterial cell culture media is an example of an industry process where effective filtration of mycoplasma is required. The Parenteral Drug Association recognized the absence of industry standard test parameters for evaluating mycoplasma clearance filters by filter manufacturers and formed a task force to formulate such a consensus among manufacturers. The use of standardized test parameters by filter manufacturers, including the preparation of the culture broth, will facilitate the end user's evaluation of the mycoplasma clearance claims provided by filter vendors. However, it is still important to recognize filter performance will depend on the actual conditions of use; therefore end users should consider, using a risk-based approach, whether process-specific evaluation of filter performance may be warranted for their application. © PDA, Inc. 2014.

  4. The design and implementation of radar clutter modelling and adaptive target detection techniques

    NASA Astrophysics Data System (ADS)

    Ali, Mohammed Hussain

    The analysis and reduction of radar clutter is investigated. Clutter is the term applied to unwanted radar reflections from land, sea, precipitation, and/or man-made objects. A great deal of useful information regarding the characteristics of clutter can be obtained by the application of frequency domain analytical methods. Thus, some considerable time was spent assessing the various techniques available and their possible application to radar clutter. In order to better understand clutter, use of a clutter model was considered desirable. There are many techniques which will enable a target to be detected in the presence of clutter. One of the most flexible of these is that of adaptive filtering. This technique was thoroughly investigated and a method for improving its efficacy was devised. The modified adaptive filter employed differential adaption times to enhance detectability. Adaptation time as a factor relating to target detectability is a new concept and was investigated in some detail. It was considered desirable to implement the theoretical work in dedicated hardware to confirm that the modified clutter model and the adaptive filter technique actually performed as predicted. The equipment produced is capable of operation in real time and provides an insight into real time DSP applications. This equipment is sufficiently rapid to produce a real time display on the actual PPI system. Finally a software package was also produced which would simulate the operation of a PPI display and thus ease the interpretation of the filter outputs.

  5. Passive Ranging Using a Dispersive Spectrometer and Optical Filters

    DTIC Science & Technology

    2012-12-20

    transform spectrometers. These in- struments are very sensitive to vibration, however, making them difficult to use on an air or space-borne platform. This... techniques will scale to longer ranges. An instrument using filters is predicted to be more accurate at long ranges, but only if the grating...done by Leonpacher at AFIT. This research focused on the CO2 absorption feature at 4.3 µm. His technique compared the relative intensity between two

  6. Lunar surface chemistry: A new imaging technique

    USGS Publications Warehouse

    Andre, C.G.; Bielefeld, M.J.; Eliason, E.; Soderblom, L.A.; Adler, I.; Philpotts, J.A.

    1977-01-01

    Detailed chemical maps of the lunar surface have been constructed by applying a new weighted-filter imaging technique to Apollo 15 and Apollo 16 x-ray fluorescence data. The data quality improvement is amply demonstrated by (i) modes in the frequency distribution, representing highland and mare soil suites, which are not evident before data filtering and (ii) numerous examples of chemical variations which are correlated with small-scale (about 15 kilometer) lunar topographic features.

  7. Lunar surface chemistry - A new imaging technique

    NASA Technical Reports Server (NTRS)

    Andre, C. G.; Adler, I.; Bielefeld, M. J.; Eliason, E.; Soderblom, L. A.; Philpotts, J. A.

    1977-01-01

    Detailed chemical maps of the lunar surface have been constructed by applying a new weighted-filter imaging technique to Apollo 15 and Apollo 16 X-ray fluorescence data. The data quality improvement is amply demonstrated by (1) modes in the frequency distribution, representing highland and mare soil suites, which are not evident before data filtering, and (2) numerous examples of chemical variations which are correlated with small-scale (about 15 kilometer) lunar topographic features.

  8. High-Speed Imaging Optical Pyrometry for Study of Boron Nitride Nanotube Generation

    NASA Technical Reports Server (NTRS)

    Inman, Jennifer A.; Danehy, Paul M.; Jones, Stephen B.; Lee, Joseph W.

    2014-01-01

    A high-speed imaging optical pyrometry system is designed for making in-situ measurements of boron temperature during the boron nitride nanotube synthesis process. Spectrometer measurements show molten boron emission to be essentially graybody in nature, lacking spectral emission fine structure over the visible range of the electromagnetic spectrum. Camera calibration experiments are performed and compared with theoretical calculations to quantitatively establish the relationship between observed signal intensity and temperature. The one-color pyrometry technique described herein involves measuring temperature based upon the absolute signal intensity observed through a narrowband spectral filter, while the two-color technique uses the ratio of the signals through two spectrally separated filters. The present study calibrated both the one- and two-color techniques at temperatures between 1,173 K and 1,591 K using a pco.dimax HD CMOS-based camera along with three such filters having transmission peaks near 550 nm, 632.8 nm, and 800 nm.

  9. Describing litho-constrained layout by a high-resolution model filter

    NASA Astrophysics Data System (ADS)

    Tsai, Min-Chun

    2008-05-01

    A novel high-resolution model (HRM) filtering technique was proposed to describe litho-constrained layouts. Litho-constrained layouts are layouts that have difficulties to pattern or are highly sensitive to process-fluctuations under current lithography technologies. HRM applies a short-wavelength (or high NA) model simulation directly on the pre-OPC, original design layout to filter out low spatial-frequency regions, and retain high spatial-frequency components which are litho-constrained. Since no OPC neither mask-synthesis steps are involved, this new technique is highly efficient in run time and can be used in design stage to detect and fix litho-constrained patterns. This method has successfully captured all the hot-spots with less than 15% overshoots on a realistic 80 mm2 full-chip M1 layout in 65nm technology node. A step by step derivation of this HRM technique is presented in this paper.

  10. Acousto-optical tunable filter for combined wideband, spectral, and optical coherence microscopy.

    PubMed

    Machikhin, Alexander S; Pozhar, Vitold E; Viskovatykh, Alexander V; Burmak, Ludmila I

    2015-09-01

    A multimodal technique for inspection of microscopic objects by means of wideband optical microscopy, spectral microscopy, and optical coherence microscopy is described, implemented, and tested. The key feature is the spectral selection of light in the output arm of an interferometer with use of the specialized imaging acousto-optical tunable filter. In this filter, two interfering optical beams are diffracted via the same ultrasound wave without destruction of interference image structure. The basic requirements for the acousto-optical tunable filter are defined, and mathematical formulas for calculation of its parameters are derived. Theoretical estimation of the achievable accuracy of the 3D image reconstruction is presented and experimental proofs are given. It is demonstrated that spectral imaging can also be accompanied by measurement of the quantitative reflectance spectra. Examples of inspection of optically transparent and nontransparent samples demonstrate the applicability of the technique.

  11. Filtering techniques for efficient inversion of two-dimensional Nuclear Magnetic Resonance data

    NASA Astrophysics Data System (ADS)

    Bortolotti, V.; Brizi, L.; Fantazzini, P.; Landi, G.; Zama, F.

    2017-10-01

    The inversion of two-dimensional Nuclear Magnetic Resonance (NMR) data requires the solution of a first kind Fredholm integral equation with a two-dimensional tensor product kernel and lower bound constraints. For the solution of this ill-posed inverse problem, the recently presented 2DUPEN algorithm [V. Bortolotti et al., Inverse Problems, 33(1), 2016] uses multiparameter Tikhonov regularization with automatic choice of the regularization parameters. In this work, I2DUPEN, an improved version of 2DUPEN that implements Mean Windowing and Singular Value Decomposition filters, is deeply tested. The reconstruction problem with filtered data is formulated as a compressed weighted least squares problem with multi-parameter Tikhonov regularization. Results on synthetic and real 2D NMR data are presented with the main purpose to deeper analyze the separate and combined effects of these filtering techniques on the reconstructed 2D distribution.

  12. Collaborative care: Using six thinking hats for decision making.

    PubMed

    Cioffi, Jane Marie

    2017-12-01

    To apply six thinking hats technique for decision making in collaborative care. In collaborative partnerships, effective communications need to occur in patient, family, and health care professional meetings. The effectiveness of these meetings depends on the engagement of participants and the quality of the meeting process. The use of six thinking hats technique to engage all participants in effective dialogue is proposed. Discussion paper. Electronic databases, CINAHL, Pub Med, and Science Direct, were searched for years 1990 to 2017. Using six thinking hats technique in patient family meetings nurses can guide a process of dialogue that focuses decision making to build equal care partnerships inclusive of all participants. Nurses will need to develop the skills for using six thinking hats technique and provide support to all participants during the meeting process. Collaborative decision making can be augmented by six thinking hat technique to provide patients, families, and health professionals with opportunities to make informed decisions about care that considers key issues for all involved. Nurses who are most often advocates for patients and their families are in a unique position to lead this initiative in meetings as they network with all health professionals. © 2017 John Wiley & Sons Australia, Ltd.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marrinan, Thomas; Leigh, Jason; Renambot, Luc

    Mixed presence collaboration involves remote collaboration between multiple collocated groups. This paper presents the design and results of a user study that focused on mixed presence collaboration using large-scale tiled display walls. The research was conducted in order to compare data synchronization schemes for multi-user visualization applications. Our study compared three techniques for sharing data between display spaces with varying constraints and affordances. The results provide empirical evidence that using data sharing techniques with continuous synchronization between the sites lead to improved collaboration for a search and analysis task between remotely located groups. We have also identified aspects of synchronizedmore » sessions that result in increased remote collaborator awareness and parallel task coordination. It is believed that this research will lead to better utilization of large-scale tiled display walls for distributed group work.« less

  14. Real time estimation of ship motions using Kalman filtering techniques

    NASA Technical Reports Server (NTRS)

    Triantafyllou, M. S.; Bodson, M.; Athans, M.

    1983-01-01

    The estimation of the heave, pitch, roll, sway, and yaw motions of a DD-963 destroyer is studied, using Kalman filtering techniques, for application in VTOL aircraft landing. The governing equations are obtained from hydrodynamic considerations in the form of linear differential equations with frequency dependent coefficients. In addition, nonminimum phase characteristics are obtained due to the spatial integration of the water wave forces. The resulting transfer matrix function is irrational and nonminimum phase. The conditions for a finite-dimensional approximation are considered and the impact of the various parameters is assessed. A detailed numerical application for a DD-963 destroyer is presented and simulations of the estimations obtained from Kalman filters are discussed.

  15. A Thick-Restart Lanczos Algorithm with Polynomial Filtering for Hermitian Eigenvalue Problems

    DOE PAGES

    Li, Ruipeng; Xi, Yuanzhe; Vecharynski, Eugene; ...

    2016-08-16

    Polynomial filtering can provide a highly effective means of computing all eigenvalues of a real symmetric (or complex Hermitian) matrix that are located in a given interval, anywhere in the spectrum. This paper describes a technique for tackling this problem by combining a thick-restart version of the Lanczos algorithm with deflation ("locking'') and a new type of polynomial filter obtained from a least-squares technique. Furthermore, the resulting algorithm can be utilized in a “spectrum-slicing” approach whereby a very large number of eigenvalues and associated eigenvectors of the matrix are computed by extracting eigenpairs located in different subintervals independently from onemore » another.« less

  16. Hyper-filter-fluorescer spectrometer for x-rays above 120 keV

    DOEpatents

    Wang, Ching L.

    1983-01-01

    An apparatus utilizing filter-fluorescer combinations is provided to measure short bursts of high fluence x-rays above 120 keV energy, where there are no practical absorption edges available for conventional filter-fluorescer techniques. The absorption edge of the prefilter is chosen to be less than that of the fluorescer, i.e., E.sub.PRF E.sub.F. In this way, the response function is virtually zero between E.sub.PRF and E.sub.F and well defined and enhanced in an energy band of less than 1000 keV above the 120 keV energy.

  17. Picosecond and sub-picosecond flat-top pulse generation using uniform long-period fiber gratings

    NASA Astrophysics Data System (ADS)

    Park, Y.; Kulishov, M.; Slavík, R.; Azaña, J.

    2006-12-01

    We propose a novel linear filtering scheme based on ultrafast all-optical differentiation for re-shaping of ultrashort pulses generated from a mode-locked laser into flat-top pulses. The technique is demonstrated using simple all-fiber optical filters, more specifically uniform long period fiber gratings (LPGs) operated in transmission. The large bandwidth typical for these fiber filters allows scaling the technique to the sub-picosecond regime. In the experiments reported here, 600-fs and 1.8-ps Gaussian-like optical pulses (@ 1535 nm) have been re-shaped into 1-ps and 3.2-ps flat-top pulses, respectively, using a single 9-cm long uniform LPG.

  18. Three Is a Crowd? No Way--Three Is a Team! Collaborative Consultation Techniques for Educators.

    ERIC Educational Resources Information Center

    Wilber, Mary M. Jensen

    This paper presents specific strategies to assist collaborative consultation efforts by educators of students with disabilities. First, a definition of collaborative consultation is offered and advantages of this approach identified. Next, essential skills and strategies to gain acceptance and establish credibility in collaborative situations are…

  19. Adaptive texture filtering for defect inspection in ultrasound images

    NASA Astrophysics Data System (ADS)

    Zmola, Carl; Segal, Andrew C.; Lovewell, Brian; Nash, Charles

    1993-05-01

    The use of ultrasonic imaging to analyze defects and characterize materials is critical in the development of non-destructive testing and non-destructive evaluation (NDT/NDE) tools for manufacturing. To develop better quality control and reliability in the manufacturing environment advanced image processing techniques are useful. For example, through the use of texture filtering on ultrasound images, we have been able to filter characteristic textures from highly-textured C-scan images of materials. The materials have highly regular characteristic textures which are of the same resolution and dynamic range as other important features within the image. By applying texture filters and adaptively modifying their filter response, we have examined a family of filters for removing these textures.

  20. LANDSAT-4 and LANDSAT-5 Multispectral Scanner Coherent Noise Characterization and Removal

    NASA Technical Reports Server (NTRS)

    Tilton, James C.; Alford, William L.

    1988-01-01

    A technique is described for characterizing the coherent noise found in LANDSAT-4 and LANDSAT-5 MSS data and a companion technique for filtering out the coherent noise. The techniques are demonstrated on LANDSAT-4 and LANDSAT-5 MSS data sets, and explanations of the noise pattern are suggested in Appendix C. A cookbook procedure for characterizing and filtering the coherent noise using special NASA/Goddard IDIMS functions is included. Also presented are analysis results from the retrofitted LANDSAT-5 MSS sensor, which shows that the coherent noise has been substantially reduced.

  1. An image filtering technique for SPIDER visible tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fonnesu, N., E-mail: nicola.fonnesu@igi.cnr.it; Agostini, M.; Brombin, M.

    2014-02-15

    The tomographic diagnostic developed for the beam generated in the SPIDER facility (100 keV, 50 A prototype negative ion source of ITER neutral beam injector) will characterize the two-dimensional particle density distribution of the beam. The simulations described in the paper show that instrumental noise has a large influence on the maximum achievable resolution of the diagnostic. To reduce its impact on beam pattern reconstruction, a filtering technique has been adapted and implemented in the tomography code. This technique is applied to the simulated tomographic reconstruction of the SPIDER beam, and the main results are reported.

  2. Generalization of the Lyot filter and its application to snapshot spectral imaging.

    PubMed

    Gorman, Alistair; Fletcher-Holmes, David William; Harvey, Andrew Robert

    2010-03-15

    A snapshot multi-spectral imaging technique is described which employs multiple cascaded birefringent interferometers to simultaneously spectrally filter and demultiplex multiple spectral images onto a single detector array. Spectral images are recorded directly without the need for inversion and without rejection of light and so the technique offers the potential for high signal-to-noise ratio. An example of an eight-band multi-spectral movie sequence is presented; we believe this is the first such demonstration of a technique able to record multi-spectral movie sequences without the need for computer reconstruction.

  3. Rapid Method for Sodium Hydroxide/Sodium Peroxide Fusion ...

    EPA Pesticide Factsheets

    Technical Fact Sheet Analysis Purpose: Qualitative analysis Technique: Alpha spectrometry Method Developed for: Plutonium-238 and plutonium-239 in water and air filters Method Selected for: SAM lists this method as a pre-treatment technique supporting analysis of refractory radioisotopic forms of plutonium in drinking water and air filters using the following qualitative techniques: • Rapid methods for acid or fusion digestion • Rapid Radiochemical Method for Plutonium-238 and Plutonium 239/240 in Building Materials for Environmental Remediation Following Radiological Incidents. Summary of subject analytical method which will be posted to the SAM website to allow access to the method.

  4. Partial information decomposition as a spatiotemporal filter.

    PubMed

    Flecker, Benjamin; Alford, Wesley; Beggs, John M; Williams, Paul L; Beer, Randall D

    2011-09-01

    Understanding the mechanisms of distributed computation in cellular automata requires techniques for characterizing the emergent structures that underlie information processing in such systems. Recently, techniques from information theory have been brought to bear on this problem. Building on this work, we utilize the new technique of partial information decomposition to show that previous information-theoretic measures can confound distinct sources of information. We then propose a new set of filters and demonstrate that they more cleanly separate out the background domains, particles, and collisions that are typically associated with information storage, transfer, and modification in cellular automata.

  5. Laboratory and airborne techniques for measuring fluoresence of natural surfaces

    NASA Technical Reports Server (NTRS)

    Stoertz, G. E.; Hemphill, W. R.

    1972-01-01

    Techniques are described for obtaining fluorescence spectra from samples of natural surfaces that can be used to predict spectral regions in which these surfaces would emit solar-stimulated or laser-stimulated fluorescence detectable by remote sensor. Scattered or reflected stray light caused large errors in spectrofluorometer analysis or natural sample surfaces. Most spurious light components can be eliminated by recording successive fluorescence spectra for each sample, using identical instrument settings, first with an appropriate glass or gelatin filter on the excitation side of the sample, and subsequently with the same filter on the emission side of the sample. This technique appears more accurate than any alternative technique for testing the fluorescence of natural surfaces.

  6. Comparison study on disturbance estimation techniques in precise slow motion control

    NASA Astrophysics Data System (ADS)

    Fan, S.; Nagamune, R.; Altintas, Y.; Fan, D.; Zhang, Z.

    2010-08-01

    Precise low speed motion control is important for the industrial applications of both micro-milling machine tool feed drives and electro-optical tracking servo systems. It calls for precise position and instantaneous velocity measurement and disturbance, which involves direct drive motor force ripple, guide way friction and cutting force etc., estimation. This paper presents a comparison study on dynamic response and noise rejection performance of three existing disturbance estimation techniques, including the time-delayed estimators, the state augmented Kalman Filters and the conventional disturbance observers. The design technique essentials of these three disturbance estimators are introduced. For designing time-delayed estimators, it is proposed to substitute Kalman Filter for Luenberger state observer to improve noise suppression performance. The results show that the noise rejection performances of the state augmented Kalman Filters and the time-delayed estimators are much better than the conventional disturbance observers. These two estimators can give not only the estimation of the disturbance but also the low noise level estimations of position and instantaneous velocity. The bandwidth of the state augmented Kalman Filters is wider than the time-delayed estimators. In addition, the state augmented Kalman Filters can give unbiased estimations of the slow varying disturbance and the instantaneous velocity, while the time-delayed estimators can not. The simulation and experiment conducted on X axis of a 2.5-axis prototype micro milling machine are provided.

  7. The Behavior of Filters and Smoothers for Strongly Nonlinear Dynamics

    NASA Technical Reports Server (NTRS)

    Zhu, Yanqiu; Cohn, Stephen E.; Todling, Ricardo

    1999-01-01

    The Kalman filter is the optimal filter in the presence of known Gaussian error statistics and linear dynamics. Filter extension to nonlinear dynamics is non trivial in the sense of appropriately representing high order moments of the statistics. Monte Carlo, ensemble-based, methods have been advocated as the methodology for representing high order moments without any questionable closure assumptions (e.g., Miller 1994). Investigation along these lines has been conducted for highly idealized dynamics such as the strongly nonlinear Lorenz (1963) model as well as more realistic models of the oceans (Evensen and van Leeuwen 1996) and atmosphere (Houtekamer and Mitchell 1998). A few relevant issues in this context are related to the necessary number of ensemble members to properly represent the error statistics and, the necessary modifications in the usual filter equations to allow for correct update of the ensemble members (Burgers 1998). The ensemble technique has also been applied to the problem of smoothing for which similar questions apply. Ensemble smoother examples, however, seem to quite puzzling in that results of state estimate are worse than for their filter analogue (Evensen 1997). In this study, we use concepts in probability theory to revisit the ensemble methodology for filtering and smoothing in data assimilation. We use Lorenz (1963) model to test and compare the behavior of a variety implementations of ensemble filters. We also implement ensemble smoothers that are able to perform better than their filter counterparts. A discussion of feasibility of these techniques to large data assimilation problems will be given at the time of the conference.

  8. Ross filter development for absolute measurement of Al line radiation on MST

    NASA Astrophysics Data System (ADS)

    Lauersdorf, N.; Reusch, L. M.; den Hartog, D. J.; Goetz, J. A.; Franz, P.; Vanmeter, P.

    2017-10-01

    The MST has a two-color soft x-ray tomography (SXT) diagnostic that, using the double-filter technique, measures electron temperature (Te) from the slope of the soft x-ray (SXR) continuum. Because MST has an aluminum plasma-facing surface, bright Al line radiation occurs in the SXR spectrum. In past application of the double-filter technique, these lines have been filtered out using thick Be filters ( 400 μm and 800 μm), restricting the measurement temperature range to >=1 keV due to the signal strength having a positive correlation with Te. Another way to deal with the line radiation is to explicitly include it into the SXR spectrum analysis from which Te is derived. A Ross filter set has been designed to measure this line radiation, and will enable the absolute intensities of the aluminum lines to be quantified and incorporated into the analysis. The Ross filter will be used to measure Al+11 and Al+12 lines, occurring between 1.59 and 2.04 keV. By using multiple detectors with filters made of varying element concentrations, we create spectral bins in which the dominant transmission is the line radiation. Absolute measurement of Al line intensities will enable use of thinner filters in the SXT diagnostic and accurate measurement of Te < 1 keV. This material is based upon work supported by the U.S. Department of Energy Office of Science, Office of Fusion Energy Sciences program under Award Numbers DE-FC02-05ER54814 and DE-SC0015474.

  9. District-level hospital trauma care audit filters: Delphi technique for defining context-appropriate indicators for quality improvement initiative evaluation in developing countries

    PubMed Central

    Stewart, Barclay T; Gyedu, Adam; Quansah, Robert; Addo, Wilfred Larbi; Afoko, Akis; Agbenorku, Pius; Amponsah-Manu, Forster; Ankomah, James; Appiah-Denkyira, Ebenezer; Baffoe, Peter; Debrah, Sam; Donkor, Peter; Dorvlo, Theodor; Japiong, Kennedy; Kushner, Adam L; Morna, Martin; Ofosu, Anthony; Oppong-Nketia, Victor; Tabiri, Stephen; Mock, Charles

    2015-01-01

    Introduction Prospective clinical audit of trauma care improves outcomes for the injured in high-income countries (HICs). However, equivalent, context-appropriate audit filters for use in low- and middle-income country (LMIC) district-level hospitals have not been well established. We aimed to develop context-appropriate trauma care audit filters for district-level hospitals in Ghana, was well as other LMICs more broadly. Methods Consensus on trauma care audit filters was built between twenty panelists using a Delphi technique with four anonymous, iterative surveys designed to elicit: i) trauma care processes to be measured; ii) important features of audit filters for the district-level hospital setting; and iii) potentially useful filters. Filters were ranked on a scale from 0 – 10 (10 being very useful). Consensus was measured with average percent majority opinion (APMO) cut-off rate. Target consensus was defined a priori as: a median rank of ≥9 for each filter and an APMO cut-off rate of ≥0.8. Results Panelists agreed on trauma care processes to target (e.g. triage, phases of trauma assessment, early referral if needed) and specific features of filters for district-level hospital use (e.g. simplicity, unassuming of resource capacity). APMO cut-off rate increased successively: Round 1 - 0.58; Round 2 - 0.66; Round 3 - 0.76; and Round 4 - 0.82. After Round 4, target consensus on 22 trauma care and referral-specific filters was reached. Example filters include: triage - vital signs are recorded within 15 minutes of arrival (must include breathing assessment, heart rate, blood pressure, oxygen saturation if available); circulation - a large bore IV was placed within 15 minutes of patient arrival; referral - if referral is activated, the referring clinician and receiving facility communicate by phone or radio prior to transfer. Conclusion This study proposes trauma care audit filters appropriate for LMIC district-level hospitals. Given the successes of similar filters in HICs and obstetric care filters in LMICs, the collection and reporting of prospective trauma care audit filters may be an important step toward improving care for the injured at district-level hospitals in LMICs. PMID:26492882

  10. SU-E-J-261: The Importance of Appropriate Image Preprocessing to Augment the Information of Radiomics Image Features

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, L; Fried, D; Fave, X

    Purpose: To investigate how different image preprocessing techniques, their parameters, and the different boundary handling techniques can augment the information of features and improve feature’s differentiating capability. Methods: Twenty-seven NSCLC patients with a solid tumor volume and no visually obvious necrotic regions in the simulation CT images were identified. Fourteen of these patients had a necrotic region visible in their pre-treatment PET images (necrosis group), and thirteen had no visible necrotic region in the pre-treatment PET images (non-necrosis group). We investigated how image preprocessing can impact the ability of radiomics image features extracted from the CT to differentiate between twomore » groups. It is expected the histogram in the necrosis group is more negatively skewed, and the uniformity from the necrosis group is less. Therefore, we analyzed two first order features, skewness and uniformity, on the image inside the GTV in the intensity range [−20HU, 180HU] under the combination of several image preprocessing techniques: (1) applying the isotropic Gaussian or anisotropic diffusion smoothing filter with a range of parameter(Gaussian smoothing: size=11, sigma=0:0.1:2.3; anisotropic smoothing: iteration=4, kappa=0:10:110); (2) applying the boundaryadapted Laplacian filter; and (3) applying the adaptive upper threshold for the intensity range. A 2-tailed T-test was used to evaluate the differentiating capability of CT features on pre-treatment PT necrosis. Result: Without any preprocessing, no differences in either skewness or uniformity were observed between two groups. After applying appropriate Gaussian filters (sigma>=1.3) or anisotropic filters(kappa >=60) with the adaptive upper threshold, skewness was significantly more negative in the necrosis group(p<0.05). By applying the boundary-adapted Laplacian filtering after the appropriate Gaussian filters (0.5 <=sigma<=1.1) or anisotropic filters(20<=kappa <=50), the uniformity was significantly lower in the necrosis group (p<0.05). Conclusion: Appropriate selection of image preprocessing techniques allows radiomics features to extract more useful information and thereby improve prediction models based on these features.« less

  11. Characterization of Mid-Infrared Single Mode Fibers as Modal Filters

    NASA Technical Reports Server (NTRS)

    Ksendzov, A.; Lay, O.; Martin, S.; Sanghera, J. S.; Busse, L. E.; Kim, W. H.; Pureza, P. C.; Nguyen, V. Q.; Aggarwal, I. D.

    2007-01-01

    We present a technique for measuring the modal filtering ability of single mode fibers. The ideal modal filter rejects all input field components that have no overlap with the fundamental mode of the filter and does not attenuate the fundamental mode. We define the quality of a nonideal modal filter Q(sub f) as the ratio of transmittance for the fundamental mode to the transmittance for an input field that has no overlap with the fundamental mode. We demonstrate the technique on a 20 cm long mid-infrared fiber that was produced by the U.S. Naval Research Laboratory. The filter quality Q(sub f) for this fiber at 10.5 micron wavelength is 1000 +/- 300. The absorption and scattering losses in the fundamental mode are approximately 8 dB/m. The total transmittance for the fundamental mode, including Fresnel reflections, is 0.428 +/- 0.002. The application of interest is the search for extrasolar Earthlike planets using nulling interferometry. It requires high rejection ratios to suppress the light of a bright star, so that the faint planet becomes visible. The use of modal filters increases the rejection ratio (or, equivalently, relaxes requirements on the wavefront quality) by reducing the sensitivity to small wavefront errors. We show theoretically that, exclusive of coupling losses, the use of a modal filter leads to the improvement of the rejection ratio in a two-beam interferometer by a factor of Q(sub f).

  12. Retrieval characteristics of the Bard Denali and Argon Option inferior vena cava filters.

    PubMed

    Dowell, Joshua D; Semaan, Dominic; Makary, Mina S; Ryu, John; Khayat, Mamdouh; Pan, Xueliang

    2017-11-01

    The purpose of this study was to compare the retrieval characteristics of the Option Elite (Argon Medical, Plano, Tex) and Denali (Bard, Tempe, Ariz) retrievable inferior vena cava filters (IVCFs), two filters that share a similar conical design. A single-center, retrospective study reviewed all Option and Denali IVCF removals during a 36-month period. Attempted retrievals were classified as advanced if the routine "snare and sheath" technique was initially unsuccessful despite multiple attempts or an alternative endovascular maneuver or access site was used. Patient and filter characteristics were documented. In our study, 63 Option and 45 Denali IVCFs were retrieved, with an average dwell time of 128.73 and 99.3 days, respectively. Significantly higher median fluoroscopy times were experienced in retrieving the Option filter compared with the Denali filter (12.18 vs 6.85 minutes; P = .046). Use of adjunctive techniques was also higher in comparing the Option filter with the Denali filter (19.0% vs 8.7%; P = .079). No significant difference was noted between these groups in regard to gender, age, or history of malignant disease. Option IVCF retrieval procedures required significantly longer retrieval fluoroscopy time compared with Denali IVCFs. Although procedure time was not analyzed in this study, as a surrogate, the increased fluoroscopy time may also have an impact on procedural direct costs and throughput. Copyright © 2017 Society for Vascular Surgery. Published by Elsevier Inc. All rights reserved.

  13. Filtration device for active effluents

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guerin, M.; Meunier, G.

    1994-12-31

    Among the various techniques relating to solid/liquid separations, filtration is currently utilized for treating radioactive effluents. After testing different equipments on various simulated effluents, the Valduc Center has decided to substitute a monoplate filter for a rotative diatomite precoated filter.

  14. Antarctic Atmospheric Infrasound.

    DTIC Science & Technology

    1981-11-30

    auroral infra - sonic waves and the atmospheric test of a nuclear weapon in China were all recorded and analyzed in real-time by the new system as...Detection Enhancement by a Pure State Filter, 16 February 1981 The great success of the polarization filter technique with infra - sonic data led to our...Project chronology ) 2. Summary of data collected 3. Antarctic infrasonic signals 4. Noise suppression using data-adaptive polarization filters: appli

  15. Poisson filtering of laser ranging data

    NASA Technical Reports Server (NTRS)

    Ricklefs, Randall L.; Shelus, Peter J.

    1993-01-01

    The filtering of data in a high noise, low signal strength environment is a situation encountered routinely in lunar laser ranging (LLR) and, to a lesser extent, in artificial satellite laser ranging (SLR). The use of Poisson statistics as one of the tools for filtering LLR data is described first in a historical context. The more recent application of this statistical technique to noisy SLR data is also described.

  16. Social Interaction Strategies and Techniques for Today's Classrooms

    ERIC Educational Resources Information Center

    Saurino, Dan R.; Saurino, Penelope; Clemente, Robert

    2009-01-01

    An emerging research tool used in recent years to better understand and improve teacher thinking has been the use of collaboration and collaborative action research. In our study, we were interested in whether teachers could enhance the learning of their subjects through the use of teaching techniques and strategies involving social interaction.…

  17. Co-Story-ing: Collaborative Story Writing with Children Who Fear

    ERIC Educational Resources Information Center

    Pehrsson, Dale-Elizabeth

    2007-01-01

    This article offers a guide for using collaborative story writing (co-story-ing), an assessment technique as well as a therapeutic intervention for children who demonstrate fears, extreme shyness and difficulty in establishing relationships. Co-story-ing draws from Gardner's Mutual Story Telling Technique. Co-story-ing guides clients as they…

  18. Adding to Your Teaching Repertoire: Integrating Action Research into the Lesson Plans

    ERIC Educational Resources Information Center

    Basham, Matthew J.; Yankowy, Barbara

    2015-01-01

    As today's students become more technologically savvy, social, and collaborative using social media, there are new and innovative techniques educators can use in the classroom. For example, action research is a newer technique using collaborative group processes, drawing upon the experiences of the individuals to promote positive results. This…

  19. The polarised internal target for the PAX experiment

    NASA Astrophysics Data System (ADS)

    Ciullo, G.; Barion, L.; Barschel, C.; Grigoriev, K.; Lenisa, P.; Nass, A.; Sarkadi, J.; Statera, M.; Steffens, E.; Tagliente, G.

    2011-05-01

    The PAX (Polarized Antiproton eXperiment) collaboration aims to polarise antiproton beams stored in ring by means of spin-filtering. The experimental setup is based on a polarised internal gas target, surrounded by a detection system for the measurement of spin observables. In this report, we present results from the commission of the PAX target (atomic beam source, openable cell, and polarimeter).

  20. An e-Learning Collaborative Filtering Approach to Suggest Problems to Solve in Programming Online Judges

    ERIC Educational Resources Information Center

    Toledo, Raciel Yera; Mota, Yailé Caballero

    2014-01-01

    The paper proposes a recommender system approach to cover online judge's domains. Online judges are e-learning tools that support the automatic evaluation of programming tasks done by individual users, and for this reason they are usually used for training students in programming contest and for supporting basic programming teachings. The…

  1. Study of slow sand filtration with backwash and the influence of the filter media on the filter recovery and cleaning.

    PubMed

    de Souza, Fernando Hymnô; Pizzolatti, Bruno Segalla; Schöntag, Juliana Marques; Sens, Maurício Luiz

    2016-01-01

    Slow sand filters are considered as a great alternative for supplying drinking water in rural and/or isolated areas where raw water that is treatable with this technique is available. Some studies used backwashing as an alternative for cleaning the slow sand filter with the goal of applying the technology in small communities, since filters that supply water to a small number of people do not require much space. In this study the influence of the effective diameter on water quality in the filters and cleaning system was evaluated. A pilot system with six filters was built: three filters were conventionally cleaned by scraping and the other three were cleaned by backwashing, each with a different effective diameter of filter medium. Most filters had an average turbidity of less than 1.0 NTU, the turbidity required at the output of the filters by the Brazilian Ministry of Health Ordinance. In the study, the filters cleaned by scraping with smaller-diameter filter beds effectively filtered water better but had worse effective production. The opposite occurs in the case of backwashed filters.

  2. Predicting online ratings based on the opinion spreading process

    NASA Astrophysics Data System (ADS)

    He, Xing-Sheng; Zhou, Ming-Yang; Zhuo, Zhao; Fu, Zhong-Qian; Liu, Jian-Guo

    2015-10-01

    Predicting users' online ratings is always a challenge issue and has drawn lots of attention. In this paper, we present a rating prediction method by combining the user opinion spreading process with the collaborative filtering algorithm, where user similarity is defined by measuring the amount of opinion a user transfers to another based on the primitive user-item rating matrix. The proposed method could produce a more precise rating prediction for each unrated user-item pair. In addition, we introduce a tunable parameter λ to regulate the preferential diffusion relevant to the degree of both opinion sender and receiver. The numerical results for Movielens and Netflix data sets show that this algorithm has a better accuracy than the standard user-based collaborative filtering algorithm using Cosine and Pearson correlation without increasing computational complexity. By tuning λ, our method could further boost the prediction accuracy when using Mean Absolute Error (MAE) and Root Mean Squared Error (RMSE) as measurements. In the optimal cases, on Movielens and Netflix data sets, the corresponding algorithmic accuracy (MAE and RMSE) are improved 11.26% and 8.84%, 13.49% and 10.52% compared to the item average method, respectively.

  3. Walking on a User Similarity Network towards Personalized Recommendations

    PubMed Central

    Gan, Mingxin

    2014-01-01

    Personalized recommender systems have been receiving more and more attention in addressing the serious problem of information overload accompanying the rapid evolution of the world-wide-web. Although traditional collaborative filtering approaches based on similarities between users have achieved remarkable success, it has been shown that the existence of popular objects may adversely influence the correct scoring of candidate objects, which lead to unreasonable recommendation results. Meanwhile, recent advances have demonstrated that approaches based on diffusion and random walk processes exhibit superior performance over collaborative filtering methods in both the recommendation accuracy and diversity. Building on these results, we adopt three strategies (power-law adjustment, nearest neighbor, and threshold filtration) to adjust a user similarity network from user similarity scores calculated on historical data, and then propose a random walk with restart model on the constructed network to achieve personalized recommendations. We perform cross-validation experiments on two real data sets (MovieLens and Netflix) and compare the performance of our method against the existing state-of-the-art methods. Results show that our method outperforms existing methods in not only recommendation accuracy and diversity, but also retrieval performance. PMID:25489942

  4. ITrace: An implicit trust inference method for trust-aware collaborative filtering

    NASA Astrophysics Data System (ADS)

    He, Xu; Liu, Bin; Chen, Kejia

    2018-04-01

    The growth of Internet commerce has stimulated the use of collaborative filtering (CF) algorithms as recommender systems. A CF algorithm recommends items of interest to the target user by leveraging the votes given by other similar users. In a standard CF framework, it is assumed that the credibility of every voting user is exactly the same with respect to the target user. This assumption is not satisfied and thus may lead to misleading recommendations in many practical applications. A natural countermeasure is to design a trust-aware CF (TaCF) algorithm, which can take account of the difference in the credibilities of the voting users when performing CF. To this end, this paper presents a trust inference approach, which can predict the implicit trust of the target user on every voting user from a sparse explicit trust matrix. Then an improved CF algorithm termed iTrace is proposed, which takes advantage of both the explicit and the predicted implicit trust to provide recommendations with the CF framework. An empirical evaluation on a public dataset demonstrates that the proposed algorithm provides a significant improvement in recommendation quality in terms of mean absolute error.

  5. Comparison of ion coupling strategies for a microengineered quadrupole mass filter.

    PubMed

    Wright, Steven; Syms, Richard R A; O'Prey, Shane; Hong, Guodong; Holmes, Andrew S

    2009-01-01

    The limitations of conventional machining and assembly techniques require that designs for quadrupole mass analyzers with rod diameters less than a millimeter are not merely scale versions of larger instruments. We show how silicon planar processing techniques and microelectromechanical systems (MEMS) design concepts can be used to incorporate complex features into the construction of a miniature quadrupole mass filter chip that could not easily be achieved using other microengineering approaches. Three designs for the entrance and exit to the filter consistent with the chosen materials and techniques have been evaluated. The differences between these seemingly similar structures have a significant effect on the performance. Although one of the designs results in severe attenuation of transmission with increasing mass, the other two can be scanned to m/z = 400 without any corruption of the mass spectrum. At m/z = 219, the variation in the transmission of the three designs was found to be approximately four orders of magnitude. A maximum resolution of M/DeltaM = 87 at 10% peak height has been achieved at m/z = 219 with a filter operated at 6 MHz and constructed using rods measuring (508 +/- 5) microm in diameter.

  6. Sequential updating of multimodal hydrogeologic parameter fields using localization and clustering techniques

    NASA Astrophysics Data System (ADS)

    Sun, Alexander Y.; Morris, Alan P.; Mohanty, Sitakanta

    2009-07-01

    Estimated parameter distributions in groundwater models may contain significant uncertainties because of data insufficiency. Therefore, adaptive uncertainty reduction strategies are needed to continuously improve model accuracy by fusing new observations. In recent years, various ensemble Kalman filters have been introduced as viable tools for updating high-dimensional model parameters. However, their usefulness is largely limited by the inherent assumption of Gaussian error statistics. Hydraulic conductivity distributions in alluvial aquifers, for example, are usually non-Gaussian as a result of complex depositional and diagenetic processes. In this study, we combine an ensemble Kalman filter with grid-based localization and a Gaussian mixture model (GMM) clustering techniques for updating high-dimensional, multimodal parameter distributions via dynamic data assimilation. We introduce innovative strategies (e.g., block updating and dimension reduction) to effectively reduce the computational costs associated with these modified ensemble Kalman filter schemes. The developed data assimilation schemes are demonstrated numerically for identifying the multimodal heterogeneous hydraulic conductivity distributions in a binary facies alluvial aquifer. Our results show that localization and GMM clustering are very promising techniques for assimilating high-dimensional, multimodal parameter distributions, and they outperform the corresponding global ensemble Kalman filter analysis scheme in all scenarios considered.

  7. Handling of uncertainty due to interference fringe in FT-NIR transmittance spectroscopy - Performance comparison of interference elimination techniques using glucose-water system

    NASA Astrophysics Data System (ADS)

    Beganović, Anel; Beć, Krzysztof B.; Henn, Raphael; Huck, Christian W.

    2018-05-01

    The applicability of two elimination techniques for interferences occurring in measurements with cells of short pathlength using Fourier transform near-infrared (FT-NIR) spectroscopy was evaluated. Due to the growing interest in the field of vibrational spectroscopy in aqueous biological fluids (e.g. glucose in blood), aqueous solutions of D-(+)-glucose were prepared and split into a calibration set and an independent validation set. All samples were measured with two FT-NIR spectrometers at various spectral resolutions. Moving average smoothing (MAS) and fast Fourier transform filter (FFT filter) were applied to the interference affected FT-NIR spectra in order to eliminate the interference pattern. After data pre-treatment, partial least squares regression (PLSR) models using different NIR regions were constructed using untreated (interference affected) spectra and spectra treated with MAS and FFT filter. The prediction of the independent validation set revealed information about the performance of the utilized interference elimination techniques, as well as the different NIR regions. The results showed that the combination band of water at approx. 5200 cm-1 is of great importance since its performance was superior to the one of the so-called first overtone of water at approx. 6800 cm-1. Furthermore, this work demonstrated that MAS and FFT filter are fast and easy-to-use techniques for the elimination of interference fringes in FT-NIR transmittance spectroscopy.

  8. Determination of polybrominated diphenyl ethers (PBDEs) in dust samples collected in air conditioning filters of different usage - method development.

    PubMed

    Śmiełowska, M; Zabiegała, B

    2018-06-19

    This study presents the results of studies aimed at the development of an analytical procedure for separation, identification, and determination of PBDEs compounds in dust samples collected from automotive cabin air filters and samples collected from filters installed as part of the air purification system in academic facilities. Ultrasound-assisted dispersive solid phase extraction (UA-dSPE) was found to perform better in terms of extract purification than the conventional SPE technique. GC-EIMS was used for final determination of analytes. The concentrations of PBDEs in car filters ranged from < LOD to 688 ng/g while from < LOD to 247 ng/g in dust from air conditioning filters. BDE-47 and BDE-100 were reported the dominating congeners. The estimated exposure to PBDEs via ingestion of dust from car filters varied from 0.00022 to 0.012 ng/day in toddlers and from 0.000036 to 0.0029 ng/day in adults; dust from air conditioning filters: from 0.017 to 0.25 ng/day in toddlers and from 0.0029 to 0.042 ng/day. In addition, an attempt was made at extracting PBDEs from a dust samples using the matrix solid-phase dispersion (MSPD) technique as a promising alternative to conventional SPE separations. Copyright © 2018 Elsevier B.V. All rights reserved.

  9. New Predictive Filters for Compensating the Transport Delay on a Flight Simulator

    NASA Technical Reports Server (NTRS)

    Guo, Liwen; Cardullo, Frank M.; Houck, Jacob A.; Kelly, Lon C.; Wolters, Thomas E.

    2004-01-01

    The problems of transport delay in a flight simulator, such as its sources and effects, are reviewed. Then their effects on a pilot-in-the-loop control system are investigated with simulations. Three current prominent delay compensators the lead/lag filter, McFarland filter, and the Sobiski/Cardullo filter were analyzed and compared. This paper introduces two novel delay compensation techniques an adaptive predictor using the Kalman estimator and a state space predictive filter using a reference aerodynamic model. Applications of these two new compensators on recorded data from the NASA Langley Research Center Visual Motion Simulator show that they achieve better compensation over the current ones.

  10. Page Oriented Holographic Memories And Optical Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Caulfield, H. J.

    1987-08-01

    In the twenty-two years since VanderLugt's introduction of holographic matched filtering, the intensive research carried out throughout the world has led to no applications in complex environment. This leads one to the suspicion that the VanderLugt filter technique is insufficiently complex to handle truly complex problems. Therefore, it is of great interest to increase the complexity of the VanderLugt filtering operation. We introduce here an approach to the real time filter assembly: use of page oriented holographic memories and optically addressed SLMs to achieve intelligent and fast reprogramming of the filters using a 10 4 to 10 6 stored pattern base.

  11. Filtered epithermal quasi-monoenergetic neutron beams at research reactor facilities.

    PubMed

    Mansy, M S; Bashter, I I; El-Mesiry, M S; Habib, N; Adib, M

    2015-03-01

    Filtered neutron techniques were applied to produce quasi-monoenergetic neutron beams in the energy range of 1.5-133keV at research reactors. A simulation study was performed to characterize the filter components and transmitted beam lines. The filtered beams were characterized in terms of the optimal thickness of the main and additive components. The filtered neutron beams had high purity and intensity, with low contamination from the accompanying thermal emission, fast neutrons and γ-rays. A computer code named "QMNB" was developed in the "MATLAB" programming language to perform the required calculations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Miscellaneous methods for measuring matric or water potential

    USGS Publications Warehouse

    Scanlon, Bridget R.; Andraski, Brian J.; Bilskie, Jim; Dane, Jacob H.; Topp, G. Clarke

    2002-01-01

    A variety of techniques to measure matric potential or water potential in the laboratory and in the field are described in this section. The techniques described herein require equilibration of some medium whose matric or water potential can be determined from previous calibration or can be measured directly. Under equilibrium conditions the matric or water potential of the medium is equal to that of the soil. The techniques can be divided into: (i) those that measure matric potential and (ii) those that measure water potential (sum of matric and osmotic potentials). Matric potential is determined when the sensor matrix is in direct contact with the soil, so salts are free to diffuse in or out of the sensor matrix, and the equilibrium measurement therefore reflects matric forces acting on the water. Water potential is determined when the sensor is separated from the soil by a vapor gap, so salts are not free to move in or out of the sensor, and the equilibrium measurement reflects the sum of the matric and osmotic forces acting on the water.Seven different techniques are described in this section. Those that measure matric potential include (i) heat dissipation sensors, (ii) electrical resistance sensors, (iii) frequency domain and time domain sensors, and (iv) electro-optical switches. A method that can be used to measure matric potential or water potential is the (v) filter paper method. Techniques that measure water potential include (vi) the Dew Point Potentiameter (Decagon Devices, Inc., Pullman, WA1) (water activity meter) and (vii) vapor equilibration.The first four techniques are electronically based methods for measuring matric potential. Heat dissipation sensors and electrical resistance sensors infer matric potential from previously determined calibration relations between sensor heat dissipation or electrical resistance and matric potential. Frequency-domain and timedomain matric potential sensors measure water content, which is related to matric potential of the sensor through calibration. Electro-optical switches measure changes in light transmission through thin, nylon filters as they absorb or desorb water in response to changes in matric potential. Heat dissipation sensors and electrical resistance sensors are used primarily in the field to provide information on matric potential. Frequency domain matric potential sensors are new and have not been widely used. Time domain matric potential sensors and electro-optical switches are new and have not been commercialized. For the fifth technique, filter paper is used as the standard matrix. The filter paper technique measures matric potential when the filter paper is in direct contact with soil or water potential when separated from soil by a vapor gap. The Dew Point Potentiameter calculates water potential from the measured dew point and sample temperature. The vapor equilibration technique involves equilibration of soil samples with salt solutions of known osmotic potential. The filter paper, Dew Point Potentiameter, and vapor equilibration techniques are generally used in the laboratory to measure water potential of disturbed field samples or to measure water potential for water retention functions.

  13. Multi-agent cooperation rescue algorithm based on influence degree and state prediction

    NASA Astrophysics Data System (ADS)

    Zheng, Yanbin; Ma, Guangfu; Wang, Linlin; Xi, Pengxue

    2018-04-01

    Aiming at the multi-agent cooperative rescue in disaster, a multi-agent cooperative rescue algorithm based on impact degree and state prediction is proposed. Firstly, based on the influence of the information in the scene on the collaborative task, the influence degree function is used to filter the information. Secondly, using the selected information to predict the state of the system and Agent behavior. Finally, according to the result of the forecast, the cooperative behavior of Agent is guided and improved the efficiency of individual collaboration. The simulation results show that this algorithm can effectively solve the cooperative rescue problem of multi-agent and ensure the efficient completion of the task.

  14. Coliform species recovered from untreated surface water and drinking water by the membrane filter, standard, and modified most-probable-number techniques.

    PubMed Central

    Evans, T M; LeChevallier, M W; Waarvick, C E; Seidler, R J

    1981-01-01

    The species of total coliform bacteria isolated from drinking water and untreated surface water by the membrane filter (MF), the standard most-probable-number (S-MPN), and modified most-probable-number (M-MPN) techniques were compared. Each coliform detection technique selected for a different profile of coliform species from both types of water samples. The MF technique indicated that Citrobacter freundii was the most common coliform species in water samples. However, the fermentation tube techniques displayed selectivity towards the isolation of Escherichia coli and Klebsiella. The M-MPN technique selected for more C. freundii and Enterobacter spp. from untreated surface water samples and for more Enterobacter and Klebsiella spp. from drinking water samples than did the S-MPN technique. The lack of agreement between the number of coliforms detected in a water sample by the S-MPN, M-MPN, and MF techniques was a result of the selection for different coliform species by the various techniques. PMID:7013706

  15. Input filter compensation for switching regulators

    NASA Technical Reports Server (NTRS)

    Lee, F. C.; Kelkar, S. S.

    1982-01-01

    The problems caused by the interaction between the input filter, output filter, and the control loop are discussed. The input filter design is made more complicated because of the need to avoid performance degradation and also stay within the weight and loss limitations. Conventional input filter design techniques are then dicussed. The concept of pole zero cancellation is reviewed; this concept is the basis for an approach to control the peaking of the output impedance of the input filter and thus mitigate some of the problems caused by the input filter. The proposed approach for control of the peaking of the output impedance of the input filter is to use a feedforward loop working in conjunction with feedback loops, thus forming a total state control scheme. The design of the feedforward loop for a buck regulator is described. A possible implementation of the feedforward loop design is suggested.

  16. Silicon oxide nanoparticles doped PQ-PMMA for volume holographic imaging filters.

    PubMed

    Luo, Yuan; Russo, Juan M; Kostuk, Raymond K; Barbastathis, George

    2010-04-15

    Holographic imaging filters are required to have high Bragg selectivity, namely, narrow angular and spectral bandwidth, to obtain spatial-spectral information within a three-dimensional object. In this Letter, we present the design of holographic imaging filters formed using silicon oxide nanoparticles (nano-SiO(2)) in phenanthrenquinone-poly(methyl methacrylate) (PQ-PMMA) polymer recording material. This combination offers greater Bragg selectivity and increases the diffraction efficiency of holographic filters. The holographic filters with optimized ratio of nano-SiO(2) in PQ-PMMA can significantly improve the performance of Bragg selectivity and diffraction efficiency by 53% and 16%, respectively. We present experimental results and data analysis demonstrating this technique in use for holographic spatial-spectral imaging filters.

  17. Reconfigurable Gabor Filter For Fingerprint Recognition Using FPGA Verilog

    NASA Astrophysics Data System (ADS)

    Rosshidi, H. T.; Hadi, A. R.

    2009-06-01

    This paper present the implementations of Gabor filter for fingerprint recognition using Verilog HDL. This work demonstrates the application of Gabor Filter technique to enhance the fingerprint image. The incoming signal in form of image pixel will be filter out or convolute by the Gabor filter to define the ridge and valley regions of fingerprint. This is done with the application of a real time convolve based on Field Programmable Gate Array (FPGA) to perform the convolution operation. The main characteristic of the proposed approach are the usage of memory to store the incoming image pixel and the coefficient of the Gabor filter before the convolution matrix take place. The result was the signal convoluted with the Gabor coefficient.

  18. Distortion analysis of subband adaptive filtering methods for FMRI active noise control systems.

    PubMed

    Milani, Ali A; Panahi, Issa M; Briggs, Richard

    2007-01-01

    Delayless subband filtering structure, as a high performance frequency domain filtering technique, is used for canceling broadband fMRI noise (8 kHz bandwidth). In this method, adaptive filtering is done in subbands and the coefficients of the main canceling filter are computed by stacking the subband weights together. There are two types of stacking methods called FFT and FFT-2. In this paper, we analyze the distortion introduced by these two stacking methods. The effect of the stacking distortion on the performance of different adaptive filters in FXLMS algorithm with non-minimum phase secondary path is explored. The investigation is done for different adaptive algorithms (nLMS, APA and RLS), different weight stacking methods, and different number of subbands.

  19. Detection of circuit-board components with an adaptive multiclass correlation filter

    NASA Astrophysics Data System (ADS)

    Diaz-Ramirez, Victor H.; Kober, Vitaly

    2008-08-01

    A new method for reliable detection of circuit-board components is proposed. The method is based on an adaptive multiclass composite correlation filter. The filter is designed with the help of an iterative algorithm using complex synthetic discriminant functions. The impulse response of the filter contains information needed to localize and classify geometrically distorted circuit-board components belonging to different classes. Computer simulation results obtained with the proposed method are provided and compared with those of known multiclass correlation based techniques in terms of performance criteria for recognition and classification of objects.

  20. Optimum filter-based discrimination of neutrons and gamma rays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amiri, Moslem; Prenosil, Vaclav; Cvachovec, Frantisek

    2015-07-01

    An optimum filter-based method for discrimination of neutrons and gamma-rays in a mixed radiation field is presented. The existing filter-based implementations of discriminators require sample pulse responses in advance of the experiment run to build the filter coefficients, which makes them less practical. Our novel technique creates the coefficients during the experiment and improves their quality gradually. Applied to several sets of mixed neutron and photon signals obtained through different digitizers using stilbene scintillator, this approach is analyzed and its discrimination quality is measured. (authors)

Top