Sample records for pattern search technique

  1. Optimization technique for problems with an inequality constraint

    NASA Technical Reports Server (NTRS)

    Russell, K. J.

    1972-01-01

    General technique uses a modified version of an existing technique termed the pattern search technique. New procedure called the parallel move strategy permits pattern search technique to be used with problems involving a constraint.

  2. Pattern Discovery in Biomolecular Data – Tools, Techniques, and Applications | Center for Cancer Research

    Cancer.gov

    Finding patterns in biomolecular data, particularly in DNA and RNA, is at the center of modern biological research. These data are complex and growing rapidly, so the search for patterns requires increasingly sophisticated computer methods. This book provides a summary of principal techniques. Each chapter describes techniques that are drawn from many fields, including graph

  3. Hybrid General Pattern Search and Simulated Annealing for Industrail Production Planning Problems

    NASA Astrophysics Data System (ADS)

    Vasant, P.; Barsoum, N.

    2010-06-01

    In this paper, the hybridization of GPS (General Pattern Search) method and SA (Simulated Annealing) incorporated in the optimization process in order to look for the global optimal solution for the fitness function and decision variables as well as minimum computational CPU time. The real strength of SA approach been tested in this case study problem of industrial production planning. This is due to the great advantage of SA for being easily escaping from trapped in local minima by accepting up-hill move through a probabilistic procedure in the final stages of optimization process. Vasant [1] in his Ph. D thesis has provided 16 different techniques of heuristic and meta-heuristic in solving industrial production problems with non-linear cubic objective functions, eight decision variables and 29 constraints. In this paper, fuzzy technological problems have been solved using hybrid techniques of general pattern search and simulated annealing. The simulated and computational results are compared to other various evolutionary techniques.

  4. Phase demodulation from a single fringe pattern based on a correlation technique.

    PubMed

    Robin, Eric; Valle, Valéry

    2004-08-01

    We present a method for determining the demodulated phase from a single fringe pattern. This method, based on a correlation technique, searches in a zone of interest for the degree of similarity between a real fringe pattern and a mathematical model. This method, named modulated phase correlation, is tested with different examples.

  5. Identifying spatially similar gene expression patterns in early stage fruit fly embryo images: binary feature versus invariant moment digital representations

    PubMed Central

    Gurunathan, Rajalakshmi; Van Emden, Bernard; Panchanathan, Sethuraman; Kumar, Sudhir

    2004-01-01

    Background Modern developmental biology relies heavily on the analysis of embryonic gene expression patterns. Investigators manually inspect hundreds or thousands of expression patterns to identify those that are spatially similar and to ultimately infer potential gene interactions. However, the rapid accumulation of gene expression pattern data over the last two decades, facilitated by high-throughput techniques, has produced a need for the development of efficient approaches for direct comparison of images, rather than their textual descriptions, to identify spatially similar expression patterns. Results The effectiveness of the Binary Feature Vector (BFV) and Invariant Moment Vector (IMV) based digital representations of the gene expression patterns in finding biologically meaningful patterns was compared for a small (226 images) and a large (1819 images) dataset. For each dataset, an ordered list of images, with respect to a query image, was generated to identify overlapping and similar gene expression patterns, in a manner comparable to what a developmental biologist might do. The results showed that the BFV representation consistently outperforms the IMV representation in finding biologically meaningful matches when spatial overlap of the gene expression pattern and the genes involved are considered. Furthermore, we explored the value of conducting image-content based searches in a dataset where individual expression components (or domains) of multi-domain expression patterns were also included separately. We found that this technique improves performance of both IMV and BFV based searches. Conclusions We conclude that the BFV representation consistently produces a more extensive and better list of biologically useful patterns than the IMV representation. The high quality of results obtained scales well as the search database becomes larger, which encourages efforts to build automated image query and retrieval systems for spatial gene expression patterns. PMID:15603586

  6. Optimization of a Boiling Water Reactor Loading Pattern Using an Improved Genetic Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2003-08-15

    A search method based on genetic algorithms (GA) using deterministic operators has been developed to generate optimized boiling water reactor (BWR) loading patterns (LPs). The search method uses an Improved GA operator, that is, crossover, mutation, and selection. The handling of the encoding technique and constraint conditions is designed so that the GA reflects the peculiar characteristics of the BWR. In addition, some strategies such as elitism and self-reproduction are effectively used to improve the search speed. LP evaluations were performed with a three-dimensional diffusion code that coupled neutronic and thermal-hydraulic models. Strong axial heterogeneities and three-dimensional-dependent constraints have alwaysmore » necessitated the use of three-dimensional core simulators for BWRs, so that an optimization method is required for computational efficiency. The proposed algorithm is demonstrated by successfully generating LPs for an actual BWR plant applying the Haling technique. In test calculations, candidates that shuffled fresh and burned fuel assemblies within a reasonable computation time were obtained.« less

  7. LoyalTracker: Visualizing Loyalty Dynamics in Search Engines.

    PubMed

    Shi, Conglei; Wu, Yingcai; Liu, Shixia; Zhou, Hong; Qu, Huamin

    2014-12-01

    The huge amount of user log data collected by search engine providers creates new opportunities to understand user loyalty and defection behavior at an unprecedented scale. However, this also poses a great challenge to analyze the behavior and glean insights into the complex, large data. In this paper, we introduce LoyalTracker, a visual analytics system to track user loyalty and switching behavior towards multiple search engines from the vast amount of user log data. We propose a new interactive visualization technique (flow view) based on a flow metaphor, which conveys a proper visual summary of the dynamics of user loyalty of thousands of users over time. Two other visualization techniques, a density map and a word cloud, are integrated to enable analysts to gain further insights into the patterns identified by the flow view. Case studies and the interview with domain experts are conducted to demonstrate the usefulness of our technique in understanding user loyalty and switching behavior in search engines.

  8. Evaluation of Simulated Clinical Breast Exam Motion Patterns Using Marker-Less Video Tracking

    PubMed Central

    Azari, David P.; Pugh, Carla M.; Laufer, Shlomi; Kwan, Calvin; Chen, Chia-Hsiung; Yen, Thomas Y.; Hu, Yu Hen; Radwin, Robert G.

    2016-01-01

    Objective This study investigates using marker-less video tracking to evaluate hands-on clinical skills during simulated clinical breast examinations (CBEs). Background There are currently no standardized and widely accepted CBE screening techniques. Methods Experienced physicians attending a national conference conducted simulated CBEs presenting different pathologies with distinct tumorous lesions. Single hand exam motion was recorded and analyzed using marker-less video tracking. Four kinematic measures were developed to describe temporal (time pressing and time searching) and spatial (area covered and distance explored) patterns. Results Mean differences between time pressing, area covered, and distance explored varied across the simulated lesions. Exams were objectively categorized as either sporadic, localized, thorough, or efficient for both temporal and spatial categories based on spatiotemporal characteristics. The majority of trials were temporally or spatially thorough (78% and 91%), exhibiting proportionally greater time pressing and time searching (temporally thorough) and greater area probed with greater distance explored (spatially thorough). More efficient exams exhibited proportionally more time pressing with less time searching (temporally efficient) and greater area probed with less distance explored (spatially efficient). Just two (5.9 %) of the trials exhibited both high temporal and spatial efficiency. Conclusions Marker-less video tracking was used to discriminate different examination techniques and measure when an exam changes from general searching to specific probing. The majority of participants exhibited more thorough than efficient patterns. Application Marker-less video kinematic tracking may be useful for quantifying clinical skills for training and assessment. PMID:26546381

  9. Settlement patterns, GIS, remote sensing, and the late prehistory of the Black Prairie in east central Mississippi

    NASA Technical Reports Server (NTRS)

    Johnson, Jay K.

    1991-01-01

    Data recovered as the result of a recent field project designed to test a model of the distribution of protohistoric settlement in an unusual physiographic zone in eastern Mississippi are examined using GIS based techniques to manipulate soil and stream distance information. Significant patterning is derived. The generally thin soils and uniform substratum of the Black Prairie in combination with a distinctive settlement pattern offer a promising opportunity for the search for site specific characteristics within airborne imagery. Landsat TM data provide information on modern ground cover which is used as a mask to select areas in which a multivariate search for archaeological site signatures within a TIMS image is most likely to prove fruitful.

  10. In Search of Lakshmi's Footprints: A Brief Study of the Use of Surface Design in India. Fulbright-Hays Summer Seminars Abroad, 1997 (India).

    ERIC Educational Resources Information Center

    Rasmussen, Marie

    This paper provides a description of the use of surface design in India and how those patterns have migrated throughout India. This study is confined in interest to the use of design and pattern to convey religious symbolism and other auspicious meanings. The migration of pattern to various parts of India will change the name or the technique, but…

  11. Fast and simple character classes and bounded gaps pattern matching, with applications to protein searching.

    PubMed

    Navarro, Gonzalo; Raffinot, Mathieu

    2003-01-01

    The problem of fast exact and approximate searching for a pattern that contains classes of characters and bounded size gaps (CBG) in a text has a wide range of applications, among which a very important one is protein pattern matching (for instance, one PROSITE protein site is associated with the CBG [RK] - x(2,3) - [DE] - x(2,3) - Y, where the brackets match any of the letters inside, and x(2,3) a gap of length between 2 and 3). Currently, the only way to search for a CBG in a text is to convert it into a full regular expression (RE). However, a RE is more sophisticated than a CBG, and searching for it with a RE pattern matching algorithm complicates the search and makes it slow. This is the reason why we design in this article two new practical CBG matching algorithms that are much simpler and faster than all the RE search techniques. The first one looks exactly once at each text character. The second one does not need to consider all the text characters, and hence it is usually faster than the first one, but in bad cases may have to read the same text character more than once. We then propose a criterion based on the form of the CBG to choose a priori the fastest between both. We also show how to search permitting a few mistakes in the occurrences. We performed many practical experiments using the PROSITE database, and all of them show that our algorithms are the fastest in virtually all cases.

  12. A System for Automated Extraction of Metadata from Scanned Documents using Layout Recognition and String Pattern Search Models.

    PubMed

    Misra, Dharitri; Chen, Siyuan; Thoma, George R

    2009-01-01

    One of the most expensive aspects of archiving digital documents is the manual acquisition of context-sensitive metadata useful for the subsequent discovery of, and access to, the archived items. For certain types of textual documents, such as journal articles, pamphlets, official government records, etc., where the metadata is contained within the body of the documents, a cost effective method is to identify and extract the metadata in an automated way, applying machine learning and string pattern search techniques.At the U. S. National Library of Medicine (NLM) we have developed an automated metadata extraction (AME) system that employs layout classification and recognition models with a metadata pattern search model for a text corpus with structured or semi-structured information. A combination of Support Vector Machine and Hidden Markov Model is used to create the layout recognition models from a training set of the corpus, following which a rule-based metadata search model is used to extract the embedded metadata by analyzing the string patterns within and surrounding each field in the recognized layouts.In this paper, we describe the design of our AME system, with focus on the metadata search model. We present the extraction results for a historic collection from the Food and Drug Administration, and outline how the system may be adapted for similar collections. Finally, we discuss some ongoing enhancements to our AME system.

  13. Hierarchical random walks in trace fossils and the origin of optimal search behavior

    PubMed Central

    Sims, David W.; Reynolds, Andrew M.; Humphries, Nicolas E.; Southall, Emily J.; Wearmouth, Victoria J.; Metcalfe, Brett; Twitchett, Richard J.

    2014-01-01

    Efficient searching is crucial for timely location of food and other resources. Recent studies show that diverse living animals use a theoretically optimal scale-free random search for sparse resources known as a Lévy walk, but little is known of the origins and evolution of foraging behavior and the search strategies of extinct organisms. Here, using simulations of self-avoiding trace fossil trails, we show that randomly introduced strophotaxis (U-turns)—initiated by obstructions such as self-trail avoidance or innate cueing—leads to random looping patterns with clustering across increasing scales that is consistent with the presence of Lévy walks. This predicts that optimal Lévy searches may emerge from simple behaviors observed in fossil trails. We then analyzed fossilized trails of benthic marine organisms by using a novel path analysis technique and find the first evidence, to our knowledge, of Lévy-like search strategies in extinct animals. Our results show that simple search behaviors of extinct animals in heterogeneous environments give rise to hierarchically nested Brownian walk clusters that converge to optimal Lévy patterns. Primary productivity collapse and large-scale food scarcity characterizing mass extinctions evident in the fossil record may have triggered adaptation of optimal Lévy-like searches. The findings suggest that Lévy-like behavior has been used by foragers since at least the Eocene but may have a more ancient origin, which might explain recent widespread observations of such patterns among modern taxa. PMID:25024221

  14. Supercomputer applications in molecular modeling.

    PubMed

    Gund, T M

    1988-01-01

    An overview of the functions performed by molecular modeling is given. Molecular modeling techniques benefiting from supercomputing are described, namely, conformation, search, deriving bioactive conformations, pharmacophoric pattern searching, receptor mapping, and electrostatic properties. The use of supercomputers for problems that are computationally intensive, such as protein structure prediction, protein dynamics and reactivity, protein conformations, and energetics of binding is also examined. The current status of supercomputing and supercomputer resources are discussed.

  15. Structural Pattern Recognition Techniques for Data Retrieval in Massive Fusion Databases

    NASA Astrophysics Data System (ADS)

    Vega, J.; Murari, A.; Rattá, G. A.; Castro, P.; Pereira, A.; Portas, A.

    2008-03-01

    Diagnostics of present day reactor class fusion experiments, like the Joint European Torus (JET), generate thousands of signals (time series and video images) in each discharge. There is a direct correspondence between the physical phenomena taking place in the plasma and the set of structural shapes (patterns) that they form in the signals: bumps, unexpected amplitude changes, abrupt peaks, periodic components, high intensity zones or specific edge contours. A major difficulty related to data analysis is the identification, in a rapid and automated way, of a set of discharges with comparable behavior, i.e. discharges with "similar" patterns. Pattern recognition techniques are efficient tools to search for similar structural forms within the database in a fast an intelligent way. To this end, classification systems must be developed to be used as indexation methods to directly fetch the more similar patterns.

  16. Fingerprint pattern restoration by digital image processing techniques.

    PubMed

    Wen, Che-Yen; Yu, Chiu-Chung

    2003-09-01

    Fingerprint evidence plays an important role in solving criminal problems. However, defective (lacking information needed for completeness) or contaminated (undesirable information included) fingerprint patterns make identifying and recognizing processes difficult. Unfortunately. this is the usual case. In the recognizing process (enhancement of patterns, or elimination of "false alarms" so that a fingerprint pattern can be searched in the Automated Fingerprint Identification System (AFIS)), chemical and physical techniques have been proposed to improve pattern legibility. In the identifying process, a fingerprint examiner can enhance contaminated (but not defective) fingerprint patterns under guidelines provided by the Scientific Working Group on Friction Ridge Analysis, Study and Technology (SWGFAST), the Scientific Working Group on Imaging Technology (SWGIT), and an AFIS working group within the National Institute of Justice. Recently, the image processing techniques have been successfully applied in forensic science. For example, we have applied image enhancement methods to improve the legibility of digital images such as fingerprints and vehicle plate numbers. In this paper, we propose a novel digital image restoration technique based on the AM (amplitude modulation)-FM (frequency modulation) reaction-diffusion method to restore defective or contaminated fingerprint patterns. This method shows its potential application to fingerprint pattern enhancement in the recognizing process (but not for the identifying process). Synthetic and real images are used to show the capability of the proposed method. The results of enhancing fingerprint patterns by the manual process and our method are evaluated and compared.

  17. A System for Automated Extraction of Metadata from Scanned Documents using Layout Recognition and String Pattern Search Models

    PubMed Central

    Misra, Dharitri; Chen, Siyuan; Thoma, George R.

    2010-01-01

    One of the most expensive aspects of archiving digital documents is the manual acquisition of context-sensitive metadata useful for the subsequent discovery of, and access to, the archived items. For certain types of textual documents, such as journal articles, pamphlets, official government records, etc., where the metadata is contained within the body of the documents, a cost effective method is to identify and extract the metadata in an automated way, applying machine learning and string pattern search techniques. At the U. S. National Library of Medicine (NLM) we have developed an automated metadata extraction (AME) system that employs layout classification and recognition models with a metadata pattern search model for a text corpus with structured or semi-structured information. A combination of Support Vector Machine and Hidden Markov Model is used to create the layout recognition models from a training set of the corpus, following which a rule-based metadata search model is used to extract the embedded metadata by analyzing the string patterns within and surrounding each field in the recognized layouts. In this paper, we describe the design of our AME system, with focus on the metadata search model. We present the extraction results for a historic collection from the Food and Drug Administration, and outline how the system may be adapted for similar collections. Finally, we discuss some ongoing enhancements to our AME system. PMID:21179386

  18. Data Flow Analysis and Visualization for Spatiotemporal Statistical Data without Trajectory Information.

    PubMed

    Kim, Seokyeon; Jeong, Seongmin; Woo, Insoo; Jang, Yun; Maciejewski, Ross; Ebert, David S

    2018-03-01

    Geographic visualization research has focused on a variety of techniques to represent and explore spatiotemporal data. The goal of those techniques is to enable users to explore events and interactions over space and time in order to facilitate the discovery of patterns, anomalies and relationships within the data. However, it is difficult to extract and visualize data flow patterns over time for non-directional statistical data without trajectory information. In this work, we develop a novel flow analysis technique to extract, represent, and analyze flow maps of non-directional spatiotemporal data unaccompanied by trajectory information. We estimate a continuous distribution of these events over space and time, and extract flow fields for spatial and temporal changes utilizing a gravity model. Then, we visualize the spatiotemporal patterns in the data by employing flow visualization techniques. The user is presented with temporal trends of geo-referenced discrete events on a map. As such, overall spatiotemporal data flow patterns help users analyze geo-referenced temporal events, such as disease outbreaks, crime patterns, etc. To validate our model, we discard the trajectory information in an origin-destination dataset and apply our technique to the data and compare the derived trajectories and the original. Finally, we present spatiotemporal trend analysis for statistical datasets including twitter data, maritime search and rescue events, and syndromic surveillance.

  19. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud

    PubMed Central

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization. PMID:26380364

  20. Universal Keyword Classifier on Public Key Based Encrypted Multikeyword Fuzzy Search in Public Cloud.

    PubMed

    Munisamy, Shyamala Devi; Chokkalingam, Arun

    2015-01-01

    Cloud computing has pioneered the emerging world by manifesting itself as a service through internet and facilitates third party infrastructure and applications. While customers have no visibility on how their data is stored on service provider's premises, it offers greater benefits in lowering infrastructure costs and delivering more flexibility and simplicity in managing private data. The opportunity to use cloud services on pay-per-use basis provides comfort for private data owners in managing costs and data. With the pervasive usage of internet, the focus has now shifted towards effective data utilization on the cloud without compromising security concerns. In the pursuit of increasing data utilization on public cloud storage, the key is to make effective data access through several fuzzy searching techniques. In this paper, we have discussed the existing fuzzy searching techniques and focused on reducing the searching time on the cloud storage server for effective data utilization. Our proposed Asymmetric Classifier Multikeyword Fuzzy Search method provides classifier search server that creates universal keyword classifier for the multiple keyword request which greatly reduces the searching time by learning the search path pattern for all the keywords in the fuzzy keyword set. The objective of using BTree fuzzy searchable index is to resolve typos and representation inconsistencies and also to facilitate effective data utilization.

  1. Intelligent web image retrieval system

    NASA Astrophysics Data System (ADS)

    Hong, Sungyong; Lee, Chungwoo; Nah, Yunmook

    2001-07-01

    Recently, the web sites such as e-business sites and shopping mall sites deal with lots of image information. To find a specific image from these image sources, we usually use web search engines or image database engines which rely on keyword only retrievals or color based retrievals with limited search capabilities. This paper presents an intelligent web image retrieval system. We propose the system architecture, the texture and color based image classification and indexing techniques, and representation schemes of user usage patterns. The query can be given by providing keywords, by selecting one or more sample texture patterns, by assigning color values within positional color blocks, or by combining some or all of these factors. The system keeps track of user's preferences by generating user query logs and automatically add more search information to subsequent user queries. To show the usefulness of the proposed system, some experimental results showing recall and precision are also explained.

  2. Library Search Prefilters for Vehicle Manufacturers to Assist in the Forensic Examination of Automotive Paints.

    PubMed

    Lavine, Barry K; White, Collin G; Ding, Tao

    2018-03-01

    Pattern recognition techniques have been applied to the infrared (IR) spectral libraries of the Paint Data Query (PDQ) database to differentiate between nonidentical but similar IR spectra of automotive paints. To tackle the problem of library searching, search prefilters were developed to identify the vehicle make from IR spectra of the clear coat, surfacer-primer, and e-coat layers. To develop these search prefilters with the appropriate degree of accuracy, IR spectra from the PDQ database were preprocessed using the discrete wavelet transform to enhance subtle but significant features in the IR spectral data. Wavelet coefficients characteristic of vehicle make were identified using a genetic algorithm for pattern recognition and feature selection. Search prefilters to identify automotive manufacturer through IR spectra obtained from a paint chip recovered at a crime scene were developed using 1596 original manufacturer's paint systems spanning six makes (General Motors, Chrysler, Ford, Honda, Nissan, and Toyota) within a limited production year range (2000-2006). Search prefilters for vehicle manufacturer that were developed as part of this study were successfully validated using IR spectra obtained directly from the PDQ database. Information obtained from these search prefilters can serve to quantify the discrimination power of original automotive paint encountered in casework and further efforts to succinctly communicate trace evidential significance to the courts.

  3. SU-E-T-295: Simultaneous Beam Sampling and Aperture Shape Optimization for Station Parameter Optimized Radiation Therapy (SPORT)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarepisheh, M; Li, R; Xing, L

    Purpose: Station Parameter Optimized Radiation Therapy (SPORT) was recently proposed to fully utilize the technical capability of emerging digital LINACs, in which the station parameters of a delivery system, (such as aperture shape and weight, couch position/angle, gantry/collimator angle) are optimized altogether. SPORT promises to deliver unprecedented radiation dose distributions efficiently, yet there does not exist any optimization algorithm to implement it. The purpose of this work is to propose an optimization algorithm to simultaneously optimize the beam sampling and aperture shapes. Methods: We build a mathematical model whose variables are beam angles (including non-coplanar and/or even nonisocentric beams) andmore » aperture shapes. To solve the resulting large scale optimization problem, we devise an exact, convergent and fast optimization algorithm by integrating three advanced optimization techniques named column generation, gradient method, and pattern search. Column generation is used to find a good set of aperture shapes as an initial solution by adding apertures sequentially. Then we apply the gradient method to iteratively improve the current solution by reshaping the aperture shapes and updating the beam angles toward the gradient. Algorithm continues by pattern search method to explore the part of the search space that cannot be reached by the gradient method. Results: The proposed technique is applied to a series of patient cases and significantly improves the plan quality. In a head-and-neck case, for example, the left parotid gland mean-dose, brainstem max-dose, spinal cord max-dose, and mandible mean-dose are reduced by 10%, 7%, 24% and 12% respectively, compared to the conventional VMAT plan while maintaining the same PTV coverage. Conclusion: Combined use of column generation, gradient search and pattern search algorithms provide an effective way to optimize simultaneously the large collection of station parameters and significantly improves quality of resultant treatment plans as compared with conventional VMAT or IMRT treatments.« less

  4. Moving Object Detection Using a Parallax Shift Vector Algorithm

    NASA Astrophysics Data System (ADS)

    Gural, Peter S.; Otto, Paul R.; Tedesco, Edward F.

    2018-07-01

    There are various algorithms currently in use to detect asteroids from ground-based observatories, but they are generally restricted to linear or mildly curved movement of the target object across the field of view. Space-based sensors in high inclination, low Earth orbits can induce significant parallax in a collected sequence of images, especially for objects at the typical distances of asteroids in the inner solar system. This results in a highly nonlinear motion pattern of the asteroid across the sensor, which requires a more sophisticated search pattern for detection processing. Both the classical pattern matching used in ground-based asteroid search and the more sensitive matched filtering and synthetic tracking techniques, can be adapted to account for highly complex parallax motion. A new shift vector generation methodology is discussed along with its impacts on commonly used detection algorithms, processing load, and responsiveness to asteroid track reporting. The matched filter, template generator, and pattern matcher source code for the software described herein are available via GitHub.

  5. Research reactor loading pattern optimization using estimation of distribution algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, S.; Ziver, K.; AMCG Group, RM Consultants, Abingdon

    2006-07-01

    A new evolutionary search based approach for solving the nuclear reactor loading pattern optimization problems is presented based on the Estimation of Distribution Algorithms. The optimization technique developed is then applied to the maximization of the effective multiplication factor (K{sub eff}) of the Imperial College CONSORT research reactor (the last remaining civilian research reactor in the United Kingdom). A new elitism-guided searching strategy has been developed and applied to improve the local convergence together with some problem-dependent information based on the 'stand-alone K{sub eff} with fuel coupling calculations. A comparison study between the EDAs and a Genetic Algorithm with Heuristicmore » Tie Breaking Crossover operator has shown that the new algorithm is efficient and robust. (authors)« less

  6. Age differences in decision making: a process methodology for examining strategic information processing.

    PubMed

    Johnson, M M

    1990-03-01

    This study explored the use of process tracing techniques in examining the decision-making processes of older and younger adults. Thirty-six college-age and thirty-six retirement-age participants decided which one of six cars they would purchase on the basis of computer-accessed data. They provided information search protocols. Results indicate that total time to reach a decision did not differ according to age. However, retirement-age participants used less information, spent more time viewing, and re-viewed fewer bits of information than college-age participants. Information search patterns differed markedly between age groups. Patterns of retirement-age adults indicated their use of noncompensatory decision rules which, according to decision-making literature (Payne, 1976), reduce cognitive processing demands. The patterns of the college-age adults indicated their use of compensatory decision rules, which have higher processing demands.

  7. Rescuing the Clinical Breast Examination: Advances in Classifying Technique and Assessing Physician Competency.

    PubMed

    Laufer, Shlomi; D'Angelo, Anne-Lise D; Kwan, Calvin; Ray, Rebbeca D; Yudkowsky, Rachel; Boulet, John R; McGaghie, William C; Pugh, Carla M

    2017-12-01

    Develop new performance evaluation standards for the clinical breast examination (CBE). There are several, technical aspects of a proper CBE. Our recent work discovered a significant, linear relationship between palpation force and CBE accuracy. This article investigates the relationship between other technical aspects of the CBE and accuracy. This performance assessment study involved data collection from physicians (n = 553) attending 3 different clinical meetings between 2013 and 2014: American Society of Breast Surgeons, American Academy of Family Physicians, and American College of Obstetricians and Gynecologists. Four, previously validated, sensor-enabled breast models were used for clinical skills assessment. Models A and B had solitary, superficial, 2 cm and 1 cm soft masses, respectively. Models C and D had solitary, deep, 2 cm hard and moderately firm masses, respectively. Finger movements (search technique) from 1137 CBE video recordings were independently classified by 2 observers. Final classifications were compared with CBE accuracy. Accuracy rates were model A = 99.6%, model B = 89.7%, model C = 75%, and model D = 60%. Final classification categories for search technique included rubbing movement, vertical movement, piano fingers, and other. Interrater reliability was (k = 0.79). Rubbing movement was 4 times more likely to yield an accurate assessment (odds ratio 3.81, P < 0.001) compared with vertical movement and piano fingers. Piano fingers had the highest failure rate (36.5%). Regression analysis of search pattern, search technique, palpation force, examination time, and 6 demographic variables, revealed that search technique independently and significantly affected CBE accuracy (P < 0.001). Our results support measurement and classification of CBE techniques and provide the foundation for a new paradigm in teaching and assessing hands-on clinical skills. The newly described piano fingers palpation technique was noted to have unusually high failure rates. Medical educators should be aware of the potential differences in effectiveness for various CBE techniques.

  8. High density circuit technology

    NASA Technical Reports Server (NTRS)

    Wade, T. E.

    1979-01-01

    Polyimide dielectric materials were acquired for comparative and evaluative studies in double layer metal processes. Preliminary experiments were performed. Also, the literature indicates that sputtered aluminum films may be successfully patterned using the left-off technique provided the substrate temperature remains low and the argon pressure in the chamber is relatively high at the time of sputtering. Vendors associated with dry processing equipment are identified. A literature search relative to future trends in VLSI fabrication techniques is described.

  9. GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns

    DOE PAGES

    Senin, Pavel; Lin, Jessica; Wang, Xing; ...

    2018-02-23

    The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less

  10. GrammarViz 3.0: Interactive Discovery of Variable-Length Time Series Patterns

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Senin, Pavel; Lin, Jessica; Wang, Xing

    The problems of recurrent and anomalous pattern discovery in time series, e.g., motifs and discords, respectively, have received a lot of attention from researchers in the past decade. However, since the pattern search space is usually intractable, most existing detection algorithms require that the patterns have discriminative characteristics and have its length known in advance and provided as input, which is an unreasonable requirement for many real-world problems. In addition, patterns of similar structure, but of different lengths may co-exist in a time series. In order to address these issues, we have developed algorithms for variable-length time series pattern discoverymore » that are based on symbolic discretization and grammar inference—two techniques whose combination enables the structured reduction of the search space and discovery of the candidate patterns in linear time. In this work, we present GrammarViz 3.0—a software package that provides implementations of proposed algorithms and graphical user interface for interactive variable-length time series pattern discovery. The current version of the software provides an alternative grammar inference algorithm that improves the time series motif discovery workflow, and introduces an experimental procedure for automated discretization parameter selection that builds upon the minimum cardinality maximum cover principle and aids the time series recurrent and anomalous pattern discovery.« less

  11. Searching Creativity: (N)On Place Design Workshop

    ERIC Educational Resources Information Center

    Önal, Gökçe Ketizmen

    2017-01-01

    This study is mainly about developing an approach for fostering creativity in design education through analyzing the interactions among creative dimensions resembling spatial and organizational pattern of folding as a technique and also by the help of cognitive action of designers: workshop participants. In order to make an assessment, a case…

  12. Fast online and index-based algorithms for approximate search of RNA sequence-structure patterns

    PubMed Central

    2013-01-01

    Background It is well known that the search for homologous RNAs is more effective if both sequence and structure information is incorporated into the search. However, current tools for searching with RNA sequence-structure patterns cannot fully handle mutations occurring on both these levels or are simply not fast enough for searching large sequence databases because of the high computational costs of the underlying sequence-structure alignment problem. Results We present new fast index-based and online algorithms for approximate matching of RNA sequence-structure patterns supporting a full set of edit operations on single bases and base pairs. Our methods efficiently compute semi-global alignments of structural RNA patterns and substrings of the target sequence whose costs satisfy a user-defined sequence-structure edit distance threshold. For this purpose, we introduce a new computing scheme to optimally reuse the entries of the required dynamic programming matrices for all substrings and combine it with a technique for avoiding the alignment computation of non-matching substrings. Our new index-based methods exploit suffix arrays preprocessed from the target database and achieve running times that are sublinear in the size of the searched sequences. To support the description of RNA molecules that fold into complex secondary structures with multiple ordered sequence-structure patterns, we use fast algorithms for the local or global chaining of approximate sequence-structure pattern matches. The chaining step removes spurious matches from the set of intermediate results, in particular of patterns with little specificity. In benchmark experiments on the Rfam database, our improved online algorithm is faster than the best previous method by up to factor 45. Our best new index-based algorithm achieves a speedup of factor 560. Conclusions The presented methods achieve considerable speedups compared to the best previous method. This, together with the expected sublinear running time of the presented index-based algorithms, allows for the first time approximate matching of RNA sequence-structure patterns in large sequence databases. Beyond the algorithmic contributions, we provide with RaligNAtor a robust and well documented open-source software package implementing the algorithms presented in this manuscript. The RaligNAtor software is available at http://www.zbh.uni-hamburg.de/ralignator. PMID:23865810

  13. Pattern Recognition Using Artificial Neural Network: A Review

    NASA Astrophysics Data System (ADS)

    Kim, Tai-Hoon

    Among the various frameworks in which pattern recognition has been traditionally formulated, the statistical approach has been most intensively studied and used in practice. More recently, artificial neural network techniques theory have been receiving increasing attention. The design of a recognition system requires careful attention to the following issues: definition of pattern classes, sensing environment, pattern representation, feature extraction and selection, cluster analysis, classifier design and learning, selection of training and test samples, and performance evaluation. In spite of almost 50 years of research and development in this field, the general problem of recognizing complex patterns with arbitrary orientation, location, and scale remains unsolved. New and emerging applications, such as data mining, web searching, retrieval of multimedia data, face recognition, and cursive handwriting recognition, require robust and efficient pattern recognition techniques. The objective of this review paper is to summarize and compare some of the well-known methods used in various stages of a pattern recognition system using ANN and identify research topics and applications which are at the forefront of this exciting and challenging field.

  14. Simultaneous beam sampling and aperture shape optimization for SPORT.

    PubMed

    Zarepisheh, Masoud; Li, Ruijiang; Ye, Yinyu; Xing, Lei

    2015-02-01

    Station parameter optimized radiation therapy (SPORT) was recently proposed to fully utilize the technical capability of emerging digital linear accelerators, in which the station parameters of a delivery system, such as aperture shape and weight, couch position/angle, gantry/collimator angle, can be optimized simultaneously. SPORT promises to deliver remarkable radiation dose distributions in an efficient manner, yet there exists no optimization algorithm for its implementation. The purpose of this work is to develop an algorithm to simultaneously optimize the beam sampling and aperture shapes. The authors build a mathematical model with the fundamental station point parameters as the decision variables. To solve the resulting large-scale optimization problem, the authors devise an effective algorithm by integrating three advanced optimization techniques: column generation, subgradient method, and pattern search. Column generation adds the most beneficial stations sequentially until the plan quality improvement saturates and provides a good starting point for the subsequent optimization. It also adds the new stations during the algorithm if beneficial. For each update resulted from column generation, the subgradient method improves the selected stations locally by reshaping the apertures and updating the beam angles toward a descent subgradient direction. The algorithm continues to improve the selected stations locally and globally by a pattern search algorithm to explore the part of search space not reachable by the subgradient method. By combining these three techniques together, all plausible combinations of station parameters are searched efficiently to yield the optimal solution. A SPORT optimization framework with seamlessly integration of three complementary algorithms, column generation, subgradient method, and pattern search, was established. The proposed technique was applied to two previously treated clinical cases: a head and neck and a prostate case. It significantly improved the target conformality and at the same time critical structure sparing compared with conventional intensity modulated radiation therapy (IMRT). In the head and neck case, for example, the average PTV coverage D99% for two PTVs, cord and brainstem max doses, and right parotid gland mean dose were improved, respectively, by about 7%, 37%, 12%, and 16%. The proposed method automatically determines the number of the stations required to generate a satisfactory plan and optimizes simultaneously the involved station parameters, leading to improved quality of the resultant treatment plans as compared with the conventional IMRT plans.

  15. Simultaneous beam sampling and aperture shape optimization for SPORT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarepisheh, Masoud; Li, Ruijiang; Xing, Lei, E-mail: Lei@stanford.edu

    Purpose: Station parameter optimized radiation therapy (SPORT) was recently proposed to fully utilize the technical capability of emerging digital linear accelerators, in which the station parameters of a delivery system, such as aperture shape and weight, couch position/angle, gantry/collimator angle, can be optimized simultaneously. SPORT promises to deliver remarkable radiation dose distributions in an efficient manner, yet there exists no optimization algorithm for its implementation. The purpose of this work is to develop an algorithm to simultaneously optimize the beam sampling and aperture shapes. Methods: The authors build a mathematical model with the fundamental station point parameters as the decisionmore » variables. To solve the resulting large-scale optimization problem, the authors devise an effective algorithm by integrating three advanced optimization techniques: column generation, subgradient method, and pattern search. Column generation adds the most beneficial stations sequentially until the plan quality improvement saturates and provides a good starting point for the subsequent optimization. It also adds the new stations during the algorithm if beneficial. For each update resulted from column generation, the subgradient method improves the selected stations locally by reshaping the apertures and updating the beam angles toward a descent subgradient direction. The algorithm continues to improve the selected stations locally and globally by a pattern search algorithm to explore the part of search space not reachable by the subgradient method. By combining these three techniques together, all plausible combinations of station parameters are searched efficiently to yield the optimal solution. Results: A SPORT optimization framework with seamlessly integration of three complementary algorithms, column generation, subgradient method, and pattern search, was established. The proposed technique was applied to two previously treated clinical cases: a head and neck and a prostate case. It significantly improved the target conformality and at the same time critical structure sparing compared with conventional intensity modulated radiation therapy (IMRT). In the head and neck case, for example, the average PTV coverage D99% for two PTVs, cord and brainstem max doses, and right parotid gland mean dose were improved, respectively, by about 7%, 37%, 12%, and 16%. Conclusions: The proposed method automatically determines the number of the stations required to generate a satisfactory plan and optimizes simultaneously the involved station parameters, leading to improved quality of the resultant treatment plans as compared with the conventional IMRT plans.« less

  16. Lifestyle and Outcomes of Assisted Reproductive Techniques: A Narrative Review

    PubMed Central

    Zeinab, Hamzehgardeshi; Zohreh, Shahhosseini; Gelehkolaee, Keshvar Samadaee

    2015-01-01

    Background: Studies reveal that lifestyles such as physical activity patterns, obesity, nutrition, and smoking, are factors that affect laboratory test results and pregnancy outcomes induced by assisted fertility techniques in infertile couples. The present study is a narrative review of studies in this area. Methods: In this study, researchers conducted their computer search in public databases Google Scholar general search engine, and then more specific: Science Direct, ProQuest, SID, Magiran, Irandoc, Pubmed, Scopus, cochrane library, and Psych info; Cumulative Index to Nursing and Allied Health Literature (CINAHL), using Medical Subject Headings (MeSH) keywords: infertility (sterility, infertility), lifestyle (life behavior, lifestyle), Assisted Reproductive Techniques (ART), antioxidant and infertility, social health, spiritual health, mental health, Alcohol and drug abuse, preventive factors, and instruments., and selected relevant articles to the study subject from 2004 to 2013. Firstly, a list of 150 papers generated from the initial search. Then reviewers studied titles and abstracts. Secondly, 111 papers were included. Finally, quality assessment of full text studies was performed by two independent reviewers. Researchers reviewed summary of all articles sought, ultimately used data from 62 full articles to compile this review paper. Results: Review of literature led to arrangement of 9 general categories of ART results’ relationship with weight watch and diet, exercise and physical activity, psychological health, avoiding medications, alcohol and drugs, preventing diseases, environmental health, spiritual health, social health, and physical health. Conclusion: The following was obtained from review of studies: since lifestyle is among important, changeable, and influential factors in fertility, success of these methods can be greatly helped through assessment of lifestyle patterns of infertile couples, and design and implementation of healthy lifestyle counseling programs, before and during implementing assisted fertility techniques. PMID:26156898

  17. Magnostics: Image-Based Search of Interesting Matrix Views for Guided Network Exploration.

    PubMed

    Behrisch, Michael; Bach, Benjamin; Hund, Michael; Delz, Michael; Von Ruden, Laura; Fekete, Jean-Daniel; Schreck, Tobias

    2017-01-01

    In this work we address the problem of retrieving potentially interesting matrix views to support the exploration of networks. We introduce Matrix Diagnostics (or Magnostics), following in spirit related approaches for rating and ranking other visualization techniques, such as Scagnostics for scatter plots. Our approach ranks matrix views according to the appearance of specific visual patterns, such as blocks and lines, indicating the existence of topological motifs in the data, such as clusters, bi-graphs, or central nodes. Magnostics can be used to analyze, query, or search for visually similar matrices in large collections, or to assess the quality of matrix reordering algorithms. While many feature descriptors for image analyzes exist, there is no evidence how they perform for detecting patterns in matrices. In order to make an informed choice of feature descriptors for matrix diagnostics, we evaluate 30 feature descriptors-27 existing ones and three new descriptors that we designed specifically for MAGNOSTICS-with respect to four criteria: pattern response, pattern variability, pattern sensibility, and pattern discrimination. We conclude with an informed set of six descriptors as most appropriate for Magnostics and demonstrate their application in two scenarios; exploring a large collection of matrices and analyzing temporal networks.

  18. Use of handheld sonar to locate a missing diver.

    PubMed

    McGrane, Owen; Cronin, Aaron; Hile, David

    2013-03-01

    The purpose of this study was to investigate whether a handheld sonar device significantly reduces the mean time needed to locate a missing diver. This institutional review board approved, prospective, crossover study used a voluntary convenience sample of 10 scuba divers. Participants conducted both a standard and modified search to locate a simulated missing diver. The standard search utilized a conventional search pattern starting at the point where the missing diver (simulated) was last seen. The modified search used a sonar beacon to augment the search. For each search method, successful completion of the search was defined as locating the missing diver within 40 minutes. Twenty total dives were completed. Using a standard search pattern, the missing diver was found by only 1 diver (10%), taking 18 minutes and 45 seconds. In the sonar-assisted search group, the missing diver was found by all 10 participants (100%), taking an average of 2 minutes and 47 seconds (SD 1 minute, 20 seconds). Using the nonparametric related samples Wilcoxon signed rank test, actual times between the sonar group and the standard group were significant (P < .01). Using paired samples t tests, the sonar group's self-assessed confidence increased significantly after using the sonar (P < .001), whereas the standard group decreased in confidence (not statistically significant, P = .111). Handheld sonar significantly reduces the mean duration to locate a missing diver as well as increasing users' confidence in their ability to find a missing diver when compared with standard search techniques. Copyright © 2013 Wilderness Medical Society. Published by Elsevier Inc. All rights reserved.

  19. A Model Independent S/W Framework for Search-Based Software Testing

    PubMed Central

    Baik, Jongmoon

    2014-01-01

    In Model-Based Testing (MBT) area, Search-Based Software Testing (SBST) has been employed to generate test cases from the model of a system under test. However, many types of models have been used in MBT. If the type of a model has changed from one to another, all functions of a search technique must be reimplemented because the types of models are different even if the same search technique has been applied. It requires too much time and effort to implement the same algorithm over and over again. We propose a model-independent software framework for SBST, which can reduce redundant works. The framework provides a reusable common software platform to reduce time and effort. The software framework not only presents design patterns to find test cases for a target model but also reduces development time by using common functions provided in the framework. We show the effectiveness and efficiency of the proposed framework with two case studies. The framework improves the productivity by about 50% when changing the type of a model. PMID:25302314

  20. A flexible motif search technique based on generalized profiles.

    PubMed

    Bucher, P; Karplus, K; Moeri, N; Hofmann, K

    1996-03-01

    A flexible motif search technique is presented which has two major components: (1) a generalized profile syntax serving as a motif definition language; and (2) a motif search method specifically adapted to the problem of finding multiple instances of a motif in the same sequence. The new profile structure, which is the core of the generalized profile syntax, combines the functions of a variety of motif descriptors implemented in other methods, including regular expression-like patterns, weight matrices, previously used profiles, and certain types of hidden Markov models (HMMs). The relationship between generalized profiles and other biomolecular motif descriptors is analyzed in detail, with special attention to HMMs. Generalized profiles are shown to be equivalent to a particular class of HMMs, and conversion procedures in both directions are given. The conversion procedures provide an interpretation for local alignment in the framework of stochastic models, allowing for clear, simple significance tests. A mathematical statement of the motif search problem defines the new method exactly without linking it to a specific algorithmic solution. Part of the definition includes a new definition of disjointness of alignments.

  1. SOI layout decomposition for double patterning lithography on high-performance computer platforms

    NASA Astrophysics Data System (ADS)

    Verstov, Vladimir; Zinchenko, Lyudmila; Makarchuk, Vladimir

    2014-12-01

    In the paper silicon on insulator layout decomposition algorithms for the double patterning lithography on high performance computing platforms are discussed. Our approach is based on the use of a contradiction graph and a modified concurrent breadth-first search algorithm. We evaluate our technique on 45 nm Nangate Open Cell Library including non-Manhattan geometry. Experimental results show that our soft computing algorithms decompose layout successfully and a minimal distance between polygons in layout is increased.

  2. A new measure for gene expression biclustering based on non-parametric correlation.

    PubMed

    Flores, Jose L; Inza, Iñaki; Larrañaga, Pedro; Calvo, Borja

    2013-12-01

    One of the emerging techniques for performing the analysis of the DNA microarray data known as biclustering is the search of subsets of genes and conditions which are coherently expressed. These subgroups provide clues about the main biological processes. Until now, different approaches to this problem have been proposed. Most of them use the mean squared residue as quality measure but relevant and interesting patterns can not be detected such as shifting, or scaling patterns. Furthermore, recent papers show that there exist new coherence patterns involved in different kinds of cancer and tumors such as inverse relationships between genes which can not be captured. The proposed measure is called Spearman's biclustering measure (SBM) which performs an estimation of the quality of a bicluster based on the non-linear correlation among genes and conditions simultaneously. The search of biclusters is performed by using a evolutionary technique called estimation of distribution algorithms which uses the SBM measure as fitness function. This approach has been examined from different points of view by using artificial and real microarrays. The assessment process has involved the use of quality indexes, a set of bicluster patterns of reference including new patterns and a set of statistical tests. It has been also examined the performance using real microarrays and comparing to different algorithmic approaches such as Bimax, CC, OPSM, Plaid and xMotifs. SBM shows several advantages such as the ability to recognize more complex coherence patterns such as shifting, scaling and inversion and the capability to selectively marginalize genes and conditions depending on the statistical significance. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Configurable pattern-based evolutionary biclustering of gene expression data

    PubMed Central

    2013-01-01

    Background Biclustering algorithms for microarray data aim at discovering functionally related gene sets under different subsets of experimental conditions. Due to the problem complexity and the characteristics of microarray datasets, heuristic searches are usually used instead of exhaustive algorithms. Also, the comparison among different techniques is still a challenge. The obtained results vary in relevant features such as the number of genes or conditions, which makes it difficult to carry out a fair comparison. Moreover, existing approaches do not allow the user to specify any preferences on these properties. Results Here, we present the first biclustering algorithm in which it is possible to particularize several biclusters features in terms of different objectives. This can be done by tuning the specified features in the algorithm or also by incorporating new objectives into the search. Furthermore, our approach bases the bicluster evaluation in the use of expression patterns, being able to recognize both shifting and scaling patterns either simultaneously or not. Evolutionary computation has been chosen as the search strategy, naming thus our proposal Evo-Bexpa (Evolutionary Biclustering based in Expression Patterns). Conclusions We have conducted experiments on both synthetic and real datasets demonstrating Evo-Bexpa abilities to obtain meaningful biclusters. Synthetic experiments have been designed in order to compare Evo-Bexpa performance with other approaches when looking for perfect patterns. Experiments with four different real datasets also confirm the proper performing of our algorithm, whose results have been biologically validated through Gene Ontology. PMID:23433178

  4. Analysis of ehealth search perspectives among female college students in the health professions using Q methodology.

    PubMed

    Stellefson, Michael; Hanik, Bruce; Chaney, J Don; Tennant, Bethany

    2012-04-27

    The current "Millennial Generation" of college students majoring in the health professions has unprecedented access to the Internet. Although some research has been initiated among medical professionals to investigate the cognitive basis for health information searches on the Internet, little is known about Internet search practices among health and medical professional students. To systematically identify health professional college student perspectives of personal eHealth search practices. Q methodology was used to examine subjective perspectives regarding personal eHealth search practices among allied health students majoring in a health education degree program. Thirteen (n = 13) undergraduate students were interviewed about their attitudes and experiences conducting eHealth searches. From the interviews, 36 statements were used in a structured ranking task to identify clusters and determine which specific perceptions of eHealth search practices discriminated students into different groups. Scores on an objective measure of eHealth literacy were used to help categorize participant perspectives. Q-technique factor analysis of the rankings identified 3 clusters of respondents with differing views on eHealth searches that generally coincided with participants' objective eHealth literacy scores. The proficient resourceful students (pattern/structure coefficient range 0.56-0.80) described themselves as using multiple resources to obtain eHealth information, as opposed to simply relying on Internet search engines. The intermediate reluctant students (pattern/structure coefficient range 0.75-0.90) reported engaging only Internet search engines to locate eHealth information, citing undeveloped evaluation skills when considering sources of information located on the Internet. Both groups of advanced students reported not knowing how to use Boolean operators to conduct Internet health searches. The basic hubristic students (pattern/structure coefficient range 0.54-0.76) described themselves as independent procrastinators when searching for eHealth information. Interestingly, basic hubristic students represented the only cluster of participants to describe themselves as (1) having received instruction on using the Internet to conduct eHealth searches, and (2) possessing relative confidence when completing a search task. Subjective perspectives of eHealth search practices differed among students possessing different levels of eHealth literacy. These multiple perspectives present both challenges and opportunities for empowering college students in the health professions to use the Internet to obtain and appraise evidence-based health information using the Internet.

  5. Analysis of eHealth Search Perspectives Among Female College Students in the Health Professions Using Q Methodology

    PubMed Central

    Hanik, Bruce; Chaney, J. Don; Tennant, Bethany

    2012-01-01

    Background The current “Millennial Generation” of college students majoring in the health professions has unprecedented access to the Internet. Although some research has been initiated among medical professionals to investigate the cognitive basis for health information searches on the Internet, little is known about Internet search practices among health and medical professional students. Objective To systematically identify health professional college student perspectives of personal eHealth search practices. Methods Q methodology was used to examine subjective perspectives regarding personal eHealth search practices among allied health students majoring in a health education degree program. Thirteen (n = 13) undergraduate students were interviewed about their attitudes and experiences conducting eHealth searches. From the interviews, 36 statements were used in a structured ranking task to identify clusters and determine which specific perceptions of eHealth search practices discriminated students into different groups. Scores on an objective measure of eHealth literacy were used to help categorize participant perspectives. Results Q-technique factor analysis of the rankings identified 3 clusters of respondents with differing views on eHealth searches that generally coincided with participants’ objective eHealth literacy scores. The proficient resourceful students (pattern/structure coefficient range 0.56-0.80) described themselves as using multiple resources to obtain eHealth information, as opposed to simply relying on Internet search engines. The intermediate reluctant students (pattern/structure coefficient range 0.75-0.90) reported engaging only Internet search engines to locate eHealth information, citing undeveloped evaluation skills when considering sources of information located on the Internet. Both groups of advanced students reported not knowing how to use Boolean operators to conduct Internet health searches. The basic hubristic students (pattern/structure coefficient range 0.54-0.76) described themselves as independent procrastinators when searching for eHealth information. Interestingly, basic hubristic students represented the only cluster of participants to describe themselves as (1) having received instruction on using the Internet to conduct eHealth searches, and (2) possessing relative confidence when completing a search task. Conclusions Subjective perspectives of eHealth search practices differed among students possessing different levels of eHealth literacy. These multiple perspectives present both challenges and opportunities for empowering college students in the health professions to use the Internet to obtain and appraise evidence-based health information using the Internet. PMID:22543437

  6. Astrometric Search Method for Individually Resolvable Gravitational Wave Sources with Gaia

    NASA Astrophysics Data System (ADS)

    Moore, Christopher J.; Mihaylov, Deyan P.; Lasenby, Anthony; Gilmore, Gerard

    2017-12-01

    Gravitational waves (GWs) cause the apparent position of distant stars to oscillate with a characteristic pattern on the sky. Astrometric measurements (e.g., those made by Gaia) provide a new way to search for GWs. The main difficulty facing such a search is the large size of the data set; Gaia observes more than one billion stars. In this Letter the problem of searching for GWs from individually resolvable supermassive black hole binaries using astrometry is addressed for the first time; it is demonstrated how the data set can be compressed by a factor of more than 1 06, with a loss of sensitivity of less than 1%. This technique was successfully used to recover artificially injected GW signals from mock Gaia data and to assess the GW sensitivity of Gaia. Throughout the Letter the complementarity of Gaia and pulsar timing searches for GWs is highlighted.

  7. Visualizing a possible atmospheric teleconnection associated with UK floods in autumn 2000

    NASA Astrophysics Data System (ADS)

    Pall, P.; Bensema, K.; Stone, D.; Wehner, M. F.; Bethel, W.; Joy, K.

    2012-12-01

    Severe floods occurred across England and Wales during the record-wet autumn of the year 2000. Recently Pall et al. (2011) demonstrated that the risk of such floods occurring at that time substantially increased as a result of anthropogenic greenhouse gas emissions, and that the synoptic weather system associated with the floods was a common but anomalously strong 'Scandinavia' atmospheric circulation pattern (a Rossby-wave-like train of tropospheric anomalies in geopotential height, extending from the subtropical Atlantic across Eurasia, with a cyclone over the UK and a strong anticyclone over Scandinavia). Blackburn and Hoskins (2001) suggest that this pattern was itself catalyzed by an anomalous upper-tropospheric flow of air: originating with an ascent of air due to convection over warm sea surface temperatures in the western Tropical Pacific, and ending in a descent of air over the Amazon in the proposed source region of the Scandinavia pattern. However, evidence for this so-called 'teleconnection' is not entirely clear in the idealised climate models they used. Here we use visualization techniques to search for this teleconnection in the simulations generated with the more comprehensive seasonal-forecast-resolution climate model of Pall et al. (2011) -- by identifying anomalous streamflow patterns and using the UV-CDAT software developed at Berkeley Lab to do so. Furthermore, since several thousand simulations were generated (in order to capture the rare flood event), totaling hundreds of GB in size, we use paralleisation techniques to perform this search efficiently.

  8. Public Awareness of Uterine Power Morcellation Through US Food and Drug Administration Communications: Analysis of Google Trends Search Term Patterns

    PubMed Central

    Jamnagerwalla, Juzar; Markowitz, Melissa A; Thum, D Joseph; McCarty, Philip; Medendorp, Andrew R; Raz, Shlomo; Kim, Ja-Hong

    2018-01-01

    Background Uterine power morcellation, where the uterus is shred into smaller pieces, is a widely used technique for removal of uterine specimens in patients undergoing minimally invasive abdominal hysterectomy or myomectomy. Complications related to power morcellation of uterine specimens led to US Food and Drug Administration (FDA) communications in 2014 ultimately recommending against the use of power morcellation for women undergoing minimally invasive hysterectomy. Subsequently, practitioners drastically decreased the use of morcellation. Objective We aimed to determine the effect of increased patient awareness on the decrease in use of the morcellator. Google Trends is a public tool that provides data on temporal patterns of search terms, and we correlated this data with the timing of the FDA communication. Methods Weekly relative search volume (RSV) was obtained from Google Trends using the term “morcellation.” Higher RSV corresponds to increases in weekly search volume. Search volumes were divided into 3 groups: the 2 years prior to the FDA communication, a 1-year period following, and thereafter, with the distribution of the weekly RSV over the 3 periods tested using 1-way analysis of variance. Additionally, we analyzed the total number of websites containing the term “morcellation” over this time. Results The mean RSV prior to the FDA communication was 12.0 (SD 15.8), with the RSV being 60.3 (SD 24.7) in the 1-year after and 19.3 (SD 5.2) thereafter (P<.001). The mean number of webpages containing the term “morcellation” in 2011 was 10,800, rising to 18,800 during 2014 and 36,200 in 2017. Conclusions Google search activity about morcellation of uterine specimens increased significantly after the FDA communications. This trend indicates an increased public awareness regarding morcellation and its complications. More extensive preoperative counseling and alteration of surgical technique and clinician practice may be necessary. PMID:29699965

  9. Error Estimation Techniques to Refine Overlapping Aerial Image Mosaic Processes via Detected Parameters

    ERIC Educational Resources Information Center

    Bond, William Glenn

    2012-01-01

    In this paper, I propose to demonstrate a means of error estimation preprocessing in the assembly of overlapping aerial image mosaics. The mosaic program automatically assembles several hundred aerial images from a data set by aligning them, via image registration using a pattern search method, onto a GIS grid. The method presented first locates…

  10. Graphical Representations of Electronic Search Patterns.

    ERIC Educational Resources Information Center

    Lin, Xia; And Others

    1991-01-01

    Discussion of search behavior in electronic environments focuses on the development of GRIP (Graphic Representor of Interaction Patterns), a graphing tool based on HyperCard that produces graphic representations of search patterns. Search state spaces are explained, and forms of data available from electronic searches are described. (34…

  11. Mars Rover imaging systems and directional filtering

    NASA Technical Reports Server (NTRS)

    Wang, Paul P.

    1989-01-01

    Computer literature searches were carried out at Duke University and NASA Langley Research Center. The purpose is to enhance personal knowledge based on the technical problems of pattern recognition and image understanding which must be solved for the Mars Rover and Sample Return Mission. Intensive study effort of a large collection of relevant literature resulted in a compilation of all important documents in one place. Furthermore, the documents are being classified into: Mars Rover; computer vision (theory); imaging systems; pattern recognition methodologies; and other smart techniques (AI, neural networks, fuzzy logic, etc).

  12. A Parallel Genetic Algorithm to Discover Patterns in Genetic Markers that Indicate Predisposition to Multifactorial Disease

    PubMed Central

    Rausch, Tobias; Thomas, Alun; Camp, Nicola J.; Cannon-Albright, Lisa A.; Facelli, Julio C.

    2008-01-01

    This paper describes a novel algorithm to analyze genetic linkage data using pattern recognition techniques and genetic algorithms (GA). The method allows a search for regions of the chromosome that may contain genetic variations that jointly predispose individuals for a particular disease. The method uses correlation analysis, filtering theory and genetic algorithms (GA) to achieve this goal. Because current genome scans use from hundreds to hundreds of thousands of markers, two versions of the method have been implemented. The first is an exhaustive analysis version that can be used to visualize, explore, and analyze small genetic data sets for two marker correlations; the second is a GA version, which uses a parallel implementation allowing searches of higher-order correlations in large data sets. Results on simulated data sets indicate that the method can be informative in the identification of major disease loci and gene-gene interactions in genome-wide linkage data and that further exploration of these techniques is justified. The results presented for both variants of the method show that it can help genetic epidemiologists to identify promising combinations of genetic factors that might predispose to complex disorders. In particular, the correlation analysis of IBD expression patterns might hint to possible gene-gene interactions and the filtering might be a fruitful approach to distinguish true correlation signals from noise. PMID:18547558

  13. Generation of optimal artificial neural networks using a pattern search algorithm: application to approximation of chemical systems.

    PubMed

    Ihme, Matthias; Marsden, Alison L; Pitsch, Heinz

    2008-02-01

    A pattern search optimization method is applied to the generation of optimal artificial neural networks (ANNs). Optimization is performed using a mixed variable extension to the generalized pattern search method. This method offers the advantage that categorical variables, such as neural transfer functions and nodal connectivities, can be used as parameters in optimization. When used together with a surrogate, the resulting algorithm is highly efficient for expensive objective functions. Results demonstrate the effectiveness of this method in optimizing an ANN for the number of neurons, the type of transfer function, and the connectivity among neurons. The optimization method is applied to a chemistry approximation of practical relevance. In this application, temperature and a chemical source term are approximated as functions of two independent parameters using optimal ANNs. Comparison of the performance of optimal ANNs with conventional tabulation methods demonstrates equivalent accuracy by considerable savings in memory storage. The architecture of the optimal ANN for the approximation of the chemical source term consists of a fully connected feedforward network having four nonlinear hidden layers and 117 synaptic weights. An equivalent representation of the chemical source term using tabulation techniques would require a 500 x 500 grid point discretization of the parameter space.

  14. Synthesis of concentric circular antenna arrays using dragonfly algorithm

    NASA Astrophysics Data System (ADS)

    Babayigit, B.

    2018-05-01

    Due to the strong non-linear relationship between the array factor and the array elements, concentric circular antenna array (CCAA) synthesis problem is challenging. Nature-inspired optimisation techniques have been playing an important role in solving array synthesis problems. Dragonfly algorithm (DA) is a novel nature-inspired optimisation technique which is based on the static and dynamic swarming behaviours of dragonflies in nature. This paper presents the design of CCAAs to get low sidelobes using DA. The effectiveness of the proposed DA is investigated in two different (with and without centre element) cases of two three-ring (having 4-, 6-, 8-element or 8-, 10-, 12-element) CCAA design. The radiation pattern of each design cases is obtained by finding optimal excitation weights of the array elements using DA. Simulation results show that the proposed algorithm outperforms the other state-of-the-art techniques (symbiotic organisms search, biogeography-based optimisation, sequential quadratic programming, opposition-based gravitational search algorithm, cat swarm optimisation, firefly algorithm, evolutionary programming) for all design cases. DA can be a promising technique for electromagnetic problems.

  15. Plans, Patterns, and Move Categories Guiding a Highly Selective Search

    NASA Astrophysics Data System (ADS)

    Trippen, Gerhard

    In this paper we present our ideas for an Arimaa-playing program (also called a bot) that uses plans and pattern matching to guide a highly selective search. We restrict move generation to moves in certain move categories to reduce the number of moves considered by the bot significantly. Arimaa is a modern board game that can be played with a standard Chess set. However, the rules of the game are not at all like those of Chess. Furthermore, Arimaa was designed to be as simple and intuitive as possible for humans, yet challenging for computers. While all established Arimaa bots use alpha-beta search with a variety of pruning techniques and other heuristics ending in an extensive positional leaf node evaluation, our new bot, Rat, starts with a positional evaluation of the current position. Based on features found in the current position - supported by pattern matching using a directed position graph - our bot Rat decides which of a given set of plans to follow. The plan then dictates what types of moves can be chosen. This is another major difference from bots that generate "all" possible moves for a particular position. Rat is only allowed to generate moves that belong to certain categories. Leaf nodes are evaluated only by a straightforward material evaluation to help avoid moves that lose material. This highly selective search looks, on average, at only 5 moves out of 5,000 to over 40,000 possible moves in a middle game position.

  16. On the Local Convergence of Pattern Search

    NASA Technical Reports Server (NTRS)

    Dolan, Elizabeth D.; Lewis, Robert Michael; Torczon, Virginia; Bushnell, Dennis M. (Technical Monitor)

    2000-01-01

    We examine the local convergence properties of pattern search methods, complementing the previously established global convergence properties for this class of algorithms. We show that the step-length control parameter which appears in the definition of pattern search algorithms provides a reliable asymptotic measure of first-order stationarity. This gives an analytical justification for a traditional stopping criterion for pattern search methods. Using this measure of first-order stationarity, we analyze the behavior of pattern search in the neighborhood of an isolated local minimizer. We show that a recognizable subsequence converges r-linearly to the minimizer.

  17. Word aligned bitmap compression method, data structure, and apparatus

    DOEpatents

    Wu, Kesheng; Shoshani, Arie; Otoo, Ekow

    2004-12-14

    The Word-Aligned Hybrid (WAH) bitmap compression method and data structure is a relatively efficient method for searching and performing logical, counting, and pattern location operations upon large datasets. The technique is comprised of a data structure and methods that are optimized for computational efficiency by using the WAH compression method, which typically takes advantage of the target computing system's native word length. WAH is particularly apropos to infrequently varying databases, including those found in the on-line analytical processing (OLAP) industry, due to the increased computational efficiency of the WAH compressed bitmap index. Some commercial database products already include some version of a bitmap index, which could possibly be replaced by the WAH bitmap compression techniques for potentially increased operation speed, as well as increased efficiencies in constructing compressed bitmaps. Combined together, this technique may be particularly useful for real-time business intelligence. Additional WAH applications may include scientific modeling, such as climate and combustion simulations, to minimize search time for analysis and subsequent data visualization.

  18. Making Temporal Search More Central in Spatial Data Infrastructures

    NASA Astrophysics Data System (ADS)

    Corti, P.; Lewis, B.

    2017-10-01

    A temporally enabled Spatial Data Infrastructure (SDI) is a framework of geospatial data, metadata, users, and tools intended to provide an efficient and flexible way to use spatial information which includes the historical dimension. One of the key software components of an SDI is the catalogue service which is needed to discover, query, and manage the metadata. A search engine is a software system capable of supporting fast and reliable search, which may use any means necessary to get users to the resources they need quickly and efficiently. These techniques may include features such as full text search, natural language processing, weighted results, temporal search based on enrichment, visualization of patterns in distributions of results in time and space using temporal and spatial faceting, and many others. In this paper we will focus on the temporal aspects of search which include temporal enrichment using a time miner - a software engine able to search for date components within a larger block of text, the storage of time ranges in the search engine, handling historical dates, and the use of temporal histograms in the user interface to display the temporal distribution of search results.

  19. Does the Australian desert ant Melophorus bagoti approximate a Lévy search by an intrinsic bi-modal walk?

    PubMed

    Reynolds, Andy M; Schultheiss, Patrick; Cheng, Ken

    2014-01-07

    We suggest that the Australian desert ant Melophorus bagoti approximates a Lévy search pattern by using an intrinsic bi-exponential walk and does so when a Lévy search pattern is advantageous. When attempting to locate its nest, M. bagoti adopt a stereotypical search pattern. These searches begin at the location where the ant expects to find the nest, and comprise loops that start and end at this location, and are directed in different azimuthal directions. Loop lengths are exponentially distributed when searches are in visually familiar surroundings and are well described by a mixture of two exponentials when searches are in unfamiliar landscapes. The latter approximates a power-law distribution, the hallmark of a Lévy search. With the aid of a simple analytically tractable theory, we show that an exponential loop-length distribution is advantageous when the distance to the nest can be estimated with some certainty and that a bi-exponential distribution is advantageous when there is considerable uncertainty regarding the nest location. The best bi-exponential search patterns are shown to be those that come closest to approximating advantageous Lévy looping searches. The bi-exponential search patterns of M. bagoti are found to approximate advantageous Lévy search patterns. Copyright © 2013. Published by Elsevier Ltd.

  20. Optimal random Lévy-loop searching: New insights into the searching behaviours of central-place foragers

    NASA Astrophysics Data System (ADS)

    Reynolds, A. M.

    2008-04-01

    A random Lévy-looping model of searching is devised and optimal random Lévy-looping searching strategies are identified for the location of a single target whose position is uncertain. An inverse-square power law distribution of loop lengths is shown to be optimal when the distance between the centre of the search and the target is much shorter than the size of the longest possible loop in the searching pattern. Optimal random Lévy-looping searching patterns have recently been observed in the flight patterns of honeybees (Apis mellifera) when attempting to locate their hive and when searching after a known food source becomes depleted. It is suggested that the searching patterns of desert ants (Cataglyphis) are consistent with the adoption of an optimal Lévy-looping searching strategy.

  1. Vander Lugt correlation of DNA sequence data

    NASA Astrophysics Data System (ADS)

    Christens-Barry, William A.; Hawk, James F.; Martin, James C.

    1990-12-01

    DNA, the molecule containing the genetic code of an organism, is a linear chain of subunits. It is the sequence of subunits, of which there are four kinds, that constitutes the unique blueprint of an individual. This sequence is the focus of a large number of analyses performed by an army of geneticists, biologists, and computer scientists. Most of these analyses entail searches for specific subsequences within the larger set of sequence data. Thus, most analyses are essentially pattern recognition or correlation tasks. Yet, there are special features to such analysis that influence the strategy and methods of an optical pattern recognition approach. While the serial processing employed in digital electronic computers remains the main engine of sequence analyses, there is no fundamental reason that more efficient parallel methods cannot be used. We describe an approach using optical pattern recognition (OPR) techniques based on matched spatial filtering. This allows parallel comparison of large blocks of sequence data. In this study we have simulated a Vander Lugt1 architecture implementing our approach. Searches for specific target sequence strings within a block of DNA sequence from the Co/El plasmid2 are performed.

  2. Generation of Escher Arts with Dual Perception.

    PubMed

    Lin, Shih-Syun; Morace, Charles C; Lin, Chao-Hung; Hsu, Li-Fong; Lee, Tong-Yee

    2018-02-01

    Escher transmutation is a graphic art that smoothly transforms one tile pattern into another tile pattern with dual perception. A classic example is the artwork called Sky and Water, in which a compelling figure-ground arrangement is applied to portray the transmutation of a bird in sky and a fish in water. The shape of a bird is progressively deformed and dissolves into the background while the background gradually reveals the shape of a fish. This paper introduces a system to create a variety of Escher-like transmutations, which includes the algorithms for initializing a tile pattern with dual figure-ground arrangement, for searching for the best matched shape of a user-specified motif from a database, and for transforming the content and shapes of tile patterns using a content-aware warping technique. The proposed system, integrating the graphic techniques of tile initialization, shape matching, and shape warping, allows users to create various Escher-like transmutations with minimal user interaction. Experimental results and conducted user studies demonstrate the feasibility and flexibility of the proposed system in Escher art generation.

  3. POETRY--PART ONE, "A WAY OF SAYING," PART TWO, "SEARCH FOR ORDER." LITERATURE CURRICULUM V, TEACHER AND STUDENT VERSIONS.

    ERIC Educational Resources Information Center

    KITZHABER, ALBERT R.

    THIS POETRY UNIT FOR 11TH-GRADERS ILLUSTRATES HOW VERSE STRUCTURE AND POETIC TECHNIQUES CONTRIBUTE TO A POEM'S MEANING. IN PART 1, IMAGERY, METAPHOR, SYMBOLISM, IRONY, PARADOX, AND MUSICAL AND RHYTHMICAL SOUND PATTERNS ARE DISCUSSED AS WAYS OF SAYING THE "UNSAYABLE" AND OF REINFORCING THE MEANING AND MOOD OF THE POEM. THE POEMS OF SUCH…

  4. Decision-theoretic control of EUVE telescope scheduling

    NASA Technical Reports Server (NTRS)

    Hansson, Othar; Mayer, Andrew

    1993-01-01

    This paper describes a decision theoretic scheduler (DTS) designed to employ state-of-the-art probabilistic inference technology to speed the search for efficient solutions to constraint-satisfaction problems. Our approach involves assessing the performance of heuristic control strategies that are normally hard-coded into scheduling systems and using probabilistic inference to aggregate this information in light of the features of a given problem. The Bayesian Problem-Solver (BPS) introduced a similar approach to solving single agent and adversarial graph search patterns yielding orders-of-magnitude improvement over traditional techniques. Initial efforts suggest that similar improvements will be realizable when applied to typical constraint-satisfaction scheduling problems.

  5. Automated detection of jet contrails using the AVHRR split window

    NASA Technical Reports Server (NTRS)

    Engelstad, M.; Sengupta, S. K.; Lee, T.; Welch, R. M.

    1992-01-01

    This paper investigates the automated detection of jet contrails using data from the Advanced Very High Resolution Radiometer. A preliminary algorithm subtracts the 11.8-micron image from the 10.8-micron image, creating a difference image on which contrails are enhanced. Then a three-stage algorithm searches the difference image for the nearly-straight line segments which characterize contrails. First, the algorithm searches for elevated, linear patterns called 'ridges'. Second, it applies a Hough transform to the detected ridges to locate nearly-straight lines. Third, the algorithm determines which of the nearly-straight lines are likely to be contrails. The paper applies this technique to several test scenes.

  6. Study on online community user motif using web usage mining

    NASA Astrophysics Data System (ADS)

    Alphy, Meera; Sharma, Ajay

    2016-04-01

    The Web usage mining is the application of data mining, which is used to extract useful information from the online community. The World Wide Web contains at least 4.73 billion pages according to Indexed Web and it contains at least 228.52 million pages according Dutch Indexed web on 6th august 2015, Thursday. It’s difficult to get needed data from these billions of web pages in World Wide Web. Here is the importance of web usage mining. Personalizing the search engine helps the web user to identify the most used data in an easy way. It reduces the time consumption; automatic site search and automatic restore the useful sites. This study represents the old techniques to latest techniques used in pattern discovery and analysis in web usage mining from 1996 to 2015. Analyzing user motif helps in the improvement of business, e-commerce, personalisation and improvement of websites.

  7. The association rules search of Indonesian university graduate’s data using FP-growth algorithm

    NASA Astrophysics Data System (ADS)

    Faza, S.; Rahmat, R. F.; Nababan, E. B.; Arisandi, D.; Effendi, S.

    2018-02-01

    The attribute varieties in university graduates data have caused frustrations to the institution in finding the combinations of attributes that often emerge and have high integration between attributes. Association rules mining is a data mining technique to determine the integration of the data or the way of a data set affects another set of data. By way of explanation, there are possibilities in finding the integration of data on a large scale. Frequent Pattern-Growth (FP-Growth) algorithm is one of the association rules mining technique to determine a frequent itemset in an FP-Tree data set. From the research on the search of university graduate’s association rules, it can be concluded that the most common attributes that have high integration between them are in the combination of State-owned High School outside Medan, regular university entrance exam, GPA of 3.00 to 3.49 and over 4-year-long study duration.

  8. Multi-INT Complex Event Processing using Approximate, Incremental Graph Pattern Search

    DTIC Science & Technology

    2012-06-01

    graph pattern search and SPARQL queries . Total execution time for 10 executions each of 5 random pattern searches in synthetic data sets...01/11 1000 10000 100000 RDF triples Time (secs) 10 20 Graph pattern algorithm SPARQL queries Initial Performance Comparisons 09/18/11 2011 Thrust Area

  9. A Globally Convergent Augmented Lagrangian Pattern Search Algorithm for Optimization with General Constraints and Simple Bounds

    NASA Technical Reports Server (NTRS)

    Lewis, Robert Michael; Torczon, Virginia

    1998-01-01

    We give a pattern search adaptation of an augmented Lagrangian method due to Conn, Gould, and Toint. The algorithm proceeds by successive bound constrained minimization of an augmented Lagrangian. In the pattern search adaptation we solve this subproblem approximately using a bound constrained pattern search method. The stopping criterion proposed by Conn, Gould, and Toint for the solution of this subproblem requires explicit knowledge of derivatives. Such information is presumed absent in pattern search methods; however, we show how we can replace this with a stopping criterion based on the pattern size in a way that preserves the convergence properties of the original algorithm. In this way we proceed by successive, inexact, bound constrained minimization without knowing exactly how inexact the minimization is. So far as we know, this is the first provably convergent direct search method for general nonlinear programming.

  10. Searching for patterns in remote sensing image databases using neural networks

    NASA Technical Reports Server (NTRS)

    Paola, Justin D.; Schowengerdt, Robert A.

    1995-01-01

    We have investigated a method, based on a successful neural network multispectral image classification system, of searching for single patterns in remote sensing databases. While defining the pattern to search for and the feature to be used for that search (spectral, spatial, temporal, etc.) is challenging, a more difficult task is selecting competing patterns to train against the desired pattern. Schemes for competing pattern selection, including random selection and human interpreted selection, are discussed in the context of an example detection of dense urban areas in Landsat Thematic Mapper imagery. When applying the search to multiple images, a simple normalization method can alleviate the problem of inconsistent image calibration. Another potential problem, that of highly compressed data, was found to have a minimal effect on the ability to detect the desired pattern. The neural network algorithm has been implemented using the PVM (Parallel Virtual Machine) library and nearly-optimal speedups have been obtained that help alleviate the long process of searching through imagery.

  11. Public Awareness of Uterine Power Morcellation Through US Food and Drug Administration Communications: Analysis of Google Trends Search Term Patterns.

    PubMed

    Wood, Lauren N; Jamnagerwalla, Juzar; Markowitz, Melissa A; Thum, D Joseph; McCarty, Philip; Medendorp, Andrew R; Raz, Shlomo; Kim, Ja-Hong

    2018-04-26

    Uterine power morcellation, where the uterus is shred into smaller pieces, is a widely used technique for removal of uterine specimens in patients undergoing minimally invasive abdominal hysterectomy or myomectomy. Complications related to power morcellation of uterine specimens led to US Food and Drug Administration (FDA) communications in 2014 ultimately recommending against the use of power morcellation for women undergoing minimally invasive hysterectomy. Subsequently, practitioners drastically decreased the use of morcellation. We aimed to determine the effect of increased patient awareness on the decrease in use of the morcellator. Google Trends is a public tool that provides data on temporal patterns of search terms, and we correlated this data with the timing of the FDA communication. Weekly relative search volume (RSV) was obtained from Google Trends using the term “morcellation.” Higher RSV corresponds to increases in weekly search volume. Search volumes were divided into 3 groups: the 2 years prior to the FDA communication, a 1-year period following, and thereafter, with the distribution of the weekly RSV over the 3 periods tested using 1-way analysis of variance. Additionally, we analyzed the total number of websites containing the term “morcellation” over this time. The mean RSV prior to the FDA communication was 12.0 (SD 15.8), with the RSV being 60.3 (SD 24.7) in the 1-year after and 19.3 (SD 5.2) thereafter (P<.001). The mean number of webpages containing the term “morcellation” in 2011 was 10,800, rising to 18,800 during 2014 and 36,200 in 2017. Google search activity about morcellation of uterine specimens increased significantly after the FDA communications. This trend indicates an increased public awareness regarding morcellation and its complications. More extensive preoperative counseling and alteration of surgical technique and clinician practice may be necessary. ©Lauren N Wood, Juzar Jamnagerwalla, Melissa A Markowitz, D Joseph Thum, Philip McCarty, Andrew R Medendorp, Shlomo Raz, Ja-Hong Kim. Originally published in JMIR Public Health and Surveillance (http://publichealth.jmir.org), 26.04.2018.

  12. How visual search relates to visual diagnostic performance: a narrative systematic review of eye-tracking research in radiology.

    PubMed

    van der Gijp, A; Ravesloot, C J; Jarodzka, H; van der Schaaf, M F; van der Schaaf, I C; van Schaik, J P J; Ten Cate, Th J

    2017-08-01

    Eye tracking research has been conducted for decades to gain understanding of visual diagnosis such as in radiology. For educational purposes, it is important to identify visual search patterns that are related to high perceptual performance and to identify effective teaching strategies. This review of eye-tracking literature in the radiology domain aims to identify visual search patterns associated with high perceptual performance. Databases PubMed, EMBASE, ERIC, PsycINFO, Scopus and Web of Science were searched using 'visual perception' OR 'eye tracking' AND 'radiology' and synonyms. Two authors independently screened search results and included eye tracking studies concerning visual skills in radiology published between January 1, 1994 and July 31, 2015. Two authors independently assessed study quality with the Medical Education Research Study Quality Instrument, and extracted study data with respect to design, participant and task characteristics, and variables. A thematic analysis was conducted to extract and arrange study results, and a textual narrative synthesis was applied for data integration and interpretation. The search resulted in 22 relevant full-text articles. Thematic analysis resulted in six themes that informed the relation between visual search and level of expertise: (1) time on task, (2) eye movement characteristics of experts, (3) differences in visual attention, (4) visual search patterns, (5) search patterns in cross sectional stack imaging, and (6) teaching visual search strategies. Expert search was found to be characterized by a global-focal search pattern, which represents an initial global impression, followed by a detailed, focal search-to-find mode. Specific task-related search patterns, like drilling through CT scans and systematic search in chest X-rays, were found to be related to high expert levels. One study investigated teaching of visual search strategies, and did not find a significant effect on perceptual performance. Eye tracking literature in radiology indicates several search patterns are related to high levels of expertise, but teaching novices to search as an expert may not be effective. Experimental research is needed to find out which search strategies can improve image perception in learners.

  13. Multimodality approach to classifying hand utilization for the clinical breast examination.

    PubMed

    Laufer, Shlomi; Cohen, Elaine R; Maag, Anne-Lise D; Kwan, Calvin; Vanveen, Barry; Pugh, Carla M

    2014-01-01

    The clinical breast examination (CBE) is performed to detect breast pathology. However, little is known regarding clinical technique and how it relates to diagnostic accuracy. We sought to quantify breast examination search patterns and hand utilization with a new data collection and analysis system. Participants performed the CBE while the sensor mapping and video camera system collected performance data. From this data, algorithms were developed that measured the number of hands used during the exam and active examination time. This system is a feasible and reliable method to collect new information on CBE techniques.

  14. Is There a Weekly Pattern for Health Searches on Wikipedia and Is the Pattern Unique to Health Topics?

    PubMed

    Gabarron, Elia; Lau, Annie Y S; Wynn, Rolf

    2015-12-22

    Online health information-seeking behaviors have been reported to be more common at the beginning of the workweek. This behavior pattern has been interpreted as a kind of "healthy new start" or "fresh start" due to regrets or attempts to compensate for unhealthy behavior or poor choices made during the weekend. However, the observations regarding the most common health information-seeking day were based only on the analyses of users' behaviors with websites on health or on online health-related searches. We wanted to confirm if this pattern could be found in searches of Wikipedia on health-related topics and also if this search pattern was unique to health-related topics or if it could represent a more general pattern of online information searching--which could be of relevance even beyond the health sector. The aim was to examine the degree to which the search pattern described previously was specific to health-related information seeking or whether similar patterns could be found in other types of information-seeking behavior. We extracted the number of searches performed on Wikipedia in the Norwegian language for 911 days for the most common sexually transmitted diseases (chlamydia, gonorrhea, herpes, human immunodeficiency virus [HIV], and acquired immune deficiency syndrome [AIDS]), other health-related topics (influenza, diabetes, and menopause), and 2 nonhealth-related topics (footballer Lionel Messi and pop singer Justin Bieber). The search dates were classified according to the day of the week and ANOVA tests were used to compare the average number of hits per day of the week. The ANOVA tests showed that the sexually transmitted disease queries had their highest peaks on Tuesdays (P<.001) and the fewest searches on Saturdays. The other health topics also showed a weekly pattern, with the highest peaks early in the week and lower numbers on Saturdays (P<.001). Footballer Lionel Messi had the highest mean number of hits on Tuesdays and Wednesdays, whereas pop singer Justin Bieber had the most hits on Tuesdays. Both these tracked search queries also showed significantly lower numbers on Saturdays (P<.001). Our study supports prior studies finding an increase in health information searching at the beginning of the workweek. However, we also found a similar pattern for 2 randomly chosen nonhealth-related terms, which may suggest that the search pattern is not unique to health-related searches. The results are potentially relevant beyond the field of health and our preliminary findings need to be further explored in future studies involving a broader range of nonhealth-related searches.

  15. Rapid Prototyping Technologies and their Applications in Prosthodontics, a Review of Literature

    PubMed Central

    Torabi, Kianoosh; Farjood, Ehsan; Hamedani, Shahram

    2015-01-01

    The early computer-aided design/computer-aided manufacturing (CAD/CAM) systems were relied exclusively on subtractive methods. In recent years, additive methods by employing rapid prototyping (RP) have progressed rapidly in various fields of dentistry as they have the potential to overcome known drawbacks of subtractive techniques such as fit problems. RP techniques have been exploited to build complex 3D models in medicine since the 1990s. RP has recently proposed successful applications in various dental fields, such as fabrication of implant surgical guides, frameworks for fixed and removable partial dentures, wax patterns for the dental prosthesis, zirconia prosthesis and molds for metal castings, and maxillofacial prosthesis and finally, complete dentures. This paper aimed to offer a comprehensive literature review of various RP methods, particularly in dentistry, that are expected to bring many improvements to the field. A search was made through MEDLINE database and Google scholar search engine. The keywords; ‘rapid prototyping’ and ‘dentistry’ were searched in title/abstract of publications; limited to 2003 to 2013, concerning past decade. The inclusion criterion was the technical researches that predominately included laboratory procedures. The exclusion criterion was meticulous clinical and excessive technical procedures. A total of 106 articles were retrieved, recited by authors and only 50 met the specified inclusion criteria for this review. Selected articles had used rapid prototyping techniques in various fields in dentistry through different techniques. This review depicted the different laboratory procedures employed in this method and confirmed that RP technique have been substantially feasible in dentistry. With advancement in various RP systems, it is possible to benefit from this technique in different dental practices, particularly in implementing dental prostheses for different applications. PMID:25759851

  16. Rapid Prototyping Technologies and their Applications in Prosthodontics, a Review of Literature.

    PubMed

    Torabi, Kianoosh; Farjood, Ehsan; Hamedani, Shahram

    2015-03-01

    The early computer-aided design/computer-aided manufacturing (CAD/CAM) systems were relied exclusively on subtractive methods. In recent years, additive methods by employing rapid prototyping (RP) have progressed rapidly in various fields of dentistry as they have the potential to overcome known drawbacks of subtractive techniques such as fit problems. RP techniques have been exploited to build complex 3D models in medicine since the 1990s. RP has recently proposed successful applications in various dental fields, such as fabrication of implant surgical guides, frameworks for fixed and removable partial dentures, wax patterns for the dental prosthesis, zirconia prosthesis and molds for metal castings, and maxillofacial prosthesis and finally, complete dentures. This paper aimed to offer a comprehensive literature review of various RP methods, particularly in dentistry, that are expected to bring many improvements to the field. A search was made through MEDLINE database and Google scholar search engine. The keywords; 'rapid prototyping' and 'dentistry' were searched in title/abstract of publications; limited to 2003 to 2013, concerning past decade. The inclusion criterion was the technical researches that predominately included laboratory procedures. The exclusion criterion was meticulous clinical and excessive technical procedures. A total of 106 articles were retrieved, recited by authors and only 50 met the specified inclusion criteria for this review. Selected articles had used rapid prototyping techniques in various fields in dentistry through different techniques. This review depicted the different laboratory procedures employed in this method and confirmed that RP technique have been substantially feasible in dentistry. With advancement in various RP systems, it is possible to benefit from this technique in different dental practices, particularly in implementing dental prostheses for different applications.

  17. Emotional Devaluation of Distracting Patterns and Faces: A Consequence of Attentional Inhibition during Visual Search?

    ERIC Educational Resources Information Center

    Raymond, Jane E.; Fenske, Mark J.; Westoby, Nikki

    2005-01-01

    Visual search has been studied extensively, yet little is known about how its constituent processes affect subsequent emotional evaluation of searched-for and searched-through items. In 3 experiments, the authors asked observers to locate a colored pattern or tinted face in an array of other patterns or faces. Shortly thereafter, either the target…

  18. [Eye movement study in multiple object search process].

    PubMed

    Xu, Zhaofang; Liu, Zhongqi; Wang, Xingwei; Zhang, Xin

    2017-04-01

    The aim of this study is to investigate the search time regulation of objectives and eye movement behavior characteristics in the multi-objective visual search. The experimental task was accomplished with computer programming and presented characters on a 24 inch computer display. The subjects were asked to search three targets among the characters. Three target characters in the same group were of high similarity degree while those in different groups of target characters and distraction characters were in different similarity degrees. We recorded the search time and eye movement data through the whole experiment. It could be seen from the eye movement data that the quantity of fixation points was large when the target characters and distraction characters were similar. There were three kinds of visual search patterns for the subjects including parallel search, serial search, and parallel-serial search. In addition, the last pattern had the best search performance among the three search patterns, that is, the subjects who used parallel-serial search pattern spent shorter time finding the target. The order that the targets presented were able to affect the search performance significantly; and the similarity degree between target characters and distraction characters could also affect the search performance.

  19. A hybrid artificial bee colony algorithm and pattern search method for inversion of particle size distribution from spectral extinction data

    NASA Astrophysics Data System (ADS)

    Wang, Li; Li, Feng; Xing, Jian

    2017-10-01

    In this paper, a hybrid artificial bee colony (ABC) algorithm and pattern search (PS) method is proposed and applied for recovery of particle size distribution (PSD) from spectral extinction data. To be more useful and practical, size distribution function is modelled as the general Johnson's ? function that can overcome the difficulty of not knowing the exact type beforehand encountered in many real circumstances. The proposed hybrid algorithm is evaluated through simulated examples involving unimodal, bimodal and trimodal PSDs with different widths and mean particle diameters. For comparison, all examples are additionally validated by the single ABC algorithm. In addition, the performance of the proposed algorithm is further tested by actual extinction measurements with real standard polystyrene samples immersed in water. Simulation and experimental results illustrate that the hybrid algorithm can be used as an effective technique to retrieve the PSDs with high reliability and accuracy. Compared with the single ABC algorithm, our proposed algorithm can produce more accurate and robust inversion results while taking almost comparative CPU time over ABC algorithm alone. The superiority of ABC and PS hybridization strategy in terms of reaching a better balance of estimation accuracy and computation effort increases its potentials as an excellent inversion technique for reliable and efficient actual measurement of PSD.

  20. Hierarchical content-based image retrieval by dynamic indexing and guided search

    NASA Astrophysics Data System (ADS)

    You, Jane; Cheung, King H.; Liu, James; Guo, Linong

    2003-12-01

    This paper presents a new approach to content-based image retrieval by using dynamic indexing and guided search in a hierarchical structure, and extending data mining and data warehousing techniques. The proposed algorithms include: a wavelet-based scheme for multiple image feature extraction, the extension of a conventional data warehouse and an image database to an image data warehouse for dynamic image indexing, an image data schema for hierarchical image representation and dynamic image indexing, a statistically based feature selection scheme to achieve flexible similarity measures, and a feature component code to facilitate query processing and guide the search for the best matching. A series of case studies are reported, which include a wavelet-based image color hierarchy, classification of satellite images, tropical cyclone pattern recognition, and personal identification using multi-level palmprint and face features.

  1. A systematic review of persuasive marketing techniques to promote food to children on television.

    PubMed

    Jenkin, G; Madhvani, N; Signal, L; Bowers, S

    2014-04-01

    The ubiquitous marketing of energy-dense, nutrient-poor food and beverages is a key modifiable influence on childhood dietary patterns and obesity. Much of the research on television food advertising is focused on identifying and quantifying unhealthy food marketing with comparatively few studies examining persuasive marketing techniques to promote unhealthy food to children. This review identifies the most frequently documented persuasive marketing techniques to promote food to children via television. A systematic search of eight online databases using key search terms identified 267 unique articles. Thirty-eight articles met the inclusion criteria. A narrative synthesis of the reviewed studies revealed the most commonly reported persuasive techniques used on television to promote food to children. These were the use of premium offers, promotional characters, nutrition and health-related claims, the theme of taste, and the emotional appeal of fun. Identifying and documenting these commonly reported persuasive marketing techniques to promote food to children on television is critical for the monitoring and evaluation of advertising codes and industry pledges and the development of further regulation in this area. This has a strong potential to curbing the international obesity epidemic besieging children throughout the world. © 2014 The Authors. obesity reviews © 2014 International Association for the Study of Obesity.

  2. Patscanui: an intuitive web interface for searching patterns in DNA and protein data.

    PubMed

    Blin, Kai; Wohlleben, Wolfgang; Weber, Tilmann

    2018-05-02

    Patterns in biological sequences frequently signify interesting features in the underlying molecule. Many tools exist to search for well-known patterns. Less support is available for exploratory analysis, where no well-defined patterns are known yet. PatScanUI (https://patscan.secondarymetabolites.org/) provides a highly interactive web interface to the powerful generic pattern search tool PatScan. The complex PatScan-patterns are created in a drag-and-drop aware interface allowing researchers to do rapid prototyping of the often complicated patterns useful to identifying features of interest.

  3. Color vision but not visual attention is altered in migraine.

    PubMed

    Shepherd, Alex J

    2006-04-01

    To examine visual search performance in migraine and headache-free control groups and to determine whether reports of selective color vision deficits in migraine occur preattentively. Visual search is a classic technique to measure certain components of visual attention. The technique can be manipulated to measure both preattentive (automatic) and attentive processes. Here, visual search for colored targets was employed to extend earlier reports that the detection or discrimination of colors selective for the short-wavelength sensitive cone photoreceptors in the retina (S or "blue" cones) is impaired in migraine. Visual search performance for small and large color differences was measured in 34 migraine and 34 control participants. Small and large color differences were included to assess attentive and preattentive processing, respectively. In separate conditions, colored stimuli were chosen that would be detected selectively by either the S-, or by the long- (L or "red") and middle (M or "green")-wavelength sensitive cone photoreceptors. The results showed no preattentive differences between the migraine and control groups. For active, or attentive, search, differences between the migraine and control groups occurred for colors detected by the S-cones only, there were no differences for colors detected by the L- and M-cones. The migraine group responded significantly more slowly than the control group for the S-cone colors. The pattern of results indicates that there are no overall differences in search performance between migraine and control groups. The differences found for the S-cone colors are attributed to impaired discrimination of these colors in migraine and not to differences in attention.

  4. Double hashing technique in closed hashing search process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Zulkarnain, Iskandar; Jaya, Hendra

    2017-09-01

    The search process is used in various activities performed both online and offline, many algorithms that can be used to perform the search process one of which is a hash search algorithm, search process with hash search algorithm used in this study using double hashing technique where the data will be formed into the table with same length and then search, the results of this study indicate that the search process with double hashing technique allows faster searching than the usual search techniques, this research allows to search the solution by dividing the value into the main table and overflow table so that the search process is expected faster than the data stacked in the form of one table and collision data could avoided.

  5. Simultaneous classification of Oranges and Apples Using Grover's and Ventura' Algorithms in a Two-qubits System

    NASA Astrophysics Data System (ADS)

    Singh, Manu Pratap; Radhey, Kishori; Kumar, Sandeep

    2017-08-01

    In the present paper, simultaneous classification of Orange and Apple has been carried out using both Grover's iterative algorithm (Grover 1996) and Ventura's model (Ventura and Martinez, Inf. Sci. 124, 273-296, 2000) taking different superposition of two- pattern start state containing Orange and Apple both, one- pattern start state containing Apple as search state and another one- pattern start state containing Orange as search state. It has been shown that the exclusion superposition is the most suitable two- pattern search state for simultaneous classification of pattern associated with Apples and Oranges and the superposition of phase-invariance are the best choice as the respective search state based on one -pattern start-states in both Grover's and Ventura's methods of classifications of patterns.

  6. Cancer Information Seeking and Scanning: Sources and Patterns

    ERIC Educational Resources Information Center

    Barnes, Laura L. B.; Khojasteh, Jam J.; Wheeler, Denna

    2017-01-01

    Objective: This study aimed to identify predominant search patterns in a recent search for health information and a potential search for strongly needed cancer information, to identify the commonly scanned sources of information that may represent stable elements of the information fields characteristic of these patterns, and to evaluate whether…

  7. Boosting the FM-Index on the GPU: Effective Techniques to Mitigate Random Memory Access.

    PubMed

    Chacón, Alejandro; Marco-Sola, Santiago; Espinosa, Antonio; Ribeca, Paolo; Moure, Juan Carlos

    2015-01-01

    The recent advent of high-throughput sequencing machines producing big amounts of short reads has boosted the interest in efficient string searching techniques. As of today, many mainstream sequence alignment software tools rely on a special data structure, called the FM-index, which allows for fast exact searches in large genomic references. However, such searches translate into a pseudo-random memory access pattern, thus making memory access the limiting factor of all computation-efficient implementations, both on CPUs and GPUs. Here, we show that several strategies can be put in place to remove the memory bottleneck on the GPU: more compact indexes can be implemented by having more threads work cooperatively on larger memory blocks, and a k-step FM-index can be used to further reduce the number of memory accesses. The combination of those and other optimisations yields an implementation that is able to process about two Gbases of queries per second on our test platform, being about 8 × faster than a comparable multi-core CPU version, and about 3 × to 5 × faster than the FM-index implementation on the GPU provided by the recently announced Nvidia NVBIO bioinformatics library.

  8. RadSearch: a RIS/PACS integrated query tool

    NASA Astrophysics Data System (ADS)

    Tsao, Sinchai; Documet, Jorge; Moin, Paymann; Wang, Kevin; Liu, Brent J.

    2008-03-01

    Radiology Information Systems (RIS) contain a wealth of information that can be used for research, education, and practice management. However, the sheer amount of information available makes querying specific data difficult and time consuming. Previous work has shown that a clinical RIS database and its RIS text reports can be extracted, duplicated and indexed for searches while complying with HIPAA and IRB requirements. This project's intent is to provide a software tool, the RadSearch Toolkit, to allow intelligent indexing and parsing of RIS reports for easy yet powerful searches. In addition, the project aims to seamlessly query and retrieve associated images from the Picture Archiving and Communication System (PACS) in situations where an integrated RIS/PACS is in place - even subselecting individual series, such as in an MRI study. RadSearch's application of simple text parsing techniques to index text-based radiology reports will allow the search engine to quickly return relevant results. This powerful combination will be useful in both private practice and academic settings; administrators can easily obtain complex practice management information such as referral patterns; researchers can conduct retrospective studies with specific, multiple criteria; teaching institutions can quickly and effectively create thorough teaching files.

  9. Forecasting of hourly load by pattern recognition in a small area power system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dehdashti-Shahrokh, A.

    1982-01-01

    An intuitive, logical, simple and efficient method of forecasting hourly load in a small area power system is presented. A pattern recognition approach is used in developing the forecasting model. Pattern recognition techniques are powerful tools in the field of artificial intelligence (cybernetics) and simulate the way the human brain operates to make decisions. Pattern recognition is generally used in analysis of processes where the total physical nature behind the process variation is unkown but specific kinds of measurements explain their behavior. In this research basic multivariate analyses, in conjunction with pattern recognition techniques, are used to develop a linearmore » deterministic model to forecast hourly load. This method assumes that load patterns in the same geographical area are direct results of climatological changes (weather sensitive load), and have occurred in the past as a result of similar climatic conditions. The algorithm described in here searches for the best possible pattern from a seasonal library of load and weather data in forecasting hourly load. To accommodate the unpredictability of weather and the resulting load, the basic twenty-four load pattern was divided into eight three-hour intervals. This division was made to make the model adaptive to sudden climatic changes. The proposed method offers flexible lead times of one to twenty-four hours. The results of actual data testing had indicated that this proposed method is computationally efficient, highly adaptive, with acceptable data storage size and accuracy that is comparable to many other existing methods.« less

  10. Is There a Weekly Pattern for Health Searches on Wikipedia and Is the Pattern Unique to Health Topics?

    PubMed Central

    Lau, Annie YS; Wynn, Rolf

    2015-01-01

    Background Online health information–seeking behaviors have been reported to be more common at the beginning of the workweek. This behavior pattern has been interpreted as a kind of “healthy new start” or “fresh start” due to regrets or attempts to compensate for unhealthy behavior or poor choices made during the weekend. However, the observations regarding the most common health information–seeking day were based only on the analyses of users’ behaviors with websites on health or on online health-related searches. We wanted to confirm if this pattern could be found in searches of Wikipedia on health-related topics and also if this search pattern was unique to health-related topics or if it could represent a more general pattern of online information searching—which could be of relevance even beyond the health sector. Objective The aim was to examine the degree to which the search pattern described previously was specific to health-related information seeking or whether similar patterns could be found in other types of information-seeking behavior. Methods We extracted the number of searches performed on Wikipedia in the Norwegian language for 911 days for the most common sexually transmitted diseases (chlamydia, gonorrhea, herpes, human immunodeficiency virus [HIV], and acquired immune deficiency syndrome [AIDS]), other health-related topics (influenza, diabetes, and menopause), and 2 nonhealth-related topics (footballer Lionel Messi and pop singer Justin Bieber). The search dates were classified according to the day of the week and ANOVA tests were used to compare the average number of hits per day of the week. Results The ANOVA tests showed that the sexually transmitted disease queries had their highest peaks on Tuesdays (P<.001) and the fewest searches on Saturdays. The other health topics also showed a weekly pattern, with the highest peaks early in the week and lower numbers on Saturdays (P<.001). Footballer Lionel Messi had the highest mean number of hits on Tuesdays and Wednesdays, whereas pop singer Justin Bieber had the most hits on Tuesdays. Both these tracked search queries also showed significantly lower numbers on Saturdays (P<.001). Conclusions Our study supports prior studies finding an increase in health information searching at the beginning of the workweek. However, we also found a similar pattern for 2 randomly chosen nonhealth-related terms, which may suggest that the search pattern is not unique to health-related searches. The results are potentially relevant beyond the field of health and our preliminary findings need to be further explored in future studies involving a broader range of nonhealth-related searches. PMID:26693859

  11. Dynamic pattern matcher using incomplete data

    NASA Technical Reports Server (NTRS)

    Johnson, Gordon G. (Inventor); Wang, Lui (Inventor)

    1993-01-01

    This invention relates generally to pattern matching systems, and more particularly to a method for dynamically adapting the system to enhance the effectiveness of a pattern match. Apparatus and methods for calculating the similarity between patterns are known. There is considerable interest, however, in the storage and retrieval of data, particularly, when the search is called or initiated by incomplete information. For many search algorithms, a query initiating a data search requires exact information, and the data file is searched for an exact match. Inability to find an exact match thus results in a failure of the system or method.

  12. Adaptive rood pattern search for fast block-matching motion estimation.

    PubMed

    Nie, Yao; Ma, Kai-Kuang

    2002-01-01

    In this paper, we propose a novel and simple fast block-matching algorithm (BMA), called adaptive rood pattern search (ARPS), which consists of two sequential search stages: 1) initial search and 2) refined local search. For each macroblock (MB), the initial search is performed only once at the beginning in order to find a good starting point for the follow-up refined local search. By doing so, unnecessary intermediate search and the risk of being trapped into local minimum matching error points could be greatly reduced in long search case. For the initial search stage, an adaptive rood pattern (ARP) is proposed, and the ARP's size is dynamically determined for each MB, based on the available motion vectors (MVs) of the neighboring MBs. In the refined local search stage, a unit-size rood pattern (URP) is exploited repeatedly, and unrestrictedly, until the final MV is found. To further speed up the search, zero-motion prejudgment (ZMP) is incorporated in our method, which is particularly beneficial to those video sequences containing small motion contents. Extensive experiments conducted based on the MPEG-4 Verification Model (VM) encoding platform show that the search speed of our proposed ARPS-ZMP is about two to three times faster than that of the diamond search (DS), and our method even achieves higher peak signal-to-noise ratio (PSNR) particularly for those video sequences containing large and/or complex motion contents.

  13. Molecular characterization of Streptococcus agalactiae strains isolated from fishes in Malaysia.

    PubMed

    Amal, M N A; Zamri-Saad, M; Siti-Zahrah, A; Zulkafli, A R; Nur-Nazifah, M

    2013-07-01

    The aim of this study was to characterize Streptococcus agalactiae strains that were isolated from fishes in Malaysia using random amplified polymorphic DNA (RAPD) and repetitive extragenic palindromic PCR (REP-PCR) techniques. A total of 181 strains of Strep. agalactiae isolated from red hybrid tilapia (Oreochromis sp.) and golden pompano (Trachinotus blochii) were characterized using RAPD and REP-PCR techniques. Both the fingerprinting techniques generated reproducible band patterns, differing in the number and molecular mass amplicons. The RAPD technique displayed greater discriminatory power by its production of more complex binding pattern and divided all the strains into 13 groups, compared to 9 by REP-PCR technique. Both techniques showed the availability to differentiate the genetic profiles of the strains according to their geographical location of origin. Three strains of Strep. agalactiae that were recovered from golden pompano showed a genetic dissimilarity from the strains isolated from red hybrid tilapia, while the strain of ATCC 27956 that recovered from bovine displayed a unique profile for both methods. Both techniques possess excellent discriminative capabilities and can be used as a rapid means of comparing Strep. agalactiae strains for future epidemiological investigation. Framework as the guideline in traceability of this disease and in the search for potential local vaccine candidates for streptococcosis in this country. Journal of Applied Microbiology © 2013 The Society for Applied Microbiology.

  14. Solving Fuzzy Optimization Problem Using Hybrid Ls-Sa Method

    NASA Astrophysics Data System (ADS)

    Vasant, Pandian

    2011-06-01

    Fuzzy optimization problem has been one of the most and prominent topics inside the broad area of computational intelligent. It's especially relevant in the filed of fuzzy non-linear programming. It's application as well as practical realization can been seen in all the real world problems. In this paper a large scale non-linear fuzzy programming problem has been solved by hybrid optimization techniques of Line Search (LS), Simulated Annealing (SA) and Pattern Search (PS). As industrial production planning problem with cubic objective function, 8 decision variables and 29 constraints has been solved successfully using LS-SA-PS hybrid optimization techniques. The computational results for the objective function respect to vagueness factor and level of satisfaction has been provided in the form of 2D and 3D plots. The outcome is very promising and strongly suggests that the hybrid LS-SA-PS algorithm is very efficient and productive in solving the large scale non-linear fuzzy programming problem.

  15. Principal component analysis of the cytokine and chemokine response to human traumatic brain injury.

    PubMed

    Helmy, Adel; Antoniades, Chrystalina A; Guilfoyle, Mathew R; Carpenter, Keri L H; Hutchinson, Peter J

    2012-01-01

    There is a growing realisation that neuro-inflammation plays a fundamental role in the pathology of Traumatic Brain Injury (TBI). This has led to the search for biomarkers that reflect these underlying inflammatory processes using techniques such as cerebral microdialysis. The interpretation of such biomarker data has been limited by the statistical methods used. When analysing data of this sort the multiple putative interactions between mediators need to be considered as well as the timing of production and high degree of statistical co-variance in levels of these mediators. Here we present a cytokine and chemokine dataset from human brain following human traumatic brain injury and use principal component analysis and partial least squares discriminant analysis to demonstrate the pattern of production following TBI, distinct phases of the humoral inflammatory response and the differing patterns of response in brain and in peripheral blood. This technique has the added advantage of making no assumptions about the Relative Recovery (RR) of microdialysis derived parameters. Taken together these techniques can be used in complex microdialysis datasets to summarise the data succinctly and generate hypotheses for future study.

  16. Internet Search Patterns of Human Immunodeficiency Virus and the Digital Divide in the Russian Federation: Infoveillance Study

    PubMed Central

    Quinn, Casey; Hercz, Daniel; Gillespie, James A

    2013-01-01

    Background Human immunodeficiency virus (HIV) is a serious health problem in the Russian Federation. However, the true scale of HIV in Russia has long been the subject of considerable debate. Using digital surveillance to monitor diseases has become increasingly popular in high income countries. But Internet users may not be representative of overall populations, and the characteristics of the Internet-using population cannot be directly ascertained from search pattern data. This exploratory infoveillance study examined if Internet search patterns can be used for disease surveillance in a large middle-income country with a dispersed population. Objective This study had two main objectives: (1) to validate Internet search patterns against national HIV prevalence data, and (2) to investigate the relationship between search patterns and the determinants of Internet access. Methods We first assessed whether online surveillance is a valid and reliable method for monitoring HIV in the Russian Federation. Yandex and Google both provided tools to study search patterns in the Russian Federation. We evaluated the relationship between both Yandex and Google aggregated search patterns and HIV prevalence in 2011 at national and regional tiers. Second, we analyzed the determinants of Internet access to determine the extent to which they explained regional variations in searches for the Russian terms for “HIV” and “AIDS”. We sought to extend understanding of the characteristics of Internet searching populations by data matching the determinants of Internet access (age, education, income, broadband access price, and urbanization ratios) and searches for the term “HIV” using principal component analysis (PCA). Results We found generally strong correlations between HIV prevalence and searches for the terms “HIV” and “AIDS”. National correlations for Yandex searches for “HIV” were very strongly correlated with HIV prevalence (Spearman rank-order coefficient [rs]=.881, P≤.001) and strongly correlated for “AIDS” (rs=.714, P≤.001). The strength of correlations varied across Russian regions. National correlations in Google for the term “HIV” (rs=.672, P=.004) and “AIDS” (rs=.584, P≤.001) were weaker than for Yandex. Second, we examined the relationship between the determinants of Internet access and search patterns for the term “HIV” across Russia using PCA. At the national level, we found Principal Component 1 loadings, including age (-0.56), HIV search (-0.533), and education (-0.479) contributed 32% of the variance. Principal Component 2 contributed 22% of national variance (income, -0.652 and broadband price, -0.460). Conclusions This study contributes to the methodological literature on search patterns in public health. Based on our preliminary research, we suggest that PCA may be used to evaluate the relationship between the determinants of Internet access and searches for health problems beyond high-income countries. We believe it is in middle-income countries that search methods can make the greatest contribution to public health. PMID:24220250

  17. Using Dermoscopic Criteria and Patient-Related Factors for the Management of Pigmented Melanocytic Nevi

    PubMed Central

    Zalaudek, Iris; Docimo, Giovanni; Argenziano, Giuseppe

    2010-01-01

    Objective: To review recent dermoscopy studies that provide new insights into the evolution of nevi and their patterns of pigmentation as they contribute to the diagnosis of nevi and the management of pigmented melanocytic nevi. Data Sources: Data for this article were identified by searching the English and German literature by Medline and Journals@Ovid search for the period 1950 to January 2009. Study Selection: The following relevant terms were used: dermoscopy, dermatoscopy, epiluminescence microscopy (ELM), surface microscopy, digital dermoscopy, digital dermatoscopy, digital epiluminescence microscopy, digital surface microscopy, melanocytic skin lesion, nevi, and pigmented skin lesions. There were no exclusion criteria. Data Synthesis: The dermoscopic diagnosis of nevi relies on the following 4 criteria (each of which is characterized by 4 variables): (1) color (black, brown, gray, and blue); (2) pattern (globular, reticular, starburst, and homogeneous blue pattern); (3) pigment distribution (multifocal, central, eccentric, and uniform); and (4) special sites (face, acral areas, nail, and mucosa). In addition, the following 6 factors related to the patient might influence the pattern of pigmentation of the individual nevi: age, skin type, history of melanoma, UV exposure, pregnancy, and growth dynamics. Conclusions: The 4×4×6 “rule” may help clinicians remember the basic dermoscopic criteria of nevi and the patient-related factors influencing their patterns. Dermoscopy is a useful technique for diagnosing melanocytic nevi, but the clinician should take additional factors into consideration to optimize the management of cases of pigmented lesions. PMID:19620566

  18. Planning representation for automated exploratory data analysis

    NASA Astrophysics Data System (ADS)

    St. Amant, Robert; Cohen, Paul R.

    1994-03-01

    Igor is a knowledge-based system for exploratory statistical analysis of complex systems and environments. Igor has two related goals: to help automate the search for interesting patterns in data sets, and to help develop models that capture significant relationships in the data. We outline a language for Igor, based on techniques of opportunistic planning, which balances control and opportunism. We describe the application of Igor to the analysis of the behavior of Phoenix, an artificial intelligence planning system.

  19. Coupling artificial intelligence and numerical computation for engineering design (Invited paper)

    NASA Astrophysics Data System (ADS)

    Tong, S. S.

    1986-01-01

    The possibility of combining artificial intelligence (AI) systems and numerical computation methods for engineering designs is considered. Attention is given to three possible areas of application involving fan design, controlled vortex design of turbine stage blade angles, and preliminary design of turbine cascade profiles. Among the AI techniques discussed are: knowledge-based systems; intelligent search; and pattern recognition systems. The potential cost and performance advantages of an AI-based design-generation system are discussed in detail.

  20. Obchs: AN Effective Harmony Search Algorithm with Oppositionbased Chaos-Enhanced Initialization for Solving Uncapacitated Facility Location Problems

    NASA Astrophysics Data System (ADS)

    Heidari, A. A.; Kazemizade, O.; Abbaspour, R. A.

    2015-12-01

    In this paper, a continuous harmony search (HS) approach is investigated for tackling the Uncapacitated Facility Location (UFL) task. This article proposes an efficient modified HS-based optimizer to improve the performance of HS on complex spatial tasks like UFL problems. For this aim, opposition-based learning (OBL) and chaotic patterns are utilized. The proposed technique is examined against several UFL benchmark challenges in specialized literature. Then, the modified HS is substantiated in detail and compared to the basic HS and some other methods. The results showed that new opposition-based chaotic HS (OBCHS) algorithm not only can exploit better solutions competently but it is able to outperform HS in solving UFL problems.

  1. Improving parallel I/O autotuning with performance modeling

    DOE PAGES

    Behzad, Babak; Byna, Surendra; Wild, Stefan M.; ...

    2014-01-01

    Various layers of the parallel I/O subsystem offer tunable parameters for improving I/O performance on large-scale computers. However, searching through a large parameter space is challenging. We are working towards an autotuning framework for determining the parallel I/O parameters that can achieve good I/O performance for different data write patterns. In this paper, we characterize parallel I/O and discuss the development of predictive models for use in effectively reducing the parameter space. Furthermore, applying our technique on tuning an I/O kernel derived from a large-scale simulation code shows that the search time can be reduced from 12 hours to 2more » hours, while achieving 54X I/O performance speedup.« less

  2. High-speed autoverifying technology for printed wiring boards

    NASA Astrophysics Data System (ADS)

    Ando, Moritoshi; Oka, Hiroshi; Okada, Hideo; Sakashita, Yorihiro; Shibutani, Nobumi

    1996-10-01

    We have developed an automated pattern verification technique. The output of an automated optical inspection system contains many false alarms. Verification is needed to distinguish between minor irregularities and serious defects. In the past, this verification was usually done manually, which led to unsatisfactory product quality. The goal of our new automated verification system is to detect pattern features on surface mount technology boards. In our system, we employ a new illumination method, which uses multiple colors and multiple direction illumination. Images are captured with a CCD camera. We have developed a new algorithm that uses CAD data for both pattern matching and pattern structure determination. This helps to search for patterns around a defect and to examine defect definition rules. These are processed with a high speed workstation and a hard-wired circuits. The system can verify a defect within 1.5 seconds. The verification system was tested in a factory. It verified 1,500 defective samples and detected all significant defects with only a 0.1 percent of error rate (false alarm).

  3. Data Collision Prevention with Overflow Hashing Technique in Closed Hash Searching Process

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Nurjamiyah; Rafika Dewi, Arie

    2017-12-01

    Hash search is a method that can be used for various search processes such as search engines, sorting, machine learning, neural network and so on, in the search process the possibility of collision data can happen and to prevent the occurrence of collision can be done in several ways one of them is to use Overflow technique, the use of this technique perform with varying length of data and this technique can prevent the occurrence of data collisions.

  4. Effects of Individual Health Topic Familiarity on Activity Patterns During Health Information Searches

    PubMed Central

    Moriyama, Koichi; Fukui, Ken–ichi; Numao, Masayuki

    2015-01-01

    Background Non-medical professionals (consumers) are increasingly using the Internet to support their health information needs. However, the cognitive effort required to perform health information searches is affected by the consumer’s familiarity with health topics. Consumers may have different levels of familiarity with individual health topics. This variation in familiarity may cause misunderstandings because the information presented by search engines may not be understood correctly by the consumers. Objective As a first step toward the improvement of the health information search process, we aimed to examine the effects of health topic familiarity on health information search behaviors by identifying the common search activity patterns exhibited by groups of consumers with different levels of familiarity. Methods Each participant completed a health terminology familiarity questionnaire and health information search tasks. The responses to the familiarity questionnaire were used to grade the familiarity of participants with predefined health topics. The search task data were transcribed into a sequence of search activities using a coding scheme. A computational model was constructed from the sequence data using a Markov chain model to identify the common search patterns in each familiarity group. Results Forty participants were classified into L1 (not familiar), L2 (somewhat familiar), and L3 (familiar) groups based on their questionnaire responses. They had different levels of familiarity with four health topics. The video data obtained from all of the participants were transcribed into 4595 search activities (mean 28.7, SD 23.27 per session). The most frequent search activities and transitions in all the familiarity groups were related to evaluations of the relevancy of selected web pages in the retrieval results. However, the next most frequent transitions differed in each group and a chi-squared test confirmed this finding (P<.001). Next, according to the results of a perplexity evaluation, the health information search patterns were best represented as a 5-gram sequence pattern. The most common patterns in group L1 were frequent query modifications, with relatively low search efficiency, and accessing and evaluating selected results from a health website. Group L2 performed frequent query modifications, but with better search efficiency, and accessed and evaluated selected results from a health website. Finally, the members of group L3 successfully discovered relevant results from the first query submission, performed verification by accessing several health websites after they discovered relevant results, and directly accessed consumer health information websites. Conclusions Familiarity with health topics affects health information search behaviors. Our analysis of state transitions in search activities detected unique behaviors and common search activity patterns in each familiarity group during health information searches. PMID:25783222

  5. PCI bus content-addressable-memory (CAM) implementation on FPGA for pattern recognition/image retrieval in a distributed environment

    NASA Astrophysics Data System (ADS)

    Megherbi, Dalila B.; Yan, Yin; Tanmay, Parikh; Khoury, Jed; Woods, C. L.

    2004-11-01

    Recently surveillance and Automatic Target Recognition (ATR) applications are increasing as the cost of computing power needed to process the massive amount of information continues to fall. This computing power has been made possible partly by the latest advances in FPGAs and SOPCs. In particular, to design and implement state-of-the-Art electro-optical imaging systems to provide advanced surveillance capabilities, there is a need to integrate several technologies (e.g. telescope, precise optics, cameras, image/compute vision algorithms, which can be geographically distributed or sharing distributed resources) into a programmable system and DSP systems. Additionally, pattern recognition techniques and fast information retrieval, are often important components of intelligent systems. The aim of this work is using embedded FPGA as a fast, configurable and synthesizable search engine in fast image pattern recognition/retrieval in a distributed hardware/software co-design environment. In particular, we propose and show a low cost Content Addressable Memory (CAM)-based distributed embedded FPGA hardware architecture solution with real time recognition capabilities and computing for pattern look-up, pattern recognition, and image retrieval. We show how the distributed CAM-based architecture offers a performance advantage of an order-of-magnitude over RAM-based architecture (Random Access Memory) search for implementing high speed pattern recognition for image retrieval. The methods of designing, implementing, and analyzing the proposed CAM based embedded architecture are described here. Other SOPC solutions/design issues are covered. Finally, experimental results, hardware verification, and performance evaluations using both the Xilinx Virtex-II and the Altera Apex20k are provided to show the potential and power of the proposed method for low cost reconfigurable fast image pattern recognition/retrieval at the hardware/software co-design level.

  6. Additive Manufacturing Techniques in Prosthodontics: Where Do We Currently Stand? A Critical Review.

    PubMed

    Alharbi, Nawal; Wismeijer, Daniel; Osman, Reham B

    The aim of this article was to critically review the current application of additive manufacturing (AM)/3D-printing techniques in prosthodontics and to highlight the influence of various technical factors involved in different AM technologies. A standard approach of searching MEDLINE, EMBASE, and Google Scholar databases was followed. The following search terms were used: (Prosth* OR Restoration) AND (Prototype OR Additive Manufacture* OR Compute* OR 3D-print* OR CAD/CAM) AND (Dentistry OR Dental). Hand searching the reference lists of the included articles and personal connections revealed additional relevant articles. Selection criteria were any article written in English and reporting on the application of AM in prosthodontics from 1990 to February 2016. From a total of 4,290 articles identified, 33 were seen as relevant. Of these, 3 were narrative reviews, 18 were in vitro studies, and 12 were clinical in vivo studies. Different AM technologies are applied in prosthodontics, directly and indirectly for the fabrication of fixed metal copings, metal frameworks for removable partial dentures, and plastic mock-ups and resin patterns for further conventional metal castings. Technical factors involved in different AM techniques influence the overall quality, the mechanical properties of the printed parts, and the total cost and manufacturing time. AM is promising and offers new possibilities in the field of prosthodontics, though its application is still limited. An understanding of these limitations and of developments in material science is crucial prior to considering AM as an acceptable method for the fabrication of dental prostheses.

  7. A smart way to identify and extract repeated patterns of a layout

    NASA Astrophysics Data System (ADS)

    Wei, Fang; Gu, Tingting; Chu, Zhihao; Zhang, Chenming; Chen, Han; Zhu, Jun; Hu, Xinyi; Du, Chunshan; Wan, Qijian; Liu, Zhengfang

    2018-03-01

    As integrated circuits (IC) technology moves forward, manufacturing process is facing more and more challenges. Optical proximity correction (OPC) has been playing an important role in the whole manufacturing process. In the deep sub-micron technology, OPC engineers not only need to guarantee the layout designs to be manufacturable but also take a more precise control of the critical patterns to ensure a high performance circuit. One of the tasks that would like to be performed is the consistency checking as the identical patterns under identical context should have identical OPC results in theory, like SRAM regions. Consistency checking is essentially a technique of repeated patterns identification, extraction and derived patterns (i.e. OPC results) comparison. The layout passing to the OPC team may not have enough design hierarchical information either because the original designs may have undergone several layout processing steps or some other unknown reasons. This paper presents a generic way to identify and extract repeated layout structures in SRAM regions purely based on layout pattern analysis through Calibre Pattern Matching and Calibre equation-based DRC (eqDRC). Without Pattern Matching and eqDRC, it will take lots of effort to manually get it done by trial and error, it is almost impossible to automate the pattern analysis process. Combining Pattern Matching and eqDRC opens a new way to implement this flow. The repeated patterns must have some fundamental features for measurement of pitches in the horizontal and vertical direction separately by Calibre eqDRC and meanwhile can be a helper to generate some anchor points which will be the starting points for Pattern Matching to capture patterns. The informative statistical report from the pattern search tells the match counts individually for each patterns captured. Experiment shows that this is a smart way of identifying and extracting repeated structures effectively. The OPC results are the derived layers on these repeated structures, by running pattern search using design layers as pattern layers and OPC results as marker layers, it is an easy job to compare the consistency.

  8. Real view radiology-impact on search patterns and confidence in radiology education.

    PubMed

    Bailey, Jared H; Roth, Trenton D; Kohli, Mark D; Heitkamp, Darel E

    2014-07-01

    Search patterns are important for radiologists because they enable systematic case review. Because radiology residents are exposed to so many imaging modalities and anatomic regions, and they rotate on-and-off service so frequently, they may have difficulty establishing effective search patterns. We developed Real View Radiology (RVR), an educational system founded on guided magnetic resonance imaging (MRI) case review and evaluated its impact on search patterns and interpretative confidence of junior radiology residents. RVR guides learners through unknown examinations by sequentially prompting learners to certain aspects of a case via a comprehensive question set and then providing immediate feedback. Junior residents first completed a brief evaluation regarding their level of confidence when interpreting certain joint MRI cases and frequency of search pattern use. They spent four half-days interpreting cases using RVR. Once finished, they repeated the evaluations. The junior resident results were compared to third-year residents who had not used RVR. The data were analyzed for change in confidence, use of search patterns, and number of cases completed. Twelve first-year and thirteen second-year residents (trained cohort) were enrolled in the study. During their 4-week musculoskeletal rotations, they completed on average 29.3 MRI knee (standard deviation [SD], 1.6) and 17.4 shoulder (SD, 1.2) cases using RVR. Overall search pattern scores of the trained cohort increased significantly both from pretraining to posttraining (knee P < .01, shoulder P < .01) and compared to the untrained third-year residents (knee (P < .01, and shoulder P < .01). The trained cohort confidence scores also increased significantly from pre to post for all joints (knee P < .01, shoulder P < .01, pelvis P < .01, and ankle P < .01). Radiology residents can increase their MRI case interpretation confidence and improve the consistency of search pattern use by training with a question-based sequential reveal educational program. RVR could be used to supplement training and assist with search pattern creation in areas in which residents often do not acquire adequate clinical exposure. Copyright © 2014 AUR. Published by Elsevier Inc. All rights reserved.

  9. A novel directional asymmetric sampling search algorithm for fast block-matching motion estimation

    NASA Astrophysics Data System (ADS)

    Li, Yue-e.; Wang, Qiang

    2011-11-01

    This paper proposes a novel directional asymmetric sampling search (DASS) algorithm for video compression. Making full use of the error information (block distortions) of the search patterns, eight different direction search patterns are designed for various situations. The strategy of local sampling search is employed for the search of big-motion vector. In order to further speed up the search, early termination strategy is adopted in procedure of DASS. Compared to conventional fast algorithms, the proposed method has the most satisfactory PSNR values for all test sequences.

  10. PROSPECT improves cis-acting regulatory element prediction by integrating expression profile data with consensus pattern searches

    PubMed Central

    Fujibuchi, Wataru; Anderson, John S. J.; Landsman, David

    2001-01-01

    Consensus pattern and matrix-based searches designed to predict cis-acting transcriptional regulatory sequences have historically been subject to large numbers of false positives. We sought to decrease false positives by incorporating expression profile data into a consensus pattern-based search method. We have systematically analyzed the expression phenotypes of over 6000 yeast genes, across 121 expression profile experiments, and correlated them with the distribution of 14 known regulatory elements over sequences upstream of the genes. Our method is based on a metric we term probabilistic element assessment (PEA), which is a ranking of potential sites based on sequence similarity in the upstream regions of genes with similar expression phenotypes. For eight of the 14 known elements that we examined, our method had a much higher selectivity than a naïve consensus pattern search. Based on our analysis, we have developed a web-based tool called PROSPECT, which allows consensus pattern-based searching of gene clusters obtained from microarray data. PMID:11574681

  11. The use of geoscience methods for terrestrial forensic searches

    NASA Astrophysics Data System (ADS)

    Pringle, J. K.; Ruffell, A.; Jervis, J. R.; Donnelly, L.; McKinley, J.; Hansen, J.; Morgan, R.; Pirrie, D.; Harrison, M.

    2012-08-01

    Geoscience methods are increasingly being utilised in criminal, environmental and humanitarian forensic investigations, and the use of such methods is supported by a growing body of experimental and theoretical research. Geoscience search techniques can complement traditional methodologies in the search for buried objects, including clandestine graves, weapons, explosives, drugs, illegal weapons, hazardous waste and vehicles. This paper details recent advances in search and detection methods, with case studies and reviews. Relevant examples are given, together with a generalised workflow for search and suggested detection technique(s) table. Forensic geoscience techniques are continuing to rapidly evolve to assist search investigators to detect hitherto difficult to locate forensic targets.

  12. Efficient sequential and parallel algorithms for finding edit distance based motifs.

    PubMed

    Pal, Soumitra; Xiao, Peng; Rajasekaran, Sanguthevar

    2016-08-18

    Motif search is an important step in extracting meaningful patterns from biological data. The general problem of motif search is intractable and there is a pressing need to develop efficient, exact and approximation algorithms to solve this problem. In this paper, we present several novel, exact, sequential and parallel algorithms for solving the (l,d) Edit-distance-based Motif Search (EMS) problem: given two integers l,d and n biological strings, find all strings of length l that appear in each input string with atmost d errors of types substitution, insertion and deletion. One popular technique to solve the problem is to explore for each input string the set of all possible l-mers that belong to the d-neighborhood of any substring of the input string and output those which are common for all input strings. We introduce a novel and provably efficient neighborhood exploration technique. We show that it is enough to consider the candidates in neighborhood which are at a distance exactly d. We compactly represent these candidate motifs using wildcard characters and efficiently explore them with very few repetitions. Our sequential algorithm uses a trie based data structure to efficiently store and sort the candidate motifs. Our parallel algorithm in a multi-core shared memory setting uses arrays for storing and a novel modification of radix-sort for sorting the candidate motifs. The algorithms for EMS are customarily evaluated on several challenging instances such as (8,1), (12,2), (16,3), (20,4), and so on. The best previously known algorithm, EMS1, is sequential and in estimated 3 days solves up to instance (16,3). Our sequential algorithms are more than 20 times faster on (16,3). On other hard instances such as (9,2), (11,3), (13,4), our algorithms are much faster. Our parallel algorithm has more than 600 % scaling performance while using 16 threads. Our algorithms have pushed up the state-of-the-art of EMS solvers and we believe that the techniques introduced in this paper are also applicable to other motif search problems such as Planted Motif Search (PMS) and Simple Motif Search (SMS).

  13. LSHSIM: A Locality Sensitive Hashing based method for multiple-point geostatistics

    NASA Astrophysics Data System (ADS)

    Moura, Pedro; Laber, Eduardo; Lopes, Hélio; Mesejo, Daniel; Pavanelli, Lucas; Jardim, João; Thiesen, Francisco; Pujol, Gabriel

    2017-10-01

    Reservoir modeling is a very important task that permits the representation of a geological region of interest, so as to generate a considerable number of possible scenarios. Since its inception, many methodologies have been proposed and, in the last two decades, multiple-point geostatistics (MPS) has been the dominant one. This methodology is strongly based on the concept of training image (TI) and the use of its characteristics, which are called patterns. In this paper, we propose a new MPS method that combines the application of a technique called Locality Sensitive Hashing (LSH), which permits to accelerate the search for patterns similar to a target one, with a Run-Length Encoding (RLE) compression technique that speeds up the calculation of the Hamming similarity. Experiments with both categorical and continuous images show that LSHSIM is computationally efficient and produce good quality realizations. In particular, for categorical data, the results suggest that LSHSIM is faster than MS-CCSIM, one of the state-of-the-art methods.

  14. A Moire Fringing Spectrometer for Extra-Solar Planet Searches

    NASA Astrophysics Data System (ADS)

    van Eyken, J. C.; Ge, J.; Mahadevan, S.; De Witt, C.; Ramsey, L. W.; Berger, D.; Shaklan, S.; Pan, X.

    2001-12-01

    We have developed a prototype moire fringing spectrometer for high precision radial velocity measurements for the detection of extra-solar planets. This combination of Michelson interferometer and spectrograph overlays an interferometer comb on a medium resolution stellar spectrum, producing Moire patterns. Small changes in the doppler shift of the spectrum lead to corresponding large shifts in the Moire pattern (Moire magnification). The sinusoidal shape of the Moire fringes enables much simpler measurement of these shifts than in standard echelle spectrograph techniques, facilitating high precision measurements with a low cost instrument. Current data analysis software we have developed has produced short-term repeatability (over a few hours) to 5-10m/s, and future planned improvements based on previous experiments should reduce this significantly. We plan eventually to carry out large scale surveys for low mass companions around other stars. This poster will present new results obtained in the lab and at the HET and Palomar 5m telescopes, the theory of the instrument, and data analysis techniques.

  15. [Advanced online search techniques and dedicated search engines for physicians].

    PubMed

    Nahum, Yoav

    2008-02-01

    In recent years search engines have become an essential tool in the work of physicians. This article will review advanced search techniques from the world of information specialists, as well as some advanced search engine operators that may help physicians improve their online search capabilities, and maximize the yield of their searches. This article also reviews popular dedicated scientific and biomedical literature search engines.

  16. Target intersection probabilities for parallel-line and continuous-grid types of search

    USGS Publications Warehouse

    McCammon, R.B.

    1977-01-01

    The expressions for calculating the probability of intersection of hidden targets of different sizes and shapes for parallel-line and continuous-grid types of search can be formulated by vsing the concept of conditional probability. When the prior probability of the orientation of a widden target is represented by a uniform distribution, the calculated posterior probabilities are identical with the results obtained by the classic methods of probability. For hidden targets of different sizes and shapes, the following generalizations about the probability of intersection can be made: (1) to a first approximation, the probability of intersection of a hidden target is proportional to the ratio of the greatest dimension of the target (viewed in plane projection) to the minimum line spacing of the search pattern; (2) the shape of the hidden target does not greatly affect the probability of the intersection when the largest dimension of the target is small relative to the minimum spacing of the search pattern, (3) the probability of intersecting a target twice for a particular type of search can be used as a lower bound if there is an element of uncertainty of detection for a particular type of tool; (4) the geometry of the search pattern becomes more critical when the largest dimension of the target equals or exceeds the minimum spacing of the search pattern; (5) for elongate targets, the probability of intersection is greater for parallel-line search than for an equivalent continuous square-grid search when the largest dimension of the target is less than the minimum spacing of the search pattern, whereas the opposite is true when the largest dimension exceeds the minimum spacing; (6) the probability of intersection for nonorthogonal continuous-grid search patterns is not greatly different from the probability of intersection for the equivalent orthogonal continuous-grid pattern when the orientation of the target is unknown. The probability of intersection for an elliptically shaped target can be approximated by treating the ellipse as intermediate between a circle and a line. A search conducted along a continuous rectangular grid can be represented as intermediate between a search along parallel lines and along a continuous square grid. On this basis, an upper and lower bound for the probability of intersection of an elliptically shaped target for a continuous rectangular grid can be calculated. Charts have been constructed that permit the values for these probabilities to be obtained graphically. The use of conditional probability allows the explorationist greater flexibility in considering alternate search strategies for locating hidden targets. ?? 1977 Plenum Publishing Corp.

  17. Application of multivariable search techniques to the optimization of airfoils in a low speed nonlinear inviscid flow field

    NASA Technical Reports Server (NTRS)

    Hague, D. S.; Merz, A. W.

    1975-01-01

    Multivariable search techniques are applied to a particular class of airfoil optimization problems. These are the maximization of lift and the minimization of disturbance pressure magnitude in an inviscid nonlinear flow field. A variety of multivariable search techniques contained in an existing nonlinear optimization code, AESOP, are applied to this design problem. These techniques include elementary single parameter perturbation methods, organized search such as steepest-descent, quadratic, and Davidon methods, randomized procedures, and a generalized search acceleration technique. Airfoil design variables are seven in number and define perturbations to the profile of an existing NACA airfoil. The relative efficiency of the techniques are compared. It is shown that elementary one parameter at a time and random techniques compare favorably with organized searches in the class of problems considered. It is also shown that significant reductions in disturbance pressure magnitude can be made while retaining reasonable lift coefficient values at low free stream Mach numbers.

  18. Age differences in visual search for compound patterns: long- versus short-range grouping.

    PubMed

    Burack, J A; Enns, J T; Iarocci, G; Randolph, B

    2000-11-01

    Visual search for compound patterns was examined in observers aged 6, 8, 10, and 22 years. The main question was whether age-related improvement in search rate (response time slope over number of items) was different for patterns defined by short- versus long-range spatial relations. Perceptual access to each type of relation was varied by using elements of same contrast (easy to access) or mixed contrast (hard to access). The results showed large improvements with age in search rate for long-range targets; search rate for short-range targets was fairly constant across age. This pattern held regardless of whether perceptual access to a target was easy or hard, supporting the hypothesis that different processes are involved in perceptual grouping at these two levels. The results also point to important links between ontogenic and microgenic change in perception (H. Werner, 1948, 1957).

  19. Are visual cue masking and removal techniques equivalent for studying perceptual skills in sport?

    PubMed

    Mecheri, Sami; Gillet, Eric; Thouvarecq, Regis; Leroy, David

    2011-01-01

    The spatial-occlusion paradigm makes use of two techniques (masking and removing visual cues) to provide information about the anticipatory cues used by viewers. The visual scene resulting from the removal technique appears to be incongruous, but the assumed equivalence of these two techniques is spreading. The present study was designed to address this issue by combining eye-movement recording with the two types of occlusion (removal versus masking) in a tennis serve-return task. Response accuracy and decision onsets were analysed. The results indicated that subjects had longer reaction times under the removal condition, with an identical proportion of correct responses. Also, the removal technique caused the subjects to rely on atypical search patterns. Our findings suggest that, when the removal technique was used, viewers were unable to systematically count on stored memories to help them accomplish the interception task. The persistent failure to question some of the assumptions about the removal technique in applied visual research is highlighted, and suggestions for continued use of the masking technique are advanced.

  20. Development of a flexible circuit board for low-background experiments

    NASA Astrophysics Data System (ADS)

    Poon, Alan; Barton, Paul; Dhar, Ankur; Larsen, Joern; Loach, James

    2017-01-01

    Future underground rare-event search experiments, such as neutrinoless double-beta decay searches, have stringent requirements for the radiopurity of materials placed near the active detector medium. Parylene is a polymer that has a high chemical purity and the vapor deposition process by which it is laid down tends to purify it further. In this talk the technique to fabricate a low-mass, flexible circuit board, with conductive traces photoligthographically patterned on a parylene substrate, is discussed. The performance of a proof-of-principle temperature sensor is presented. This work was supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, under Contract No. DE-AC02-05CH11231 and by the Shanghai Key Lab for Particle Physics and Cosmology (SKLPPC), Grant No. 15DZ2272100.

  1. Macromolecular powder diffraction : structure solution via molecular.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebbler, J.; Von Dreele, R.; X-Ray Science Division

    Macromolecular powder diffraction is a burgeoning technique for protein structure solution - ideally suited for cases where no suitable single crystals are available. Over the past seven years, pioneering work by Von Dreele et al. [1,2] and Margiolaki et al. [3,4] has demonstrated the viability of this approach for several protein structures. Among these initial powder studies, molecular replacement solutions of insulin and turkey lysozyme into alternate space groups were accomplished. Pressing the technique further, Margiolaki et al. [5] executed the first molecular replacement of an unknown protein structure: the SH3 domain of ponsin, using data from a multianalyzer diffractometer.more » To demonstrate that cross-species molecular replacement using image plate data is also possible, we present the solution of hen egg white lysozyme using the 60% identical human lysozyme (PDB code: 1LZ1) as the search model. Due to the high incidence of overlaps in powder patterns, especially in more complex structures, we have used extracted intensities from five data sets taken at different salt concentrations in a multi-pattern Pawley refinement. The use of image plates severely increases the overlap problem due to lower detector resolution, but radiation damage effects are minimized with shorter exposure times and the fact that the entire pattern is obtained in a single exposure. This image plate solution establishes the robustness of powder molecular replacement resulting from different data collection techniques.« less

  2. A Survey in Indexing and Searching XML Documents.

    ERIC Educational Resources Information Center

    Luk, Robert W. P.; Leong, H. V.; Dillon, Tharam S.; Chan, Alvin T. S.; Croft, W. Bruce; Allan, James

    2002-01-01

    Discussion of XML focuses on indexing techniques for XML documents, grouping them into flat-file, semistructured, and structured indexing paradigms. Highlights include searching techniques, including full text search and multistage search; search result presentations; database and information retrieval system integration; XML query languages; and…

  3. The impacts of cognitive-behavioral therapy on the treatment of phobic disorders measured by functional neuroimaging techniques: a systematic review.

    PubMed

    Galvao-de Almeida, Amanda; Araujo Filho, Gerardo Maria de; Berberian, Arthur de Almeida; Trezsniak, Clarissa; Nery-Fernandes, Fabiana; Araujo Neto, Cesar Augusto; Jackowski, Andrea Parolin; Miranda-Scippa, Angela; Oliveira, Irismar Reis de

    2013-01-01

    Functional neuroimaging techniques represent fundamental tools in the context of translational research integrating neurobiology, psychopathology, neuropsychology, and therapeutics. In addition, cognitive-behavioral therapy (CBT) has proven its efficacy in the treatment of anxiety disorders and may be useful in phobias. The literature has shown that feelings and behaviors are mediated by specific brain circuits, and changes in patterns of interaction should be associated with cerebral alterations. Based on these concepts, a systematic review was conducted aiming to evaluate the impact of CBT on phobic disorders measured by functional neuroimaging techniques. A systematic review of the literature was conducted including studies published between January 1980 and April 2012. Studies written in English, Spanish or Portuguese evaluating changes in the pattern of functional neuroimaging before and after CBT in patients with phobic disorders were included. The initial search strategy retrieved 45 studies. Six of these studies met all inclusion criteria. Significant deactivations in the amygdala, insula, thalamus and hippocampus, as well as activation of the medial orbitofrontal cortex, were observed after CBT in phobic patients when compared with controls. In spite of their technical limitations, neuroimaging techniques provide neurobiological support for the efficacy of CBT in the treatment of phobic disorders. Further studies are needed to confirm this conclusion.

  4. Unravelling associations between unassigned mass spectrometry peaks with frequent itemset mining techniques.

    PubMed

    Vu, Trung Nghia; Mrzic, Aida; Valkenborg, Dirk; Maes, Evelyne; Lemière, Filip; Goethals, Bart; Laukens, Kris

    2014-01-01

    Mass spectrometry-based proteomics experiments generate spectra that are rich in information. Often only a fraction of this information is used for peptide/protein identification, whereas a significant proportion of the peaks in a spectrum remain unexplained. In this paper we explore how a specific class of data mining techniques termed "frequent itemset mining" can be employed to discover patterns in the unassigned data, and how such patterns can help us interpret the origin of the unexpected/unexplained peaks. First a model is proposed that describes the origin of the observed peaks in a mass spectrum. For this purpose we use the classical correlative database search algorithm. Peaks that support a positive identification of the spectrum are termed explained peaks. Next, frequent itemset mining techniques are introduced to infer which unexplained peaks are associated in a spectrum. The method is validated on two types of experimental proteomic data. First, peptide mass fingerprint data is analyzed to explain the unassigned peaks in a full scan mass spectrum. Interestingly, a large numbers of experimental spectra reveals several highly frequent unexplained masses, and pattern mining on these frequent masses demonstrates that subsets of these peaks frequently co-occur. Further evaluation shows that several of these co-occurring peaks indeed have a known common origin, and other patterns are promising hypothesis generators for further analysis. Second, the proposed methodology is validated on tandem mass spectrometral data using a public spectral library, where associations within the mass differences of unassigned peaks and peptide modifications are explored. The investigation of the found patterns illustrates that meaningful patterns can be discovered that can be explained by features of the employed technology and found modifications. This simple approach offers opportunities to monitor accumulating unexplained mass spectrometry data for emerging new patterns, with possible applications for the development of mass exclusion lists, for the refinement of quality control strategies and for a further interpretation of unexplained spectral peaks in mass spectrometry and tandem mass spectrometry.

  5. Beam angle optimization for intensity-modulated radiation therapy using a guided pattern search method

    NASA Astrophysics Data System (ADS)

    Rocha, Humberto; Dias, Joana M.; Ferreira, Brígida C.; Lopes, Maria C.

    2013-05-01

    Generally, the inverse planning of radiation therapy consists mainly of the fluence optimization. The beam angle optimization (BAO) in intensity-modulated radiation therapy (IMRT) consists of selecting appropriate radiation incidence directions and may influence the quality of the IMRT plans, both to enhance better organ sparing and to improve tumor coverage. However, in clinical practice, most of the time, beam directions continue to be manually selected by the treatment planner without objective and rigorous criteria. The goal of this paper is to introduce a novel approach that uses beam’s-eye-view dose ray tracing metrics within a pattern search method framework in the optimization of the highly non-convex BAO problem. Pattern search methods are derivative-free optimization methods that require a few function evaluations to progress and converge and have the ability to better avoid local entrapment. The pattern search method framework is composed of a search step and a poll step at each iteration. The poll step performs a local search in a mesh neighborhood and ensures the convergence to a local minimizer or stationary point. The search step provides the flexibility for a global search since it allows searches away from the neighborhood of the current iterate. Beam’s-eye-view dose metrics assign a score to each radiation beam direction and can be used within the pattern search framework furnishing a priori knowledge of the problem so that directions with larger dosimetric scores are tested first. A set of clinical cases of head-and-neck tumors treated at the Portuguese Institute of Oncology of Coimbra is used to discuss the potential of this approach in the optimization of the BAO problem.

  6. Space communications scheduler: A rule-based approach to adaptive deadline scheduling

    NASA Technical Reports Server (NTRS)

    Straguzzi, Nicholas

    1990-01-01

    Job scheduling is a deceptively complex subfield of computer science. The highly combinatorial nature of the problem, which is NP-complete in nearly all cases, requires a scheduling program to intelligently transverse an immense search tree to create the best possible schedule in a minimal amount of time. In addition, the program must continually make adjustments to the initial schedule when faced with last-minute user requests, cancellations, unexpected device failures, quests, cancellations, unexpected device failures, etc. A good scheduler must be quick, flexible, and efficient, even at the expense of generating slightly less-than-optimal schedules. The Space Communication Scheduler (SCS) is an intelligent rule-based scheduling system. SCS is an adaptive deadline scheduler which allocates modular communications resources to meet an ordered set of user-specified job requests on board the NASA Space Station. SCS uses pattern matching techniques to detect potential conflicts through algorithmic and heuristic means. As a result, the system generates and maintains high density schedules without relying heavily on backtracking or blind search techniques. SCS is suitable for many common real-world applications.

  7. SpolSimilaritySearch - A web tool to compare and search similarities between spoligotypes of Mycobacterium tuberculosis complex.

    PubMed

    Couvin, David; Zozio, Thierry; Rastogi, Nalin

    2017-07-01

    Spoligotyping is one of the most commonly used polymerase chain reaction (PCR)-based methods for identification and study of genetic diversity of Mycobacterium tuberculosis complex (MTBC). Despite its known limitations if used alone, the methodology is particularly useful when used in combination with other methods such as mycobacterial interspersed repetitive units - variable number of tandem DNA repeats (MIRU-VNTRs). At a worldwide scale, spoligotyping has allowed identification of information on 103,856 MTBC isolates (corresponding to 98049 clustered strains plus 5807 unique isolates from 169 countries of patient origin) contained within the SITVIT2 proprietary database of the Institut Pasteur de la Guadeloupe. The SpolSimilaritySearch web-tool described herein (available at: http://www.pasteur-guadeloupe.fr:8081/SpolSimilaritySearch) incorporates a similarity search algorithm allowing users to get a complete overview of similar spoligotype patterns (with information on presence or absence of 43 spacers) in the aforementioned worldwide database. This tool allows one to analyze spread and evolutionary patterns of MTBC by comparing similar spoligotype patterns, to distinguish between widespread, specific and/or confined patterns, as well as to pinpoint patterns with large deleted blocks, which play an intriguing role in the genetic epidemiology of M. tuberculosis. Finally, the SpolSimilaritySearch tool also provides with the country distribution patterns for each queried spoligotype. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Feature generation using genetic programming with application to fault classification.

    PubMed

    Guo, Hong; Jack, Lindsay B; Nandi, Asoke K

    2005-02-01

    One of the major challenges in pattern recognition problems is the feature extraction process which derives new features from existing features, or directly from raw data in order to reduce the cost of computation during the classification process, while improving classifier efficiency. Most current feature extraction techniques transform the original pattern vector into a new vector with increased discrimination capability but lower dimensionality. This is conducted within a predefined feature space, and thus, has limited searching power. Genetic programming (GP) can generate new features from the original dataset without prior knowledge of the probabilistic distribution. In this paper, a GP-based approach is developed for feature extraction from raw vibration data recorded from a rotating machine with six different conditions. The created features are then used as the inputs to a neural classifier for the identification of six bearing conditions. Experimental results demonstrate the ability of GP to discover autimatically the different bearing conditions using features expressed in the form of nonlinear functions. Furthermore, four sets of results--using GP extracted features with artificial neural networks (ANN) and support vector machines (SVM), as well as traditional features with ANN and SVM--have been obtained. This GP-based approach is used for bearing fault classification for the first time and exhibits superior searching power over other techniques. Additionaly, it significantly reduces the time for computation compared with genetic algorithm (GA), therefore, makes a more practical realization of the solution.

  9. Earthquake effect on volcano and the geological structure in central java using tomography travel time method and relocation hypocenter by grid search method

    NASA Astrophysics Data System (ADS)

    Suharsono; Nurdian, S. W.; Palupi, I. R.

    2016-11-01

    Relocating hypocenter is a way to improve the velocity model of the subsurface. One of the method is Grid Search. To perform the distribution of the velocity in subsurface by tomography method, it is used the result of relocating hypocenter to be a reference for subsurface analysis in volcanic and major structural patterns, such as in Central Java. The main data of this study is the earthquake data recorded from 1952 to 2012 with the P wave number is 9162, the number of events is 2426 were recorded by 30 stations located in the vicinity of Central Java. Grid search method has some advantages they are: it can relocate the hypocenter more accurate because this method is dividing space lattice model into blocks, and each grid block can only be occupied by one point hypocenter. Tomography technique is done by travel time data that has had relocated with inversion pseudo bending method. Grid search relocated method show that the hypocenter's depth is shallower than before and the direction is to the south, the hypocenter distribution is modeled into the subduction zone between the continent of Eurasia with the Indo-Australian with an average angle of 14 °. The tomography results show the low velocity value is contained under volcanoes with value of -8% to -10%, then the pattern of the main fault structure in Central Java can be description by the results of tomography at high velocity that is from 8% to 10% with the direction is northwest and northeast-southwest.

  10. Estimation, modeling, and simulation of patterned growth in extreme environments.

    PubMed

    Strader, B; Schubert, K E; Quintana, M; Gomez, E; Curnutt, J; Boston, P

    2011-01-01

    In the search for life on Mars and other extraterrestrial bodies or in our attempts to identify biological traces in the most ancient rock record of Earth, one of the biggest problems facing us is how to recognize life or the remains of ancient life in a context very different from our planet's modern biological examples. Specific chemistries or biological properties may well be inapplicable to extraterrestrial conditions or ancient Earth environments. Thus, we need to develop an arsenal of techniques that are of broader applicability. The notion of patterning created in some fashion by biological processes and properties may provide such a generalized property of biological systems no matter what the incidentals of chemistry or environmental conditions. One approach to recognizing these kinds of patterns is to look at apparently organized arrangements created and left by life in extreme environments here on Earth, especially at various spatial scales, different geologies, and biogeochemical circumstances.

  11. Wavelet analysis of frequency chaos game signal: a time-frequency signature of the C. elegans DNA.

    PubMed

    Messaoudi, Imen; Oueslati, Afef Elloumi; Lachiri, Zied

    2014-12-01

    Challenging tasks are encountered in the field of bioinformatics. The choice of the genomic sequence's mapping technique is one the most fastidious tasks. It shows that a judicious choice would serve in examining periodic patterns distribution that concord with the underlying structure of genomes. Despite that, searching for a coding technique that can highlight all the information contained in the DNA has not yet attracted the attention it deserves. In this paper, we propose a new mapping technique based on the chaos game theory that we call the frequency chaos game signal (FCGS). The particularity of the FCGS coding resides in exploiting the statistical properties of the genomic sequence itself. This may reflect important structural and organizational features of DNA. To prove the usefulness of the FCGS approach in the detection of different local periodic patterns, we use the wavelet analysis because it provides access to information that can be obscured by other time-frequency methods such as the Fourier analysis. Thus, we apply the continuous wavelet transform (CWT) with the complex Morlet wavelet as a mother wavelet function. Scalograms that relate to the organism Caenorhabditis elegans (C. elegans) exhibit a multitude of periodic organization of specific DNA sequences.

  12. Signatures of active and passive optimized Lévy searching in jellyfish.

    PubMed

    Reynolds, Andy M

    2014-10-06

    Some of the strongest empirical support for Lévy search theory has come from telemetry data for the dive patterns of marine predators (sharks, bony fishes, sea turtles and penguins). The dive patterns of the unusually large jellyfish Rhizostoma octopus do, however, sit outside of current Lévy search theory which predicts that a single search strategy is optimal. When searching the water column, the movement patterns of these jellyfish change over time. Movement bouts can be approximated by a variety of Lévy and Brownian (exponential) walks. The adaptive value of this variation is not known. On some occasions movement pattern data are consistent with the jellyfish prospecting away from a preferred depth, not finding an improvement in conditions elsewhere and so returning to their original depth. This 'bounce' behaviour also sits outside of current Lévy walk search theory. Here, it is shown that the jellyfish movement patterns are consistent with their using optimized 'fast simulated annealing'--a novel kind of Lévy walk search pattern--to locate the maximum prey concentration in the water column and/or to locate the strongest of many olfactory trails emanating from more distant prey. Fast simulated annealing is a powerful stochastic search algorithm for locating a global maximum that is hidden among many poorer local maxima in a large search space. This new finding shows that the notion of active optimized Lévy walk searching is not limited to the search for randomly and sparsely distributed resources, as previously thought, but can be extended to embrace other scenarios, including that of the jellyfish R. octopus. In the presence of convective currents, it could become energetically favourable to search the water column by riding the convective currents. Here, it is shown that these passive movements can be represented accurately by Lévy walks of the type occasionally seen in R. octopus. This result vividly illustrates that Lévy walks are not necessarily the result of selection pressures for advantageous searching behaviour but can instead arise freely and naturally from simple processes. It also shows that the family of Lévy walkers is vastly larger than previously thought and includes spores, pollens, seeds and minute wingless arthropods that on warm days disperse passively within the atmospheric boundary layer. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  13. Discovering weighted patterns in intron sequences using self-adaptive harmony search and back-propagation algorithms.

    PubMed

    Huang, Yin-Fu; Wang, Chia-Ming; Liou, Sing-Wu

    2013-01-01

    A hybrid self-adaptive harmony search and back-propagation mining system was proposed to discover weighted patterns in human intron sequences. By testing the weights under a lazy nearest neighbor classifier, the numerical results revealed the significance of these weighted patterns. Comparing these weighted patterns with the popular intron consensus model, it is clear that the discovered weighted patterns make originally the ambiguous 5SS and 3SS header patterns more specific and concrete.

  14. Discovering Weighted Patterns in Intron Sequences Using Self-Adaptive Harmony Search and Back-Propagation Algorithms

    PubMed Central

    Wang, Chia-Ming; Liou, Sing-Wu

    2013-01-01

    A hybrid self-adaptive harmony search and back-propagation mining system was proposed to discover weighted patterns in human intron sequences. By testing the weights under a lazy nearest neighbor classifier, the numerical results revealed the significance of these weighted patterns. Comparing these weighted patterns with the popular intron consensus model, it is clear that the discovered weighted patterns make originally the ambiguous 5SS and 3SS header patterns more specific and concrete. PMID:23737711

  15. Seasonal variation in Internet searches for vitamin D.

    PubMed

    Moon, Rebecca J; Curtis, Elizabeth M; Davies, Justin H; Cooper, Cyrus; Harvey, Nicholas C

    2017-12-01

    Internet search rates for "vitamin D" were explored using Google Trends. Search rates increased from 2004 until 2010 and thereafter displayed a seasonal pattern peaking in late winter. This knowledge could help guide the timing of public health interventions aimed at managing vitamin D deficiency. The Internet is an important source of health information. Analysis of Internet search activity rates can provide information on disease epidemiology, health related behaviors and public interest. We explored Internet search rates for vitamin D to determine whether this reflects the increasing scientific interest in this topic. Google Trends is a publically available tool that provides data on Internet searches using Google. Search activity for the term "vitamin D" from 1st January 2004 until 31st October 2016 was obtained. Comparison was made to other bone and nutrition related terms. Worldwide, searches for "vitamin D" increased from 2004 until 2010 and thereafter a statistically significant (p < 0.001) seasonal pattern with a peak in February and nadir in August was observed. This seasonal pattern was evident for searches originating from both the USA (peak in February) and Australia (peak in August); p < 0.001 for both. Searches for the terms "osteoporosis", "rickets", "back pain" or "folic acid" did not display the increase observed for vitamin D or evidence of seasonal variation. Public interest in vitamin D, as assessed by Internet search activity, did increase from 2004 to 2010, likely reflecting the growing scientific interest, but now displays a seasonal pattern with peak interest during late winter. This information could be used to guide public health approaches to managing vitamin D deficiency.

  16. Nearest Neighbor Searching in Binary Search Trees: Simulation of a Multiprocessor System.

    ERIC Educational Resources Information Center

    Stewart, Mark; Willett, Peter

    1987-01-01

    Describes the simulation of a nearest neighbor searching algorithm for document retrieval using a pool of microprocessors. Three techniques are described which allow parallel searching of a binary search tree as well as a PASCAL-based system, PASSIM, which can simulate these techniques. Fifty-six references are provided. (Author/LRW)

  17. A Literature Review of Indexing and Searching Techniques Implementation in Educational Search Engines

    ERIC Educational Resources Information Center

    El Guemmat, Kamal; Ouahabi, Sara

    2018-01-01

    The objective of this article is to analyze the searching and indexing techniques of educational search engines' implementation while treating future challenges. Educational search engines could greatly help in the effectiveness of e-learning if used correctly. However, these engines have several gaps which influence the performance of e-learning…

  18. All Source Sensor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    - PNNL, Harold Trease

    2012-10-10

    ASSA is a software application that processes binary data into summarized index tables that can be used to organize features contained within the data. ASSA's index tables can also be used to search for user specified features. ASSA is designed to organize and search for patterns in unstructured binary data streams or archives, such as video, images, audio, and network traffic. ASSA is basically a very general search engine used to search for any pattern in any binary data stream. It has uses in video analytics, image analysis, audio analysis, searching hard-drives, monitoring network traffic, etc.

  19. Patterns of Information-Seeking for Cancer on the Internet: An Analysis of Real World Data

    PubMed Central

    Ofran, Yishai; Paltiel, Ora; Pelleg, Dan; Rowe, Jacob M.; Yom-Tov, Elad

    2012-01-01

    Although traditionally the primary information sources for cancer patients have been the treating medical team, patients and their relatives increasingly turn to the Internet, though this source may be misleading and confusing. We assess Internet searching patterns to understand the information needs of cancer patients and their acquaintances, as well as to discern their underlying psychological states. We screened 232,681 anonymous users who initiated cancer-specific queries on the Yahoo Web search engine over three months, and selected for study users with high levels of interest in this topic. Searches were partitioned by expected survival for the disease being searched. We compared the search patterns of anonymous users and their contacts. Users seeking information on aggressive malignancies exhibited shorter search periods, focusing on disease- and treatment-related information. Users seeking knowledge regarding more indolent tumors searched for longer periods, alternated between different subjects, and demonstrated a high interest in topics such as support groups. Acquaintances searched for longer periods than the proband user when seeking information on aggressive (compared to indolent) cancers. Information needs can be modeled as transitioning between five discrete states, each with a unique signature representing the type of information of interest to the user. Thus, early phases of information-seeking for cancer follow a specific dynamic pattern. Areas of interest are disease dependent and vary between probands and their contacts. These patterns can be used by physicians and medical Web site authors to tailor information to the needs of patients and family members. PMID:23029317

  20. Structator: fast index-based search for RNA sequence-structure patterns

    PubMed Central

    2011-01-01

    Background The secondary structure of RNA molecules is intimately related to their function and often more conserved than the sequence. Hence, the important task of searching databases for RNAs requires to match sequence-structure patterns. Unfortunately, current tools for this task have, in the best case, a running time that is only linear in the size of sequence databases. Furthermore, established index data structures for fast sequence matching, like suffix trees or arrays, cannot benefit from the complementarity constraints introduced by the secondary structure of RNAs. Results We present a novel method and readily applicable software for time efficient matching of RNA sequence-structure patterns in sequence databases. Our approach is based on affix arrays, a recently introduced index data structure, preprocessed from the target database. Affix arrays support bidirectional pattern search, which is required for efficiently handling the structural constraints of the pattern. Structural patterns like stem-loops can be matched inside out, such that the loop region is matched first and then the pairing bases on the boundaries are matched consecutively. This allows to exploit base pairing information for search space reduction and leads to an expected running time that is sublinear in the size of the sequence database. The incorporation of a new chaining approach in the search of RNA sequence-structure patterns enables the description of molecules folding into complex secondary structures with multiple ordered patterns. The chaining approach removes spurious matches from the set of intermediate results, in particular of patterns with little specificity. In benchmark experiments on the Rfam database, our method runs up to two orders of magnitude faster than previous methods. Conclusions The presented method's sublinear expected running time makes it well suited for RNA sequence-structure pattern matching in large sequence databases. RNA molecules containing several stem-loop substructures can be described by multiple sequence-structure patterns and their matches are efficiently handled by a novel chaining method. Beyond our algorithmic contributions, we provide with Structator a complete and robust open-source software solution for index-based search of RNA sequence-structure patterns. The Structator software is available at http://www.zbh.uni-hamburg.de/Structator. PMID:21619640

  1. A Search Technique for Weak and Long-Duration Gamma-Ray Bursts from Background Model Residuals

    NASA Technical Reports Server (NTRS)

    Skelton, R. T.; Mahoney, W. A.

    1993-01-01

    We report on a planned search technique for Gamma-Ray Bursts too weak to trigger the on-board threshold. The technique is to search residuals from a physically based background model used for analysis of point sources by the Earth occultation method.

  2. A matter of life or limb? A review of traumatic injury patterns and anesthesia techniques for disaster relief after major earthquakes.

    PubMed

    Missair, Andres; Pretto, Ernesto A; Visan, Alexandru; Lobo, Laila; Paula, Frank; Castillo-Pedraza, Catalina; Cooper, Lebron; Gebhard, Ralf E

    2013-10-01

    All modalities of anesthetic care, including conscious sedation, general, and regional anesthesia, have been used to manage earthquake survivors who require urgent surgical intervention during the acute phase of medical relief. Consequently, we felt that a review of epidemiologic data from major earthquakes in the context of urgent intraoperative management was warranted to optimize anesthesia disaster preparedness for future medical relief operations. The primary outcome measure of this study was to identify the predominant preoperative injury pattern (anatomic location and pathology) of survivors presenting for surgical care immediately after major earthquakes during the acute phase of medical relief (0-15 days after disaster). The injury pattern is of significant relevance because it closely relates to the anesthetic techniques available for patient management. We discuss our findings in the context of evidence-based strategies for anesthetic management during the acute phase of medical relief after major earthquakes and the associated obstacles of devastated medical infrastructure. To identify reports on acute medical care in the aftermath of natural disasters, a query was conducted using MEDLINE/PubMed, Embase, CINAHL, as well as an online search engine (Google Scholar). The search terms were "disaster" and "earthquake" in combination with "injury," "trauma," "surgery," "anesthesia," and "wounds." Our investigation focused only on studies of acute traumatic injury that specified surgical intervention among survivors in the acute phase of medical relief. A total of 31 articles reporting on 15 major earthquakes (between 1980 and 2010) and the treatment of more than 33,410 patients met our specific inclusion criteria. The mean incidence of traumatic limb injury per major earthquake was 68.0%. The global incidence of traumatic limb injury was 54.3% (18,144/33,410 patients). The pooled estimate of the proportion of limb injuries was calculated to be 67.95%, with a 95% confidence interval of 62.32% to 73.58%. Based on this analysis, early disaster surgical intervention will focus on surviving patients with limb injury. All anesthetic techniques have been safely used for medical relief. While regional anesthesia may be an intuitive choice based on these findings, in the context of collapsed medical infrastructure, provider experience may dictate the available anesthetic techniques for earthquake survivors requiring urgent surgery.

  3. Combining the Bourne-Shell, sed and awk in the UNIX Environment for Language Analysis.

    ERIC Educational Resources Information Center

    Schmitt, Lothar M.; Christianson, Kiel T.

    This document describes how to construct tools for language analysis in research and teaching using the Bourne-shell, sed, and awk, three search tools, in the UNIX operating system. Applications include: searches for words, phrases, grammatical patterns, and phonemic patterns in text; statistical analysis of text in regard to such searches,…

  4. Job Search Patterns of College Graduates: The Role of Social Capital

    ERIC Educational Resources Information Center

    Coonfield, Emily S.

    2012-01-01

    This dissertation addresses job search patterns of college graduates and the implications of social capital by race and class. The purpose of this study is to explore (1) how the job search transpires for recent college graduates, (2) how potential social networks in a higher educational context, like KU, may make a difference for students with…

  5. Gravitational radiation from rotating gravitational collapse

    NASA Technical Reports Server (NTRS)

    Stark, Richard F.

    1989-01-01

    The efficiency of gravitational wave emission from axisymmetric rotating collapse to a black hole was found to be very low: Delta E/Mc sq. less than 7 x 10(exp -4). The main waveform shape is well defined and nearly independent of the details of the collapse. Such a signature will allow pattern recognition techniques to be used when searching experimental data. These results (which can be scaled in mass) were obtained using a fully general relativistic computer code that evolves rotating axisymmetric configurations and directly computes their gravitational radiation emission.

  6. New Powder Diffraction File (PDF-4) in relational database format: advantages and data-mining capabilities.

    PubMed

    Kabekkodu, Soorya N; Faber, John; Fawcett, Tim

    2002-06-01

    The International Centre for Diffraction Data (ICDD) is responding to the changing needs in powder diffraction and materials analysis by developing the Powder Diffraction File (PDF) in a very flexible relational database (RDB) format. The PDF now contains 136,895 powder diffraction patterns. In this paper, an attempt is made to give an overview of the PDF-4, search/match methods and the advantages of having the PDF-4 in RDB format. Some case studies have been carried out to search for crystallization trends, properties, frequencies of space groups and prototype structures. These studies give a good understanding of the basic structural aspects of classes of compounds present in the database. The present paper also reports data-mining techniques and demonstrates the power of a relational database over the traditional (flat-file) database structures.

  7. Lévy flight and Brownian search patterns of a free-ranging predator reflect different prey field characteristics.

    PubMed

    Sims, David W; Humphries, Nicolas E; Bradford, Russell W; Bruce, Barry D

    2012-03-01

    1. Search processes play an important role in physical, chemical and biological systems. In animal foraging, the search strategy predators should use to search optimally for prey is an enduring question. Some models demonstrate that when prey is sparsely distributed, an optimal search pattern is a specialised random walk known as a Lévy flight, whereas when prey is abundant, simple Brownian motion is sufficiently efficient. These predictions form part of what has been termed the Lévy flight foraging hypothesis (LFF) which states that as Lévy flights optimise random searches, movements approximated by optimal Lévy flights may have naturally evolved in organisms to enhance encounters with targets (e.g. prey) when knowledge of their locations is incomplete. 2. Whether free-ranging predators exhibit the movement patterns predicted in the LFF hypothesis in response to known prey types and distributions, however, has not been determined. We tested this using vertical and horizontal movement data from electronic tagging of an apex predator, the great white shark Carcharodon carcharias, across widely differing habitats reflecting different prey types. 3. Individual white sharks exhibited movement patterns that predicted well the prey types expected under the LFF hypothesis. Shark movements were best approximated by Brownian motion when hunting near abundant, predictable sources of prey (e.g. seal colonies, fish aggregations), whereas movements approximating truncated Lévy flights were present when searching for sparsely distributed or potentially difficult-to-detect prey in oceanic or shelf environments, respectively. 4. That movement patterns approximated by truncated Lévy flights and Brownian behaviour were present in the predicted prey fields indicates search strategies adopted by white sharks appear to be the most efficient ones for encountering prey in the habitats where such patterns are observed. This suggests that C. carcharias appears capable of exhibiting search patterns that are approximated as optimal in response to encountered changes in prey type and abundance, and across diverse marine habitats, from the surf zone to the deep ocean. 5. Our results provide some support for the LFF hypothesis. However, it is possible that the observed Lévy patterns of white sharks may not arise from an adaptive behaviour but could be an emergent property arising from simple, straight-line movements between complex (e.g. fractal) distributions of prey. Experimental studies are needed in vertebrates to test for the presence of Lévy behaviour patterns in the absence of complex prey distributions. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.

  8. LC/QTOF-MS fragmentation of N-nitrosodimethylamine precursors in drinking water supplies is predictable and aids their identification.

    PubMed

    Hanigan, David; Ferrer, Imma; Thurman, E Michael; Herckes, Pierre; Westerhoff, Paul

    2017-02-05

    N-Nitrosodimethylamine (NDMA) is carcinogenic in rodents and occurs in chloraminated drinking water and wastewater effluents. NDMA forms via reactions between chloramines and mostly unidentified, N-containing organic matter. We developed a mass spectrometry technique to identify NDMA precursors by analyzing 25 model compounds with LC/QTOF-MS. We searched isolates of 11 drinking water sources and 1 wastewater using a custom MATLAB ® program and extracted ion chromatograms for two fragmentation patterns that were specific to the model compounds. Once a diagnostic fragment was discovered, we conducted MS/MS during a subsequent injection to confirm the precursor ion. Using non-target searches and two diagnostic fragmentation patterns, we discovered 158 potential NDMA precursors. Of these, 16 were identified using accurate mass combined with fragment and retention time matches of analytical standards when available. Five of these sixteen NDMA precursors were previously unidentified in the literature, three of which were metabolites of pharmaceuticals. Except methadone, the newly identified precursors all had NDMA molar yields of less than 5%, indicating that NDMA formation could be additive from multiple compounds, each with low yield. We demonstrate that the method is applicable to other disinfection by-product precursors by predicting and verifying the fragmentation patterns for one nitrosodiethylamine precursor. Copyright © 2016. Published by Elsevier B.V.

  9. The potential use of unmanned aircraft systems (drones) in mountain search and rescue operations.

    PubMed

    Karaca, Yunus; Cicek, Mustafa; Tatli, Ozgur; Sahin, Aynur; Pasli, Sinan; Beser, Muhammed Fatih; Turedi, Suleyman

    2018-04-01

    This study explores the potential use of drones in searching for and locating victims and of motorized transportation of search and rescue providers in a mountain environment using a simulation model. This prospective randomized simulation study was performed in order to compare two different search and rescue techniques in searching for an unconscious victim on snow-covered ground. In the control arm, the Classical Line Search Technique (CLT) was used, in which the search is performed on foot and the victim is reached on foot. In the intervention arm, the Drone-snowmobile Technique (DST) was used, the search being performed by drone and the victim reached by snowmobile. The primary outcome of the study was the comparison of the two search and rescue techniques in terms of first human contact time. Twenty search and rescue operations were conducted in this study. Median time to arrival at the mannequin was 57.3min for CLT, compared to 8.9min for DST. The median value of the total searched area was 88,322.0m 2 for CLT and 228,613.0m 2 for DST. The median area searched per minute was 1489.6m 2 for CLT and 32,979.9m 2 for DST (p<0.01 for all comparisons). In conclusion, a wider area can be searched faster by drone using DST compared to the classical technique, and the victim can be located faster and reached earlier with rescuers transported by snowmobile. Copyright © 2017 Elsevier Inc. All rights reserved.

  10. Evaluation of publicly available documents to trace chiropractic technique systems that advocate radiography for subluxation analysis: a proposed genealogy.

    PubMed

    Young, Kenneth J

    2014-12-01

    The purpose of this study was to evaluate publicly available information of chiropractic technique systems that advocate radiography for subluxation detection to identify links between chiropractic technique systems and to describe claims made of the health effects of the osseous misalignment component of the chiropractic subluxation and radiographic paradigms. The Internet and publicly available documents were searched for information representing chiropractic technique systems that advocate radiography for subluxation detection. Key phrases including chiropractic, x-ray, radiography, and technique were identified from a Google search between April 2013 and March 2014. Phrases in Web sites and public documents were examined for any information about origins and potential links between these techniques, including the type of connection to BJ Palmer, who was the first chiropractor to advocate radiography for subluxation detection. Quotes were gathered to identify claims of health effects from osseous misalignment (subluxation) and paradigms of radiography. Techniques were grouped by region of the spine and how they could be traced back to B.J Palmer. A genealogy model and summary table of information on each technique were created. Patterns in year of origination and radiographic paradigms were noted, and percentages were calculated on elements of the techniques' characteristics in comparison to the entire group. Twenty-three techniques were identified on the Internet: 6 full spine, 17 upper cervical, and 2 techniques generating other lineage. Most of the upper cervical techniques (14/16) traced their origins to a time when the Palmer School was teaching upper cervical technique, and all the full spine techniques (6/6) originated before or after this phase. All the technique systems' documents attributed broad health effects to their methods. Many (21/23) of the techniques used spinal realignment on radiographs as one of their outcome measures. Chiropractic technique systems in this study (ie, those that advocate for radiography for subluxation misalignment detection) seem to be closely related by descent, their claims of a variety of health effects associated with chiropractic subluxation, and their radiographic paradigms.

  11. Temporal stability of visual search-driven biometrics

    NASA Astrophysics Data System (ADS)

    Yoon, Hong-Jun; Carmichael, Tandy R.; Tourassi, Georgia

    2015-03-01

    Previously, we have shown the potential of using an individual's visual search pattern as a possible biometric. That study focused on viewing images displaying dot-patterns with different spatial relationships to determine which pattern can be more effective in establishing the identity of an individual. In this follow-up study we investigated the temporal stability of this biometric. We performed an experiment with 16 individuals asked to search for a predetermined feature of a random-dot pattern as we tracked their eye movements. Each participant completed four testing sessions consisting of two dot patterns repeated twice. One dot pattern displayed concentric circles shifted to the left or right side of the screen overlaid with visual noise, and participants were asked which side the circles were centered on. The second dot-pattern displayed a number of circles (between 0 and 4) scattered on the screen overlaid with visual noise, and participants were asked how many circles they could identify. Each session contained 5 untracked tutorial questions and 50 tracked test questions (200 total tracked questions per participant). To create each participant's "fingerprint", we constructed a Hidden Markov Model (HMM) from the gaze data representing the underlying visual search and cognitive process. The accuracy of the derived HMM models was evaluated using cross-validation for various time-dependent train-test conditions. Subject identification accuracy ranged from 17.6% to 41.8% for all conditions, which is significantly higher than random guessing (1/16 = 6.25%). The results suggest that visual search pattern is a promising, temporally stable personalized fingerprint of perceptual organization.

  12. Evaluation of Publicly Available Documents to Trace Chiropractic Technique Systems That Advocate Radiography for Subluxation Analysis: A Proposed Genealogy

    PubMed Central

    Young, Kenneth J.

    2014-01-01

    Objective The purpose of this study was to evaluate publicly available information of chiropractic technique systems that advocate radiography for subluxation detection to identify links between chiropractic technique systems and to describe claims made of the health effects of the osseous misalignment component of the chiropractic subluxation and radiographic paradigms. Methods The Internet and publicly available documents were searched for information representing chiropractic technique systems that advocate radiography for subluxation detection. Key phrases including chiropractic, x-ray, radiography, and technique were identified from a Google search between April 2013 and March 2014. Phrases in Web sites and public documents were examined for any information about origins and potential links between these techniques, including the type of connection to BJ Palmer, who was the first chiropractor to advocate radiography for subluxation detection. Quotes were gathered to identify claims of health effects from osseous misalignment (subluxation) and paradigms of radiography. Techniques were grouped by region of the spine and how they could be traced back to B.J Palmer. A genealogy model and summary table of information on each technique were created. Patterns in year of origination and radiographic paradigms were noted, and percentages were calculated on elements of the techniques’ characteristics in comparison to the entire group. Results Twenty-three techniques were identified on the Internet: 6 full spine, 17 upper cervical, and 2 techniques generating other lineage. Most of the upper cervical techniques (14/16) traced their origins to a time when the Palmer School was teaching upper cervical technique, and all the full spine techniques (6/6) originated before or after this phase. All the technique systems’ documents attributed broad health effects to their methods. Many (21/23) of the techniques used spinal realignment on radiographs as one of their outcome measures. Conclusion Chiropractic technique systems in this study (ie, those that advocate for radiography for subluxation misalignment detection) seem to be closely related by descent, their claims of a variety of health effects associated with chiropractic subluxation, and their radiographic paradigms. PMID:25431540

  13. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search,more » and discuss a number of possible improvements for each.« less

  14. Shotgun metagenomic data streams: surfing without fear

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berendzen, Joel R

    2010-12-06

    Timely information about bio-threat prevalence, consequence, propagation, attribution, and mitigation is needed to support decision-making, both routinely and in a crisis. One DNA sequencer can stream 25 Gbp of information per day, but sampling strategies and analysis techniques are needed to turn raw sequencing power into actionable knowledge. Shotgun metagenomics can enable biosurveillance at the level of a single city, hospital, or airplane. Metagenomics characterizes viruses and bacteria from complex environments such as soil, air filters, or sewage. Unlike targeted-primer-based sequencing, shotgun methods are not blind to sequences that are truly novel, and they can measure absolute prevalence. Shotgun metagenomicmore » sampling can be non-invasive, efficient, and inexpensive while being informative. We have developed analysis techniques for shotgun metagenomic sequencing that rely upon phylogenetic signature patterns. They work by indexing local sequence patterns in a manner similar to web search engines. Our methods are laptop-fast and favorable scaling properties ensure they will be sustainable as sequencing methods grow. We show examples of application to soil metagenomic samples.« less

  15. The Lévy flight paradigm: random search patterns and mechanisms.

    PubMed

    Reynolds, A M; Rhodes, C J

    2009-04-01

    Over recent years there has been an accumulation of evidence from a variety of experimental, theoretical, and field studies that many organisms use a movement strategy approximated by Lévy flights when they are searching for resources. Lévy flights are random movements that can maximize the efficiency of resource searches in uncertain environments. This is a highly significant finding because it suggests that Lévy flights provide a rigorous mathematical basis for separating out evolved, innate behaviors from environmental influences. We discuss recent developments in random-search theory, as well as the many different experimental and data collection initiatives that have investigated search strategies. Methods for trajectory construction and robust data analysis procedures are presented. The key to prediction and understanding does, however, lie in the elucidation of mechanisms underlying the observed patterns. We discuss candidate neurological, olfactory, and learning mechanisms for the emergence of Lévy flight patterns in some organisms, and note that convergence of behaviors along such different evolutionary pathways is not surprising given the energetic efficiencies that Lévy flight movement patterns confer.

  16. Evidence From Web-Based Dietary Search Patterns to the Role of B12 Deficiency in Non-Specific Chronic Pain: A Large-Scale Observational Study

    PubMed Central

    Giat, Eitan

    2018-01-01

    Background Profound vitamin B12 deficiency is a known cause of disease, but the role of low or intermediate levels of B12 in the development of neuropathy and other neuropsychiatric symptoms, as well as the relationship between eating meat and B12 levels, is unclear. Objective The objective of our study was to investigate the role of low or intermediate levels of B12 in the development of neuropathy and other neuropsychiatric symptoms. Methods We used food-related Internet search patterns from a sample of 8.5 million people based in the US as a proxy for B12 intake and correlated these searches with Internet searches related to possible effects of B12 deficiency. Results Food-related search patterns were highly correlated with known consumption and food-related searches (ρ=.69). Awareness of B12 deficiency was associated with a higher consumption of B12-rich foods and with queries for B12 supplements. Searches for terms related to neurological disorders were correlated with searches for B12-poor foods, in contrast with control terms. Popular medicines, those having fewer indications, and those which are predominantly used to treat pain, were more strongly correlated with the ability to predict neuropathic pain queries using the B12 contents of food. Conclusions Our findings show that Internet search patterns are a useful way of investigating health questions in large populations, and suggest that low B12 intake may be associated with a broader spectrum of neurological disorders than previously thought. PMID:29305340

  17. Universal principles governing multiple random searchers on complex networks: The logarithmic growth pattern and the harmonic law

    NASA Astrophysics Data System (ADS)

    Weng, Tongfeng; Zhang, Jie; Small, Michael; Harandizadeh, Bahareh; Hui, Pan

    2018-03-01

    We propose a unified framework to evaluate and quantify the search time of multiple random searchers traversing independently and concurrently on complex networks. We find that the intriguing behaviors of multiple random searchers are governed by two basic principles—the logarithmic growth pattern and the harmonic law. Specifically, the logarithmic growth pattern characterizes how the search time increases with the number of targets, while the harmonic law explores how the search time of multiple random searchers varies relative to that needed by individual searchers. Numerical and theoretical results demonstrate these two universal principles established across a broad range of random search processes, including generic random walks, maximal entropy random walks, intermittent strategies, and persistent random walks. Our results reveal two fundamental principles governing the search time of multiple random searchers, which are expected to facilitate investigation of diverse dynamical processes like synchronization and spreading.

  18. Signatures of active and passive optimized Lévy searching in jellyfish

    PubMed Central

    Reynolds, Andy M.

    2014-01-01

    Some of the strongest empirical support for Lévy search theory has come from telemetry data for the dive patterns of marine predators (sharks, bony fishes, sea turtles and penguins). The dive patterns of the unusually large jellyfish Rhizostoma octopus do, however, sit outside of current Lévy search theory which predicts that a single search strategy is optimal. When searching the water column, the movement patterns of these jellyfish change over time. Movement bouts can be approximated by a variety of Lévy and Brownian (exponential) walks. The adaptive value of this variation is not known. On some occasions movement pattern data are consistent with the jellyfish prospecting away from a preferred depth, not finding an improvement in conditions elsewhere and so returning to their original depth. This ‘bounce’ behaviour also sits outside of current Lévy walk search theory. Here, it is shown that the jellyfish movement patterns are consistent with their using optimized ‘fast simulated annealing’—a novel kind of Lévy walk search pattern—to locate the maximum prey concentration in the water column and/or to locate the strongest of many olfactory trails emanating from more distant prey. Fast simulated annealing is a powerful stochastic search algorithm for locating a global maximum that is hidden among many poorer local maxima in a large search space. This new finding shows that the notion of active optimized Lévy walk searching is not limited to the search for randomly and sparsely distributed resources, as previously thought, but can be extended to embrace other scenarios, including that of the jellyfish R. octopus. In the presence of convective currents, it could become energetically favourable to search the water column by riding the convective currents. Here, it is shown that these passive movements can be represented accurately by Lévy walks of the type occasionally seen in R. octopus. This result vividly illustrates that Lévy walks are not necessarily the result of selection pressures for advantageous searching behaviour but can instead arise freely and naturally from simple processes. It also shows that the family of Lévy walkers is vastly larger than previously thought and includes spores, pollens, seeds and minute wingless arthropods that on warm days disperse passively within the atmospheric boundary layer. PMID:25100323

  19. rasbhari: Optimizing Spaced Seeds for Database Searching, Read Mapping and Alignment-Free Sequence Comparison.

    PubMed

    Hahn, Lars; Leimeister, Chris-André; Ounit, Rachid; Lonardi, Stefano; Morgenstern, Burkhard

    2016-10-01

    Many algorithms for sequence analysis rely on word matching or word statistics. Often, these approaches can be improved if binary patterns representing match and don't-care positions are used as a filter, such that only those positions of words are considered that correspond to the match positions of the patterns. The performance of these approaches, however, depends on the underlying patterns. Herein, we show that the overlap complexity of a pattern set that was introduced by Ilie and Ilie is closely related to the variance of the number of matches between two evolutionarily related sequences with respect to this pattern set. We propose a modified hill-climbing algorithm to optimize pattern sets for database searching, read mapping and alignment-free sequence comparison of nucleic-acid sequences; our implementation of this algorithm is called rasbhari. Depending on the application at hand, rasbhari can either minimize the overlap complexity of pattern sets, maximize their sensitivity in database searching or minimize the variance of the number of pattern-based matches in alignment-free sequence comparison. We show that, for database searching, rasbhari generates pattern sets with slightly higher sensitivity than existing approaches. In our Spaced Words approach to alignment-free sequence comparison, pattern sets calculated with rasbhari led to more accurate estimates of phylogenetic distances than the randomly generated pattern sets that we previously used. Finally, we used rasbhari to generate patterns for short read classification with CLARK-S. Here too, the sensitivity of the results could be improved, compared to the default patterns of the program. We integrated rasbhari into Spaced Words; the source code of rasbhari is freely available at http://rasbhari.gobics.de/.

  20. Aggregate age-at-marriage patterns from individual mate-search heuristics.

    PubMed

    Todd, Peter M; Billari, Francesco C; Simão, Jorge

    2005-08-01

    The distribution of age at first marriage shows well-known strong regularities across many countries and recent historical periods. We accounted for these patterns by developing agent-based models that simulate the aggregate behavior of individuals who are searching for marriage partners. Past models assumed fully rational agents with complete knowledge of the marriage market; our simulated agents used psychologically plausible simple heuristic mate search rules that adjust aspiration levels on the basis of a sequence of encounters with potential partners. Substantial individual variation must be included in the models to account for the demographically observed age-at-marriage patterns.

  1. PandaX-III: Searching for neutrinoless double beta decay with high pressure 136Xe gas time projection chambers

    NASA Astrophysics Data System (ADS)

    Chen, Xun; Fu, ChangBo; Galan, Javier; Giboni, Karl; Giuliani, Franco; Gu, LingHui; Han, Ke; Ji, XiangDong; Lin, Heng; Liu, JiangLai; Ni, KaiXiang; Kusano, Hiroki; Ren, XiangXiang; Wang, ShaoBo; Yang, Yong; Zhang, Dan; Zhang, Tao; Zhao, Li; Sun, XiangMing; Hu, ShouYang; Jian, SiYu; Li, XingLong; Li, XiaoMei; Liang, Hao; Zhang, HuanQiao; Zhao, MingRui; Zhou, Jing; Mao, YaJun; Qiao, Hao; Wang, SiGuang; Yuan, Ying; Wang, Meng; Khan, Amir N.; Raper, Neill; Tang, Jian; Wang, Wei; Dong, JiaNing; Feng, ChangQing; Li, Cheng; Liu, JianBei; Liu, ShuBin; Wang, XiaoLian; Zhu, DanYang; Castel, Juan F.; Cebrián, Susana; Dafni, Theopisti; Garza, Javier G.; Irastorza, Igor G.; Iguaz, Francisco J.; Luzón, Gloria; Mirallas, Hector; Aune, Stephan; Berthoumieux, Eric; Bedfer, Yann; Calvet, Denis; d'Hose, Nicole; Delbart, Alain; Diakaki, Maria; Ferrer-Ribas, Esther; Ferrero, Andrea; Kunne, Fabienne; Neyret, Damien; Papaevangelou, Thomas; Sabatié, Franck; Vanderbroucke, Maxence; Tan, AnDi; Haxton, Wick; Mei, Yuan; Kobdaj, Chinorat; Yan, Yu-Peng

    2017-06-01

    Searching for the neutrinoless double beta decay (NLDBD) is now regarded as the topmost promising technique to explore the nature of neutrinos after the discovery of neutrino masses in oscillation experiments. PandaX-III (particle and astrophysical xenon experiment III) will search for the NLDBD of 136Xe at the China Jin Ping Underground Laboratory (CJPL). In the first phase of the experiment, a high pressure gas Time Projection Chamber (TPC) will contain 200 kg, 90% 136Xe enriched gas operated at 10 bar. Fine pitch micro-pattern gas detector (Microbulk Micromegas) will be used at both ends of the TPC for the charge readout with a cathode in the middle. Charge signals can be used to reconstruct the electron tracks of the NLDBD events and provide good energy and spatial resolution. The detector will be immersed in a large water tank to ensure 5 m of water shielding in all directions. The second phase, a ton-scale experiment, will consist of five TPCs in the same water tank, with improved energy resolution and better control over backgrounds.

  2. Costs of Limiting Route Optimization to Published Waypoints in the Traffic Aware Planner

    NASA Technical Reports Server (NTRS)

    Karr, David A.; Vivona, Robert A.; Wing, David J.

    2013-01-01

    The Traffic Aware Planner (TAP) is an airborne advisory tool that generates optimized, traffic-avoiding routes to support the aircraft crew in making strategic reroute requests to Air Traffic Control (ATC). TAP is derived from a research-prototype self-separation tool, the Autonomous Operations Planner (AOP), in which optimized route modifications that avoid conflicts with traffic and weather, using waypoints at explicit latitudes and longitudes (a technique supported by self-separation concepts), are generated by maneuver patterns applied to the existing route. For use in current-day operations in which trajectory changes must be requested from ATC via voice communication, TAP produces optimized routes described by advisories that use only published waypoints prior to a reconnection waypoint on the existing route. We describe how the relevant algorithms of AOP have been modified to implement this requirement. The modifications include techniques for finding appropriate published waypoints in a maneuver pattern and a method for combining the genetic algorithm of AOP with an exhaustive search of certain types of advisory. We demonstrate methods to investigate the increased computation required by these techniques and to estimate other costs (measured in terms such as time to destination and fuel burned) that may be incurred when only published waypoints are used.

  3. Self-evaluation on Motion Adaptation for Service Robots

    NASA Astrophysics Data System (ADS)

    Funabora, Yuki; Yano, Yoshikazu; Doki, Shinji; Okuma, Shigeru

    We suggest self motion evaluation method to adapt to environmental changes for service robots. Several motions such as walking, dancing, demonstration and so on are described with time series patterns. These motions are optimized with the architecture of the robot and under certain surrounding environment. Under unknown operating environment, robots cannot accomplish their tasks. We propose autonomous motion generation techniques based on heuristic search with histories of internal sensor values. New motion patterns are explored under unknown operating environment based on self-evaluation. Robot has some prepared motions which realize the tasks under the designed environment. Internal sensor values observed under the designed environment with prepared motions show the interaction results with the environment. Self-evaluation is composed of difference of internal sensor values between designed environment and unknown operating environment. Proposed method modifies the motions to synchronize the interaction results on both environment. New motion patterns are generated to maximize self-evaluation function without external information, such as run length, global position of robot, human observation and so on. Experimental results show that the possibility to adapt autonomously patterned motions to environmental changes.

  4. Computational mining for hypothetical patterns of amino acid side chains in protein data bank (PDB)

    NASA Astrophysics Data System (ADS)

    Ghani, Nur Syatila Ab; Firdaus-Raih, Mohd

    2018-04-01

    The three-dimensional structure of a protein can provide insights regarding its function. Functional relationship between proteins can be inferred from fold and sequence similarities. In certain cases, sequence or fold comparison fails to conclude homology between proteins with similar mechanism. Since the structure is more conserved than the sequence, a constellation of functional residues can be similarly arranged among proteins of similar mechanism. Local structural similarity searches are able to detect such constellation of amino acids among distinct proteins, which can be useful to annotate proteins of unknown function. Detection of such patterns of amino acids on a large scale can increase the repertoire of important 3D motifs since available known 3D motifs currently, could not compensate the ever-increasing numbers of uncharacterized proteins to be annotated. Here, a computational platform for an automated detection of 3D motifs is described. A fuzzy-pattern searching algorithm derived from IMagine an Amino Acid 3D Arrangement search EnGINE (IMAAAGINE) was implemented to develop an automated method for searching of hypothetical patterns of amino acid side chains in Protein Data Bank (PDB), without the need for prior knowledge on related sequence or structure of pattern of interest. We present an example of the searches, which is the detection of a hypothetical pattern derived from known structural motif of C2H2 structural pattern from zinc fingers. The conservation of particular patterns of amino acid side chains in unrelated proteins is highlighted. This approach can act as a complementary method for available structure- and sequence-based platforms and may contribute in improving functional association between proteins.

  5. Prey field switching based on preferential behaviour can induce Lévy flights

    PubMed Central

    Lundy, Mathieu G.; Harrison, Alan; Buckley, Daniel J.; Boston, Emma S.; Scott, David D.; Teeling, Emma C.; Montgomery, W. Ian; Houghton, Jonathan D. R.

    2013-01-01

    Using the foraging movements of an insectivorous bat, Myotis mystacinus, we describe temporal switching of foraging behaviour in response to resource availability. These observations conform to predictions of optimized search under the Lévy flight paradigm. However, we suggest that this occurs as a result of a preference behaviour and knowledge of resource distribution. Preferential behaviour and knowledge of a familiar area generate distinct movement patterns as resource availability changes on short temporal scales. The behavioural response of predators to changes in prey fields can elicit different functional responses, which are considered to be central in the development of stable predator–prey communities. Recognizing how the foraging movements of an animal relate to environmental conditions also elucidates the evolution of optimized search and the prevalence of discrete strategies in natural systems. Applying techniques that use changes in the frequency distribution of movements facilitates exploration of the processes that underpin behavioural changes. PMID:23054951

  6. Path integration mediated systematic search: a Bayesian model.

    PubMed

    Vickerstaff, Robert J; Merkle, Tobias

    2012-08-21

    The systematic search behaviour is a backup system that increases the chances of desert ants finding their nest entrance after foraging when the path integrator has failed to guide them home accurately enough. Here we present a mathematical model of the systematic search that is based on extensive behavioural studies in North African desert ants Cataglyphis fortis. First, a simple search heuristic utilising Bayesian inference and a probability density function is developed. This model, which optimises the short-term nest detection probability, is then compared to three simpler search heuristics and to recorded search patterns of Cataglyphis ants. To compare the different searches a method to quantify search efficiency is established as well as an estimate of the error rate in the ants' path integrator. We demonstrate that the Bayesian search heuristic is able to automatically adapt to increasing levels of positional uncertainty to produce broader search patterns, just as desert ants do, and that it outperforms the three other search heuristics tested. The searches produced by it are also arguably the most similar in appearance to the ant's searches. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. Pattern Classifications Using Grover's and Ventura's Algorithms in a Two-qubits System

    NASA Astrophysics Data System (ADS)

    Singh, Manu Pratap; Radhey, Kishori; Rajput, B. S.

    2018-03-01

    Carrying out the classification of patterns in a two-qubit system by separately using Grover's and Ventura's algorithms on different possible superposition, it has been shown that the exclusion superposition and the phase-invariance superposition are the most suitable search states obtained from two-pattern start-states and one-pattern start-states, respectively, for the simultaneous classifications of patterns. The higher effectiveness of Grover's algorithm for large search states has been verified but the higher effectiveness of Ventura's algorithm for smaller data base has been contradicted in two-qubit systems and it has been demonstrated that the unknown patterns (not present in the concerned data-base) are classified more efficiently than the known ones (present in the data-base) in both the algorithms. It has also been demonstrated that different states of Singh-Rajput MES obtained from the corresponding self-single- pattern start states are the most suitable search states for the classification of patterns |00>,|01 >, |10> and |11> respectively on the second iteration of Grover's method or the first operation of Ventura's algorithm.

  8. Google Searches for "Cheap Cigarettes" Spike at Tax Increases: Evidence from an Algorithm to Detect Spikes in Time Series Data.

    PubMed

    Caputi, Theodore L

    2018-05-03

    Online cigarette dealers have lower prices than brick-and-mortar retailers and advertise tax-free status.1-8 Previous studies show smokers search out these online alternatives at the time of a cigarette tax increase.9,10 However, these studies rely upon researchers' decision to consider a specific date and preclude the possibility that researchers focus on the wrong date. The purpose of this study is to introduce an unbiased methodology to the field of observing search patterns and to use this methodology to determine whether smokers search Google for "cheap cigarettes" at cigarette tax increases and, if so, whether the increased level of searches persists. Publicly available data from Google Trends is used to observe standardized search volumes for the term, "cheap cigarettes". Seasonal Hybrid Extreme Studentized Deviate and E-Divisive with Means tests were performed to observe spikes and mean level shifts in search volume. Of the twelve cigarette tax increases studied, ten showed spikes in searches for "cheap cigarettes" within two weeks of the tax increase. However, the mean level shifts did not occur for any cigarette tax increase. Searches for "cheap cigarettes" spike around the time of a cigarette tax increase, but the mean level of searches does not shift in response to a tax increase. The SHESD and EDM tests are unbiased methodologies that can be used to identify spikes and mean level shifts in time series data without an a priori date to be studied. SHESD and EDM affirm spikes in interest are related to tax increases. • Applies improved statistical techniques (SHESD and EDM) to Google search data related to cigarettes, reducing bias and increasing power • Contributes to the body of evidence that state and federal tax increases are associated with spikes in searches for cheap cigarettes and may be good dates for increased online health messaging related to tobacco.

  9. cDNA-AFLP analysis reveals differential gene expression in compatible interaction of wheat challenged with Puccinia striiformis f. sp. tritici

    PubMed Central

    Wang, Xiaojie; Tang, Chunlei; Zhang, Gang; Li, Yingchun; Wang, Chenfang; Liu, Bo; Qu, Zhipeng; Zhao, Jie; Han, Qingmei; Huang, Lili; Chen, Xianming; Kang, Zhensheng

    2009-01-01

    Background Puccinia striiformis f. sp. tritici is a fungal pathogen causing stripe rust, one of the most important wheat diseases worldwide. The fungus is strictly biotrophic and thus, completely dependent on living host cells for its reproduction, which makes it difficult to study genes of the pathogen. In spite of its economic importance, little is known about the molecular basis of compatible interaction between the pathogen and wheat host. In this study, we identified wheat and P. striiformis genes associated with the infection process by conducting a large-scale transcriptomic analysis using cDNA-AFLP. Results Of the total 54,912 transcript derived fragments (TDFs) obtained using cDNA-AFLP with 64 primer pairs, 2,306 (4.2%) displayed altered expression patterns after inoculation, of which 966 showed up-regulated and 1,340 down-regulated. 186 TDFs produced reliable sequences after sequencing of 208 TDFs selected, of which 74 (40%) had known functions through BLAST searching the GenBank database. Majority of the latter group had predicted gene products involved in energy (13%), signal transduction (5.4%), disease/defence (5.9%) and metabolism (5% of the sequenced TDFs). BLAST searching of the wheat stem rust fungus genome database identified 18 TDFs possibly from the stripe rust pathogen, of which 9 were validated of the pathogen origin using PCR-based assays followed by sequencing confirmation. Of the 186 reliable TDFs, 29 homologous to genes known to play a role in disease/defense, signal transduction or uncharacterized genes were further selected for validation of cDNA-AFLP expression patterns using qRT-PCR analyses. Results confirmed the altered expression patterns of 28 (96.5%) genes revealed by the cDNA-AFLP technique. Conclusion The results show that cDNA-AFLP is a reliable technique for studying expression patterns of genes involved in the wheat-stripe rust interactions. Genes involved in compatible interactions between wheat and the stripe rust pathogen were identified and their expression patterns were determined. The present study should be helpful in elucidating the molecular basis of the infection process, and identifying genes that can be targeted for inhibiting the growth and reproduction of the pathogen. Moreover, this study can also be used to elucidate the defence responses of the genes that were of plant origin. PMID:19566949

  10. The effect of scleral search coil lens wear on the eye.

    PubMed

    Murphy, P J; Duncan, A L; Glennie, A J; Knox, P C

    2001-03-01

    Scleral search coils are used to measure eye movements. A recent abstract suggests that the coil can affect the eye by decreasing visual acuity, increasing intraocular pressure, and damaging the corneal and conjunctival surface. Such findings, if repeated in all subjects, would cast doubt on the credibility of the search coil as a reliable investigative technique. The aim of this study was to reassess the effect of the scleral search coil on visual function. Six volunteer subjects were selected to undergo coil wear and baseline measurements were taken of logMAR visual acuity, non-contact tonometry, keratometry, and slit lamp examination. Four drops of 0.4% benoxinate hydrochloride were instilled before insertion of the lens by an experienced clinician. The lens then remained on the eye for 30 minutes. Measurements of the four ocular health parameters were repeated after 15 and 30 minutes of lens wear. The lens was then removed and the health of the eye reassessed. No obvious pattern of change was found in logMAR visual acuity, keratometry, or intraocular pressure. The lens did produce changes to the conjunctival and corneal surfaces, but this was not considered clinically significant. Search coils do not appear to cause any significant effects on visual function. However, thorough prescreening of subjects and post-wear checks should be carried out on all coil wearers to ensure no adverse effects have been caused.

  11. Native Language Processing using Exegy Text Miner

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Compton, J

    2007-10-18

    Lawrence Livermore National Laboratory's New Architectures Testbed recently evaluated Exegy's Text Miner appliance to assess its applicability to high-performance, automated native language analysis. The evaluation was performed with support from the Computing Applications and Research Department in close collaboration with Global Security programs, and institutional activities in native language analysis. The Exegy Text Miner is a special-purpose device for detecting and flagging user-supplied patterns of characters, whether in streaming text or in collections of documents at very high rates. Patterns may consist of simple lists of words or complex expressions with sub-patterns linked by logical operators. These searches are accomplishedmore » through a combination of specialized hardware (i.e., one or more field-programmable gates arrays in addition to general-purpose processors) and proprietary software that exploits these individual components in an optimal manner (through parallelism and pipelining). For this application the Text Miner has performed accurately and reproducibly at high speeds approaching those documented by Exegy in its technical specifications. The Exegy Text Miner is primarily intended for the single-byte ASCII characters used in English, but at a technical level its capabilities are language-neutral and can be applied to multi-byte character sets such as those found in Arabic and Chinese. The system is used for searching databases or tracking streaming text with respect to one or more lexicons. In a real operational environment it is likely that data would need to be processed separately for each lexicon or search technique. However, the searches would be so fast that multiple passes should not be considered as a limitation a priori. Indeed, it is conceivable that large databases could be searched as often as necessary if new queries were deemed worthwhile. This project is concerned with evaluating the Exegy Text Miner installed in the New Architectures Testbed running under software version 2.0. The concrete goals of the evaluation were to test the speed and accuracy of the Exegy and explore ways that it could be employed in current or future text-processing projects at Lawrence Livermore National Laboratory (LLNL). This study extended beyond this to evaluate its suitability for processing foreign language sources. The scope of this study was limited to the capabilities of the Exegy Text Miner in the file search mode and does not attempt simulating the streaming mode. Since the capabilities of the machine are invariant to the choice of input mode and since timing should not depend on this choice, it was felt that the added effort was not necessary for this restricted study.« less

  12. An automated approach to design of solid rockets utilizing a special internal ballistics model

    NASA Technical Reports Server (NTRS)

    Sforzini, R. H.

    1980-01-01

    A pattern search technique is presented, which is utilized in a computer program that minimizes the sum of the squares of the differences, at various times, between a desired thrust-time trace and that calculated with a special mathematical internal ballistics model of a solid propellant rocket motor. The program is demonstrated by matching the thrust-time trace obtained from static tests of the first Space Shuttle SRM starting with input values of 10 variables which are, in general, 10% different from the as-built SRM. It is concluded that an excellent match is obtained.

  13. Choice: 36 band feature selection software with applications to multispectral pattern recognition

    NASA Technical Reports Server (NTRS)

    Jones, W. C.

    1973-01-01

    Feature selection software was developed at the Earth Resources Laboratory that is capable of inputting up to 36 channels and selecting channel subsets according to several criteria based on divergence. One of the criterion used is compatible with the table look-up classifier requirements. The software indicates which channel subset best separates (based on average divergence) each class from all other classes. The software employs an exhaustive search technique, and computer time is not prohibitive. A typical task to select the best 4 of 22 channels for 12 classes takes 9 minutes on a Univac 1108 computer.

  14. [Reason and emotion: integration of cognitive-behavioural and experiential interventions in the treatment of long evolution food disorders].

    PubMed

    Vilariño Besteiro, M P; Pérez Franco, C; Gallego Morales, L; Calvo Sagardoy, R; García de Lorenzo, A

    2009-01-01

    This paper intends to show the combination of therapeutical strategies in the treatment of long evolution food disorders. This fashion of work entitled "Modelo Santa Cristina" is based on several theoretical paradigms: Enabling Model, Action Control Model, Change Process Transtheoretical Model and Cognitive-Behavioural Model (Cognitive Restructuring and Learning Theories). Furthermore, Gestalt, Systemic and Psychodrama Orientation Techniques. The purpose of the treatment is both the normalization of food patterns and the increase in self-knowledge, self-acceptance and self-efficacy of patients. The exploration of ambivalence to change, the discovery of the functions of symptoms and the search for alternative behaviours, the normalization of food patterns, bodily image, cognitive restructuring, decision taking, communication skills and elaboration of traumatic experiences are among the main areas of intervention.

  15. Dynamic search and working memory in social recall.

    PubMed

    Hills, Thomas T; Pachur, Thorsten

    2012-01-01

    What are the mechanisms underlying search in social memory (e.g., remembering the people one knows)? Do the search mechanisms involve dynamic local-to-global transitions similar to semantic search, and are these transitions governed by the general control of attention, associated with working memory span? To find out, we asked participants to recall individuals from their personal social networks and measured each participant's working memory capacity. Additionally, participants provided social-category and contact-frequency information about the recalled individuals as well as information about the social proximity among the recalled individuals. On the basis of these data, we tested various computational models of memory search regarding their ability to account for the patterns in which participants recalled from social memory. Although recall patterns showed clustering based on social categories, models assuming dynamic transitions between representations cued by social proximity and frequency information predicted participants' recall patterns best-no additional explanatory power was gained from social-category information. Moreover, individual differences in the time between transitions were positively correlated with differences in working memory capacity. These results highlight the role of social proximity in structuring social memory and elucidate the role of working memory for maintaining search criteria during search within that structure.

  16. Signatures of a globally optimal searching strategy in the three-dimensional foraging flights of bumblebees

    NASA Astrophysics Data System (ADS)

    Lihoreau, Mathieu; Ings, Thomas C.; Chittka, Lars; Reynolds, Andy M.

    2016-07-01

    Simulated annealing is a powerful stochastic search algorithm for locating a global maximum that is hidden among many poorer local maxima in a search space. It is frequently implemented in computers working on complex optimization problems but until now has not been directly observed in nature as a searching strategy adopted by foraging animals. We analysed high-speed video recordings of the three-dimensional searching flights of bumblebees (Bombus terrestris) made in the presence of large or small artificial flowers within a 0.5 m3 enclosed arena. Analyses of the three-dimensional flight patterns in both conditions reveal signatures of simulated annealing searches. After leaving a flower, bees tend to scan back-and forth past that flower before making prospecting flights (loops), whose length increases over time. The search pattern becomes gradually more expansive and culminates when another rewarding flower is found. Bees then scan back and forth in the vicinity of the newly discovered flower and the process repeats. This looping search pattern, in which flight step lengths are typically power-law distributed, provides a relatively simple yet highly efficient strategy for pollinators such as bees to find best quality resources in complex environments made of multiple ephemeral feeding sites with nutritionally variable rewards.

  17. Visual search for facial expressions of emotions: a comparison of dynamic and static faces.

    PubMed

    Horstmann, Gernot; Ansorge, Ulrich

    2009-02-01

    A number of past studies have used the visual search paradigm to examine whether certain aspects of emotional faces are processed preattentively and can thus be used to guide attention. All these studies presented static depictions of facial prototypes. Emotional expressions conveyed by the movement patterns of the face have never been examined for their preattentive effect. The present study presented for the first time dynamic facial expressions in a visual search paradigm. Experiment 1 revealed efficient search for a dynamic angry face among dynamic friendly faces, but inefficient search in a control condition with static faces. Experiments 2 to 4 suggested that this pattern of results is due to a stronger movement signal in the angry than in the friendly face: No (strong) advantage of dynamic over static faces is revealed when the degree of movement is controlled. These results show that dynamic information can be efficiently utilized in visual search for facial expressions. However, these results do not generally support the hypothesis that emotion-specific movement patterns are always preattentively discriminated. (c) 2009 APA, all rights reserved

  18. Optimization of Online Searching by Pre-Recording the Search Statements: A Technique for the HP-2645A Terminal.

    ERIC Educational Resources Information Center

    Oberhauser, O. C.; Stebegg, K.

    1982-01-01

    Describes the terminal's capabilities, ways to store and call up lines of statements, cassette tapes needed during searches, and master tape's use for login storage. Advantages of the technique and two sources are listed. (RBF)

  19. Transterm—extended search facilities and improved integration with other databases

    PubMed Central

    Jacobs, Grant H.; Stockwell, Peter A.; Tate, Warren P.; Brown, Chris M.

    2006-01-01

    Transterm has now been publicly available for >10 years. Major changes have been made since its last description in this database issue in 2002. The current database provides data for key regions of mRNA sequences, a curated database of mRNA motifs and tools to allow users to investigate their own motifs or mRNA sequences. The key mRNA regions database is derived computationally from Genbank. It contains 3′ and 5′ flanking regions, the initiation and termination signal context and coding sequence for annotated CDS features from Genbank and RefSeq. The database is non-redundant, enabling summary files and statistics to be prepared for each species. Advances include providing extended search facilities, the database may now be searched by BLAST in addition to regular expressions (patterns) allowing users to search for motifs such as known miRNA sequences, and the inclusion of RefSeq data. The database contains >40 motifs or structural patterns important for translational control. In this release, patterns from UTRsite and Rfam are also incorporated with cross-referencing. Users may search their sequence data with Transterm or user-defined patterns. The system is accessible at . PMID:16381889

  20. Kangaroo – A pattern-matching program for biological sequences

    PubMed Central

    2002-01-01

    Background Biologists are often interested in performing a simple database search to identify proteins or genes that contain a well-defined sequence pattern. Many databases do not provide straightforward or readily available query tools to perform simple searches, such as identifying transcription binding sites, protein motifs, or repetitive DNA sequences. However, in many cases simple pattern-matching searches can reveal a wealth of information. We present in this paper a regular expression pattern-matching tool that was used to identify short repetitive DNA sequences in human coding regions for the purpose of identifying potential mutation sites in mismatch repair deficient cells. Results Kangaroo is a web-based regular expression pattern-matching program that can search for patterns in DNA, protein, or coding region sequences in ten different organisms. The program is implemented to facilitate a wide range of queries with no restriction on the length or complexity of the query expression. The program is accessible on the web at http://bioinfo.mshri.on.ca/kangaroo/ and the source code is freely distributed at http://sourceforge.net/projects/slritools/. Conclusion A low-level simple pattern-matching application can prove to be a useful tool in many research settings. For example, Kangaroo was used to identify potential genetic targets in a human colorectal cancer variant that is characterized by a high frequency of mutations in coding regions containing mononucleotide repeats. PMID:12150718

  1. Evidence of Levy walk foraging patterns in human hunter-gatherers.

    PubMed

    Raichlen, David A; Wood, Brian M; Gordon, Adam D; Mabulla, Audax Z P; Marlowe, Frank W; Pontzer, Herman

    2014-01-14

    When searching for food, many organisms adopt a superdiffusive, scale-free movement pattern called a Lévy walk, which is considered optimal when foraging for heterogeneously located resources with little prior knowledge of distribution patterns [Viswanathan GM, da Luz MGE, Raposo EP, Stanley HE (2011) The Physics of Foraging: An Introduction to Random Searches and Biological Encounters]. Although memory of food locations and higher cognition may limit the benefits of random walk strategies, no studies to date have fully explored search patterns in human foraging. Here, we show that human hunter-gatherers, the Hadza of northern Tanzania, perform Lévy walks in nearly one-half of all foraging bouts. Lévy walks occur when searching for a wide variety of foods from animal prey to underground tubers, suggesting that, even in the most cognitively complex forager on Earth, such patterns are essential to understanding elementary foraging mechanisms. This movement pattern may be fundamental to how humans experience and interact with the world across a wide range of ecological contexts, and it may be adaptive to food distribution patterns on the landscape, which previous studies suggested for organisms with more limited cognition. Additionally, Lévy walks may have become common early in our genus when hunting and gathering arose as a major foraging strategy, playing an important role in the evolution of human mobility.

  2. Social Search: A Taxonomy of, and a User-Centred Approach to, Social Web Search

    ERIC Educational Resources Information Center

    McDonnell, Michael; Shiri, Ali

    2011-01-01

    Purpose: The purpose of this paper is to introduce the notion of social search as a new concept, drawing upon the patterns of web search behaviour. It aims to: define social search; present a taxonomy of social search; and propose a user-centred social search method. Design/methodology/approach: A mixed method approach was adopted to investigate…

  3. Searching for Truth: Internet Search Patterns as a Method of Investigating Online Responses to a Russian Illicit Drug Policy Debate

    PubMed Central

    Gillespie, James A; Quinn, Casey

    2012-01-01

    Background This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. Objective This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. Methods A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. Results We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r s = 0.88, P < .001), “Bychkov” (r s = .78, P < .001) and “Khimki”(r s = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for “Bychkov” and 48,084 for “Egor Bychkov”, compared to 53,403 for “Khimki” in Yandex. We found Google potentially provides timely search results, whereas Yandex provides more accurate geographic localization. The correlation was moderate to strong between search terms representing the Bychkov episode and terms representing salient drug issues in Yandex–“illicit drug treatment” (r s = .90, P < .001), "illicit drugs" (r s = .76, P < .001), and "drug addiction" (r s = .74, P < .001). Google correlations were weaker or absent–"illicit drug treatment" (r s = .12, P = .58), “illicit drugs ” (r s = -0.29, P = .17), and "drug addiction" (r s = .68, P < .001). Conclusions This study contributes to the methodological literature on the analysis of search patterns for public health. This paper investigated the relationship between Google and Yandex, and contributed to the broader methods literature by highlighting both the potential and limitations of these two search providers. We believe that Yandex Wordstat is a potentially valuable, and underused data source for researchers working on Russian-related illicit drug policy and other public health problems. The Russian Federation, with its large, geographically dispersed, and politically engaged online population presents unique opportunities for studying the evolving influence of the Internet on politics and policy, using low cost methods resilient against potential increases in censorship. PMID:23238600

  4. Searching for truth: internet search patterns as a method of investigating online responses to a Russian illicit drug policy debate.

    PubMed

    Zheluk, Andrey; Gillespie, James A; Quinn, Casey

    2012-12-13

    This is a methodological study investigating the online responses to a national debate over an important health and social problem in Russia. Russia is the largest Internet market in Europe, exceeding Germany in the absolute number of users. However, Russia is unusual in that the main search provider is not Google, but Yandex. This study had two main objectives. First, to validate Yandex search patterns against those provided by Google, and second, to test this method's adequacy for investigating online interest in a 2010 national debate over Russian illicit drug policy. We hoped to learn what search patterns and specific search terms could reveal about the relative importance and geographic distribution of interest in this debate. A national drug debate, centering on the anti-drug campaigner Egor Bychkov, was one of the main Russian domestic news events of 2010. Public interest in this episode was accompanied by increased Internet search. First, we measured the search patterns for 13 search terms related to the Bychkov episode and concurrent domestic events by extracting data from Google Insights for Search (GIFS) and Yandex WordStat (YaW). We conducted Spearman Rank Correlation of GIFS and YaW search data series. Second, we coded all 420 primary posts from Bychkov's personal blog between March 2010 and March 2012 to identify the main themes. Third, we compared GIFS and Yandex policies concerning the public release of search volume data. Finally, we established the relationship between salient drug issues and the Bychkov episode. We found a consistent pattern of strong to moderate positive correlations between Google and Yandex for the terms "Egor Bychkov" (r(s) = 0.88, P < .001), "Bychkov" (r(s) = .78, P < .001) and "Khimki"(r(s) = 0.92, P < .001). Peak search volumes for the Bychkov episode were comparable to other prominent domestic political events during 2010. Monthly search counts were 146,689 for "Bychkov" and 48,084 for "Egor Bychkov", compared to 53,403 for "Khimki" in Yandex. We found Google potentially provides timely search results, whereas Yandex provides more accurate geographic localization. The correlation was moderate to strong between search terms representing the Bychkov episode and terms representing salient drug issues in Yandex-"illicit drug treatment" (r(s) = .90, P < .001), "illicit drugs" (r(s) = .76, P < .001), and "drug addiction" (r(s) = .74, P < .001). Google correlations were weaker or absent-"illicit drug treatment" (r(s) = .12, P = .58), "illicit drugs " (r(s) = -0.29, P = .17), and "drug addiction" (r(s) = .68, P < .001). This study contributes to the methodological literature on the analysis of search patterns for public health. This paper investigated the relationship between Google and Yandex, and contributed to the broader methods literature by highlighting both the potential and limitations of these two search providers. We believe that Yandex Wordstat is a potentially valuable, and underused data source for researchers working on Russian-related illicit drug policy and other public health problems. The Russian Federation, with its large, geographically dispersed, and politically engaged online population presents unique opportunities for studying the evolving influence of the Internet on politics and policy, using low cost methods resilient against potential increases in censorship.

  5. Why Lévy Foraging does not need to be 'unshackled' from Optimal Foraging Theory. Comment on "Liberating Lévy walk research from the shackles of optimal foraging" by A.M. Reynolds

    NASA Astrophysics Data System (ADS)

    Humphries, Nicolas E.

    2015-09-01

    The comprehensive review of Lévy patterns observed in the moves and pauses of a vast array of organisms by Reynolds [1] makes clear a need to attempt to unify phenomena to understand how organism movement may have evolved. However, I would contend that the research on Lévy 'movement patterns' we detect in time series of animal movements has to a large extent been misunderstood. The statistical techniques, such as Maximum Likelihood Estimation, used to detect these patterns look only at the statistical distribution of move step-lengths and not at the actual pattern, or structure, of the movement path. The path structure is lost altogether when move step-lengths are sorted prior to analysis. Likewise, the simulated movement paths, with step-lengths drawn from a truncated power law distribution in order to test characteristics of the path, such as foraging efficiency, in no way match the actual paths, or trajectories, of real animals. These statistical distributions are, therefore, null models of searching or foraging activity. What has proved surprising about these step-length distributions is the extent to which they improve the efficiency of random searches over simple Brownian motion. It has been shown unequivocally that a power law distribution of move step lengths is more efficient, in terms of prey items located per unit distance travelled, than any other distribution of move step-lengths so far tested (up to 3 times better than Brownian), and over a range of prey field densities spanning more than 4 orders of magnitude [2].

  6. Developing a Systematic Patent Search Training Program

    ERIC Educational Resources Information Center

    Zhang, Li

    2009-01-01

    This study aims to develop a systematic patent training program using patent analysis and citation analysis techniques applied to patents held by the University of Saskatchewan. The results indicate that the target audience will be researchers in life sciences, and aggregated patent database searching and advanced search techniques should be…

  7. Combining results of multiple search engines in proteomics.

    PubMed

    Shteynberg, David; Nesvizhskii, Alexey I; Moritz, Robert L; Deutsch, Eric W

    2013-09-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques.

  8. Combining Results of Multiple Search Engines in Proteomics*

    PubMed Central

    Shteynberg, David; Nesvizhskii, Alexey I.; Moritz, Robert L.; Deutsch, Eric W.

    2013-01-01

    A crucial component of the analysis of shotgun proteomics datasets is the search engine, an algorithm that attempts to identify the peptide sequence from the parent molecular ion that produced each fragment ion spectrum in the dataset. There are many different search engines, both commercial and open source, each employing a somewhat different technique for spectrum identification. The set of high-scoring peptide-spectrum matches for a defined set of input spectra differs markedly among the various search engine results; individual engines each provide unique correct identifications among a core set of correlative identifications. This has led to the approach of combining the results from multiple search engines to achieve improved analysis of each dataset. Here we review the techniques and available software for combining the results of multiple search engines and briefly compare the relative performance of these techniques. PMID:23720762

  9. Breeding novel solutions in the brain: a model of Darwinian neurodynamics.

    PubMed

    Szilágyi, András; Zachar, István; Fedor, Anna; de Vladar, Harold P; Szathmáry, Eörs

    2016-01-01

    Background : The fact that surplus connections and neurons are pruned during development is well established. We complement this selectionist picture by a proof-of-principle model of evolutionary search in the brain, that accounts for new variations in theory space. We present a model for Darwinian evolutionary search for candidate solutions in the brain. Methods : We combine known components of the brain - recurrent neural networks (acting as attractors), the action selection loop and implicit working memory - to provide the appropriate Darwinian architecture. We employ a population of attractor networks with palimpsest memory. The action selection loop is employed with winners-share-all dynamics to select for candidate solutions that are transiently stored in implicit working memory. Results : We document two processes: selection of stored solutions and evolutionary search for novel solutions. During the replication of candidate solutions attractor networks occasionally produce recombinant patterns, increasing variation on which selection can act. Combinatorial search acts on multiplying units (activity patterns) with hereditary variation and novel variants appear due to (i) noisy recall of patterns from the attractor networks, (ii) noise during transmission of candidate solutions as messages between networks, and, (iii) spontaneously generated, untrained patterns in spurious attractors. Conclusions : Attractor dynamics of recurrent neural networks can be used to model Darwinian search. The proposed architecture can be used for fast search among stored solutions (by selection) and for evolutionary search when novel candidate solutions are generated in successive iterations. Since all the suggested components are present in advanced nervous systems, we hypothesize that the brain could implement a truly evolutionary combinatorial search system, capable of generating novel variants.

  10. Evidence-Based Practice in Liposuction.

    PubMed

    Collins, Patrick S; Moyer, Kurtis E

    2018-06-01

    The goal of this study is to examine the existing peer reviewed literature comparing modern adjunctive techniques in liposuction including laser-assisted liposuction (LAL) and ultrasound-assisted liposuction (UAL) to standard suction-assisted liposuction (SAL). We intend to interpret these findings into a literature-based clinical application to influence practice patterns. A literature review was conducted using a keyword search in PubMed. Keyword search items included liposuction, lipoplasty, suction assisted liposuction, ultrasound assisted liposuction, laser assisted liposuction, tumescent, liposuction comparison, liposuction review, and combinations therein. Exclusion criteria included articles with a primary focus on histologic effects of energy devices, primary animal models, primary opinion papers with no reference to available data, and industry-sponsored publications. Inclusion criteria included articles with direct comparison of liposuction modalities, randomized or blinded studies, and studies with objective outcomes. Twenty-five articles that met the inclusion criteria comparing SAL to UAL or LAL out of 9972 articles identified were obtained. The selected literature was assigned into 3 categories: evidence demonstrating an advantage of 1 modality (SAL, UAL, or LAL) over another, evidence that showed no benefit of 1 modality over another, and evidence that demonstrated risks of complications of 1 modality over another. The benefits of UAL and LAL over SAL include the following: (1) UAL over SAL in the treatment of gynecomastia, (2) LAL and UAL over SAL with decreased hemoglobin/hematocrit in high-volume lipoaspirates, and (3) LAL over SAL with skin tightening in select areas specifically the submental area. Otherwise, the literature demonstrates equivocal results among the described techniques with no clear benefit to set one apart from the other. There appears to be no demonstrable added benefit to the addition of either UAL or LAL that would urge a change in practice patterns outside the exceptions listed.

  11. Optimization of Boiling Water Reactor Loading Pattern Using Two-Stage Genetic Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobayashi, Yoko; Aiyoshi, Eitaro

    2002-10-15

    A new two-stage optimization method based on genetic algorithms (GAs) using an if-then heuristic rule was developed to generate optimized boiling water reactor (BWR) loading patterns (LPs). In the first stage, the LP is optimized using an improved GA operator. In the second stage, an exposure-dependent control rod pattern (CRP) is sought using GA with an if-then heuristic rule. The procedure of the improved GA is based on deterministic operators that consist of crossover, mutation, and selection. The handling of the encoding technique and constraint conditions by that GA reflects the peculiar characteristics of the BWR. In addition, strategies suchmore » as elitism and self-reproduction are effectively used in order to improve the search speed. The LP evaluations were performed with a three-dimensional diffusion code that coupled neutronic and thermal-hydraulic models. Strong axial heterogeneities and constraints dependent on three dimensions have always necessitated the use of three-dimensional core simulators for BWRs, so that optimization of computational efficiency is required. The proposed algorithm is demonstrated by successfully generating LPs for an actual BWR plant in two phases. One phase is only LP optimization applying the Haling technique. The other phase is an LP optimization that considers the CRP during reactor operation. In test calculations, candidates that shuffled fresh and burned fuel assemblies within a reasonable computation time were obtained.« less

  12. Search for patterns by combining cosmic-ray energy and arrival directions at the Pierre Auger Observatory.

    PubMed

    Aab, A; Abreu, P; Aglietta, M; Ahn, E J; Samarai, I Al; Albuquerque, I F M; Allekotte, I; Allen, J; Allison, P; Almela, A; Castillo, J Alvarez; Alvarez-Muñiz, J; Batista, R Alves; Ambrosio, M; Aminaei, A; Anchordoqui, L; Andringa, S; Aramo, C; Aranda, V M; Arqueros, F; Asorey, H; Assis, P; Aublin, J; Ave, M; Avenier, M; Avila, G; Awal, N; Badescu, A M; Barber, K B; Bäuml, J; Baus, C; Beatty, J J; Becker, K H; Bellido, J A; Berat, C; Bertaina, M E; Bertou, X; Biermann, P L; Billoir, P; Blaess, S; Blanco, M; Bleve, C; Blümer, H; Boháčová, M; Boncioli, D; Bonifazi, C; Bonino, R; Borodai, N; Brack, J; Brancus, I; Bridgeman, A; Brogueira, P; Brown, W C; Buchholz, P; Bueno, A; Buitink, S; Buscemi, M; Caballero-Mora, K S; Caccianiga, B; Caccianiga, L; Candusso, M; Caramete, L; Caruso, R; Castellina, A; Cataldi, G; Cazon, L; Cester, R; Chavez, A G; Chiavassa, A; Chinellato, J A; Chudoba, J; Cilmo, M; Clay, R W; Cocciolo, G; Colalillo, R; Coleman, A; Collica, L; Coluccia, M R; Conceição, R; Contreras, F; Cooper, M J; Cordier, A; Coutu, S; Covault, C E; Cronin, J; Curutiu, A; Dallier, R; Daniel, B; Dasso, S; Daumiller, K; Dawson, B R; Almeida, R M de; Domenico, M De; Jong, S J de; Neto, J R T de Mello; Mitri, I De; Oliveira, J de; Souza, V de; Peral, L Del; Deligny, O; Dembinski, H; Dhital, N; Giulio, C Di; Matteo, A Di; Diaz, J C; Castro, M L Díaz; Diogo, F; Dobrigkeit, C; Docters, W; D'Olivo, J C; Dorofeev, A; Hasankiadeh, Q Dorosti; Dova, M T; Ebr, J; Engel, R; Erdmann, M; Erfani, M; Escobar, C O; Espadanal, J; Etchegoyen, A; Luis, P Facal San; Falcke, H; Fang, K; Farrar, G; Fauth, A C; Fazzini, N; Ferguson, A P; Fernandes, M; Fick, B; Figueira, J M; Filevich, A; Filipčič, A; Fox, B D; Fratu, O; Fröhlich, U; Fuchs, B; Fujii, T; Gaior, R; García, B; Roca, S T Garcia; Garcia-Gamez, D; Garcia-Pinto, D; Garilli, G; Bravo, A Gascon; Gate, F; Gemmeke, H; Ghia, P L; Giaccari, U; Giammarchi, M; Giller, M; Glaser, C; Glass, H; Berisso, M Gómez; Vitale, P F Gómez; Gonçalves, P; Gonzalez, J G; González, N; Gookin, B; Gordon, J; Gorgi, A; Gorham, P; Gouffon, P; Grebe, S; Griffith, N; Grillo, A F; Grubb, T D; Guarino, F; Guedes, G P; Hampel, M R; Hansen, P; Harari, D; Harrison, T A; Hartmann, S; Harton, J L; Haungs, A; Hebbeker, T; Heck, D; Heimann, P; Herve, A E; Hill, G C; Hojvat, C; Hollon, N; Holt, E; Homola, P; Hörandel, J R; Horvath, P; Hrabovský, M; Huber, D; Huege, T; Insolia, A; Isar, P G; Jandt, I; Jansen, S; Jarne, C; Josebachuili, M; Kääpä, A; Kambeitz, O; Kampert, K H; Kasper, P; Katkov, I; Kégl, B; Keilhauer, B; Keivani, A; Kemp, E; Kieckhafer, R M; Klages, H O; Kleifges, M; Kleinfeller, J; Krause, R; Krohm, N; Krömer, O; Kruppke-Hansen, D; Kuempel, D; Kunka, N; LaHurd, D; Latronico, L; Lauer, R; Lauscher, M; Lautridou, P; Coz, S Le; Leão, M S A B; Lebrun, D; Lebrun, P; Oliveira, M A Leigui de; Letessier-Selvon, A; Lhenry-Yvon, I; Link, K; López, R; Agüera, A Lopez; Louedec, K; Bahilo, J Lozano; Lu, L; Lucero, A; Ludwig, M; Malacari, M; Maldera, S; Mallamaci, M; Maller, J; Mandat, D; Mantsch, P; Mariazzi, A G; Marin, V; Mariş, I C; Marsella, G; Martello, D; Martin, L; Martinez, H; Bravo, O Martínez; Martraire, D; Meza, J J Masías; Mathes, H J; Mathys, S; Matthews, J; Matthews, J A J; Matthiae, G; Maurel, D; Maurizio, D; Mayotte, E; Mazur, P O; Medina, C; Medina-Tanco, G; Meissner, R; Melissas, M; Melo, D; Menshikov, A; Messina, S; Meyhandan, R; Mićanović, S; Micheletti, M I; Middendorf, L; Minaya, I A; Miramonti, L; Mitrica, B; Molina-Bueno, L; Mollerach, S; Monasor, M; Ragaigne, D Monnier; Montanet, F; Morello, C; Mostafá, M; Moura, C A; Muller, M A; Müller, G; Müller, S; Münchmeyer, M; Mussa, R; Navarra, G; Navas, S; Necesal, P; Nellen, L; Nelles, A; Neuser, J; Nguyen, P; Niechciol, M; Niemietz, L; Niggemann, T; Nitz, D; Nosek, D; Novotny, V; Nožka, L; Ochilo, L; Olinto, A; Oliveira, M; Pacheco, N; Selmi-Dei, D Pakk; Palatka, M; Pallotta, J; Palmieri, N; Papenbreer, P; Parente, G; Parra, A; Paul, T; Pech, M; Pȩkala, J; Pelayo, R; Pepe, I M; Perrone, L; Petermann, E; Peters, C; Petrera, S; Petrov, Y; Phuntsok, J; Piegaia, R; Pierog, T; Pieroni, P; Pimenta, M; Pirronello, V; Platino, M; Plum, M; Porcelli, A; Porowski, C; Prado, R R; Privitera, P; Prouza, M; Purrello, V; Quel, E J; Querchfeld, S; Quinn, S; Rautenberg, J; Ravel, O; Ravignani, D; Revenu, B; Ridky, J; Riggi, S; Risse, M; Ristori, P; Rizi, V; Carvalho, W Rodrigues de; Cabo, I Rodriguez; Fernandez, G Rodriguez; Rojo, J Rodriguez; Rodríguez-Frías, M D; Rogozin, D; Ros, G; Rosado, J; Rossler, T; Roth, M; Roulet, E; Rovero, A C; Saffi, S J; Saftoiu, A; Salamida, F; Salazar, H; Saleh, A; Greus, F Salesa; Salina, G; Sánchez, F; Sanchez-Lucas, P; Santo, C E; Santos, E; Santos, E M; Sarazin, F; Sarkar, B; Sarmento, R; Sato, R; Scharf, N; Scherini, V; Schieler, H; Schiffer, P; Schmidt, D; Schröder, F G; Scholten, O; Schoorlemmer, H; Schovánek, P; Schulz, A; Schulz, J; Schumacher, J; Sciutto, S J; Segreto, A; Settimo, M; Shadkam, A; Shellard, R C; Sidelnik, I; Sigl, G; Sima, O; Kowski, A Śmiał; Šmída, R; Snow, G R; Sommers, P; Sorokin, J; Squartini, R; Srivastava, Y N; Stanič, S; Stapleton, J; Stasielak, J; Stephan, M; Stutz, A; Suarez, F; Suomijärvi, T; Supanitsky, A D; Sutherland, M S; Swain, J; Szadkowski, Z; Szuba, M; Taborda, O A; Tapia, A; Tartare, M; Tepe, A; Theodoro, V M; Timmermans, C; Peixoto, C J Todero; Toma, G; Tomankova, L; Tomé, B; Tonachini, A; Elipe, G Torralba; Machado, D Torres; Travnicek, P; Trovato, E; Tueros, M; Ulrich, R; Unger, M; Urban, M; Galicia, J F Valdés; Valiño, I; Valore, L; Aar, G van; Bodegom, P van; Berg, A M van den; Velzen, S van; Vliet, A van; Varela, E; Vargas Cárdenas, B; Varner, G; Vázquez, J R; Vázquez, R A; Veberič, D; Verzi, V; Vicha, J; Videla, M; Villaseñor, L; Vlcek, B; Vorobiov, S; Wahlberg, H; Wainberg, O; Walz, D; Watson, A A; Weber, M; Weidenhaupt, K; Weindl, A; Werner, F; Widom, A; Wiencke, L; Wilczyńska, B; Wilczyński, H; Will, M; Williams, C; Winchen, T; Wittkowski, D; Wundheiler, B; Wykes, S; Yamamoto, T; Yapici, T; Yuan, G; Yushkov, A; Zamorano, B; Zas, E; Zavrtanik, D; Zavrtanik, M; Zaw, I; Zepeda, A; Zhou, J; Zhu, Y; Silva, M Zimbres; Ziolkowski, M; Zuccarello, F

    Energy-dependent patterns in the arrival directions of cosmic rays are searched for using data of the Pierre Auger Observatory. We investigate local regions around the highest-energy cosmic rays with [Formula: see text] eV by analyzing cosmic rays with energies above [Formula: see text] eV arriving within an angular separation of approximately 15[Formula: see text]. We characterize the energy distributions inside these regions by two independent methods, one searching for angular dependence of energy-energy correlations and one searching for collimation of energy along the local system of principal axes of the energy distribution. No significant patterns are found with this analysis. The comparison of these measurements with astrophysical scenarios can therefore be used to obtain constraints on related model parameters such as strength of cosmic-ray deflection and density of point sources.

  13. Search for patterns by combining cosmic-ray energy and arrival directions at the Pierre Auger Observatory

    DOE PAGES

    Aab, Alexander

    2015-06-20

    Energy-dependent patterns in the arrival directions of cosmic rays are searched for using data of the Pierre Auger Observatory. We investigate local regions around the highest-energy cosmic rays with E ≥ 6×10 19 eV by analyzing cosmic rays with energies above E ≥ 5×10 18 eV arriving within an angular separation of approximately 15°. We characterize the energy distributions inside these regions by two independent methods, one searching for angular dependence of energy-energy correlations and one searching for collimation of energy along the local system of principal axes of the energy distribution. No significant patterns are found with this analysis.more » As a result, the comparison of these measurements with astrophysical scenarios can therefore be used to obtain constraints on related model parameters such as strength of cosmic-ray deflection and density of point sources.« less

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoon, Hong-Jun; Carmichael, Tandy; Tourassi, Georgia

    Previously, we have shown the potential of using an individual s visual search pattern as a possible biometric. That study focused on viewing images displaying dot-patterns with different spatial relationships to determine which pattern can be more effective in establishing the identity of an individual. In this follow-up study we investigated the temporal stability of this biometric. We performed an experiment with 16 individuals asked to search for a predetermined feature of a random-dot pattern as we tracked their eye movements. Each participant completed four testing sessions consisting of two dot patterns repeated twice. One dot pattern displayed concentric circlesmore » shifted to the left or right side of the screen overlaid with visual noise, and participants were asked which side the circles were centered on. The second dot-pattern displayed a number of circles (between 0 and 4) scattered on the screen overlaid with visual noise, and participants were asked how many circles they could identify. Each session contained 5 untracked tutorial questions and 50 tracked test questions (200 total tracked questions per participant). To create each participant s "fingerprint", we constructed a Hidden Markov Model (HMM) from the gaze data representing the underlying visual search and cognitive process. The accuracy of the derived HMM models was evaluated using cross-validation for various time-dependent train-test conditions. Subject identification accuracy ranged from 17.6% to 41.8% for all conditions, which is significantly higher than random guessing (1/16 = 6.25%). The results suggest that visual search pattern is a promising, fairly stable personalized fingerprint of perceptual organization.« less

  15. Vascularization of bioprosthetic valve material

    NASA Astrophysics Data System (ADS)

    Boughner, Derek R.; Dunmore-Buyze, Joy; Heenatigala, Dino; Lohmann, Tara; Ellis, Chris G.

    1999-04-01

    Cell membrane remnants represent a probable nucleation site for calcium deposition in bioprosthetic heart valves. Calcification is a primary failure mode of both bovine pericardial and porcine aortic heterograft bioprosthesis but the nonuniform pattern of calcium distribution within the tissue remains unexplained. Searching for a likely cellular source, we considered the possibility of a previously overlooked small blood vessel network. Using a videomicroscopy technique, we examined 5 matched pairs of porcine aortic and pulmonary valves and 14 samples from 6 bovine pericardia. Tissue was placed on a Leitz Metallux microscope and transilluminated with a 75 watt mercury lamp. Video images were obtained using a silicon intensified target camera equipped with a 431 nm interference filter to maximize contrast of red cells trapped in a capillary microvasculature. Video images were recorded for analysis on a Silicon Graphics Image Analysis work station equipped with a video frame grabber. For porcine valves, the technique demonstrated a vascular bed in the central spongiosa at cusp bases with vessel sizes from 6-80 micrometers . Bovine pericardium differed with a more uniform distribution of 7-100 micrometers vessels residing centrally. Thus, small blood vessel endothelial cells provide a potential explanation patterns of bioprosthetic calcification.

  16. Detection of Upscale-Crop and Partial Manipulation in Surveillance Video Based on Sensor Pattern Noise

    PubMed Central

    Hyun, Dai-Kyung; Ryu, Seung-Jin; Lee, Hae-Yeoun; Lee, Heung-Kyu

    2013-01-01

    In many court cases, surveillance videos are used as significant court evidence. As these surveillance videos can easily be forged, it may cause serious social issues, such as convicting an innocent person. Nevertheless, there is little research being done on forgery of surveillance videos. This paper proposes a forensic technique to detect forgeries of surveillance video based on sensor pattern noise (SPN). We exploit the scaling invariance of the minimum average correlation energy Mellin radial harmonic (MACE-MRH) correlation filter to reliably unveil traces of upscaling in videos. By excluding the high-frequency components of the investigated video and adaptively choosing the size of the local search window, the proposed method effectively localizes partially manipulated regions. Empirical evidence from a large database of test videos, including RGB (Red, Green, Blue)/infrared video, dynamic-/static-scene video and compressed video, indicates the superior performance of the proposed method. PMID:24051524

  17. Improving the nowcasting of precipitation in an Alpine region with an enhanced radar echo tracking algorithm

    NASA Astrophysics Data System (ADS)

    Mecklenburg, S.; Joss, J.; Schmid, W.

    2000-12-01

    Nowcasting for hydrological applications is discussed. The tracking algorithm extrapolates radar images in space and time. It originates from the pattern recognition techniques TREC (Tracking Radar Echoes by Correlation, Rinehart and Garvey, J. Appl. Meteor., 34 (1995) 1286) and COTREC (Continuity of TREC vectors, Li et al., Nature, 273 (1978) 287). To evaluate the quality of the extrapolation, a parameter scheme is introduced, able to distinguish between errors in the position and the intensity of the predicted precipitation. The parameters for the position are the absolute error, the relative error and the error of the forecasted direction. The parameters for the intensity are the ratio of the medians and the variations of the rain rate (ratio of two quantiles) between the actual and the forecasted image. To judge the overall quality of the forecast, the correlation coefficient between the forecasted and the actual radar image has been used. To improve the forecast, three aspects have been investigated: (a) Common meteorological attributes of convective cells, derived from a hail statistics, have been determined to optimize the parameters of the tracking algorithm. Using (a), the forecast procedure modifications (b) and (c) have been applied. (b) Small-scale features have been removed by using larger tracking areas and by applying a spatial and temporal smoothing, since problems with the tracking algorithm are mainly caused by small-scale/short-term variations of the echo pattern or because of limitations caused by the radar technique itself (erroneous vectors caused by clutter or shielding). (c) The searching area and the number of searched boxes have been restricted. This limits false detections, which is especially useful in stratiform precipitation and for stationary echoes. Whereas a larger scale and the removal of small-scale features improve the forecasted position for the convective precipitation, the forecast of the stratiform event is not influenced, but limiting the search area leads to a slightly better forecast. The forecast of the intensity is successful for both precipitation events. Forecasting the variation of the rain rate calls for further investigation. Applying COTREC improves the forecast of the convective precipitation, especially for extrapolation times exceeding 30 min.

  18. Source process and tectonic implication of the January 20, 2007 Odaesan earthquake, South Korea

    NASA Astrophysics Data System (ADS)

    Abdel-Fattah, Ali K.; Kim, K. Y.; Fnais, M. S.; Al-Amri, A. M.

    2014-04-01

    The source process for the 20th of January 2007, Mw 4.5 Odaesan earthquake in South Korea is investigated in the low- and high-frequency bands, using velocity and acceleration waveform data recorded by the Korea Meteorological Administration Seismographic Network at distances less than 70 km from the epicenter. Synthetic Green functions are adopted for the low-frequency band of 0.1-0.3 Hz by using the wave-number integration technique and the one dimensional velocity model beneath the epicentral area. An iterative technique was performed by a grid search across the strike, dip, rake, and focal depth of rupture nucleation parameters to find the best-fit double-couple mechanism. To resolve the nodal plane ambiguity, the spatiotemporal slip distribution on the fault surface was recovered using a non-negative least-square algorithm for each set of the grid-searched parameters. The focal depth of 10 km was determined through the grid search for depths in the range of 6-14 km. The best-fit double-couple mechanism obtained from the finite-source model indicates a vertical strike-slip faulting mechanism. The NW faulting plane gives comparatively smaller root-mean-squares (RMS) error than its auxiliary plane. Slip pattern event provides simple source process due to the effect of Low-frequency that acted as a point source model. Three empirical Green functions are adopted to investigate the source process in the high-frequency band. A set of slip models was recovered on both nodal planes of the focal mechanism with various rupture velocities in the range of 2.0-4.0 km/s. Although there is a small difference between the RMS errors produced by the two orthogonal nodal planes, the SW dipping plane gives a smaller RMS error than its auxiliary plane. The slip distribution is relatively assessable by the oblique pattern recovered around the hypocenter in the high-frequency analysis; indicating a complex rupture scenario for such moderate-sized earthquake, similar to those reported for large earthquakes.

  19. Detecting buried explosive hazards with handheld GPR and deep learning

    NASA Astrophysics Data System (ADS)

    Besaw, Lance E.

    2016-05-01

    Buried explosive hazards (BEHs), including traditional landmines and homemade improvised explosives, have proven difficult to detect and defeat during and after conflicts around the world. Despite their various sizes, shapes and construction material, ground penetrating radar (GPR) is an excellent phenomenology for detecting BEHs due to its ability to sense localized differences in electromagnetic properties. Handheld GPR detectors are common equipment for detecting BEHs because of their flexibility (in part due to the human operator) and effectiveness in cluttered environments. With modern digital electronics and positioning systems, handheld GPR sensors can sense and map variation in electromagnetic properties while searching for BEHs. Additionally, large-scale computers have demonstrated an insatiable appetite for ingesting massive datasets and extracting meaningful relationships. This is no more evident than the maturation of deep learning artificial neural networks (ANNs) for image and speech recognition now commonplace in industry and academia. This confluence of sensing, computing and pattern recognition technologies offers great potential to develop automatic target recognition techniques to assist GPR operators searching for BEHs. In this work deep learning ANNs are used to detect BEHs and discriminate them from harmless clutter. We apply these techniques to a multi-antennae, handheld GPR with centimeter-accurate positioning system that was used to collect data over prepared lanes containing a wide range of BEHs. This work demonstrates that deep learning ANNs can automatically extract meaningful information from complex GPR signatures, complementing existing GPR anomaly detection and classification techniques.

  20. The effect of scleral search coil lens wear on the eye

    PubMed Central

    Murphy, P.; Duncan, A.; Glennie, A.; Knox, P.

    2001-01-01

    BACKGROUND/AIM—Scleral search coils are used to measure eye movements. A recent abstract suggests that the coil can affect the eye by decreasing visual acuity, increasing intraocular pressure, and damaging the corneal and conjunctival surface. Such findings, if repeated in all subjects, would cast doubt on the credibility of the search coil as a reliable investigative technique. The aim of this study was to reassess the effect of the scleral search coil on visual function.
METHODS—Six volunteer subjects were selected to undergo coil wear and baseline measurements were taken of logMAR visual acuity, non-contact tonometry, keratometry, and slit lamp examination. Four drops of 0.4% benoxinate hydrochloride were instilled before insertion of the lens by an experienced clinician. The lens then remained on the eye for 30 minutes. Measurements of the four ocular health parameters were repeated after 15 and 30 minutes of lens wear. The lens was then removed and the health of the eye reassessed.
RESULTS—No obvious pattern of change was found in logMAR visual acuity, keratometry, or intraocular pressure. The lens did produce changes to the conjunctival and corneal surfaces, but this was not considered clinically significant.
CONCLUSION—Search coils do not appear to cause any significant effects on visual function. However, thorough prescreening of subjects and post-wear checks should be carried out on all coil wearers to ensure no adverse effects have been caused.

 PMID:11222341

  1. Seasonal trends in sleep-disordered breathing: evidence from Internet search engine query data.

    PubMed

    Ingram, David G; Matthews, Camilla K; Plante, David T

    2015-03-01

    The primary aim of the current study was to test the hypothesis that there is a seasonal component to snoring and obstructive sleep apnea (OSA) through the use of Google search engine query data. Internet search engine query data were retrieved from Google Trends from January 2006 to December 2012. Monthly normalized search volume was obtained over that 7-year period in the USA and Australia for the following search terms: "snoring" and "sleep apnea". Seasonal effects were investigated by fitting cosinor regression models. In addition, the search terms "snoring children" and "sleep apnea children" were evaluated to examine seasonal effects in pediatric populations. Statistically significant seasonal effects were found using cosinor analysis in both USA and Australia for "snoring" (p < 0.00001 for both countries). Similarly, seasonal patterns were observed for "sleep apnea" in the USA (p = 0.001); however, cosinor analysis was not significant for this search term in Australia (p = 0.13). Seasonal patterns for "snoring children" and "sleep apnea children" were observed in the USA (p = 0.002 and p < 0.00001, respectively), with insufficient search volume to examine these search terms in Australia. All searches peaked in the winter or early spring in both countries, with the magnitude of seasonal effect ranging from 5 to 50 %. Our findings indicate that there are significant seasonal trends for both snoring and sleep apnea internet search engine queries, with a peak in the winter and early spring. Further research is indicated to determine the mechanisms underlying these findings, whether they have clinical impact, and if they are associated with other comorbid medical conditions that have similar patterns of seasonal exacerbation.

  2. A study on PubMed search tag usage pattern: association rule mining of a full-day PubMed query log.

    PubMed

    Mosa, Abu Saleh Mohammad; Yoo, Illhoi

    2013-01-09

    The practice of evidence-based medicine requires efficient biomedical literature search such as PubMed/MEDLINE. Retrieval performance relies highly on the efficient use of search field tags. The purpose of this study was to analyze PubMed log data in order to understand the usage pattern of search tags by the end user in PubMed/MEDLINE search. A PubMed query log file was obtained from the National Library of Medicine containing anonymous user identification, timestamp, and query text. Inconsistent records were removed from the dataset and the search tags were extracted from the query texts. A total of 2,917,159 queries were selected for this study issued by a total of 613,061 users. The analysis of frequent co-occurrences and usage patterns of the search tags was conducted using an association mining algorithm. The percentage of search tag usage was low (11.38% of the total queries) and only 2.95% of queries contained two or more tags. Three out of four users used no search tag and about two-third of them issued less than four queries. Among the queries containing at least one tagged search term, the average number of search tags was almost half of the number of total search terms. Navigational search tags are more frequently used than informational search tags. While no strong association was observed between informational and navigational tags, six (out of 19) informational tags and six (out of 29) navigational tags showed strong associations in PubMed searches. The low percentage of search tag usage implies that PubMed/MEDLINE users do not utilize the features of PubMed/MEDLINE widely or they are not aware of such features or solely depend on the high recall focused query translation by the PubMed's Automatic Term Mapping. The users need further education and interactive search application for effective use of the search tags in order to fulfill their biomedical information needs from PubMed/MEDLINE.

  3. A Study on Pubmed Search Tag Usage Pattern: Association Rule Mining of a Full-day Pubmed Query Log

    PubMed Central

    2013-01-01

    Background The practice of evidence-based medicine requires efficient biomedical literature search such as PubMed/MEDLINE. Retrieval performance relies highly on the efficient use of search field tags. The purpose of this study was to analyze PubMed log data in order to understand the usage pattern of search tags by the end user in PubMed/MEDLINE search. Methods A PubMed query log file was obtained from the National Library of Medicine containing anonymous user identification, timestamp, and query text. Inconsistent records were removed from the dataset and the search tags were extracted from the query texts. A total of 2,917,159 queries were selected for this study issued by a total of 613,061 users. The analysis of frequent co-occurrences and usage patterns of the search tags was conducted using an association mining algorithm. Results The percentage of search tag usage was low (11.38% of the total queries) and only 2.95% of queries contained two or more tags. Three out of four users used no search tag and about two-third of them issued less than four queries. Among the queries containing at least one tagged search term, the average number of search tags was almost half of the number of total search terms. Navigational search tags are more frequently used than informational search tags. While no strong association was observed between informational and navigational tags, six (out of 19) informational tags and six (out of 29) navigational tags showed strong associations in PubMed searches. Conclusions The low percentage of search tag usage implies that PubMed/MEDLINE users do not utilize the features of PubMed/MEDLINE widely or they are not aware of such features or solely depend on the high recall focused query translation by the PubMed’s Automatic Term Mapping. The users need further education and interactive search application for effective use of the search tags in order to fulfill their biomedical information needs from PubMed/MEDLINE. PMID:23302604

  4. Contextual cueing impairment in patients with age-related macular degeneration.

    PubMed

    Geringswald, Franziska; Herbik, Anne; Hoffmann, Michael B; Pollmann, Stefan

    2013-09-12

    Visual attention can be guided by past experience of regularities in our visual environment. In the contextual cueing paradigm, incidental learning of repeated distractor configurations speeds up search times compared to random search arrays. Concomitantly, fewer fixations and more direct scan paths indicate more efficient visual exploration in repeated search arrays. In previous work, we found that simulating a central scotoma in healthy observers eliminated this search facilitation. Here, we investigated contextual cueing in patients with age-related macular degeneration (AMD) who suffer from impaired foveal vision. AMD patients performed visual search using only their more severely impaired eye (n = 13) as well as under binocular viewing (n = 16). Normal-sighted controls developed a significant contextual cueing effect. In comparison, patients showed only a small nonsignificant advantage for repeated displays when searching with their worse eye. When searching binocularly, they profited from contextual cues, but still less than controls. Number of fixations and scan pattern ratios showed a comparable pattern as search times. Moreover, contextual cueing was significantly correlated with acuity in monocular search. Thus, foveal vision loss may lead to impaired guidance of attention by contextual memory cues.

  5. Semi-supervised anomaly detection - towards model-independent searches of new physics

    NASA Astrophysics Data System (ADS)

    Kuusela, Mikael; Vatanen, Tommi; Malmi, Eric; Raiko, Tapani; Aaltonen, Timo; Nagai, Yoshikazu

    2012-06-01

    Most classification algorithms used in high energy physics fall under the category of supervised machine learning. Such methods require a training set containing both signal and background events and are prone to classification errors should this training data be systematically inaccurate for example due to the assumed MC model. To complement such model-dependent searches, we propose an algorithm based on semi-supervised anomaly detection techniques, which does not require a MC training sample for the signal data. We first model the background using a multivariate Gaussian mixture model. We then search for deviations from this model by fitting to the observations a mixture of the background model and a number of additional Gaussians. This allows us to perform pattern recognition of any anomalous excess over the background. We show by a comparison to neural network classifiers that such an approach is a lot more robust against misspecification of the signal MC than supervised classification. In cases where there is an unexpected signal, a neural network might fail to correctly identify it, while anomaly detection does not suffer from such a limitation. On the other hand, when there are no systematic errors in the training data, both methods perform comparably.

  6. Causal gene identification using combinatorial V-structure search.

    PubMed

    Cai, Ruichu; Zhang, Zhenjie; Hao, Zhifeng

    2013-07-01

    With the advances of biomedical techniques in the last decade, the costs of human genomic sequencing and genomic activity monitoring are coming down rapidly. To support the huge genome-based business in the near future, researchers are eager to find killer applications based on human genome information. Causal gene identification is one of the most promising applications, which may help the potential patients to estimate the risk of certain genetic diseases and locate the target gene for further genetic therapy. Unfortunately, existing pattern recognition techniques, such as Bayesian networks, cannot be directly applied to find the accurate causal relationship between genes and diseases. This is mainly due to the insufficient number of samples and the extremely high dimensionality of the gene space. In this paper, we present the first practical solution to causal gene identification, utilizing a new combinatorial formulation over V-Structures commonly used in conventional Bayesian networks, by exploring the combinations of significant V-Structures. We prove the NP-hardness of the combinatorial search problem under a general settings on the significance measure on the V-Structures, and present a greedy algorithm to find sub-optimal results. Extensive experiments show that our proposal is both scalable and effective, particularly with interesting findings on the causal genes over real human genome data. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. Automation of energy demand forecasting

    NASA Astrophysics Data System (ADS)

    Siddique, Sanzad

    Automation of energy demand forecasting saves time and effort by searching automatically for an appropriate model in a candidate model space without manual intervention. This thesis introduces a search-based approach that improves the performance of the model searching process for econometrics models. Further improvements in the accuracy of the energy demand forecasting are achieved by integrating nonlinear transformations within the models. This thesis introduces machine learning techniques that are capable of modeling such nonlinearity. Algorithms for learning domain knowledge from time series data using the machine learning methods are also presented. The novel search based approach and the machine learning models are tested with synthetic data as well as with natural gas and electricity demand signals. Experimental results show that the model searching technique is capable of finding an appropriate forecasting model. Further experimental results demonstrate an improved forecasting accuracy achieved by using the novel machine learning techniques introduced in this thesis. This thesis presents an analysis of how the machine learning techniques learn domain knowledge. The learned domain knowledge is used to improve the forecast accuracy.

  8. Pattern Recognition for a Flight Dynamics Monte Carlo Simulation

    NASA Technical Reports Server (NTRS)

    Restrepo, Carolina; Hurtado, John E.

    2011-01-01

    The design, analysis, and verification and validation of a spacecraft relies heavily on Monte Carlo simulations. Modern computational techniques are able to generate large amounts of Monte Carlo data but flight dynamics engineers lack the time and resources to analyze it all. The growing amounts of data combined with the diminished available time of engineers motivates the need to automate the analysis process. Pattern recognition algorithms are an innovative way of analyzing flight dynamics data efficiently. They can search large data sets for specific patterns and highlight critical variables so analysts can focus their analysis efforts. This work combines a few tractable pattern recognition algorithms with basic flight dynamics concepts to build a practical analysis tool for Monte Carlo simulations. Current results show that this tool can quickly and automatically identify individual design parameters, and most importantly, specific combinations of parameters that should be avoided in order to prevent specific system failures. The current version uses a kernel density estimation algorithm and a sequential feature selection algorithm combined with a k-nearest neighbor classifier to find and rank important design parameters. This provides an increased level of confidence in the analysis and saves a significant amount of time.

  9. Search Interview Techniques, Information Gain, and User Satisfaction with Online Bibliographic Retrieval Services.

    ERIC Educational Resources Information Center

    Auster, Ethel; Lawton, Stephen B.

    This research study involved a systematic investigation into the relationships among: (1) the techniques used by search analysts during preliminary interviews with users before engaging in online retrieval of bibliographic citations; (2) the amount of new information gained by the user as a result of the search; and (3) the user's ultimate…

  10. Teaching Web Search Skills: Techniques and Strategies of Top Trainers

    ERIC Educational Resources Information Center

    Notess, Greg R.

    2006-01-01

    Here is a unique and practical reference for anyone who teaches Web searching. Greg Notess shares his own techniques and strategies along with expert tips and advice from a virtual "who's who" of Web search training: Joe Barker, Paul Barron, Phil Bradley, John Ferguson, Alice Fulbright, Ran Hock, Jeff Humphrey, Diane Kovacs, Gary Price, Danny…

  11. Knowledge Discovery in Medical Mining by using Genetic Algorithms and Artificial Neural Networks

    NASA Astrophysics Data System (ADS)

    Srivathsa, P. K.

    2011-12-01

    Medical Data mining could be thought of as the search for relationships and patterns within the medical data, which facilitates the acquisition of useful knowledge for effective medical diagnosis. Consequently, the predictability of disease will become more effective and the early detection of disease certainly facilitates an increased exposure to required patient care with focused treatment, economic feasibility and improved cure rates. So, the present investigation is carried on medical data(PIMA) using DM and GA based Neural Network technique and the results predict that the methodology is not only reliable but also helps in furthering the scope of the subject.

  12. Fast, Inclusive Searches for Geographic Names Using Digraphs

    USGS Publications Warehouse

    Donato, David I.

    2008-01-01

    An algorithm specifies how to quickly identify names that approximately match any specified name when searching a list or database of geographic names. Based on comparisons of the digraphs (ordered letter pairs) contained in geographic names, this algorithmic technique identifies approximately matching names by applying an artificial but useful measure of name similarity. A digraph index enables computer name searches that are carried out using this technique to be fast enough for deployment in a Web application. This technique, which is a member of the class of n-gram algorithms, is related to, but distinct from, the soundex, PHONIX, and metaphone phonetic algorithms. Despite this technique's tendency to return some counterintuitive approximate matches, it is an effective aid for fast, inclusive searches for geographic names when the exact name sought, or its correct spelling, is unknown.

  13. Plain Language to Communicate Physical Activity Information: A Website Content Analysis.

    PubMed

    Paige, Samantha R; Black, David R; Mattson, Marifran; Coster, Daniel C; Stellefson, Michael

    2018-04-01

    Plain language techniques are health literacy universal precautions intended to enhance health care system navigation and health outcomes. Physical activity (PA) is a popular topic on the Internet, yet it is unknown if information is communicated in plain language. This study examined how plain language techniques are included in PA websites, and if the use of plain language techniques varies according to search procedures (keyword, search engine) and website host source (government, commercial, educational/organizational). Three keywords ("physical activity," "fitness," and "exercise") were independently entered into three search engines (Google, Bing, and Yahoo) to locate a nonprobability sample of websites ( N = 61). Fourteen plain language techniques were coded within each website to examine content formatting, clarity and conciseness, and multimedia use. Approximately half ( M = 6.59; SD = 1.68) of the plain language techniques were included in each website. Keyword physical activity resulted in websites with fewer clear and concise plain language techniques ( p < .05), whereas fitness resulted in websites with more clear and concise techniques ( p < .01). Plain language techniques did not vary by search engine or the website host source. Accessing PA information that is easy to understand and behaviorally oriented may remain a challenge for users. Transdisciplinary collaborations are needed to optimize plain language techniques while communicating online PA information.

  14. Interest in tanning beds and sunscreen in German-speaking countries.

    PubMed

    Kirchberger, Michael C; Kirchberger, Laura F; Eigentler, Thomas K; Reinhard, Raphael; Berking, Carola; Schuler, Gerold; Heinzerling, Lucie; Heppt, Markus V

    2017-12-01

    The growing incidence of nearly all types of skin cancer can be attributed to increased exposure to natural or artificial ultraviolet (UV) radiation. However, there is a scarcity of statistical data on risk behavior or sunscreen use, which would be important for any prevention efforts. Using the search engine Google ® , we analyzed search patterns for the terms Solarium (tanning bed), Sonnencreme (sunscreen), and Sonnenschutz (sun protection) in Germany, Austria, and Switzerland between 2004 and 2016, and compared it to search patterns worldwide. For this purpose, "normalized search volumes" (NSVs) were calculated for the various search queries. The corresponding polynomial functions were then compared with each other over the course of time. Since 2001, there has been a marked worldwide decrease in the search queries for tanning bed, whereas those for sunscreen have steadily increased. In German-speaking countries, on the other hand, there have - for years - consistently been more search queries for tanning bed than for sunscreen. There is an annual periodicity of the queries, with the highest NSVs for tanning bed between March and May and those for sunscreen in the summer months around June. In Germany, the city-states of Hamburg and Berlin have particularly high NSVs for tanning bed. Compared to the rest of the world, German-speaking countries show a strikingly unfavorable search pattern. There is still great need for education and prevention with respect to sunscreen use and avoidance of artificial UV exposure. © 2017 Deutsche Dermatologische Gesellschaft (DDG). Published by John Wiley & Sons Ltd.

  15. Efficient protein structure search using indexing methods

    PubMed Central

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively. PMID:23691543

  16. Efficient protein structure search using indexing methods.

    PubMed

    Kim, Sungchul; Sael, Lee; Yu, Hwanjo

    2013-01-01

    Understanding functions of proteins is one of the most important challenges in many studies of biological processes. The function of a protein can be predicted by analyzing the functions of structurally similar proteins, thus finding structurally similar proteins accurately and efficiently from a large set of proteins is crucial. A protein structure can be represented as a vector by 3D-Zernike Descriptor (3DZD) which compactly represents the surface shape of the protein tertiary structure. This simplified representation accelerates the searching process. However, computing the similarity of two protein structures is still computationally expensive, thus it is hard to efficiently process many simultaneous requests of structurally similar protein search. This paper proposes indexing techniques which substantially reduce the search time to find structurally similar proteins. In particular, we first exploit two indexing techniques, i.e., iDistance and iKernel, on the 3DZDs. After that, we extend the techniques to further improve the search speed for protein structures. The extended indexing techniques build and utilize an reduced index constructed from the first few attributes of 3DZDs of protein structures. To retrieve top-k similar structures, top-10 × k similar structures are first found using the reduced index, and top-k structures are selected among them. We also modify the indexing techniques to support θ-based nearest neighbor search, which returns data points less than θ to the query point. The results show that both iDistance and iKernel significantly enhance the searching speed. In top-k nearest neighbor search, the searching time is reduced 69.6%, 77%, 77.4% and 87.9%, respectively using iDistance, iKernel, the extended iDistance, and the extended iKernel. In θ-based nearest neighbor serach, the searching time is reduced 80%, 81%, 95.6% and 95.6% using iDistance, iKernel, the extended iDistance, and the extended iKernel, respectively.

  17. Loading Patterns of the Posterior Cruciate Ligament in the Healthy Knee: A Systematic Review

    PubMed Central

    List, Renate; Oberhofer, Katja; Fucentese, Sandro F.; Snedeker, Jess G.; Taylor, William R.

    2016-01-01

    Background The posterior cruciate ligament (PCL) is the strongest ligament of the knee, serving as one of the major passive stabilizers of the tibio-femoral joint. However, despite a number of experimental and modelling approaches to understand the kinematics and kinetics of the ligament, the normal loading conditions of the PCL and its functional bundles are still controversially discussed. Objectives This study aimed to generate science-based evidence for understanding the functional loading of the PCL, including the anterolateral and posteromedial bundles, in the healthy knee joint through systematic review and statistical analysis of the literature. Data sources MEDLINE, EMBASE and CENTRAL Eligibility criteria for selecting studies Databases were searched for articles containing any numerical strain or force data on the healthy PCL and its functional bundles. Studied activities were as follows: passive flexion, flexion under 100N and 134N posterior tibial load, walking, stair ascent and descent, body-weight squatting and forward lunge. Method Statistical analysis was performed on the reported load data, which was weighted according to the number of knees tested to extract average strain and force trends of the PCL and identify deviations from the norms. Results From the 3577 articles retrieved by the initial electronic search, only 66 met all inclusion criteria. The results obtained by aggregating data reported in the eligible studies indicate that the loading patterns of the PCL vary with activity type, knee flexion angle, but importantly also the technique used for assessment. Moreover, different fibres of the PCL exhibit different strain patterns during knee flexion, with higher strain magnitudes reported in the anterolateral bundle. While during passive flexion the posteromedial bundle is either lax or very slightly elongated, it experiences higher strain levels during forward lunge and has a synergetic relationship with the anterolateral bundle. The strain patterns obtained for virtual fibres that connect the origin and insertion of the bundles in a straight line show similar trends to those of the real bundles but with different magnitudes. Conclusion This review represents what is now the best available understanding of the biomechanics of the PCL, and may help to improve programs for injury prevention, diagnosis methods as well as reconstruction and rehabilitation techniques. PMID:27880849

  18. Analysis of semantic search within the domains of uncertainty: using Keyword Effectiveness Indexing as an evaluation tool.

    PubMed

    Lorence, Daniel; Abraham, Joanna

    2006-01-01

    Medical and health-related searches pose a special case of risk when using the web as an information resource. Uninsured consumers, lacking access to a trained provider, will often rely on information from the internet for self-diagnosis and treatment. In areas where treatments are uncertain or controversial, most consumers lack the knowledge to make an informed decision. This exploratory technology assessment examines the use of Keyword Effectiveness Indexing (KEI) analysis as a potential tool for profiling information search and keyword retrieval patterns. Results demonstrate that the KEI methodology can be useful in identifying e-health search patterns, but is limited by semantic or text-based web environments.

  19. Adolescents Searching for Health Information on the Internet: An Observational Study

    PubMed Central

    Derry, Holly A; Resnick, Paul J; Richardson, Caroline R

    2003-01-01

    Background Adolescents' access to health information on the Internet is partly a function of their ability to search for and find answers to their health-related questions. Adolescents may have unique health and computer literacy needs. Although many surveys, interviews, and focus groups have been utilized to understand the information-seeking and information-retrieval behavior of adolescents looking for health information online, we were unable to locate observations of individual adolescents that have been conducted in this context. Objective This study was designed to understand how adolescents search for health information using the Internet and what implications this may have on access to health information. Methods A convenience sample of 12 students (age 12-17 years) from 1 middle school and 2 high schools in southeast Michigan were provided with 6 health-related questions and asked to look for answers using the Internet. Researchers recorded 68 specific searches using software that captured screen images as well as synchronized audio recordings. Recordings were reviewed later and specific search techniques and strategies were coded. A qualitative review of the verbal communication was also performed. Results Out of 68 observed searches, 47 (69%) were successful in that the adolescent found a correct and useful answer to the health question. The majority of sites that students attempted to access were retrieved directly from search engine results (77%) or a search engine's recommended links (10%); only a small percentage were directly accessed (5%) or linked from another site (7%). The majority (83%) of followed links from search engine results came from the first 9 results. Incorrect spelling (30 of 132 search terms), number of pages visited within a site (ranging from 1-15), and overall search strategy (eg, using a search engine versus directly accessing a site), were each important determinants of success. Qualitative analysis revealed that participants used a trial-and-error approach to formulate search strings, scanned pages randomly instead of systematically, and did not consider the source of the content when searching for health information. Conclusions This study provides a useful snapshot of current adolescent searching patterns. The results have implications for constructing realistic simulations of adolescent search behavior, improving distribution and usefulness of Web sites with health information relevant to adolescents, and enhancing educators' knowledge of what specific pitfalls students are likely to encounter. PMID:14713653

  20. LandEx - Fast, FOSS-Based Application for Query and Retrieval of Land Cover Patterns

    NASA Astrophysics Data System (ADS)

    Netzel, P.; Stepinski, T.

    2012-12-01

    The amount of satellite-based spatial data is continuously increasing making a development of efficient data search tools a priority. The bulk of existing research on searching satellite-gathered data concentrates on images and is based on the concept of Content-Based Image Retrieval (CBIR); however, available solutions are not efficient and robust enough to be put to use as deployable web-based search tools. Here we report on development of a practical, deployable tool that searches classified, rather than raw image. LandEx (Landscape Explorer) is a GeoWeb-based tool for Content-Based Pattern Retrieval (CBPR) contained within the National Land Cover Dataset 2006 (NLCD2006). The USGS-developed NLCD2006 is derived from Landsat multispectral images; it covers the entire conterminous U.S. with the resolution of 30 meters/pixel and it depicts 16 land cover classes. The size of NLCD2006 is about 10 Gpixels (161,000 x 100,000 pixels). LandEx is a multi-tier GeoWeb application based on Open Source Software. Main components are: GeoExt/OpenLayers (user interface), GeoServer (OGC WMS, WCS and WPS server), and GRASS (calculation engine). LandEx performs search using query-by-example approach: user selects a reference scene (exhibiting a chosen pattern of land cover classes) and the tool produces, in real time, a map indicating a degree of similarity between the reference pattern and all local patterns across the U.S. Scene pattern is encapsulated by a 2D histogram of classes and sizes of single-class clumps. Pattern similarity is based on the notion of mutual information. The resultant similarity map can be viewed and navigated in a web browser, or it can download as a GeoTiff file for more in-depth analysis. The LandEx is available at http://sil.uc.edu

  1. A main path domain map as digital library interface

    NASA Astrophysics Data System (ADS)

    Demaine, Jeffrey

    2009-01-01

    The shift to electronic publishing of scientific journals is an opportunity for the digital library to provide non-traditional ways of accessing the literature. One method is to use citation metadata drawn from a collection of electronic journals to generate maps of science. These maps visualize the communication patterns in the collection, giving the user an easy-tograsp view of the semantic structure underlying the scientific literature. For this visualization to be understandable the complexity of the citation network must be reduced through an algorithm. This paper describes the Citation Pathfinder application and its integration into a prototype digital library. This application generates small-scale citation networks that expand upon the search results of the digital library. These domain maps are linked to the collection, creating an interface that is based on the communication patterns in science. The Main Path Analysis technique is employed to simplify these networks into linear, sequential structures. By identifying patterns that characterize the evolution of the research field, Citation Pathfinder uses citations to give users a deeper understanding of the scientific literature.

  2. Assessment of Severe Apnoea through Voice Analysis, Automatic Speech, and Speaker Recognition Techniques

    NASA Astrophysics Data System (ADS)

    Fernández Pozo, Rubén; Blanco Murillo, Jose Luis; Hernández Gómez, Luis; López Gonzalo, Eduardo; Alcázar Ramírez, José; Toledano, Doroteo T.

    2009-12-01

    This study is part of an ongoing collaborative effort between the medical and the signal processing communities to promote research on applying standard Automatic Speech Recognition (ASR) techniques for the automatic diagnosis of patients with severe obstructive sleep apnoea (OSA). Early detection of severe apnoea cases is important so that patients can receive early treatment. Effective ASR-based detection could dramatically cut medical testing time. Working with a carefully designed speech database of healthy and apnoea subjects, we describe an acoustic search for distinctive apnoea voice characteristics. We also study abnormal nasalization in OSA patients by modelling vowels in nasal and nonnasal phonetic contexts using Gaussian Mixture Model (GMM) pattern recognition on speech spectra. Finally, we present experimental findings regarding the discriminative power of GMMs applied to severe apnoea detection. We have achieved an 81% correct classification rate, which is very promising and underpins the interest in this line of inquiry.

  3. Accelerated Bayesian model-selection and parameter-estimation in continuous gravitational-wave searches with pulsar-timing arrays

    NASA Astrophysics Data System (ADS)

    Taylor, Stephen; Ellis, Justin; Gair, Jonathan

    2014-11-01

    We describe several new techniques which accelerate Bayesian searches for continuous gravitational-wave emission from supermassive black-hole binaries using pulsar-timing arrays. These techniques mitigate the problematic increase of search dimensionality with the size of the pulsar array which arises from having to include an extra parameter per pulsar as the array is expanded. This extra parameter corresponds to searching over the phase of the gravitational wave as it propagates past each pulsar so that we can coherently include the pulsar term in our search strategies. Our techniques make the analysis tractable with powerful evidence-evaluation packages like MultiNest. We find good agreement of our techniques with the parameter-estimation and Bayes factor evaluation performed with full signal templates and conclude that these techniques make excellent first-cut tools for detection and characterization of continuous gravitational-wave signals with pulsar-timing arrays. Crucially, at low to moderate signal-to-noise ratios the factor by which the analysis is sped up can be ≳100 , permitting rigorous programs of systematic injection and recovery of signals to establish robust detection criteria within a Bayesian formalism.

  4. Iterative repair for scheduling and rescheduling

    NASA Technical Reports Server (NTRS)

    Zweben, Monte; Davis, Eugene; Deale, Michael

    1991-01-01

    An iterative repair search method is described called constraint based simulated annealing. Simulated annealing is a hill climbing search technique capable of escaping local minima. The utility of the constraint based framework is shown by comparing search performance with and without the constraint framework on a suite of randomly generated problems. Results are also shown of applying the technique to the NASA Space Shuttle ground processing problem. These experiments show that the search methods scales to complex, real world problems and reflects interesting anytime behavior.

  5. Use of Cognitive and Metacognitive Strategies in Online Search: An Eye-Tracking Study

    ERIC Educational Resources Information Center

    Zhou, Mingming; Ren, Jing

    2016-01-01

    This study used eye-tracking technology to track students' eye movements while searching information on the web. The research question guiding this study was "Do students with different search performance levels have different visual attention distributions while searching information online? If yes, what are the patterns for high and low…

  6. The effects of cognitive style and emotional trade-off difficulty on information processing in decision-making.

    PubMed

    Wang, Dawei; Hao, Leilei; Maguire, Phil; Hu, Yixin

    2016-12-01

    This study investigated the effects of cognitive style and emotional trade-off difficulty (ETOD) on information processing in decision-making. Eighty undergraduates (73.75% female, M = 21.90), grouped according to their cognitive style (field-dependent or field-independent), conducted an Information Display Board (IDB) task, through which search time, search depth and search pattern were measured. Participants' emotional states were assessed both before and after the IDB task. The results showed that participants experienced significantly more negative emotion under high ETOD compared to those under low ETOD. While both cognitive style and ETOD had significant effects on search time and search depth, only ETOD significantly influenced search pattern; individuals in both cognitive style groups tended to use attribute-based processing under high ETOD and to use alternative-based processing under low ETOD. There was also a significant interaction between cognitive style and ETOD for search time and search depth. We propose that these results are best accounted for by the coping behaviour framework under high ETOD, and by the negative emotion hypothesis under low ETOD. © 2016 International Union of Psychological Science.

  7. The Alexander Technique and musicians: a systematic review of controlled trials.

    PubMed

    Klein, Sabine D; Bayard, Claudine; Wolf, Ursula

    2014-10-24

    Musculoskeletal disorders, stress and performance anxiety are common in musicians. Therefore, some use the Alexander Technique (AT), a psycho-physical method that helps to release unnecessary muscle tension and re-educates non-beneficial movement patterns through intentional inhibition of unwanted habitual behaviours. According to a recent review AT sessions may be effective for chronic back pain. This review aimed to evaluate the evidence for the effectiveness of AT sessions on musicians' performance, anxiety, respiratory function and posture. The following electronic databases were searched up to February 2014 for relevant publications: PUBMED, Google Scholar, CINAHL, EMBASE, AMED, PsycINFO and RILM. The search criteria were "Alexander Technique" AND "music*". References were searched, and experts and societies of AT or musicians' medicine contacted for further publications. 237 citations were assessed. 12 studies were included for further analysis, 5 of which were randomised controlled trials (RCTs), 5 controlled but not randomised (CTs), and 2 mixed methods studies. Main outcome measures in RCTs and CTs were music performance, respiratory function, performance anxiety, body use and posture. Music performance was judged by external experts and found to be improved by AT in 1 of 3 RCTs; in 1 RCT comparing neurofeedback (NF) to AT, only NF caused improvements. Respiratory function was investigated in 2 RCTs, but not improved by AT training. Performance anxiety was mostly assessed by questionnaires and decreased by AT in 2 of 2 RCTs and in 2 of 2 CTs. A variety of outcome measures has been used to investigate the effectiveness of AT sessions in musicians. Evidence from RCTs and CTs suggests that AT sessions may improve performance anxiety in musicians. Effects on music performance, respiratory function and posture yet remain inconclusive. Future trials with well-established study designs are warranted to further and more reliably explore the potential of AT in the interest of musicians.

  8. Environmental context explains Lévy and Brownian movement patterns of marine predators.

    PubMed

    Humphries, Nicolas E; Queiroz, Nuno; Dyer, Jennifer R M; Pade, Nicolas G; Musyl, Michael K; Schaefer, Kurt M; Fuller, Daniel W; Brunnschweiler, Juerg M; Doyle, Thomas K; Houghton, Jonathan D R; Hays, Graeme C; Jones, Catherine S; Noble, Leslie R; Wearmouth, Victoria J; Southall, Emily J; Sims, David W

    2010-06-24

    An optimal search theory, the so-called Lévy-flight foraging hypothesis, predicts that predators should adopt search strategies known as Lévy flights where prey is sparse and distributed unpredictably, but that Brownian movement is sufficiently efficient for locating abundant prey. Empirical studies have generated controversy because the accuracy of statistical methods that have been used to identify Lévy behaviour has recently been questioned. Consequently, whether foragers exhibit Lévy flights in the wild remains unclear. Crucially, moreover, it has not been tested whether observed movement patterns across natural landscapes having different expected resource distributions conform to the theory's central predictions. Here we use maximum-likelihood methods to test for Lévy patterns in relation to environmental gradients in the largest animal movement data set assembled for this purpose. Strong support was found for Lévy search patterns across 14 species of open-ocean predatory fish (sharks, tuna, billfish and ocean sunfish), with some individuals switching between Lévy and Brownian movement as they traversed different habitat types. We tested the spatial occurrence of these two principal patterns and found Lévy behaviour to be associated with less productive waters (sparser prey) and Brownian movements to be associated with productive shelf or convergence-front habitats (abundant prey). These results are consistent with the Lévy-flight foraging hypothesis, supporting the contention that organism search strategies naturally evolved in such a way that they exploit optimal Lévy patterns.

  9. Precise design-based defect characterization and root cause analysis

    NASA Astrophysics Data System (ADS)

    Xie, Qian; Venkatachalam, Panneerselvam; Lee, Julie; Chen, Zhijin; Zafar, Khurram

    2017-03-01

    As semiconductor manufacturing continues its march towards more advanced technology nodes, it becomes increasingly important to identify and characterize design weak points, which is typically done using a combination of inline inspection data and the physical layout (or design). However, the employed methodologies have been somewhat imprecise, relying greatly on statistical techniques to signal excursions. For example, defect location error that is inherent to inspection tools prevents them from reporting the true locations of defects. Therefore, common operations such as background-based binning that are designed to identify frequently failing patterns cannot reliably identify specific weak patterns. They can only identify an approximate set of possible weak patterns, but within these sets there are many perfectly good patterns. Additionally, characterizing the failure rate of a known weak pattern based on inline inspection data also has a lot of fuzziness due to coordinate uncertainty. SEM (Scanning Electron Microscope) Review attempts to come to the rescue by capturing high resolution images of the regions surrounding the reported defect locations, but SEM images are reviewed by human operators and the weak patterns revealed in those images must be manually identified and classified. Compounding the problem is the fact that a single Review SEM image may contain multiple defective patterns and several of those patterns might not appear defective to the human eye. In this paper we describe a significantly improved methodology that brings advanced computer image processing and design-overlay techniques to better address the challenges posed by today's leading technology nodes. Specifically, new software techniques allow the computer to analyze Review SEM images in detail, to overlay those images with reference design to detect every defect that might be present in all regions of interest within the overlaid reference design (including several classes of defects that human operators will typically miss), to obtain the exact defect location on design, to compare all defective patterns thus detected against a library of known patterns, and to classify all defective patterns as either new or known. By applying the computer to these tasks, we automate the entire process from defective pattern identification to pattern classification with high precision, and we perform this operation en masse during R & D, ramp, and volume production. By adopting the methodology, whenever a specific weak pattern is identified, we are able to run a series of characterization operations to ultimately arrive at the root cause. These characterization operations can include (a) searching all pre-existing Review SEM images for the presence of the specific weak pattern to determine whether there is any spatial (within die or within wafer) or temporal (within any particular date range, before or after a mask revision, etc.) correlation and (b) understanding the failure rate of the specific weak pattern to prioritize the urgency of the problem, (c) comparing the weak pattern against an OPC (Optical Procimity Correction) Verification report or a PWQ (Process Window Qualification)/FEM (Focus Exposure Matrix) result to assess the likelihood of it being a litho-sensitive pattern, etc. After resolving the specific weak pattern, we will categorize it as known pattern, and the engineer will move forward with discovering new weak patterns.

  10. Health care public reporting utilization - user clusters, web trails, and usage barriers on Germany's public reporting portal Weisse-Liste.de.

    PubMed

    Pross, Christoph; Averdunk, Lars-Henrik; Stjepanovic, Josip; Busse, Reinhard; Geissler, Alexander

    2017-04-21

    Quality of care public reporting provides structural, process and outcome information to facilitate hospital choice and strengthen quality competition. Yet, evidence indicates that patients rarely use this information in their decision-making, due to limited awareness of the data and complex and conflicting information. While there is enthusiasm among policy makers for public reporting, clinicians and researchers doubt its overall impact. Almost no study has analyzed how users behave on public reporting portals, which information they seek out and when they abort their search. This study employs web-usage mining techniques on server log data of 17 million user actions from Germany's premier provider transparency portal Weisse-Liste.de (WL.de) between 2012 and 2015. Postal code and ICD search requests facilitate identification of geographical and treatment area usage patterns. User clustering helps to identify user types based on parameters like session length, referrer and page topic visited. First-level markov chains illustrate common click paths and premature exits. In 2015, the WL.de Hospital Search portal had 2,750 daily users, with 25% mobile traffic, a bounce rate of 38% and 48% of users examining hospital quality information. From 2013 to 2015, user traffic grew at 38% annually. On average users spent 7 min on the portal, with 7.4 clicks and 54 s between clicks. Users request information for many oncologic and orthopedic conditions, for which no process or outcome quality indicators are available. Ten distinct user types, with particular usage patterns and interests, are identified. In particular, the different types of professional and non-professional users need to be addressed differently to avoid high premature exit rates at several key steps in the information search and view process. Of all users, 37% enter hospital information correctly upon entry, while 47% require support in their hospital search. Several onsite and offsite improvement options are identified. Public reporting needs to be directed at the interests of its users, with more outcome quality information for oncology and orthopedics. Customized reporting can cater to the different needs and skill levels of professional and non-professional users. Search engine optimization and hospital quality advocacy can increase website traffic.

  11. Evolutionary pattern search algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, W.E.

    1995-09-19

    This paper defines a class of evolutionary algorithms called evolutionary pattern search algorithms (EPSAs) and analyzes their convergence properties. This class of algorithms is closely related to evolutionary programming, evolutionary strategie and real-coded genetic algorithms. EPSAs are self-adapting systems that modify the step size of the mutation operator in response to the success of previous optimization steps. The rule used to adapt the step size can be used to provide a stationary point convergence theory for EPSAs on any continuous function. This convergence theory is based on an extension of the convergence theory for generalized pattern search methods. An experimentalmore » analysis of the performance of EPSAs demonstrates that these algorithms can perform a level of global search that is comparable to that of canonical EAs. We also describe a stopping rule for EPSAs, which reliably terminated near stationary points in our experiments. This is the first stopping rule for any class of EAs that can terminate at a given distance from stationary points.« less

  12. Investigations into the formation of nanocrystalline quantum dot thin films by mist deposition process

    NASA Astrophysics Data System (ADS)

    Kshirsagar, Aditya

    Semiconductor nanocrystalline quantum dots (NQDs) have material properties remarkably different compared to bulk semiconductors with the same material composition. These NQDs have various novel applications in the electronic and photonic industry, such as light emitting diodes (LEDs) and flat-panel displays. In these applications, ultra-thin films of NQDs in the monolayer regime are needed to ensure optimal current transport properties and device efficiency. There is ongoing search to find a suitable method to deposit and pattern such ultra-thin films of quantum dots with few monolayer thicknesses. Several competing approaches are available, each with its pros and cons. This study explores mist deposition as the technique to fill this void. In this study, ultra-thin films of quantum dots are deposited on diverse substrates and are characterized to understand the mechanics of mist deposition. Various applications of blanket deposited and patterned quantum dot films are studied. The results discussed here include atomic force microscopy analysis of the films to study surface morphology, fluorescence microscopy to study light emission and optical microscope images to study patterning techniques. These results demonstrate the ability of mist deposition to form 1-4 monolayers thick, uniform, defect-free patterned films with root mean square (RMS) surface roughness less than 2 nm. LEDs fabricated using mist deposition show a peak luminescence greater than 500 cd/m2 for matched red, yellow and green devices using Alq3 as the electron transport layer, and over 9000 cd/m2 for red devices using ZnO as the electron transport layer, respectively. In addition to the experimental approach to study the process and explore potential applications, simulation and modeling are carried out to understand the various aspects of mist deposition. A mathematical model is presented which discusses the atomization process of the precursor solution, the physics involved during the deposition process, and the mechanics of film formation. Results of film morphology simulation using Monte Carlo techniques and process simulation using multi-physics approach are discussed. Problems in pattern transfer due to electrostatic effects when using shadow masks are presented in a separate chapter.

  13. Pattern-based integer sample motion search strategies in the context of HEVC

    NASA Astrophysics Data System (ADS)

    Maier, Georg; Bross, Benjamin; Grois, Dan; Marpe, Detlev; Schwarz, Heiko; Veltkamp, Remco C.; Wiegand, Thomas

    2015-09-01

    The H.265/MPEG-H High Efficiency Video Coding (HEVC) standard provides a significant increase in coding efficiency compared to its predecessor, the H.264/MPEG-4 Advanced Video Coding (AVC) standard, which however comes at the cost of a high computational burden for a compliant encoder. Motion estimation (ME), which is a part of the inter-picture prediction process, typically consumes a high amount of computational resources, while significantly increasing the coding efficiency. In spite of the fact that both H.265/MPEG-H HEVC and H.264/MPEG-4 AVC standards allow processing motion information on a fractional sample level, the motion search algorithms based on the integer sample level remain to be an integral part of ME. In this paper, a flexible integer sample ME framework is proposed, thereby allowing to trade off significant reduction of ME computation time versus coding efficiency penalty in terms of bit rate overhead. As a result, through extensive experimentation, an integer sample ME algorithm that provides a good trade-off is derived, incorporating a combination and optimization of known predictive, pattern-based and early termination techniques. The proposed ME framework is implemented on a basis of the HEVC Test Model (HM) reference software, further being compared to the state-of-the-art fast search algorithm, which is a native part of HM. It is observed that for high resolution sequences, the integer sample ME process can be speed-up by factors varying from 3.2 to 7.6, resulting in the bit-rate overhead of 1.5% and 0.6% for Random Access (RA) and Low Delay P (LDP) configurations, respectively. In addition, the similar speed-up is observed for sequences with mainly Computer-Generated Imagery (CGI) content while trading off the bit rate overhead of up to 5.2%.

  14. Inferring consistent functional interaction patterns from natural stimulus FMRI data

    PubMed Central

    Sun, Jiehuan; Hu, Xintao; Huang, Xiu; Liu, Yang; Li, Kaiming; Li, Xiang; Han, Junwei; Guo, Lei

    2014-01-01

    There has been increasing interest in how the human brain responds to natural stimulus such as video watching in the neuroimaging field. Along this direction, this paper presents our effort in inferring consistent and reproducible functional interaction patterns under natural stimulus of video watching among known functional brain regions identified by task-based fMRI. Then, we applied and compared four statistical approaches, including Bayesian network modeling with searching algorithms: greedy equivalence search (GES), Peter and Clark (PC) analysis, independent multiple greedy equivalence search (IMaGES), and the commonly used Granger causality analysis (GCA), to infer consistent and reproducible functional interaction patterns among these brain regions. It is interesting that a number of reliable and consistent functional interaction patterns were identified by the GES, PC and IMaGES algorithms in different participating subjects when they watched multiple video shots of the same semantic category. These interaction patterns are meaningful given current neuroscience knowledge and are reasonably reproducible across different brains and video shots. In particular, these consistent functional interaction patterns are supported by structural connections derived from diffusion tensor imaging (DTI) data, suggesting the structural underpinnings of consistent functional interactions. Our work demonstrates that specific consistent patterns of functional interactions among relevant brain regions might reflect the brain's fundamental mechanisms of online processing and comprehension of video messages. PMID:22440644

  15. A novel tree-based algorithm to discover seismic patterns in earthquake catalogs

    NASA Astrophysics Data System (ADS)

    Florido, E.; Asencio-Cortés, G.; Aznarte, J. L.; Rubio-Escudero, C.; Martínez-Álvarez, F.

    2018-06-01

    A novel methodology is introduced in this research study to detect seismic precursors. Based on an existing approach, the new methodology searches for patterns in the historical data. Such patterns may contain statistical or soil dynamics information. It improves the original version in several aspects. First, new seismicity indicators have been used to characterize earthquakes. Second, a machine learning clustering algorithm has been applied in a very flexible way, thus allowing the discovery of new data groupings. Third, a novel search strategy is proposed in order to obtain non-overlapped patterns. And, fourth, arbitrary lengths of patterns are searched for, thus discovering long and short-term behaviors that may influence in the occurrence of medium-large earthquakes. The methodology has been applied to seven different datasets, from three different regions, namely the Iberian Peninsula, Chile and Japan. Reported results show a remarkable improvement with respect to the former version, in terms of all evaluated quality measures. In particular, the number of false positives has decreased and the positive predictive values increased, both of them in a very remarkable manner.

  16. SPIKE: AI scheduling techniques for Hubble Space Telescope

    NASA Astrophysics Data System (ADS)

    Johnston, Mark D.

    1991-09-01

    AI (Artificial Intelligence) scheduling techniques for HST are presented in the form of the viewgraphs. The following subject areas are covered: domain; HST constraint timescales; HTS scheduling; SPIKE overview; SPIKE architecture; constraint representation and reasoning; use of suitability functions by scheduling agent; SPIKE screen example; advantages of suitability function framework; limiting search and constraint propagation; scheduling search; stochastic search; repair methods; implementation; and status.

  17. Quantifying seascape structure: Extending terrestrial spatial pattern metrics to the marine realm

    USGS Publications Warehouse

    Wedding, L.M.; Christopher, L.A.; Pittman, S.J.; Friedlander, A.M.; Jorgensen, S.

    2011-01-01

    Spatial pattern metrics have routinely been applied to characterize and quantify structural features of terrestrial landscapes and have demonstrated great utility in landscape ecology and conservation planning. The important role of spatial structure in ecology and management is now commonly recognized, and recent advances in marine remote sensing technology have facilitated the application of spatial pattern metrics to the marine environment. However, it is not yet clear whether concepts, metrics, and statistical techniques developed for terrestrial ecosystems are relevant for marine species and seascapes. To address this gap in our knowledge, we reviewed, synthesized, and evaluated the utility and application of spatial pattern metrics in the marine science literature over the past 30 yr (1980 to 2010). In total, 23 studies characterized seascape structure, of which 17 quantified spatial patterns using a 2-dimensional patch-mosaic model and 5 used a continuously varying 3-dimensional surface model. Most seascape studies followed terrestrial-based studies in their search for ecological patterns and applied or modified existing metrics. Only 1 truly unique metric was found (hydrodynamic aperture applied to Pacific atolls). While there are still relatively few studies using spatial pattern metrics in the marine environment, they have suffered from similar misuse as reported for terrestrial studies, such as the lack of a priori considerations or the problem of collinearity between metrics. Spatial pattern metrics offer great potential for ecological research and environmental management in marine systems, and future studies should focus on (1) the dynamic boundary between the land and sea; (2) quantifying 3-dimensional spatial patterns; and (3) assessing and monitoring seascape change. ?? Inter-Research 2011.

  18. A k-Vector Approach to Sampling, Interpolation, and Approximation

    NASA Astrophysics Data System (ADS)

    Mortari, Daniele; Rogers, Jonathan

    2013-12-01

    The k-vector search technique is a method designed to perform extremely fast range searching of large databases at computational cost independent of the size of the database. k-vector search algorithms have historically found application in satellite star-tracker navigation systems which index very large star catalogues repeatedly in the process of attitude estimation. Recently, the k-vector search algorithm has been applied to numerous other problem areas including non-uniform random variate sampling, interpolation of 1-D or 2-D tables, nonlinear function inversion, and solution of systems of nonlinear equations. This paper presents algorithms in which the k-vector search technique is used to solve each of these problems in a computationally-efficient manner. In instances where these tasks must be performed repeatedly on a static (or nearly-static) data set, the proposed k-vector-based algorithms offer an extremely fast solution technique that outperforms standard methods.

  19. Solving search problems by strongly simulating quantum circuits

    PubMed Central

    Johnson, T. H.; Biamonte, J. D.; Clark, S. R.; Jaksch, D.

    2013-01-01

    Simulating quantum circuits using classical computers lets us analyse the inner workings of quantum algorithms. The most complete type of simulation, strong simulation, is believed to be generally inefficient. Nevertheless, several efficient strong simulation techniques are known for restricted families of quantum circuits and we develop an additional technique in this article. Further, we show that strong simulation algorithms perform another fundamental task: solving search problems. Efficient strong simulation techniques allow solutions to a class of search problems to be counted and found efficiently. This enhances the utility of strong simulation methods, known or yet to be discovered, and extends the class of search problems known to be efficiently simulable. Relating strong simulation to search problems also bounds the computational power of efficiently strongly simulable circuits; if they could solve all problems in P this would imply that all problems in NP and #P could be solved in polynomial time. PMID:23390585

  20. E12 sheet plastination: Techniques and applications.

    PubMed

    Ottone, Nicolas Ernesto; Baptista, Carlos A C; Latorre, Rafael; Bianchi, Homero Felipe; Del Sol, Mariano; Fuentes, Ramon

    2017-10-30

    Plastination is an anatomical technique that consists of replacing the liquids and fat of specimens by reactive polymers through forced impregnation in a vacuum. These are then polymerized to achieve the final result. E12 sheet plastination involves epoxy resin impregnation of thin (2-4 mm) and ultra-thin (<2 mm) tissue sheets, producing dry, transparent, odorless, non-toxic and long-lasting sheets. E12 sheet plastination techniques were reviewed using MEDLINE, EMBASE and SciELO databases, and manual searches. After searching, 616 records were found using the online and manual searches (MEDLINE, n: 207; EMBASE, n: 346; SciELO, n: 44; Manual search: 23). Finally, 96 records were included in this review (after duplicates and articles unrelated to the subject were excluded). The aim of this work was to review the E12 sheet plastination technique, searching for articles concerning views of it, identifying the different variants implemented by researchers since its creation by Gunther von Hagens, and to identify its applications from teaching and research in anatomy to morphological sciences. Clin. Anat., 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  1. Prediction of Compressional, Shear, and Stoneley Wave Velocities from Conventional Well Log Data Using a Committee Machine with Intelligent Systems

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa

    2012-01-01

    Measurement of compressional, shear, and Stoneley wave velocities, carried out by dipole sonic imager (DSI) logs, provides invaluable data in geophysical interpretation, geomechanical studies and hydrocarbon reservoir characterization. The presented study proposes an improved methodology for making a quantitative formulation between conventional well logs and sonic wave velocities. First, sonic wave velocities were predicted from conventional well logs using artificial neural network, fuzzy logic, and neuro-fuzzy algorithms. Subsequently, a committee machine with intelligent systems was constructed by virtue of hybrid genetic algorithm-pattern search technique while outputs of artificial neural network, fuzzy logic and neuro-fuzzy models were used as inputs of the committee machine. It is capable of improving the accuracy of final prediction through integrating the outputs of aforementioned intelligent systems. The hybrid genetic algorithm-pattern search tool, embodied in the structure of committee machine, assigns a weight factor to each individual intelligent system, indicating its involvement in overall prediction of DSI parameters. This methodology was implemented in Asmari formation, which is the major carbonate reservoir rock of Iranian oil field. A group of 1,640 data points was used to construct the intelligent model, and a group of 800 data points was employed to assess the reliability of the proposed model. The results showed that the committee machine with intelligent systems performed more effectively compared with individual intelligent systems performing alone.

  2. Beyond the ridge pattern: multi-informative analysis of latent fingermarks by MALDI mass spectrometry.

    PubMed

    Francese, S; Bradshaw, R; Ferguson, L S; Wolstenholme, R; Clench, M R; Bleay, S

    2013-08-07

    After over a century, fingerprints are still one of the most powerful means of biometric identification. The conventional forensic workflow for suspect identification consists of (i) recovering latent marks from crime scenes using the appropriate enhancement technique and (ii) obtaining an image of the mark to compare either against known suspect prints and/or to search in a Fingerprint Database. The suspect is identified through matching the ridge pattern and local characteristics of the ridge pattern (minutiae). However successful, there are a number of scenarios in which this process may fail; they include the recovery of partial, distorted or smudged marks, poor quality of the image resulting from inadequacy of the enhancement technique applied, extensive scarring/abrasion of the fingertips or absence of suspect's fingerprint records in the database. In all of these instances it would be very desirable to have a technology able to provide additional information from a fingermark exploiting its endogenous and exogenous chemical content. This opportunity could potentially provide new investigative leads, especially when the fingermark comparison and match process fails. We have demonstrated that Matrix Assisted Laser Desorption Ionisation Mass Spectrometry and Mass Spectrometry Imaging (MALDI MSI) can provide multiple images of the same fingermark in one analysis simultaneous with additional intelligence. Here, a review on the pioneering use and development of MALDI MSI for the analysis of latent fingermarks is presented along with the latest achievements on the forensic intelligence retrievable.

  3. Investigating User Search Tactic Patterns and System Support in Using Digital Libraries

    ERIC Educational Resources Information Center

    Joo, Soohyung

    2013-01-01

    This study aims to investigate users' search tactic application and system support in using digital libraries. A user study was conducted with sixty digital library users. The study was designed to answer three research questions: 1) How do users engage in a search process by applying different types of search tactics while conducting different…

  4. A Game of Hide and Seek: Expectations of Clumpy Resources Influence Hiding and Searching Patterns

    PubMed Central

    Wilke, Andreas; Minich, Steven; Panis, Megane; Langen, Tom A.; Skufca, Joseph D.; Todd, Peter M.

    2015-01-01

    Resources are often distributed in clumps or patches in space, unless an agent is trying to protect them from discovery and theft using a dispersed distribution. We uncover human expectations of such spatial resource patterns in collaborative and competitive settings via a sequential multi-person game in which participants hid resources for the next participant to seek. When collaborating, resources were mostly hidden in clumpy distributions, but when competing, resources were hidden in more dispersed (random or hyperdispersed) patterns to increase the searching difficulty for the other player. More dispersed resource distributions came at the cost of higher overall hiding (as well as searching) times, decreased payoffs, and an increased difficulty when the hider had to recall earlier hiding locations at the end of the experiment. Participants’ search strategies were also affected by their underlying expectations, using a win-stay lose-shift strategy appropriate for clumpy resources when searching for collaboratively-hidden items, but moving equally far after finding or not finding an item in competitive settings, as appropriate for dispersed resources. Thus participants showed expectations for clumpy versus dispersed spatial resources that matched the distributions commonly found in collaborative versus competitive foraging settings. PMID:26154661

  5. Soft-Decision Decoding of Binary Linear Block Codes Based on an Iterative Search Algorithm

    NASA Technical Reports Server (NTRS)

    Lin, Shu; Kasami, Tadao; Moorthy, H. T.

    1997-01-01

    This correspondence presents a suboptimum soft-decision decoding scheme for binary linear block codes based on an iterative search algorithm. The scheme uses an algebraic decoder to iteratively generate a sequence of candidate codewords one at a time using a set of test error patterns that are constructed based on the reliability information of the received symbols. When a candidate codeword is generated, it is tested based on an optimality condition. If it satisfies the optimality condition, then it is the most likely (ML) codeword and the decoding stops. If it fails the optimality test, a search for the ML codeword is conducted in a region which contains the ML codeword. The search region is determined by the current candidate codeword and the reliability of the received symbols. The search is conducted through a purged trellis diagram for the given code using the Viterbi algorithm. If the search fails to find the ML codeword, a new candidate is generated using a new test error pattern, and the optimality test and search are renewed. The process of testing and search continues until either the MEL codeword is found or all the test error patterns are exhausted and the decoding process is terminated. Numerical results show that the proposed decoding scheme achieves either practically optimal performance or a performance only a fraction of a decibel away from the optimal maximum-likelihood decoding with a significant reduction in decoding complexity compared with the Viterbi decoding based on the full trellis diagram of the codes.

  6. Independent operation of implicit working memory under cognitive load.

    PubMed

    Ji, Eunhee; Lee, Kyung Min; Kim, Min-Shik

    2017-10-01

    Implicit working memory (WM) has been known to operate non-consciously and unintentionally. The current study investigated whether implicit WM is a discrete mechanism from explicit WM in terms of cognitive resource. To induce cognitive resource competition, we used a conjunction search task (Experiment 1) and imposed spatial WM load (Experiment 2a and 2b). Each trial was composed of a set of five consecutive search displays. The location of the first four displays appeared as per pre-determined patterns, but the fifth display could follow the same pattern or not. If implicit WM can extract the moving pattern of stimuli, response times for the fifth target would be faster when it followed the pattern compared to when it did not. Our results showed implicit WM can operate when participants are searching for the conjunction target and even while maintaining spatial WM information. These results suggest that implicit WM is independent from explicit spatial WM. Copyright © 2017. Published by Elsevier Inc.

  7. Seeking health information on the web: positive hypothesis testing.

    PubMed

    Kayhan, Varol Onur

    2013-04-01

    The goal of this study is to investigate positive hypothesis testing among consumers of health information when they search the Web. After demonstrating the extent of positive hypothesis testing using Experiment 1, we conduct Experiment 2 to test the effectiveness of two debiasing techniques. A total of 60 undergraduate students searched a tightly controlled online database developed by the authors to test the validity of a hypothesis. The database had four abstracts that confirmed the hypothesis and three abstracts that disconfirmed it. Findings of Experiment 1 showed that majority of participants (85%) exhibited positive hypothesis testing. In Experiment 2, we found that the recommendation technique was not effective in reducing positive hypothesis testing since none of the participants assigned to this server could retrieve disconfirming evidence. Experiment 2 also showed that the incorporation technique successfully reduced positive hypothesis testing since 75% of the participants could retrieve disconfirming evidence. Positive hypothesis testing on the Web is an understudied topic. More studies are needed to validate the effectiveness of the debiasing techniques discussed in this study and develop new techniques. Search engine developers should consider developing new options for users so that both confirming and disconfirming evidence can be presented in search results as users test hypotheses using search engines. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Idiosyncratic Patterns of Representational Similarity in Prefrontal Cortex Predict Attentional Performance.

    PubMed

    Lee, Jeongmi; Geng, Joy J

    2017-02-01

    The efficiency of finding an object in a crowded environment depends largely on the similarity of nontargets to the search target. Models of attention theorize that the similarity is determined by representations stored within an "attentional template" held in working memory. However, the degree to which the contents of the attentional template are individually unique and where those idiosyncratic representations are encoded in the brain are unknown. We investigated this problem using representational similarity analysis of human fMRI data to measure the common and idiosyncratic representations of famous face morphs during an identity categorization task; data from the categorization task were then used to predict performance on a separate identity search task. We hypothesized that the idiosyncratic categorical representations of the continuous face morphs would predict their distractability when searching for each target identity. The results identified that patterns of activation in the lateral prefrontal cortex (LPFC) as well as in face-selective areas in the ventral temporal cortex were highly correlated with the patterns of behavioral categorization of face morphs and search performance that were common across subjects. However, the individually unique components of the categorization behavior were reliably decoded only in right LPFC. Moreover, the neural pattern in right LPFC successfully predicted idiosyncratic variability in search performance, such that reaction times were longer when distractors had a higher probability of being categorized as the target identity. These results suggest that the prefrontal cortex encodes individually unique components of categorical representations that are also present in attentional templates for target search. Everyone's perception of the world is uniquely shaped by personal experiences and preferences. Using functional MRI, we show that individual differences in the categorization of face morphs between two identities could be decoded from the prefrontal cortex and the ventral temporal cortex. Moreover, the individually unique representations in prefrontal cortex predicted idiosyncratic variability in attentional performance when looking for each identity in the "crowd" of another morphed face in a separate search task. Our results reveal that the representation of task-related information in prefrontal cortex is individually unique and preserved across categorization and search performance. This demonstrates the possibility of predicting individual behaviors across tasks with patterns of brain activity. Copyright © 2017 the authors 0270-6474/17/371257-12$15.00/0.

  9. How to keep your pants on: historic metamaterials and elasticity before the invention of elastic

    NASA Astrophysics Data System (ADS)

    Matsumoto, Elisabetta A.; Mahadevan, L.

    2015-03-01

    How do you create stretching from an inextensible material? Remarkably, the centuries-old embroidery technique known as smocking accomplishes just this. With the recent explosion of origami-based engineering, the search is on for a set of design principles to generate materials with prescribed mechanical properties. This quickly becomes a complex mathematical question due to the strict constraints of rigid origami imposed by the inextensibility of paper. Softening these constraints by considering woven fabrics, which have two orthogonal inextensible directions and a skewed soft shear mode, opens up a zoo of possible configurations. We explore the emergence of elastic properties in smocked fabrics as functions of both fabric elasticity and smocking pattern.

  10. A hybrid flower pollination algorithm based modified randomized location for multi-threshold medical image segmentation.

    PubMed

    Wang, Rui; Zhou, Yongquan; Zhao, Chengyan; Wu, Haizhou

    2015-01-01

    Multi-threshold image segmentation is a powerful image processing technique that is used for the preprocessing of pattern recognition and computer vision. However, traditional multilevel thresholding methods are computationally expensive because they involve exhaustively searching the optimal thresholds to optimize the objective functions. To overcome this drawback, this paper proposes a flower pollination algorithm with a randomized location modification. The proposed algorithm is used to find optimal threshold values for maximizing Otsu's objective functions with regard to eight medical grayscale images. When benchmarked against other state-of-the-art evolutionary algorithms, the new algorithm proves itself to be robust and effective through numerical experimental results including Otsu's objective values and standard deviations.

  11. Context-Sensitive Grammar Transform: Compression and Pattern Matching

    NASA Astrophysics Data System (ADS)

    Maruyama, Shirou; Tanaka, Youhei; Sakamoto, Hiroshi; Takeda, Masayuki

    A framework of context-sensitive grammar transform for speeding-up compressed pattern matching (CPM) is proposed. A greedy compression algorithm with the transform model is presented as well as a Knuth-Morris-Pratt (KMP)-type compressed pattern matching algorithm. The compression ratio is a match for gzip and Re-Pair, and the search speed of our CPM algorithm is almost twice faster than the KMP-type CPM algorithm on Byte-Pair-Encoding by Shibata et al.[18], and in the case of short patterns, faster than the Boyer-Moore-Horspool algorithm with the stopper encoding by Rautio et al.[14], which is regarded as one of the best combinations that allows a practically fast search.

  12. Google vs. the Library (Part II): Student Search Patterns and Behaviors When Using Google and a Federated Search Tool

    ERIC Educational Resources Information Center

    Georgas, Helen

    2014-01-01

    This study examines the information-seeking behavior of undergraduate students within a research context. Student searches were recorded while the participants used Google and a library (federated) search tool to find sources (one book, two articles, and one other source of their choosing) for a selected topic. The undergraduates in this study…

  13. Bounding the Resource Availability of Partially Ordered Events with Constant Resource Impact

    NASA Technical Reports Server (NTRS)

    Frank, Jeremy

    2004-01-01

    We compare existing techniques to bound the resource availability of partially ordered events. We first show that, contrary to intuition, two existing techniques, one due to Laborie and one due to Muscettola, are not strictly comparable in terms of the size of the search trees generated under chronological search with a fixed heuristic. We describe a generalization of these techniques called the Flow Balance Constraint to tightly bound the amount of available resource for a set of partially ordered events with piecewise constant resource impact We prove that the new technique generates smaller proof trees under chronological search with a fixed heuristic, at little increase in computational expense. We then show how to construct tighter resource bounds but at increased computational cost.

  14. Search Radar Track-Before-Detect Using the Hough Transform.

    DTIC Science & Technology

    1995-03-01

    before - detect processing method which allows previous data to help in target detection. The technique provides many advantages compared to...improved target detection scheme, applicable to search radars, using the Hough transform image processing technique. The system concept involves a track

  15. Exhaustive search system and method using space-filling curves

    DOEpatents

    Spires, Shannon V.

    2003-10-21

    A search system and method for one agent or for multiple agents using a space-filling curve provides a way to control one or more agents to cover an area of any space of any dimensionality using an exhaustive search pattern. An example of the space-filling curve is a Hilbert curve. The search area can be a physical geography, a cyberspace search area, or an area searchable by computing resources. The search agent can be one or more physical agents, such as a robot, and can be software agents for searching cyberspace.

  16. Integrating unified medical language system and association mining techniques into relevance feedback for biomedical literature search.

    PubMed

    Ji, Yanqing; Ying, Hao; Tran, John; Dews, Peter; Massanari, R Michael

    2016-07-19

    Finding highly relevant articles from biomedical databases is challenging not only because it is often difficult to accurately express a user's underlying intention through keywords but also because a keyword-based query normally returns a long list of hits with many citations being unwanted by the user. This paper proposes a novel biomedical literature search system, called BiomedSearch, which supports complex queries and relevance feedback. The system employed association mining techniques to build a k-profile representing a user's relevance feedback. More specifically, we developed a weighted interest measure and an association mining algorithm to find the strength of association between a query and each concept in the article(s) selected by the user as feedback. The top concepts were utilized to form a k-profile used for the next-round search. BiomedSearch relies on Unified Medical Language System (UMLS) knowledge sources to map text files to standard biomedical concepts. It was designed to support queries with any levels of complexity. A prototype of BiomedSearch software was made and it was preliminarily evaluated using the Genomics data from TREC (Text Retrieval Conference) 2006 Genomics Track. Initial experiment results indicated that BiomedSearch increased the mean average precision (MAP) for a set of queries. With UMLS and association mining techniques, BiomedSearch can effectively utilize users' relevance feedback to improve the performance of biomedical literature search.

  17. A modified three-term PRP conjugate gradient algorithm for optimization models.

    PubMed

    Wu, Yanlin

    2017-01-01

    The nonlinear conjugate gradient (CG) algorithm is a very effective method for optimization, especially for large-scale problems, because of its low memory requirement and simplicity. Zhang et al. (IMA J. Numer. Anal. 26:629-649, 2006) firstly propose a three-term CG algorithm based on the well known Polak-Ribière-Polyak (PRP) formula for unconstrained optimization, where their method has the sufficient descent property without any line search technique. They proved the global convergence of the Armijo line search but this fails for the Wolfe line search technique. Inspired by their method, we will make a further study and give a modified three-term PRP CG algorithm. The presented method possesses the following features: (1) The sufficient descent property also holds without any line search technique; (2) the trust region property of the search direction is automatically satisfied; (3) the steplengh is bounded from below; (4) the global convergence will be established under the Wolfe line search. Numerical results show that the new algorithm is more effective than that of the normal method.

  18. Generalizing Backtrack-Free Search: A Framework for Search-Free Constraint Satisfaction

    NASA Technical Reports Server (NTRS)

    Jonsson, Ari K.; Frank, Jeremy

    2000-01-01

    Tractable classes of constraint satisfaction problems are of great importance in artificial intelligence. Identifying and taking advantage of such classes can significantly speed up constraint problem solving. In addition, tractable classes are utilized in applications where strict worst-case performance guarantees are required, such as constraint-based plan execution. In this work, we present a formal framework for search-free (backtrack-free) constraint satisfaction. The framework is based on general procedures, rather than specific propagation techniques, and thus generalizes existing techniques in this area. We also relate search-free problem solving to the notion of decision sets and use the result to provide a constructive criterion that is sufficient to guarantee search-free problem solving.

  19. Hyperspace geography: visualizing fitness landscapes beyond 4D.

    PubMed

    Wiles, Janet; Tonkes, Bradley

    2006-01-01

    Human perception is finely tuned to extract structure about the 4D world of time and space as well as properties such as color and texture. Developing intuitions about spatial structure beyond 4D requires exploiting other perceptual and cognitive abilities. One of the most natural ways to explore complex spaces is for a user to actively navigate through them, using local explorations and global summaries to develop intuitions about structure, and then testing the developing ideas by further exploration. This article provides a brief overview of a technique for visualizing surfaces defined over moderate-dimensional binary spaces, by recursively unfolding them onto a 2D hypergraph. We briefly summarize the uses of a freely available Web-based visualization tool, Hyperspace Graph Paper (HSGP), for exploring fitness landscapes and search algorithms in evolutionary computation. HSGP provides a way for a user to actively explore a landscape, from simple tasks such as mapping the neighborhood structure of different points, to seeing global properties such as the size and distribution of basins of attraction or how different search algorithms interact with landscape structure. It has been most useful for exploring recursive and repetitive landscapes, and its strength is that it allows intuitions to be developed through active navigation by the user, and exploits the visual system's ability to detect pattern and texture. The technique is most effective when applied to continuous functions over Boolean variables using 4 to 16 dimensions.

  20. Scavengers on the move: behavioural changes in foraging search patterns during the annual cycle.

    PubMed

    López-López, Pascual; Benavent-Corai, José; García-Ripollés, Clara; Urios, Vicente

    2013-01-01

    Optimal foraging theory predicts that animals will tend to maximize foraging success by optimizing search strategies. However, how organisms detect sparsely distributed food resources remains an open question. When targets are sparse and unpredictably distributed, a Lévy strategy should maximize foraging success. By contrast, when resources are abundant and regularly distributed, simple brownian random movement should be sufficient. Although very different groups of organisms exhibit Lévy motion, the shift from a Lévy to a brownian search strategy has been suggested to depend on internal and external factors such as sex, prey density, or environmental context. However, animal response at the individual level has received little attention. We used GPS satellite-telemetry data of Egyptian vultures Neophron percnopterus to examine movement patterns at the individual level during consecutive years, with particular interest in the variations in foraging search patterns during the different periods of the annual cycle (i.e. breeding vs. non-breeding). Our results show that vultures followed a brownian search strategy in their wintering sojourn in Africa, whereas they exhibited a more complex foraging search pattern at breeding grounds in Europe, including Lévy motion. Interestingly, our results showed that individuals shifted between search strategies within the same period of the annual cycle in successive years. Results could be primarily explained by the different environmental conditions in which foraging activities occur. However, the high degree of behavioural flexibility exhibited during the breeding period in contrast to the non-breeding period is challenging, suggesting that not only environmental conditions explain individuals' behaviour but also individuals' cognitive abilities (e.g., memory effects) could play an important role. Our results support the growing awareness about the role of behavioural flexibility at the individual level, adding new empirical evidence about how animals in general, and particularly scavengers, solve the problem of efficiently finding food resources.

  1. Search query data to monitor interest in behavior change: application for public health.

    PubMed

    Carr, Lucas J; Dunsiger, Shira I

    2012-01-01

    There is a need for effective interventions and policies that target the leading preventable causes of death in the U.S. (e.g., smoking, overweight/obesity, physical inactivity). Such efforts could be aided by the use of publicly available, real-time search query data that illustrate times and locations of high and low public interest in behaviors related to preventable causes of death. This study explored patterns of search query activity for the terms 'weight', 'diet', 'fitness', and 'smoking' using Google Insights for Search. Search activity for 'weight', 'diet', 'fitness', and 'smoking' conducted within the United States via Google between January 4(th), 2004 (first date data was available) and November 28(th), 2011 (date of data download and analysis) were analyzed. Using a generalized linear model, we explored the effects of time (month) on mean relative search volume for all four terms. Models suggest a significant effect of month on mean search volume for all four terms. Search activity for all four terms was highest in January with observable declines throughout the remainder of the year. These findings demonstrate discernable temporal patterns of search activity for four areas of behavior change. These findings could be used to inform the timing, location and messaging of interventions, campaigns and policies targeting these behaviors.

  2. Time Patterns in Remote OPAC Use.

    ERIC Educational Resources Information Center

    Lucas, Thomas A.

    1993-01-01

    Describes a transaction log analysis of the New York Public Library research libraries' OPAC (online public access catalog). Much of the remote searching occurred when the libraries were closed and was more evenly distributed than internal searching, demonstrating that remote searching could expand access and reduce peak system loads. (Contains…

  3. High activity and Levy searches: jellyfish can search the water column like fish.

    PubMed

    Hays, Graeme C; Bastian, Thomas; Doyle, Thomas K; Fossette, Sabrina; Gleiss, Adrian C; Gravenor, Michael B; Hobson, Victoria J; Humphries, Nicolas E; Lilley, Martin K S; Pade, Nicolas G; Sims, David W

    2012-02-07

    Over-fishing may lead to a decrease in fish abundance and a proliferation of jellyfish. Active movements and prey search might be thought to provide a competitive advantage for fish, but here we use data-loggers to show that the frequently occurring coastal jellyfish (Rhizostoma octopus) does not simply passively drift to encounter prey. Jellyfish (327 days of data from 25 jellyfish with depth collected every 1 min) showed very dynamic vertical movements, with their integrated vertical movement averaging 619.2 m d(-1), more than 60 times the water depth where they were tagged. The majority of movement patterns were best approximated by exponential models describing normal random walks. However, jellyfish also showed switching behaviour from exponential patterns to patterns best fitted by a truncated Lévy distribution with exponents (mean μ=1.96, range 1.2-2.9) close to the theoretical optimum for searching for sparse prey (μopt≈2.0). Complex movements in these 'simple' animals may help jellyfish to compete effectively with fish for plankton prey, which may enhance their ability to increase in dominance in perturbed ocean systems.

  4. Googling in anatomy education: Can google trends inform educators of national online search patterns of anatomical syllabi?

    PubMed

    Phelan, Nigel; Davy, Shane; O'Keeffe, Gerard W; Barry, Denis S

    2017-03-01

    The role of e-learning platforms in anatomy education continues to expand as self-directed learning is promoted in higher education. Although a wide range of e-learning resources are available, determining student use of non-academic internet resources requires novel approaches. One such approach that may be useful is the Google Trends © web application. To determine the feasibility of Google Trends to gain insights into anatomy-related online searches, Google Trends data from the United States from January 2010 to December 2015 were analyzed. Data collected were based on the recurrence of keywords related to head and neck anatomy generated from the American Association of Clinical Anatomists and the Anatomical Society suggested anatomy syllabi. Relative search volume (RSV) data were analyzed for seasonal periodicity and their overall temporal trends. Following exclusions due to insufficient search volume data, 29 out of 36 search terms were analyzed. Significant seasonal patterns occurred in 23 search terms. Thirty-nine seasonal peaks were identified, mainly in October and April, coinciding with teaching periods in anatomy curricula. A positive correlation of RSV with time over the 6-year study period occurred in 25 out of 29 search terms. These data demonstrate how Google Trends may offer insights into the nature and timing of online search patterns of anatomical syllabi and may potentially inform the development and timing of targeted online supports to ensure that students of anatomy have the opportunity to engage with online content that is both accurate and fit for purpose. Anat Sci Educ 10: 152-159. © 2016 American Association of Anatomists. © 2016 American Association of Anatomists.

  5. A hybrid, auto-adaptive and rule-based multi-agent approach using evolutionary algorithms for improved searching

    NASA Astrophysics Data System (ADS)

    Izquierdo, Joaquín; Montalvo, Idel; Campbell, Enrique; Pérez-García, Rafael

    2016-08-01

    Selecting the most appropriate heuristic for solving a specific problem is not easy, for many reasons. This article focuses on one of these reasons: traditionally, the solution search process has operated in a given manner regardless of the specific problem being solved, and the process has been the same regardless of the size, complexity and domain of the problem. To cope with this situation, search processes should mould the search into areas of the search space that are meaningful for the problem. This article builds on previous work in the development of a multi-agent paradigm using techniques derived from knowledge discovery (data-mining techniques) on databases of so-far visited solutions. The aim is to improve the search mechanisms, increase computational efficiency and use rules to enrich the formulation of optimization problems, while reducing the search space and catering to realistic problems.

  6. Knowledge discovery from data as a framework to decision support in medical domains

    PubMed Central

    Gibert, Karina

    2009-01-01

    Introduction Knowledge discovery from data (KDD) is a multidisciplinary discipline which appeared in 1996 for “non trivial identifying of valid, novel, potentially useful, ultimately understandable patterns in data”. Pre-treatment of data and post-processing is as important as the data exploitation (Data Mining) itself. Different analysis techniques can be properly combined to produce explicit knowledge from data. Methods Hybrid KDD methodologies combining Artificial Intelligence with Statistics and visualization have been used to identify patterns in complex medical phenomena: experts provide prior knowledge (pK); it biases the search of distinguishable groups of homogeneous objects; support-interpretation tools (CPG) assisted experts in conceptualization and labelling of discovered patterns, consistently with pK. Results Patterns of dependency in mental disabilities supported decision-making on legislation of the Spanish Dependency Law in Catalonia. Relationships between type of neurorehabilitation treatment and patterns of response for brain damage are assessed. Patterns of the perceived QOL along time are used in spinal cord lesion to improve social inclusion. Conclusion Reality is more and more complex and classical data analyses are not powerful enough to model it. New methodologies are required including multidisciplinarity and stressing on production of understandable models. Interaction with the experts is critical to generate meaningful results which can really support decision-making, particularly convenient transferring the pK to the system, as well as interpreting results in close interaction with experts. KDD is a valuable paradigm, particularly when facing very complex domains, not well understood yet, like many medical phenomena.

  7. Empirical tests of the role of disruptive coloration in reducing detectability

    PubMed Central

    Fraser, Stewart; Callahan, Alison; Klassen, Dana; Sherratt, Thomas N

    2007-01-01

    Disruptive patterning is a potentially universal camouflage technique that is thought to enhance concealment by rendering the detection of body shapes more difficult. In a recent series of field experiments, artificial moths with markings that extended to the edges of their ‘wings’ survived at higher rates than moths with the same edge patterns inwardly displaced. While this result seemingly indicates a benefit to obscuring edges, it is possible that the higher density markings of the inwardly displaced patterns concomitantly reduced their extent of background matching. Likewise, it has been suggested that the mealworm baits placed on the artificial moths could have created differential contrasts with different moth patterns. To address these concerns, we conducted controlled trials in which human subjects searched for computer-generated moth images presented against images of oak trees. Moths with edge-extended disruptive markings survived at higher rates, and took longer to find, than all other moth types, whether presented sequentially or simultaneously. However, moths with no edge markings and reduced interior pattern density survived better than their high-density counterparts, indicating that background matching may have played a so-far unrecognized role in the earlier experiments. Our disruptively patterned non-background-matching moths also had the lowest overall survivorship, indicating that disruptive coloration alone may not provide significant protection from predators. Collectively, our results provide independent support for the survival value of disruptive markings and demonstrate that there are common features in human and avian perception of camouflage. PMID:17360282

  8. Non-Lethal Weapons Program

    Science.gov Websites

    ), 26th Marine Expeditionary Unit (MEU), practice non-lethal control techniques during a non-lethal Skip to main content (Press Enter). Toggle navigation Non-Lethal Weapons Program Search Search JNLWP: Search Search JNLWP: Search Non-Lethal Weapons Program U.S. Department of Defense Non-Lethal

  9. Block Architecture Problem with Depth First Search Solution and Its Application

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Abdullah, Dahlan; Simarmata, Janner; Pranolo, Andri; Saleh Ahmar, Ansari; Hidayat, Rahmat; Napitupulu, Darmawan; Nurdiyanto, Heri; Febriadi, Bayu; Zamzami, Z.

    2018-01-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  10. Searching Process with Raita Algorithm and its Application

    NASA Astrophysics Data System (ADS)

    Rahim, Robbi; Saleh Ahmar, Ansari; Abdullah, Dahlan; Hartama, Dedy; Napitupulu, Darmawan; Putera Utama Siahaan, Andysah; Hasan Siregar, Muhammad Noor; Nasution, Nurliana; Sundari, Siti; Sriadhi, S.

    2018-04-01

    Searching is a common process performed by many computer users, Raita algorithm is one algorithm that can be used to match and find information in accordance with the patterns entered. Raita algorithm applied to the file search application using java programming language and the results obtained from the testing process of the file search quickly and with accurate results and support many data types.

  11. Use of narrow-band imaging bronchoscopy in detection of lung cancer.

    PubMed

    Zaric, Bojan; Perin, Branislav

    2010-05-01

    Narrow-band imaging (NBI) is a new endoscopic technique designed for detection of pathologically altered submucosal and mucosal microvascular patterns. The combination of magnification videobronchoscopy and NBI showed great potential in the detection of precancerous and cancerous lesions of the bronchial mucosa. The preliminary studies confirmed supremacy of NBI over white-light videobronchoscopy in the detection of premalignant and malignant lesions. Pathological patterns of capillaries in bronchial mucosa are known as Shibuya's descriptors (dotted, tortuous and abrupt-ending blood vessels). Where respiratory endoscopy is concerned, the NBI is still a 'technology in search of proper indication'. More randomized trials are necessary to confirm the place of NBI in the diagnostic algorithm, and more trials are needed to evaluate the relation of NBI to autofluorescence videobronchoscopy and to white-light magnification videobronchoscopy. Considering the fact that NBI examination of the tracheo-bronchial tree is easy, reproducible and clear to interpret, it is certain that NBI videobronchoscopy will play a significant role in the future of lung cancer detection and staging.

  12. Nanometer-Scale Chemistry of a Calcite Biomineralization Template: Implications for Skeletal Composition and Nucleation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branson, Oscar; Bonnin, Elisa A.; Perea, Daniel E.

    Biomineralizing organisms exhibit exquisite control over skeletal morphology and composition. The promise of understanding and harnessing this feat of natural engineering has motivated an intense search for the mechanisms that direct in vivo mineral self-assembly. We used atom probe tomography, a sub-nanometer 3D chemical mapping technique, to examine the chemistry of a buried organic-mineral interface in biomineral calcite from a marine foraminifer. The chemical patterns at this interface capture the processes of early biomineralization, when the shape, mineralogy, and orientation of skeletal growth are initially established. Sodium is enriched by a factor of nine on the organic side of themore » interface. Based on this pattern, we suggest that sodium plays an integral role in early biomineralization, potentially altering interfacial energy to promote crystal nucleation, and that interactions between organic surfaces and electrolytes other than calcium or carbonate could be a crucial aspect of CaCO3 biomineralization.« less

  13. Nanometer-Scale Chemistry of a Calcite Biomineralization Template: Implications for Skeletal Composition and Nucleation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Branson, Oscar; Bonnin, Elisa A.; Perea, Daniel E.

    2016-10-28

    Biomineralizing organisms exhibit exquisite control over skeletal morphology and composition. The promise of understanding and harnessing this feat of natural engineering has motivated an intense search for the mechanisms that direct in vivo mineral self-assembly. We used atom probe tomography, a sub-nanometer 3D chemical mapping technique, to examine the chemistry of a buried organic-mineral interface in biomineral calcite from a marine foraminifer. The chemical patterns at this interface capture the processes of early biomineralization, when the shape, mineralogy, and orientation of skeletal growth are initially established. Sodium is enriched by a factor of nine on the organic side of themore » interface. Based on this pattern, we suggest that sodium plays an integral role in early biomineralization, potentially altering interfacial energy to promote crystal nucleation, and that interactions between organic surfaces and electrolytes other than calcium or carbonate could be a crucial aspect of CaCO3 biomineralization.« less

  14. Nanometer-Scale Chemistry of a Calcite Biomineralization Template: Implications for Skeletal Composition and Nucleation

    DOE PAGES

    Branson, Oscar; Bonnin, Elisa A.; Perea, Daniel E.; ...

    2016-10-28

    Biomineralizing organisms exhibit exquisite control over skeletal morphology and composition. The promise of understanding and harnessing this feat of natural engineering has motivated an intense search for the mechanisms that direct in vivo mineral self-assembly. We used atom probe tomography, a sub-nanometer 3D chemical mapping technique, to examine the chemistry of a buried organic-mineral interface in biomineral calcite from a marine foraminifer. Here, the chemical patterns at this interface capture the processes of early biomineralization, when the shape, mineralogy, and orientation of skeletal growth are initially established. Sodium is enriched by a factor of nine on the organic side ofmore » the interface. Based on this pattern, we suggest that sodium plays an integral role in early biomineralization, potentially altering interfacial energy to promote crystal nucleation, and that interactions between organic surfaces and electrolytes other than calcium or carbonate could be a crucial aspect of CaCO 3 biomineralization.« less

  15. Search the SLAC Web

    Science.gov Websites

    for results of SLAC Intranet searches. Search Tips Technique Example Finds Results That: word want exact matches on words, for example, names of people, places, or organizations. words help desk

  16. Implementing the Army NetCentric Data Strategy in a ServiceOriented Environment

    DTIC Science & Technology

    2009-04-23

    D a t a D i s c o v e r y Data Retrieval Data Subscription Data Discovery D a t a A c c e s s Artifact Discovery Federated Search Data Search Data...define common interfaces to search and  retrieve data across the enterprise.  • Patterns • Search • Status • Receive – Services • Federated   Search • Artifact

  17. Intrinsic Lévy behaviour in organisms - searching for a mechanism. Comment on "Liberating Lévy walk research from the shackles of optimal foraging" by A.M. Reynolds

    NASA Astrophysics Data System (ADS)

    Sims, David W.

    2015-09-01

    The seminal papers by Viswanathan and colleagues in the late 1990s [1,2] proposed not only that scale-free, superdiffusive Lévy walks can describe the free-ranging movement patterns observed in animals such as the albatross [1], but that the Lévy walk was optimal for searching for sparsely and randomly distributed resource targets [2]. This distinct advantage, now shown to be present over a much broader set of conditions than originally theorised [3], implied that the Lévy walk is a search strategy that should be found very widely in organisms [4]. In the years since there have been several influential empirical studies showing that Lévy walks can indeed be detected in the movement patterns of a very broad range of taxa, from jellyfish, insects, fish, reptiles, seabirds, humans [5-10], and even in the fossilised trails of extinct invertebrates [11]. The broad optimality and apparent deep evolutionary origin of movement (search) patterns that are well approximated by Lévy walks led to the development of the Lévy flight foraging (LFF) hypothesis [12], which states that "since Lévy flights and walks can optimize search efficiencies, therefore natural selection should have led to adaptations for Lévy flight foraging".

  18. B-tree search reinforcement learning for model based intelligent agent

    NASA Astrophysics Data System (ADS)

    Bhuvaneswari, S.; Vignashwaran, R.

    2013-03-01

    Agents trained by learning techniques provide a powerful approximation of active solutions for naive approaches. In this study using B - Trees implying reinforced learning the data search for information retrieval is moderated to achieve accuracy with minimum search time. The impact of variables and tactics applied in training are determined using reinforcement learning. Agents based on these techniques perform satisfactory baseline and act as finite agents based on the predetermined model against competitors from the course.

  19. An improved CS-LSSVM algorithm-based fault pattern recognition of ship power equipments.

    PubMed

    Yang, Yifei; Tan, Minjia; Dai, Yuewei

    2017-01-01

    A ship power equipments' fault monitoring signal usually provides few samples and the data's feature is non-linear in practical situation. This paper adopts the method of the least squares support vector machine (LSSVM) to deal with the problem of fault pattern identification in the case of small sample data. Meanwhile, in order to avoid involving a local extremum and poor convergence precision which are induced by optimizing the kernel function parameter and penalty factor of LSSVM, an improved Cuckoo Search (CS) algorithm is proposed for the purpose of parameter optimization. Based on the dynamic adaptive strategy, the newly proposed algorithm improves the recognition probability and the searching step length, which can effectively solve the problems of slow searching speed and low calculation accuracy of the CS algorithm. A benchmark example demonstrates that the CS-LSSVM algorithm can accurately and effectively identify the fault pattern types of ship power equipments.

  20. Multi-Database Searching in the Behavioral Sciences--Part I: Basic Techniques and Core Databases.

    ERIC Educational Resources Information Center

    Angier, Jennifer J.; Epstein, Barbara A.

    1980-01-01

    Outlines practical searching techniques in seven core behavioral science databases accessing psychological literature: Psychological Abstracts, Social Science Citation Index, Biosis, Medline, Excerpta Medica, Sociological Abstracts, ERIC. Use of individual files is discussed and their relative strengths/weaknesses are compared. Appended is a list…

  1. Understanding and Mitigating Forum Spam

    ERIC Educational Resources Information Center

    Shin, Youngsang

    2011-01-01

    The Web is large and expanding, making it challenging to attract new visitors to websites. Website operators often use Search Engine Optimization (SEO) techniques to boost the search engine rankings of their sites, thereby maximizing the inflow of visitors. Malicious operators take SEO to the extreme through many unsavory techniques that are often…

  2. Visual search among items of different salience: removal of visual attention mimics a lesion in extrastriate area V4.

    PubMed

    Braun, J

    1994-02-01

    In more than one respect, visual search for the most salient or the least salient item in a display are different kinds of visual tasks. The present work investigated whether this difference is primarily one of perceptual difficulty, or whether it is more fundamental and relates to visual attention. Display items of different salience were produced by varying either size, contrast, color saturation, or pattern. Perceptual masking was employed and, on average, mask onset was delayed longer in search for the least salient item than in search for the most salient item. As a result, the two types of visual search presented comparable perceptual difficulty, as judged by psychophysical measures of performance, effective stimulus contrast, and stability of decision criterion. To investigate the role of attention in the two types of search, observers attempted to carry out a letter discrimination and a search task concurrently. To discriminate the letters, observers had to direct visual attention at the center of the display and, thus, leave unattended the periphery, which contained target and distractors of the search task. In this situation, visual search for the least salient item was severely impaired while visual search for the most salient item was only moderately affected, demonstrating a fundamental difference with respect to visual attention. A qualitatively identical pattern of results was encountered by Schiller and Lee (1991), who used similar visual search tasks to assess the effect of a lesion in extrastriate area V4 of the macaque.

  3. Molecular modeling of directed self-assembly of block copolymers: Fundamental studies of processing conditions and evolutionary pattern design

    NASA Astrophysics Data System (ADS)

    Khaira, Gurdaman Singh

    Rapid progress in the semi-conductor industry has pushed for smaller feature sizes on integrated electronic circuits. Current photo-lithographic techniques for nanofabrication have reached their technical limit and are problematic when printing features small enough to meet future industrial requirements. "Bottom-up'' techniques, such as the directed self-assembly (DSA) of block copolymers (BCP), are the primary contenders to compliment current "top-down'' photo-lithography ones. For industrial requirements, the defect density from DSA needs to be less than 1 defect per 10 cm by 10 cm. Knowledge of both material synthesis and the thermodynamics of the self-assembly process are required before optimal operating conditions can be found to produce results adequate for industry. The work present in this thesis is divided into three chapters, each discussing various aspects of DSA as studied via a molecular model that contains the essential physics of BCP self-assembly. Though there are various types of guiding fields that can be used to direct BCPs over large wafer areas with minimum defects, this study focuses only on chemically patterned substrates. The first chapter addresses optimal pattern design by describing a framework where molecular simulations of various complexities are coupled with an advanced optimization technique to find a pattern that directs a target morphology. It demonstrates the first ever study where BCP self-assembly on a patterned substrate is optimized using a three-dimensional description of the block-copolymers. For problems pertaining to DSA, the methodology is shown to converge much faster than the traditional random search approach. The second chapter discusses the metrology of BCP thin films using TEM tomography and X-ray scattering techniques, such as CDSAXS and GISAXS. X-ray scattering has the advantage of being able to quickly probe the average structure of BCP morphologies over large wafer areas; however, deducing the BCP morphology from the information in inverse space is a challenging task. Using the optimization techniques and molecular simulations discussed in the first chapter, a methodology to reconstruct BCP morphology from X-ray scattering data is described. It is shown that only a handful of simulation parameters that come directly from experiment are able to describe the morphologies observed from real X-ray scattering experiments. The last chapter focuses on the use of solvents to assist the self-assembly of BCPs. Additional functionality to capture the process of solvent annealing is also discussed. The bulk behavior of solvated mixtures of BCPs with solvents of various affinities is described, and the results are consistent with the experimentally observed behavior of BCPs in the presence of solvents.

  4. SIRW: A web server for the Simple Indexing and Retrieval System that combines sequence motif searches with keyword searches.

    PubMed

    Ramu, Chenna

    2003-07-01

    SIRW (http://sirw.embl.de/) is a World Wide Web interface to the Simple Indexing and Retrieval System (SIR) that is capable of parsing and indexing various flat file databases. In addition it provides a framework for doing sequence analysis (e.g. motif pattern searches) for selected biological sequences through keyword search. SIRW is an ideal tool for the bioinformatics community for searching as well as analyzing biological sequences of interest.

  5. Optimizing Search Patterns for Multiple Searchers Prosecuting a Single Contact In the South China Sea

    DTIC Science & Technology

    2016-09-01

    searching for lost car keys in a parking lot to prosecuting a submarine in the South China Sea. This research draws on oceanographic properties to...search area based on the oceanographic properties at 21N 119E. 14. SUBJECT TERMS Search Theory, Undersea Warfare, South China Sea, Anti- Submarine ...lot to prosecuting a submarine in the South China Sea. This research draws on oceanographic properties to develop a search radii for two surface ships

  6. A biclustering algorithm for extracting bit-patterns from binary datasets.

    PubMed

    Rodriguez-Baena, Domingo S; Perez-Pulido, Antonio J; Aguilar-Ruiz, Jesus S

    2011-10-01

    Binary datasets represent a compact and simple way to store data about the relationships between a group of objects and their possible properties. In the last few years, different biclustering algorithms have been specially developed to be applied to binary datasets. Several approaches based on matrix factorization, suffix trees or divide-and-conquer techniques have been proposed to extract useful biclusters from binary data, and these approaches provide information about the distribution of patterns and intrinsic correlations. A novel approach to extracting biclusters from binary datasets, BiBit, is introduced here. The results obtained from different experiments with synthetic data reveal the excellent performance and the robustness of BiBit to density and size of input data. Also, BiBit is applied to a central nervous system embryonic tumor gene expression dataset to test the quality of the results. A novel gene expression preprocessing methodology, based on expression level layers, and the selective search performed by BiBit, based on a very fast bit-pattern processing technique, provide very satisfactory results in quality and computational cost. The power of biclustering in finding genes involved simultaneously in different cancer processes is also shown. Finally, a comparison with Bimax, one of the most cited binary biclustering algorithms, shows that BiBit is faster while providing essentially the same results. The source and binary codes, the datasets used in the experiments and the results can be found at: http://www.upo.es/eps/bigs/BiBit.html dsrodbae@upo.es Supplementary data are available at Bioinformatics online.

  7. Searching the Internet for psychiatric disorders among Arab and Jewish Israelis: insights from a comprehensive infodemiological survey.

    PubMed

    Adawi, Mohammad; Amital, Howard; Mahamid, Mahmud; Amital, Daniela; Bisharat, Bishara; Mahroum, Naim; Sharif, Kassem; Guy, Adi; Adawi, Amin; Mahagna, Hussein; Abu Much, Arsalan; Watad, Samaa; Bragazzi, Nicola Luigi; Watad, Abdulla

    2018-01-01

    Israel represents a complex and pluralistic society comprising two major ethno-national groups, Israeli Jews and Israeli Arabs, which differ in terms of religious and cultural values as well as social constructs. According to the so-called "diversification hypothesis", within the framework of e-health and in the era of new information and communication technologies, seeking online health information could be a channel to increase health literacy, especially among disadvantaged groups. However, little is known concerning digital seeking behavior and, in particular, digital mental health literacy. This study was conducted in order to fill in this gap. Concerning raw figures, unadjusted for confounding variables (time, population size, Internet penetration index, disease rate), "depression" searched in Hebrew was characterized by 1.5 times higher search volumes, slightly declining throughout time, whereas relative search volumes (RSVs) related to "depression" searched in Arabic tended to increase over the years. Similar patterns could be detected for "phobia" (in Hebrew 1.4-fold higher than in Arabic) and for "anxiety" (with the searches performed in Hebrew 2.3 times higher than in Arabic). "Suicide" in Hebrew was searched 2.0-fold more than in Arabic (interestingly for both languages search volumes exhibited seasonal cyclic patterns). Eating disorders were searched more in Hebrew: 8.0-times more for "bulimia", whilst "anorexia" was searched in Hebrew only. When adjusting for confounding variables, association between digital seeking behavior and ethnicity remained statistically significant ( p -value < 0.0001) for all psychiatric disorders considered in the current investigation, except for "bulimia" ( p  = 0.989). More in details, Israeli Arabs searched for mental health disorders less than Jews, apart from "depression". Arab and Jewish Israelis, besides differing in terms of language, religion, social and cultural values, have different patterns of usage of healthcare services and provisions, as well as e-healthcare services concerning mental health. Policy- and decision-makers should be aware of this and make their best efforts to promote digital health literacy among the Arab population in Israel.

  8. Searching the Internet for psychiatric disorders among Arab and Jewish Israelis: insights from a comprehensive infodemiological survey

    PubMed Central

    Adawi, Mohammad; Amital, Howard; Mahamid, Mahmud; Amital, Daniela; Bisharat, Bishara; Mahroum, Naim; Sharif, Kassem; Guy, Adi; Adawi, Amin; Mahagna, Hussein; Abu Much, Arsalan; Watad, Samaa; Watad, Abdulla

    2018-01-01

    Israel represents a complex and pluralistic society comprising two major ethno-national groups, Israeli Jews and Israeli Arabs, which differ in terms of religious and cultural values as well as social constructs. According to the so-called “diversification hypothesis”, within the framework of e-health and in the era of new information and communication technologies, seeking online health information could be a channel to increase health literacy, especially among disadvantaged groups. However, little is known concerning digital seeking behavior and, in particular, digital mental health literacy. This study was conducted in order to fill in this gap. Concerning raw figures, unadjusted for confounding variables (time, population size, Internet penetration index, disease rate), “depression” searched in Hebrew was characterized by 1.5 times higher search volumes, slightly declining throughout time, whereas relative search volumes (RSVs) related to “depression” searched in Arabic tended to increase over the years. Similar patterns could be detected for “phobia” (in Hebrew 1.4-fold higher than in Arabic) and for “anxiety” (with the searches performed in Hebrew 2.3 times higher than in Arabic). “Suicide” in Hebrew was searched 2.0-fold more than in Arabic (interestingly for both languages search volumes exhibited seasonal cyclic patterns). Eating disorders were searched more in Hebrew: 8.0-times more for “bulimia”, whilst “anorexia” was searched in Hebrew only. When adjusting for confounding variables, association between digital seeking behavior and ethnicity remained statistically significant (p-value < 0.0001) for all psychiatric disorders considered in the current investigation, except for “bulimia” (p = 0.989). More in details, Israeli Arabs searched for mental health disorders less than Jews, apart from “depression”. Arab and Jewish Israelis, besides differing in terms of language, religion, social and cultural values, have different patterns of usage of healthcare services and provisions, as well as e-healthcare services concerning mental health. Policy- and decision-makers should be aware of this and make their best efforts to promote digital health literacy among the Arab population in Israel. PMID:29576974

  9. A Search for Quasi-periodic Oscillations in the Blazar 1ES 1959+650

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Xiao-Pan; Luo, Yu-Hui; Yang, Hai-Yan

    We have searched quasi-periodic oscillations (QPOs) in the 15 GHz light curve of the BL Lac object 1ES 1959+650 monitored by the Owens Valley Radio Observatory 40 m telescope during the period from 2008 January to 2016 February, using the Lomb–Scargle Periodogram, power spectral density (PSD), discrete autocorrelation function, and phase dispersion minimization (PDM) techniques. The red noise background has been established via the PSD method, and no QPO can be derived at the 3 σ confidence level accounting for the impact of the red noise variability. We conclude that the light curve of 1ES 1959+650 can be explained bymore » a stochastic red noise process that contributes greatly to the total observed variability amplitude, dominates the power spectrum, causes spurious bumps and wiggles in the autocorrelation function and can result in the variance of the folded light curve decreasing toward lower temporal frequencies when few-cycle, sinusoid-like patterns are present. Moreover, many early supposed periodicity claims for blazar light curves need to be reevaluated assuming red noise.« less

  10. Computer-Aided Discovery Tools for Volcano Deformation Studies with InSAR and GPS

    NASA Astrophysics Data System (ADS)

    Pankratius, V.; Pilewskie, J.; Rude, C. M.; Li, J. D.; Gowanlock, M.; Bechor, N.; Herring, T.; Wauthier, C.

    2016-12-01

    We present a Computer-Aided Discovery approach that facilitates the cloud-scalable fusion of different data sources, such as GPS time series and Interferometric Synthetic Aperture Radar (InSAR), for the purpose of identifying the expansion centers and deformation styles of volcanoes. The tools currently developed at MIT allow the definition of alternatives for data processing pipelines that use various analysis algorithms. The Computer-Aided Discovery system automatically generates algorithmic and parameter variants to help researchers explore multidimensional data processing search spaces efficiently. We present first application examples of this technique using GPS data on volcanoes on the Aleutian Islands and work in progress on combined GPS and InSAR data in Hawaii. In the model search context, we also illustrate work in progress combining time series Principal Component Analysis with InSAR augmentation to constrain the space of possible model explanations on current empirical data sets and achieve a better identification of deformation patterns. This work is supported by NASA AIST-NNX15AG84G and NSF ACI-1442997 (PI: V. Pankratius).

  11. Status of LUMINEU program to search for neutrinoless double beta decay of 100Mo with cryogenic ZnMoO4 scintillating bolometers

    NASA Astrophysics Data System (ADS)

    Danevich, F. A.; Bergé, L.; Boiko, R. S.; Chapellier, M.; Chernyak, D. M.; Coron, N.; Devoyon, L.; Drillien, A.-A.; Dumoulin, L.; Enss, C.; Fleischmann, A.; Gastaldo, L.; Giuliani, A.; Gray, D.; Gros, M.; Hervé, S.; Humbert, V.; Ivanov, I. M.; Juillard, A.; Kobychev, V. V.; Koskas, F.; Loidl, M.; Magnier, P.; Makarov, E. P.; Mancuso, M.; de Marcillac, P.; Marnieros, S.; Marrache-Kikuchi, C.; Navick, X.-F.; Nones, C.; Olivieri, E.; Paul, B.; Penichot, Y.; Pessina, G.; Plantevin, O.; Poda, D. V.; Redon, T.; Rodrigues, M.; Shlegel, V. N.; Strazzer, O.; Tenconi, M.; Torres, L.; Tretyak, V. I.; Vasiliev, Ya. V.; Velazquez, M.; Viraphong, O.

    2015-10-01

    The LUMTNEU program aims at performing a pilot experiment on 0ν2β decay of 100Mo using radiopure ZnMoO4 crystals enriched in 100Mo operated as cryogenic scintillating bolometers. Large volume ZnMoO4 crystal scintillators (˜ 0.3 kg) were developed and tested showing high performance in terms of radiopurity, energy resolution and α/β particle discrimination capability. Zinc molybdate crystal scintillators enriched in 100Mo were grown for the first time by the low-thermal-gradient Czochralski technique with a high crystal yield and an acceptable level of enriched molybdenum irrecoverable losses. A background level of ˜ 0.5 counts/(yr keV ton) in the region of interest can be reached in a large detector array thanks to the excellent detectors radiopurity and particle discrimination capability, suppression of randomly coinciding events by pulse-shape analysis, and anticoincidence cut. These results pave the way to future sensitive searches based on the LUMTNEU technology, capable of approachingand exploring the inverted hierarchy region of the neutrino mass pattern.

  12. Machine learning for a Toolkit for Image Mining

    NASA Technical Reports Server (NTRS)

    Delanoy, Richard L.

    1995-01-01

    A prototype user environment is described that enables a user with very limited computer skills to collaborate with a computer algorithm to develop search tools (agents) that can be used for image analysis, creating metadata for tagging images, searching for images in an image database on the basis of image content, or as a component of computer vision algorithms. Agents are learned in an ongoing, two-way dialogue between the user and the algorithm. The user points to mistakes made in classification. The algorithm, in response, attempts to discover which image attributes are discriminating between objects of interest and clutter. It then builds a candidate agent and applies it to an input image, producing an 'interest' image highlighting features that are consistent with the set of objects and clutter indicated by the user. The dialogue repeats until the user is satisfied. The prototype environment, called the Toolkit for Image Mining (TIM) is currently capable of learning spectral and textural patterns. Learning exhibits rapid convergence to reasonable levels of performance and, when thoroughly trained, Fo appears to be competitive in discrimination accuracy with other classification techniques.

  13. Chemical shift-based identification of monosaccharide spin-systems with NMR spectroscopy to complement untargeted glycomics.

    PubMed

    Klukowski, Piotr; Schubert, Mario

    2018-06-15

    A better understanding of oligosaccharides and their wide-ranging functions in almost every aspect of biology and medicine promises to uncover hidden layers of biology and will support the development of better therapies. Elucidating the chemical structure of an unknown oligosaccharide is still a challenge. Efficient tools are required for non-targeted glycomics. Chemical shifts are a rich source of information about the topology and configuration of biomolecules, whose potential is however not fully explored for oligosaccharides. We hypothesize that the chemical shifts of each monosaccharide are unique for each saccharide type with a certain linkage pattern, so that correlated data measured by NMR spectroscopy can be used to identify the chemical nature of a carbohydrate. We present here an efficient search algorithm, GlycoNMRSearch, that matches either a subset or the entire set of chemical shifts of an unidentified monosaccharide spin system to all spin systems in an NMR database. The search output is much more precise than earlier search functions and highly similar matches suggest the chemical structure of the spin system within the oligosaccharide. Thus searching for connected chemical shift correlations within all electronically available NMR data of oligosaccharides is a very efficient way of identifying the chemical structure of unknown oligosaccharides. With an improved database in the future, GlycoNMRSearch will be even more efficient deducing chemical structures of oligosaccharides and there is a high chance that it becomes an indispensable technique for glycomics. The search algorithm presented here, together with a graphical user interface, is available at http://glyconmrsearch.santos.pwr.edu.pl. Supplementary data are available at Bioinformatics online.

  14. Information Discovery and Retrieval Tools

    DTIC Science & Technology

    2004-12-01

    information. This session will focus on the various Internet search engines , directories, and how to improve the user experience through the use of...such techniques as metadata, meta- search engines , subject specific search tools, and other developing technologies.

  15. Information Discovery and Retrieval Tools

    DTIC Science & Technology

    2003-04-01

    information. This session will focus on the various Internet search engines , directories, and how to improve the user experience through the use of...such techniques as metadata, meta- search engines , subject specific search tools, and other developing technologies.

  16. Working with Data: Discovering Knowledge through Mining and Analysis; Systematic Knowledge Management and Knowledge Discovery; Text Mining; Methodological Approach in Discovering User Search Patterns through Web Log Analysis; Knowledge Discovery in Databases Using Formal Concept Analysis; Knowledge Discovery with a Little Perspective.

    ERIC Educational Resources Information Center

    Qin, Jian; Jurisica, Igor; Liddy, Elizabeth D.; Jansen, Bernard J; Spink, Amanda; Priss, Uta; Norton, Melanie J.

    2000-01-01

    These six articles discuss knowledge discovery in databases (KDD). Topics include data mining; knowledge management systems; applications of knowledge discovery; text and Web mining; text mining and information retrieval; user search patterns through Web log analysis; concept analysis; data collection; and data structure inconsistency. (LRW)

  17. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach.

    PubMed

    Perez, Susan L; Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-07-20

    Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant's information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites.

  18. Characterizing the Processes for Navigating Internet Health Information Using Real-Time Observations: A Mixed-Methods Approach

    PubMed Central

    Paterniti, Debora A; Wilson, Machelle; Bell, Robert A; Chan, Man Shan; Villareal, Chloe C; Nguyen, Hien Huy; Kravitz, Richard L

    2015-01-01

    Background Little is known about the processes people use to find health-related information on the Internet or the individual characteristics that shape selection of information-seeking approaches. Objective Our aim was to describe the processes by which users navigate the Internet for information about a hypothetical acute illness and to identify individual characteristics predictive of their information-seeking strategies. Methods Study participants were recruited from public settings and agencies. Interested individuals were screened for eligibility using an online questionnaire. Participants listened to one of two clinical scenarios—consistent with influenza or bacterial meningitis—and then conducted an Internet search. Screen-capture video software captured Internet search mouse clicks and keystrokes. Each step of the search was coded as hypothesis testing (etiology), evidence gathering (symptoms), or action/treatment seeking (behavior). The coded steps were used to form a step-by-step pattern of each participant’s information-seeking process. A total of 78 Internet health information seekers ranging from 21-35 years of age and who experienced barriers to accessing health care services participated. Results We identified 27 unique patterns of information seeking, which were grouped into four overarching classifications based on the number of steps taken during the search, whether a pattern consisted of developing a hypothesis and exploring symptoms before ending the search or searching an action/treatment, and whether a pattern ended with action/treatment seeking. Applying dual-processing theory, we categorized the four overarching pattern classifications as either System 1 (41%, 32/78), unconscious, rapid, automatic, and high capacity processing; or System 2 (59%, 46/78), conscious, slow, and deliberative processing. Using multivariate regression, we found that System 2 processing was associated with higher education and younger age. Conclusions We identified and classified two approaches to processing Internet health information. System 2 processing, a methodical approach, most resembles the strategies for information processing that have been found in other studies to be associated with higher-quality decisions. We conclude that the quality of Internet health-information seeking could be improved through consumer education on methodical Internet navigation strategies and the incorporation of decision aids into health information websites. PMID:26194787

  19. Searching social networks for subgraph patterns

    NASA Astrophysics Data System (ADS)

    Ogaard, Kirk; Kase, Sue; Roy, Heather; Nagi, Rakesh; Sambhoos, Kedar; Sudit, Moises

    2013-06-01

    Software tools for Social Network Analysis (SNA) are being developed which support various types of analysis of social networks extracted from social media websites (e.g., Twitter). Once extracted and stored in a database such social networks are amenable to analysis by SNA software. This data analysis often involves searching for occurrences of various subgraph patterns (i.e., graphical representations of entities and relationships). The authors have developed the Graph Matching Toolkit (GMT) which provides an intuitive Graphical User Interface (GUI) for a heuristic graph matching algorithm called the Truncated Search Tree (TruST) algorithm. GMT is a visual interface for graph matching algorithms processing large social networks. GMT enables an analyst to draw a subgraph pattern by using a mouse to select categories and labels for nodes and links from drop-down menus. GMT then executes the TruST algorithm to find the top five occurrences of the subgraph pattern within the social network stored in the database. GMT was tested using a simulated counter-insurgency dataset consisting of cellular phone communications within a populated area of operations in Iraq. The results indicated GMT (when executing the TruST graph matching algorithm) is a time-efficient approach to searching large social networks. GMT's visual interface to a graph matching algorithm enables intelligence analysts to quickly analyze and summarize the large amounts of data necessary to produce actionable intelligence.

  20. Search Query Data to Monitor Interest in Behavior Change: Application for Public Health

    PubMed Central

    Carr, Lucas J.; Dunsiger, Shira I.

    2012-01-01

    There is a need for effective interventions and policies that target the leading preventable causes of death in the U.S. (e.g., smoking, overweight/obesity, physical inactivity). Such efforts could be aided by the use of publicly available, real-time search query data that illustrate times and locations of high and low public interest in behaviors related to preventable causes of death. Objectives This study explored patterns of search query activity for the terms ‘weight’, ‘diet’, ‘fitness’, and ‘smoking’ using Google Insights for Search. Methods Search activity for ‘weight’, ‘diet’, ‘fitness’, and ‘smoking’ conducted within the United States via Google between January 4th, 2004 (first date data was available) and November 28th, 2011 (date of data download and analysis) were analyzed. Using a generalized linear model, we explored the effects of time (month) on mean relative search volume for all four terms. Results Models suggest a significant effect of month on mean search volume for all four terms. Search activity for all four terms was highest in January with observable declines throughout the remainder of the year. Conclusions These findings demonstrate discernable temporal patterns of search activity for four areas of behavior change. These findings could be used to inform the timing, location and messaging of interventions, campaigns and policies targeting these behaviors. PMID:23110198

  1. Searches for millisecond pulsations in low-mass X-ray binaries

    NASA Technical Reports Server (NTRS)

    Wood, K. S.; Hertz, P.; Norris, J. P.; Vaughan, B. A.; Michelson, P. F.; Mitsuda, K.; Lewin, W. H. G.; Van Paradijs, J.; Penninx, W.; Van Der Klis, M.

    1991-01-01

    High-sensitivity search techniques for millisecond periods are presented and applied to data from the Japanese satellite Ginga and HEAO 1. The search is optimized for pulsed signals whose period, drift rate, and amplitude conform with what is expected for low-class X-ray binary (LMXB) sources. Consideration is given to how the current understanding of LMXBs guides the search strategy and sets these parameter limits. An optimized one-parameter coherence recovery technique (CRT) developed for recovery of phase coherence is presented. This technique provides a large increase in sensitivity over the method of incoherent summation of Fourier power spectra. The range of spin periods expected from LMXB phenomenology is discussed, the necessary constraints on the application of CRT are described in terms of integration time and orbital parameters, and the residual power unrecovered by the quadratic approximation for realistic cases is estimated.

  2. A SOUND SOURCE LOCALIZATION TECHNIQUE TO SUPPORT SEARCH AND RESCUE IN LOUD NOISE ENVIRONMENTS

    NASA Astrophysics Data System (ADS)

    Yoshinaga, Hiroshi; Mizutani, Koichi; Wakatsuki, Naoto

    At some sites of earthquakes and other disasters, rescuers search for people buried under rubble by listening for the sounds which they make. Thus developing a technique to localize sound sources amidst loud noise will support such search and rescue operations. In this paper, we discuss an experiment performed to test an array signal processing technique which searches for unperceivable sound in loud noise environments. Two speakers simultaneously played a noise of a generator and a voice decreased by 20 dB (= 1/100 of power) from the generator noise at an outdoor space where cicadas were making noise. The sound signal was received by a horizontally set linear microphone array 1.05 m in length and consisting of 15 microphones. The direction and the distance of the voice were computed and the sound of the voice was extracted and played back as an audible sound by array signal processing.

  3. Dynamic model updating based on strain mode shape and natural frequency using hybrid pattern search technique

    NASA Astrophysics Data System (ADS)

    Guo, Ning; Yang, Zhichun; Wang, Le; Ouyang, Yan; Zhang, Xinping

    2018-05-01

    Aiming at providing a precise dynamic structural finite element (FE) model for dynamic strength evaluation in addition to dynamic analysis. A dynamic FE model updating method is presented to correct the uncertain parameters of the FE model of a structure using strain mode shapes and natural frequencies. The strain mode shape, which is sensitive to local changes in structure, is used instead of the displacement mode for enhancing model updating. The coordinate strain modal assurance criterion is developed to evaluate the correlation level at each coordinate over the experimental and the analytical strain mode shapes. Moreover, the natural frequencies which provide the global information of the structure are used to guarantee the accuracy of modal properties of the global model. Then, the weighted summation of the natural frequency residual and the coordinate strain modal assurance criterion residual is used as the objective function in the proposed dynamic FE model updating procedure. The hybrid genetic/pattern-search optimization algorithm is adopted to perform the dynamic FE model updating procedure. Numerical simulation and model updating experiment for a clamped-clamped beam are performed to validate the feasibility and effectiveness of the present method. The results show that the proposed method can be used to update the uncertain parameters with good robustness. And the updated dynamic FE model of the beam structure, which can correctly predict both the natural frequencies and the local dynamic strains, is reliable for the following dynamic analysis and dynamic strength evaluation.

  4. Evaluation of dynamically dimensioned search algorithm for optimizing SWAT by altering sampling distributions and searching range

    USDA-ARS?s Scientific Manuscript database

    The primary advantage of Dynamically Dimensioned Search algorithm (DDS) is that it outperforms many other optimization techniques in both convergence speed and the ability in searching for parameter sets that satisfy statistical guidelines while requiring only one algorithm parameter (perturbation f...

  5. NMR Parameters Determination through ACE Committee Machine with Genetic Implanted Fuzzy Logic and Genetic Implanted Neural Network

    NASA Astrophysics Data System (ADS)

    Asoodeh, Mojtaba; Bagheripour, Parisa; Gholami, Amin

    2015-06-01

    Free fluid porosity and rock permeability, undoubtedly the most critical parameters of hydrocarbon reservoir, could be obtained by processing of nuclear magnetic resonance (NMR) log. Despite conventional well logs (CWLs), NMR logging is very expensive and time-consuming. Therefore, idea of synthesizing NMR log from CWLs would be of a great appeal among reservoir engineers. For this purpose, three optimization strategies are followed. Firstly, artificial neural network (ANN) is optimized by virtue of hybrid genetic algorithm-pattern search (GA-PS) technique, then fuzzy logic (FL) is optimized by means of GA-PS, and eventually an alternative condition expectation (ACE) model is constructed using the concept of committee machine to combine outputs of optimized and non-optimized FL and ANN models. Results indicated that optimization of traditional ANN and FL model using GA-PS technique significantly enhances their performances. Furthermore, the ACE committee of aforementioned models produces more accurate and reliable results compared with a singular model performing alone.

  6. The search for a hippocampal engram.

    PubMed

    Mayford, Mark

    2014-01-05

    Understanding the molecular and cellular changes that underlie memory, the engram, requires the identification, isolation and manipulation of the neurons involved. This presents a major difficulty for complex forms of memory, for example hippocampus-dependent declarative memory, where the participating neurons are likely to be sparse, anatomically distributed and unique to each individual brain and learning event. In this paper, I discuss several new approaches to this problem. In vivo calcium imaging techniques provide a means of assessing the activity patterns of large numbers of neurons over long periods of time with precise anatomical identification. This provides important insight into how the brain represents complex information and how this is altered with learning. The development of techniques for the genetic modification of neural ensembles based on their natural, sensory-evoked, activity along with optogenetics allows direct tests of the coding function of these ensembles. These approaches provide a new methodological framework in which to examine the mechanisms of complex forms of learning at the level of the neurons involved in a specific memory.

  7. The search for a hippocampal engram

    PubMed Central

    Mayford, Mark

    2014-01-01

    Understanding the molecular and cellular changes that underlie memory, the engram, requires the identification, isolation and manipulation of the neurons involved. This presents a major difficulty for complex forms of memory, for example hippocampus-dependent declarative memory, where the participating neurons are likely to be sparse, anatomically distributed and unique to each individual brain and learning event. In this paper, I discuss several new approaches to this problem. In vivo calcium imaging techniques provide a means of assessing the activity patterns of large numbers of neurons over long periods of time with precise anatomical identification. This provides important insight into how the brain represents complex information and how this is altered with learning. The development of techniques for the genetic modification of neural ensembles based on their natural, sensory-evoked, activity along with optogenetics allows direct tests of the coding function of these ensembles. These approaches provide a new methodological framework in which to examine the mechanisms of complex forms of learning at the level of the neurons involved in a specific memory. PMID:24298162

  8. RNAPattMatch: a web server for RNA sequence/structure motif detection based on pattern matching with flexible gaps

    PubMed Central

    Drory Retwitzer, Matan; Polishchuk, Maya; Churkin, Elena; Kifer, Ilona; Yakhini, Zohar; Barash, Danny

    2015-01-01

    Searching for RNA sequence-structure patterns is becoming an essential tool for RNA practitioners. Novel discoveries of regulatory non-coding RNAs in targeted organisms and the motivation to find them across a wide range of organisms have prompted the use of computational RNA pattern matching as an enhancement to sequence similarity. State-of-the-art programs differ by the flexibility of patterns allowed as queries and by their simplicity of use. In particular—no existing method is available as a user-friendly web server. A general program that searches for RNA sequence-structure patterns is RNA Structator. However, it is not available as a web server and does not provide the option to allow flexible gap pattern representation with an upper bound of the gap length being specified at any position in the sequence. Here, we introduce RNAPattMatch, a web-based application that is user friendly and makes sequence/structure RNA queries accessible to practitioners of various background and proficiency. It also extends RNA Structator and allows a more flexible variable gaps representation, in addition to analysis of results using energy minimization methods. RNAPattMatch service is available at http://www.cs.bgu.ac.il/rnapattmatch. A standalone version of the search tool is also available to download at the site. PMID:25940619

  9. Attention-Deficit / Hyperactivity Disorder (ADHD): Data and Statistics

    MedlinePlus

    ... Search Form Controls Cancel Submit Search the CDC Attention-Deficit / Hyperactivity Disorder (ADHD) Note: Javascript is disabled ... claims to understand diagnosis and treatment patterns for Attention-Deficit/Hyperactivity Disorder (ADHD). On this page you ...

  10. Discovering discovery patterns with Predication-based Semantic Indexing.

    PubMed

    Cohen, Trevor; Widdows, Dominic; Schvaneveldt, Roger W; Davies, Peter; Rindflesch, Thomas C

    2012-12-01

    In this paper we utilize methods of hyperdimensional computing to mediate the identification of therapeutically useful connections for the purpose of literature-based discovery. Our approach, named Predication-based Semantic Indexing, is utilized to identify empirically sequences of relationships known as "discovery patterns", such as "drug x INHIBITS substance y, substance y CAUSES disease z" that link pharmaceutical substances to diseases they are known to treat. These sequences are derived from semantic predications extracted from the biomedical literature by the SemRep system, and subsequently utilized to direct the search for known treatments for a held out set of diseases. Rapid and efficient inference is accomplished through the application of geometric operators in PSI space, allowing for both the derivation of discovery patterns from a large set of known TREATS relationships, and the application of these discovered patterns to constrain search for therapeutic relationships at scale. Our results include the rediscovery of discovery patterns that have been constructed manually by other authors in previous research, as well as the discovery of a set of previously unrecognized patterns. The application of these patterns to direct search through PSI space results in better recovery of therapeutic relationships than is accomplished with models based on distributional statistics alone. These results demonstrate the utility of efficient approximate inference in geometric space as a means to identify therapeutic relationships, suggesting a role of these methods in drug repurposing efforts. In addition, the results provide strong support for the utility of the discovery pattern approach pioneered by Hristovski and his colleagues. Copyright © 2012 Elsevier Inc. All rights reserved.

  11. A Search for WIMP Dark Matter Using an Optimized Chi-square Technique on the Final Data from the Cryogenic Dark Matter Search Experiment (CDMS II)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manungu Kiveni, Joseph

    2012-12-01

    This dissertation describes the results of a WIMP search using CDMS II data sets accumulated at the Soudan Underground Laboratory in Minnesota. Results from the original analysis of these data were published in 2009; two events were observed in the signal region with an expected leakage of 0.9 events. Further investigation revealed an issue with the ionization-pulse reconstruction algorithm leading to a software upgrade and a subsequent reanalysis of the data. As part of the reanalysis, I performed an advanced discrimination technique to better distinguish (potential) signal events from backgrounds using a 5-dimensional chi-square method. This dataanalysis technique combines themore » event information recorded for each WIMP-search event to derive a backgrounddiscrimination parameter capable of reducing the expected background to less than one event, while maintaining high efficiency for signal events. Furthermore, optimizing the cut positions of this 5-dimensional chi-square parameter for the 14 viable germanium detectors yields an improved expected sensitivity to WIMP interactions relative to previous CDMS results. This dissertation describes my improved (and optimized) discrimination technique and the results obtained from a blind application to the reanalyzed CDMS II WIMP-search data.« less

  12. Study of the Gray Scale, Polychromatic, Distortion Invariant Neural Networks Using the Ipa Model.

    NASA Astrophysics Data System (ADS)

    Uang, Chii-Maw

    Research in the optical neural network field is primarily motivated by the fact that humans recognize objects better than the conventional digital computers and the massively parallel inherent nature of optics. This research represents a continuous effort during the past several years in the exploitation of using neurocomputing for pattern recognition. Based on the interpattern association (IPA) model and Hamming net model, many new systems and applications are introduced. A gray level discrete associative memory that is based on object decomposition/composition is proposed for recognizing gray-level patterns. This technique extends the processing ability from the binary mode to gray-level mode, and thus the information capacity is increased. Two polychromatic optical neural networks using color liquid crystal television (LCTV) panels for color pattern recognition are introduced. By introducing a color encoding technique in conjunction with the interpattern associative algorithm, a color associative memory was realized. Based on the color decomposition and composition technique, a color exemplar-based Hamming net was built for color image classification. A shift-invariant neural network is presented through use of the translation invariant property of the modulus of the Fourier transformation and the hetero-associative interpattern association (IPA) memory. To extract the main features, a quadrantal sampling method is used to sampled data and then replace the training patterns. Using the concept of hetero-associative memory to recall the distorted object. A shift and rotation invariant neural network using an interpattern hetero-association (IHA) model is presented. To preserve the shift and rotation invariant properties, a set of binarized-encoded circular harmonic expansion (CHE) functions at the Fourier domain is used as the training set. We use the shift and symmetric properties of the modulus of the Fourier spectrum to avoid the problem of centering the CHE functions. Almost all neural networks have the positive and negative weights, which increases the difficulty of optical implementation. A method to construct a unipolar IPA IWM is discussed. By searching the redundant interconnection links, an effective way that removes all negative links is discussed.

  13. Classification of Meteorological Influences Surrounding Extreme Precipitation Events in the United States using the MERRA-2 Reanalysis

    NASA Technical Reports Server (NTRS)

    Collow, Allie Marquardt; Bosilovich, Mike; Ullrich, Paul; Hoeck, Ian

    2017-01-01

    Extreme precipitation events can have a large impact on society through flooding that can result in property destruction, crop losses, economic losses, the spread of water-borne diseases, and fatalities. Observations indicate there has been a statistically significant increase in extreme precipitation events over the past 15 years in the Northeastern United States and other localized regions of the country have become crippled with record flooding events, for example, the flooding that occurred in the Southeast United States associated with Hurricane Matthew in October 2016. Extreme precipitation events in the United States can be caused by various meteorological influences such as extratropical cyclones, tropical cyclones, mesoscale convective complexes, general air mass thunderstorms, upslope flow, fronts, and the North American Monsoon. Reanalyses, such as the Modern Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), have become a pivotal tool to study the meteorology surrounding extreme precipitation events. Using days classified as an extreme precipitation events based on a combination of observational gauge and radar data, two techniques for the classification of these events are used to gather additional information that can be used to determine how events have changed over time using atmospheric data from MERRA-2. The first is self organizing maps, which is an artificial neural network that uses unsupervised learning to cluster like patterns and the second is an automated detection technique that searches for characteristics in the atmosphere that define a meteorological phenomena. For example, the automated detection for tropical cycles searches for a defined area of suppressed sea level pressure, alongside thickness anomalies aloft, indicating the presence of a warm core. These techniques are employed for extreme precipitation events in preselected regions that were chosen based an analysis of the climatology of precipitation.

  14. Searching for Contracting Patterns over Time: Do Prime Contractor and Subcontractor Relations Follow Similar Patterns for Professional Services Provision?

    ERIC Educational Resources Information Center

    Ponomariov, Branco; Kingsley, Gordon; Boardman, Craig

    2011-01-01

    This paper compares over a 12-year period (1) patterns of contracting between a state transportation agency and its prime contractors providing engineering design services with (2) patterns between these prime contractors and their subcontractors. We find evidence of different contracting patterns at each level that emerge over time and coexist in…

  15. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea

    PubMed Central

    Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo

    2016-01-01

    Background Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. Methods and Results The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman’s correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Conclusion Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary. PMID:27391028

  16. Correlation between National Influenza Surveillance Data and Search Queries from Mobile Devices and Desktops in South Korea.

    PubMed

    Shin, Soo-Yong; Kim, Taerim; Seo, Dong-Woo; Sohn, Chang Hwan; Kim, Sung-Hoon; Ryoo, Seung Mok; Lee, Yoon-Seon; Lee, Jae Ho; Kim, Won Young; Lim, Kyoung Soo

    2016-01-01

    Digital surveillance using internet search queries can improve both the sensitivity and timeliness of the detection of a health event, such as an influenza outbreak. While it has recently been estimated that the mobile search volume surpasses the desktop search volume and mobile search patterns differ from desktop search patterns, the previous digital surveillance systems did not distinguish mobile and desktop search queries. The purpose of this study was to compare the performance of mobile and desktop search queries in terms of digital influenza surveillance. The study period was from September 6, 2010 through August 30, 2014, which consisted of four epidemiological years. Influenza-like illness (ILI) and virologic surveillance data from the Korea Centers for Disease Control and Prevention were used. A total of 210 combined queries from our previous survey work were used for this study. Mobile and desktop weekly search data were extracted from Naver, which is the largest search engine in Korea. Spearman's correlation analysis was used to examine the correlation of the mobile and desktop data with ILI and virologic data in Korea. We also performed lag correlation analysis. We observed that the influenza surveillance performance of mobile search queries matched or exceeded that of desktop search queries over time. The mean correlation coefficients of mobile search queries and the number of queries with an r-value of ≥ 0.7 equaled or became greater than those of desktop searches over the four epidemiological years. A lag correlation analysis of up to two weeks showed similar trends. Our study shows that mobile search queries for influenza surveillance have equaled or even become greater than desktop search queries over time. In the future development of influenza surveillance using search queries, the recognition of changing trend of mobile search data could be necessary.

  17. Wide-Range Motion Estimation Architecture with Dual Search Windows for High Resolution Video Coding

    NASA Astrophysics Data System (ADS)

    Dung, Lan-Rong; Lin, Meng-Chun

    This paper presents a memory-efficient motion estimation (ME) technique for high-resolution video compression. The main objective is to reduce the external memory access, especially for limited local memory resource. The reduction of memory access can successfully save the notorious power consumption. The key to reduce the memory accesses is based on center-biased algorithm in that the center-biased algorithm performs the motion vector (MV) searching with the minimum search data. While considering the data reusability, the proposed dual-search-windowing (DSW) approaches use the secondary windowing as an option per searching necessity. By doing so, the loading of search windows can be alleviated and hence reduce the required external memory bandwidth. The proposed techniques can save up to 81% of external memory bandwidth and require only 135 MBytes/sec, while the quality degradation is less than 0.2dB for 720p HDTV clips coded at 8Mbits/sec.

  18. Dynamic metrology and data processing for precision freeform optics fabrication and testing

    NASA Astrophysics Data System (ADS)

    Aftab, Maham; Trumper, Isaac; Huang, Lei; Choi, Heejoo; Zhao, Wenchuan; Graves, Logan; Oh, Chang Jin; Kim, Dae Wook

    2017-06-01

    Dynamic metrology holds the key to overcoming several challenging limitations of conventional optical metrology, especially with regards to precision freeform optical elements. We present two dynamic metrology systems: 1) adaptive interferometric null testing; and 2) instantaneous phase shifting deflectometry, along with an overview of a gradient data processing and surface reconstruction technique. The adaptive null testing method, utilizing a deformable mirror, adopts a stochastic parallel gradient descent search algorithm in order to dynamically create a null testing condition for unknown freeform optics. The single-shot deflectometry system implemented on an iPhone uses a multiplexed display pattern to enable dynamic measurements of time-varying optical components or optics in vibration. Experimental data, measurement accuracy / precision, and data processing algorithms are discussed.

  19. Luggage and shipped goods.

    PubMed

    Vogel, H; Haller, D

    2007-08-01

    Control of luggage and shipped goods are frequently carried out. The possibilities of X-ray technology shall be demonstrated. There are different imaging techniques. The main concepts are transmission imaging, backscatter imaging, computed tomography, and dual energy imaging and the combination of different methods The images come from manufacturers and personal collections. The search concerns mainly, weapons, explosives, and drugs; furthermore animals, and stolen goods, Special problems offer the control of letters and the detection of Improvised Explosive Devices (IED). One has to expect that controls will increase and that imaging with X-rays will have their part. Pattern recognition software will be used for analysis enforced by economy and by demand for higher efficiency - man and computer will produce more security than man alone.

  20. Medical data mining: knowledge discovery in a clinical data warehouse.

    PubMed Central

    Prather, J. C.; Lobach, D. F.; Goodwin, L. K.; Hales, J. W.; Hage, M. L.; Hammond, W. E.

    1997-01-01

    Clinical databases have accumulated large quantities of information about patients and their medical conditions. Relationships and patterns within this data could provide new medical knowledge. Unfortunately, few methodologies have been developed and applied to discover this hidden knowledge. In this study, the techniques of data mining (also known as Knowledge Discovery in Databases) were used to search for relationships in a large clinical database. Specifically, data accumulated on 3,902 obstetrical patients were evaluated for factors potentially contributing to preterm birth using exploratory factor analysis. Three factors were identified by the investigators for further exploration. This paper describes the processes involved in mining a clinical database including data warehousing, data query and cleaning, and data analysis. PMID:9357597

  1. Influence of legislations and news on Indian internet search query patterns of e-cigarettes.

    PubMed

    Thavarajah, Rooban; Mohandoss, Anusa Arunachalam; Ranganathan, Kannan; Kondalsamy-Chennakesavan, Srinivas

    2017-01-01

    There is a paucity of data on the use of electronic nicotine delivery systems (ENDS) in India. In addition, the Indian internet search pattern for ENDS has not been studied. We aimed to address this lacuna. Moreover, the influence of the tobacco legislations and news pieces on such search volume is not known. Given the fact that ENDS could cause oral lesions, these data are pertinent to dentists. Using a time series analysis, we examined the effect of tobacco-related legislations and news pieces on total search volume (TSV) from September 1, 2012, to August 31, 2016. TSV data were seasonally adjusted and analyzed using time series modeling. The TSV clocked during the month of legislations and news pieces were analyzed for their influence on search pattern of ENDS. The overall mean ± standard deviation (range) TSV was 22273.75 ± 6784.01 (12310-40510) during the study with seasonal variations. Individually, the best model for TSV-legislation and news pieces was autoregressive integrated moving average model, and when influence of legislations and news events were combined, it was the Winter's additive model. In the legislation alone model, the pre-event, event and post-event month TSV was not a better indicator of the effect, barring for post-event month of 2 nd legislation, which involved pictorial warnings on packages in the study period. Similarly, a news piece on Pan-India ban on ENDS influenced the model in the news piece model. When combined, no "events" emerged significant. These findings suggest that search for information on ENDS is increasing and that these tobacco control policies and news items, targeting tobacco usage reduction, have only a short-term effect on the rate of searching for information on ENDS.

  2. PepPat, a pattern-based oligopeptide homology search method and the identification of a novel tachykinin-like peptide.

    PubMed

    Jiang, Ying; Gao, Ge; Fang, Gang; Gustafson, Eric L; Laverty, Maureen; Yin, Yanbin; Zhang, Yong; Luo, Jingchu; Greene, Jonathan R; Bayne, Marvin L; Hedrick, Joseph A; Murgolo, Nicholas J

    2003-05-01

    PepPat, a hybrid method that combines pattern matching with similarity scoring, is described. We also report PepPat's application in the identification of a novel tachykinin-like peptide. PepPat takes as input a query peptide and a user-specified regular expression pattern within the peptide. It first performs a database pattern match and then ranks candidates on the basis of their similarity to the query peptide. PepPat calculates similarity over the pattern spanning region, enhancing PepPat's sensitivity for short query peptides. PepPat can also search for a user-specified number of occurrences of a repeated pattern within the target sequence. We illustrate PepPat's application in short peptide ligand mining. As a validation example, we report the identification of a novel tachykinin-like peptide, C14TKL-1, and show it is an NK1 (neuokinin receptor 1) agonist whose message is widely expressed in human periphery. PepPat is offered online at: http://peppat.cbi.pku.edu.cn.

  3. Introducing Online Bibliographic Service to its Users: The Online Presentation

    ERIC Educational Resources Information Center

    Crane, Nancy B.; Pilachowski, David M.

    1978-01-01

    A description of techniques for introducing online services to new user groups includes discussion of terms and their definitions, evolution of online searching, advantages and disadvantages of online searching, production of the data bases, search strategies, Boolean logic, costs and charges, "do's and don'ts," and a user search questionnaire. (J…

  4. Internet search and krokodil in the Russian Federation: an infoveillance study.

    PubMed

    Zheluk, Andrey; Quinn, Casey; Meylakhs, Peter

    2014-09-18

    Krokodil is an informal term for a cheap injectable illicit drug domestically prepared from codeine-containing medication (CCM). The method of krokodil preparation may produce desomorphine as well as toxic reactants that cause extensive tissue necrosis. The first confirmed report of krokodil use in Russia took place in 2004. In 2012, reports of krokodil-related injection injuries began to appear beyond Russia in Western Europe and the United States. This exploratory study had two main objectives: (1) to determine if Internet search patterns could detect regularities in behavioral responses to Russian CCM policy at the population level, and (2) to determine if complementary data sources could explain the regularities we observed. First, we obtained krokodil-related search pattern data for each Russia subregion (oblast) between 2011 and 2012. Second, we analyzed several complementary data sources included krokodil-related court cases, and related search terms on both Google and Yandex to evaluate the characteristics of terms accompanying krokodil-related search queries. In the 6 months preceding CCM sales restrictions, 21 of Russia's 83 oblasts had search rates higher than the national average (mean) of 16.67 searches per 100,000 population for terms associated with krokodil. In the 6 months following restrictions, mean national searches dropped to 9.65 per 100,000. Further, the number of oblasts recording a higher than average search rate dropped from 30 to 16. Second, we found krokodil-related court appearances were moderately positively correlated (Spearman correlation=.506, P≤.001) with behaviors consistent with an interest in the production and use of krokodil across Russia. Finally, Google Trends and Google and Yandex related terms suggested consistent public interest in the production and use of krokodil as well as for CCM as analgesic medication during the date range covered by this study. Illicit drug use data are generally regarded as difficult to obtain through traditional survey methods. Our analysis suggests it is plausible that Yandex search behavior served as a proxy for patterns of krokodil production and use during the date range we investigated. More generally, this study demonstrates the application of novel methods recently used by policy makers to both monitor illicit drug use and influence drug policy decision making.

  5. Evaluating random search strategies in three mammals from distinct feeding guilds.

    PubMed

    Auger-Méthé, Marie; Derocher, Andrew E; DeMars, Craig A; Plank, Michael J; Codling, Edward A; Lewis, Mark A

    2016-09-01

    Searching allows animals to find food, mates, shelter and other resources essential for survival and reproduction and is thus among the most important activities performed by animals. Theory predicts that animals will use random search strategies in highly variable and unpredictable environments. Two prominent models have been suggested for animals searching in sparse and heterogeneous environments: (i) the Lévy walk and (ii) the composite correlated random walk (CCRW) and its associated area-restricted search behaviour. Until recently, it was difficult to differentiate between the movement patterns of these two strategies. Using a new method that assesses whether movement patterns are consistent with these two strategies and two other common random search strategies, we investigated the movement behaviour of three species inhabiting sparse northern environments: woodland caribou (Rangifer tarandus caribou), barren-ground grizzly bear (Ursus arctos) and polar bear (Ursus maritimus). These three species vary widely in their diets and thus allow us to contrast the movement patterns of animals from different feeding guilds. Our results showed that although more traditional methods would have found evidence for the Lévy walk for some individuals, a comparison of the Lévy walk to CCRWs showed stronger support for the latter. While a CCRW was the best model for most individuals, there was a range of support for its absolute fit. A CCRW was sufficient to explain the movement of nearly half of herbivorous caribou and a quarter of omnivorous grizzly bears, but was insufficient to explain the movement of all carnivorous polar bears. Strong evidence for CCRW movement patterns suggests that many individuals may use a multiphasic movement strategy rather than one-behaviour strategies such as the Lévy walk. The fact that the best model was insufficient to describe the movement paths of many individuals suggests that some animals living in sparse environments may use strategies that are more complicated than those described by the standard random search models. Thus, our results indicate a need to develop movement models that incorporate factors such as the perceptual and cognitive capacities of animals. © 2016 The Authors. Journal of Animal Ecology © 2016 British Ecological Society.

  6. Correlation-coefficient-based fast template matching through partial elimination.

    PubMed

    Mahmood, Arif; Khan, Sohaib

    2012-04-01

    Partial computation elimination techniques are often used for fast template matching. At a particular search location, computations are prematurely terminated as soon as it is found that this location cannot compete with an already known best match location. Due to the nonmonotonic growth pattern of the correlation-based similarity measures, partial computation elimination techniques have been traditionally considered inapplicable to speed up these measures. In this paper, we show that partial elimination techniques may be applied to a correlation coefficient by using a monotonic formulation, and we propose basic-mode and extended-mode partial correlation elimination algorithms for fast template matching. The basic-mode algorithm is more efficient on small template sizes, whereas the extended mode is faster on medium and larger templates. We also propose a strategy to decide which algorithm to use for a given data set. To achieve a high speedup, elimination algorithms require an initial guess of the peak correlation value. We propose two initialization schemes including a coarse-to-fine scheme for larger templates and a two-stage technique for small- and medium-sized templates. Our proposed algorithms are exact, i.e., having exhaustive equivalent accuracy, and are compared with the existing fast techniques using real image data sets on a wide variety of template sizes. While the actual speedups are data dependent, in most cases, our proposed algorithms have been found to be significantly faster than the other algorithms.

  7. Reference ability neural networks and behavioral performance across the adult life span.

    PubMed

    Habeck, Christian; Eich, Teal; Razlighi, Ray; Gazes, Yunglin; Stern, Yaakov

    2018-05-15

    To better understand the impact of aging, along with other demographic and brain health variables, on the neural networks that support different aspects of cognitive performance, we applied a brute-force search technique based on Principal Components Analysis to derive 4 corresponding spatial covariance patterns (termed Reference Ability Neural Networks -RANNs) from a large sample of participants across the age range. 255 clinically healthy, community-dwelling adults, aged 20-77, underwent fMRI while performing 12 tasks, 3 tasks for each of the following cognitive reference abilities: Episodic Memory, Reasoning, Perceptual Speed, and Vocabulary. The derived RANNs (1) showed selective activation to their specific cognitive domain and (2) correlated with behavioral performance. Quasi out-of-sample replication with Monte-Carlo 5-fold cross validation was built into our approach, and all patterns indicated their corresponding reference ability and predicted performance in held-out data to a degree significantly greater than chance level. RANN-pattern expression for Episodic Memory, Reasoning and Vocabulary were associated selectively with age, while the pattern for Perceptual Speed showed no such age-related influences. For each participant we also looked at residual activity unaccounted for by the RANN-pattern derived for the cognitive reference ability. Higher residual activity was associated with poorer brain-structural health and older age, but -apart from Vocabulary-not with cognitive performance, indicating that older participants with worse brain-structural health might recruit alternative neural resources to maintain performance levels. Copyright © 2018 Elsevier Inc. All rights reserved.

  8. Gridless, pattern-driven point cloud completion and extension

    NASA Astrophysics Data System (ADS)

    Gravey, Mathieu; Mariethoz, Gregoire

    2016-04-01

    While satellites offer Earth observation with a wide coverage, other remote sensing techniques such as terrestrial LiDAR can acquire very high-resolution data on an area that is limited in extension and often discontinuous due to shadow effects. Here we propose a numerical approach to merge these two types of information, thereby reconstructing high-resolution data on a continuous large area. It is based on a pattern matching process that completes the areas where only low-resolution data is available, using bootstrapped high-resolution patterns. Currently, the most common approach to pattern matching is to interpolate the point data on a grid. While this approach is computationally efficient, it presents major drawbacks for point clouds processing because a significant part of the information is lost in the point-to-grid resampling, and that a prohibitive amount of memory is needed to store large grids. To address these issues, we propose a gridless method that compares point clouds subsets without the need to use a grid. On-the-fly interpolation involves a heavy computational load, which is met by using a GPU high-optimized implementation and a hierarchical pattern searching strategy. The method is illustrated using data from the Val d'Arolla, Swiss Alps, where high-resolution terrestrial LiDAR data are fused with lower-resolution Landsat and WorldView-3 acquisitions, such that the density of points is homogeneized (data completion) and that it is extend to a larger area (data extension).

  9. Brain activity patterns induced by interrupting the cognitive processes with online advertising.

    PubMed

    Rejer, Izabela; Jankowski, Jarosław

    2017-11-01

    As a result of the increasing role of online advertising and strong competition among advertisers, intrusive techniques are commonly used to attract web users' attention. Moreover, since marketing content is usually delivered to the target audience when they are performing typical online tasks, like searching for information or reading online content, its delivery interrupts the web user's current cognitive process. The question posed by many researchers in the field of online advertising is: how should we measure the influence of interruption of cognitive processes on human behavior and emotional state? Much research has been conducted in this field; however, most of this research has focused on monitoring activity in the simulated environment, or processing declarative responses given by users in prepared questionnaires. In this paper, a more direct real-time approach is taken, and the effect of the interruption on a web user is analyzed directly by studying the activity of his brain. This paper presents the results of an experiment that was conducted to find the brain activity patterns associated with interruptions of the cognitive process by showing internet advertisements during a text-reading task. Three specific aspects were addressed in the experiment: individual patterns, the consistency of these patterns across trials, and the intra-subject correlation of the individual patterns. Two main effects were observed for most subjects: a drop in activity in the frontal and prefrontal cortical areas across all frequency bands, and significant changes in the frontal/prefrontal asymmetry index.

  10. Improving biomedical information retrieval by linear combinations of different query expansion techniques.

    PubMed

    Abdulla, Ahmed AbdoAziz Ahmed; Lin, Hongfei; Xu, Bo; Banbhrani, Santosh Kumar

    2016-07-25

    Biomedical literature retrieval is becoming increasingly complex, and there is a fundamental need for advanced information retrieval systems. Information Retrieval (IR) programs scour unstructured materials such as text documents in large reserves of data that are usually stored on computers. IR is related to the representation, storage, and organization of information items, as well as to access. In IR one of the main problems is to determine which documents are relevant and which are not to the user's needs. Under the current regime, users cannot precisely construct queries in an accurate way to retrieve particular pieces of data from large reserves of data. Basic information retrieval systems are producing low-quality search results. In our proposed system for this paper we present a new technique to refine Information Retrieval searches to better represent the user's information need in order to enhance the performance of information retrieval by using different query expansion techniques and apply a linear combinations between them, where the combinations was linearly between two expansion results at one time. Query expansions expand the search query, for example, by finding synonyms and reweighting original terms. They provide significantly more focused, particularized search results than do basic search queries. The retrieval performance is measured by some variants of MAP (Mean Average Precision) and according to our experimental results, the combination of best results of query expansion is enhanced the retrieved documents and outperforms our baseline by 21.06 %, even it outperforms a previous study by 7.12 %. We propose several query expansion techniques and their combinations (linearly) to make user queries more cognizable to search engines and to produce higher-quality search results.

  11. Recent developments in MrBUMP: better search-model preparation, graphical interaction with search models, and solution improvement and assessment.

    PubMed

    Keegan, Ronan M; McNicholas, Stuart J; Thomas, Jens M H; Simpkin, Adam J; Simkovic, Felix; Uski, Ville; Ballard, Charles C; Winn, Martyn D; Wilson, Keith S; Rigden, Daniel J

    2018-03-01

    Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case.

  12. Recent developments in MrBUMP: better search-model preparation, graphical interaction with search models, and solution improvement and assessment

    PubMed Central

    Keegan, Ronan M.; McNicholas, Stuart J.; Thomas, Jens M. H.; Simpkin, Adam J.; Uski, Ville; Ballard, Charles C.

    2018-01-01

    Increasing sophistication in molecular-replacement (MR) software and the rapid expansion of the PDB in recent years have allowed the technique to become the dominant method for determining the phases of a target structure in macromolecular X-ray crystallography. In addition, improvements in bioinformatic techniques for finding suitable homologous structures for use as MR search models, combined with developments in refinement and model-building techniques, have pushed the applicability of MR to lower sequence identities and made weak MR solutions more amenable to refinement and improvement. MrBUMP is a CCP4 pipeline which automates all stages of the MR procedure. Its scope covers everything from the sourcing and preparation of suitable search models right through to rebuilding of the positioned search model. Recent improvements to the pipeline include the adoption of more sensitive bioinformatic tools for sourcing search models, enhanced model-preparation techniques including better ensembling of homologues, and the use of phase improvement and model building on the resulting solution. The pipeline has also been deployed as an online service through CCP4 online, which allows its users to exploit large bioinformatic databases and coarse-grained parallelism to speed up the determination of a possible solution. Finally, the molecular-graphics application CCP4mg has been combined with MrBUMP to provide an interactive visual aid to the user during the process of selecting and manipulating search models for use in MR. Here, these developments in MrBUMP are described with a case study to explore how some of the enhancements to the pipeline and to CCP4mg can help to solve a difficult case. PMID:29533225

  13. The Search for Extension: 7 Steps to Help People Find Research-Based Information on the Internet

    ERIC Educational Resources Information Center

    Hill, Paul; Rader, Heidi B.; Hino, Jeff

    2012-01-01

    For Extension's unbiased, research-based content to be found by people searching the Internet, it needs to be organized in a way conducive to the ranking criteria of a search engine. With proper web design and search engine optimization techniques, Extension's content can be found, recognized, and properly indexed by search engines and…

  14. A generalized approach to automated NMR peak list editing: application to reduced dimensionality triple resonance spectra.

    PubMed

    Moseley, Hunter N B; Riaz, Nadeem; Aramini, James M; Szyperski, Thomas; Montelione, Gaetano T

    2004-10-01

    We present an algorithm and program called Pattern Picker that performs editing of raw peak lists derived from multidimensional NMR experiments with characteristic peak patterns. Pattern Picker detects groups of correlated peaks within peak lists from reduced dimensionality triple resonance (RD-TR) NMR spectra, with high fidelity and high yield. With typical quality RD-TR NMR data sets, Pattern Picker performs almost as well as human analysis, and is very robust in discriminating real peak sets from noise and other artifacts in unedited peak lists. The program uses a depth-first search algorithm with short-circuiting to efficiently explore a search tree representing every possible combination of peaks forming a group. The Pattern Picker program is particularly valuable for creating an automated peak picking/editing process. The Pattern Picker algorithm can be applied to a broad range of experiments with distinct peak patterns including RD, G-matrix Fourier transformation (GFT) NMR spectra, and experiments to measure scalar and residual dipolar coupling, thus promoting the use of experiments that are typically harder for a human to analyze. Since the complexity of peak patterns becomes a benefit rather than a drawback, Pattern Picker opens new opportunities in NMR experiment design.

  15. Modulation of neuronal responses during covert search for visual feature conjunctions

    PubMed Central

    Buracas, Giedrius T.; Albright, Thomas D.

    2009-01-01

    While searching for an object in a visual scene, an observer's attentional focus and eye movements are often guided by information about object features and spatial locations. Both spatial and feature-specific attention are known to modulate neuronal responses in visual cortex, but little is known of the dynamics and interplay of these mechanisms as visual search progresses. To address this issue, we recorded from directionally selective cells in visual area MT of monkeys trained to covertly search for targets defined by a unique conjunction of color and motion features and to signal target detection with an eye movement to the putative target. Two patterns of response modulation were observed. One pattern consisted of enhanced responses to targets presented in the receptive field (RF). These modulations occurred at the end-stage of search and were more potent during correct target identification than during erroneous saccades to a distractor in RF, thus suggesting that this modulation is not a mere presaccadic enhancement. A second pattern of modulation was observed when RF stimuli were nontargets that shared a feature with the target. The latter effect was observed during early stages of search and is consistent with a global feature-specific mechanism. This effect often terminated before target identification, thus suggesting that it interacts with spatial attention. This modulation was exhibited not only for motion but also for color cue, although MT neurons are known to be insensitive to color. Such cue-invariant attentional effects may contribute to a feature binding mechanism acting across visual dimensions. PMID:19805385

  16. Modulation of neuronal responses during covert search for visual feature conjunctions.

    PubMed

    Buracas, Giedrius T; Albright, Thomas D

    2009-09-29

    While searching for an object in a visual scene, an observer's attentional focus and eye movements are often guided by information about object features and spatial locations. Both spatial and feature-specific attention are known to modulate neuronal responses in visual cortex, but little is known of the dynamics and interplay of these mechanisms as visual search progresses. To address this issue, we recorded from directionally selective cells in visual area MT of monkeys trained to covertly search for targets defined by a unique conjunction of color and motion features and to signal target detection with an eye movement to the putative target. Two patterns of response modulation were observed. One pattern consisted of enhanced responses to targets presented in the receptive field (RF). These modulations occurred at the end-stage of search and were more potent during correct target identification than during erroneous saccades to a distractor in RF, thus suggesting that this modulation is not a mere presaccadic enhancement. A second pattern of modulation was observed when RF stimuli were nontargets that shared a feature with the target. The latter effect was observed during early stages of search and is consistent with a global feature-specific mechanism. This effect often terminated before target identification, thus suggesting that it interacts with spatial attention. This modulation was exhibited not only for motion but also for color cue, although MT neurons are known to be insensitive to color. Such cue-invariant attentional effects may contribute to a feature binding mechanism acting across visual dimensions.

  17. Teaching Google Search Techniques in an L2 Academic Writing Context

    ERIC Educational Resources Information Center

    Han, Sumi; Shin, Jeong-Ah

    2017-01-01

    This mixed-method study examines the effectiveness of teaching Google search techniques (GSTs) to Korean EFL college students in an intermediate-level academic English writing course. 18 students participated in a 4-day GST workshop consisting of an overview session of the web as corpus and Google as a concordancer, and three training sessions…

  18. Liquid electrolyte informatics using an exhaustive search with linear regression.

    PubMed

    Sodeyama, Keitaro; Igarashi, Yasuhiko; Nakayama, Tomofumi; Tateyama, Yoshitaka; Okada, Masato

    2018-06-14

    Exploring new liquid electrolyte materials is a fundamental target for developing new high-performance lithium-ion batteries. In contrast to solid materials, disordered liquid solution properties have been less studied by data-driven information techniques. Here, we examined the estimation accuracy and efficiency of three information techniques, multiple linear regression (MLR), least absolute shrinkage and selection operator (LASSO), and exhaustive search with linear regression (ES-LiR), by using coordination energy and melting point as test liquid properties. We then confirmed that ES-LiR gives the most accurate estimation among the techniques. We also found that ES-LiR can provide the relationship between the "prediction accuracy" and "calculation cost" of the properties via a weight diagram of descriptors. This technique makes it possible to choose the balance of the "accuracy" and "cost" when the search of a huge amount of new materials was carried out.

  19. Apriori Versions Based on MapReduce for Mining Frequent Patterns on Big Data.

    PubMed

    Luna, Jose Maria; Padillo, Francisco; Pechenizkiy, Mykola; Ventura, Sebastian

    2017-09-27

    Pattern mining is one of the most important tasks to extract meaningful and useful information from raw data. This task aims to extract item-sets that represent any type of homogeneity and regularity in data. Although many efficient algorithms have been developed in this regard, the growing interest in data has caused the performance of existing pattern mining techniques to be dropped. The goal of this paper is to propose new efficient pattern mining algorithms to work in big data. To this aim, a series of algorithms based on the MapReduce framework and the Hadoop open-source implementation have been proposed. The proposed algorithms can be divided into three main groups. First, two algorithms [Apriori MapReduce (AprioriMR) and iterative AprioriMR] with no pruning strategy are proposed, which extract any existing item-set in data. Second, two algorithms (space pruning AprioriMR and top AprioriMR) that prune the search space by means of the well-known anti-monotone property are proposed. Finally, a last algorithm (maximal AprioriMR) is also proposed for mining condensed representations of frequent patterns. To test the performance of the proposed algorithms, a varied collection of big data datasets have been considered, comprising up to 3 · 10#x00B9;⁸ transactions and more than 5 million of distinct single-items. The experimental stage includes comparisons against highly efficient and well-known pattern mining algorithms. Results reveal the interest of applying MapReduce versions when complex problems are considered, and also the unsuitability of this paradigm when dealing with small data.

  20. Effect of thenardite on the direct detection of aromatic amino acids: implications for the search for life in the solar system

    NASA Astrophysics Data System (ADS)

    Doc Richardson, C.; Hinman, Nancy W.; Scott, Jill R.

    2009-10-01

    With the discovery of Na-sulphate minerals on Mars and Europa, recent studies using these minerals have focused on their ability to assist in the detection of bio/organic signatures. This study further investigates the ability of thenardite (Na2SO4) to effectively facilitate the ionization and identification of aromatic amino acids (phenylalanine, tyrosine and tryptophan) using a technique called geomatrix-assisted laser desorption/ionization in conjunction with a Fourier transform ion cyclotron resonance mass spectrometry. This technique is based on the ability of a mineral host to facilitate desorption and ionization of bio/organic molecules for detection. Spectra obtained from each aromatic amino acid alone and in combination with thenardite show differences in ionization mechanism and fragmentation patterns. These differences are due to chemical and structural differences between the aromatic side chains of their respective amino acid. Tyrosine and tryptophan when combined with thenardite were observed to undergo cation-attachment ([M+Na]+), due to the high alkali ion affinity of their aromatic side chains. In addition, substitution of the carboxyl group hydrogen by sodium led to formation of [M-H+Na]Na+ peaks. In contrast, phenylalanine mixed with thenardite showed no evidence of Na+ attachment. Understanding how co-deposition of amino acids with thenardite can affect the observed mass spectra is important for future exploration missions that are likely to use laser desorption mass spectrometry to search for bio/organic compounds in extraterrestrial environments.

  1. Effect of Thenardite on the Direct Detection of Aromatic Amino Acids: Implications for the Search for Life in the Solar System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    C. Doc Richardson; Nancy W. Hinman; Jill R. Scott

    2009-10-01

    With the discovery of Na-sulfate minerals on Mars and Europa, recent studies using these minerals have focused on their ability to assist in the detection of bio/organic signatures. This study further investigates the ability of thenardite (Na2SO4) to effectively facilitate the ionization and identification of aromatic amino acids (phenylalanine, tyrosine, and tryptophan) using a technique called geomatrix-assisted laser desorption/ionization (GALDI) in conjunction with a Fourier transform mass spectrometry (FTICR-MS). This technique is based on the ability of a mineral host to facilitate the ionization and detection of bio/organic molecules. Spectra obtained from each aromatic amino acid alone and in combinationmore » with thenardite show differences in ionization mechanism and fragmentation patterns. These differences are due to chemical and structural differences between the aromatic side chains of their respective amino acid. Tyrosine and tryptophan when combined with thenardite were observed to undergo cation-attachment ([M+Na]+), due to the high alkali affinity of their aromatic side chains. Subsequent cation substitution of the carboxyl group led to formation double cation-attached peaks ([M-H+Na]Na+). In contrast, phenylalanine mixed with thenardite showed no evidence of Na+ interaction. Understanding how codeposition of amino acids with thenardite can affect the observed mass spectra is important for future exploration missions that are likely to use laser desorption mass spectrometry to search for bio/organic compounds in extraterrestrial environments.« less

  2. Implementing the Army NetCentric Data Strategy in a ServiceOriented Environment

    DTIC Science & Technology

    2009-04-23

    a Data Subscriptionc c e s s Federated Search Data Search D a t a A b s t r a c t i o n Adapter Configuration Adapter Data Service D a t a S e r...across t e enterpr se.  • Patterns • Search • Status • Receive – Services • Federated   Search • Artifact Discovery • Data Discovery 17 Data Discovery

  3. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities

    PubMed Central

    Kwan, Paul; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus. An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops. PMID:28875085

  4. Remote sensing and spatial statistical techniques for modelling Ommatissus lybicus (Hemiptera: Tropiduchidae) habitat and population densities.

    PubMed

    Al-Kindi, Khalifa M; Kwan, Paul; R Andrew, Nigel; Welch, Mitchell

    2017-01-01

    In order to understand the distribution and prevalence of Ommatissus lybicus (Hemiptera: Tropiduchidae) as well as analyse their current biographical patterns and predict their future spread, comprehensive and detailed information on the environmental, climatic, and agricultural practices are essential. The spatial analytical techniques such as Remote Sensing and Spatial Statistics Tools, can help detect and model spatial links and correlations between the presence, absence and density of O. lybicus in response to climatic, environmental, and human factors. The main objective of this paper is to review remote sensing and relevant analytical techniques that can be applied in mapping and modelling the habitat and population density of O. lybicus . An exhaustive search of related literature revealed that there are very limited studies linking location-based infestation levels of pests like the O. lybicus with climatic, environmental, and human practice related variables. This review also highlights the accumulated knowledge and addresses the gaps in this area of research. Furthermore, it makes recommendations for future studies, and gives suggestions on monitoring and surveillance methods in designing both local and regional level integrated pest management strategies of palm tree and other affected cultivated crops.

  5. PCV: An Alignment Free Method for Finding Homologous Nucleotide Sequences and its Application in Phylogenetic Study.

    PubMed

    Kumar, Rajnish; Mishra, Bharat Kumar; Lahiri, Tapobrata; Kumar, Gautam; Kumar, Nilesh; Gupta, Rahul; Pal, Manoj Kumar

    2017-06-01

    Online retrieval of the homologous nucleotide sequences through existing alignment techniques is a common practice against the given database of sequences. The salient point of these techniques is their dependence on local alignment techniques and scoring matrices the reliability of which is limited by computational complexity and accuracy. Toward this direction, this work offers a novel way for numerical representation of genes which can further help in dividing the data space into smaller partitions helping formation of a search tree. In this context, this paper introduces a 36-dimensional Periodicity Count Value (PCV) which is representative of a particular nucleotide sequence and created through adaptation from the concept of stochastic model of Kolekar et al. (American Institute of Physics 1298:307-312, 2010. doi: 10.1063/1.3516320 ). The PCV construct uses information on physicochemical properties of nucleotides and their positional distribution pattern within a gene. It is observed that PCV representation of gene reduces computational cost in the calculation of distances between a pair of genes while being consistent with the existing methods. The validity of PCV-based method was further tested through their use in molecular phylogeny constructs in comparison with that using existing sequence alignment methods.

  6. Localization Versus Abstraction: A Comparison of Two Search Reduction Techniques

    NASA Technical Reports Server (NTRS)

    Lansky, Amy L.

    1992-01-01

    There has been much recent work on the use of abstraction to improve planning behavior and cost. Another technique for dealing with the inherently explosive cost of planning is localization. This paper compares the relative strengths of localization and abstraction in reducing planning search cost. In particular, localization is shown to subsume abstraction. Localization techniques can model the various methods of abstraction that have been used, but also provide a much more flexible framework, with a broader range of benefits.

  7. Electrode channel selection based on backtracking search optimization in motor imagery brain-computer interfaces.

    PubMed

    Dai, Shengfa; Wei, Qingguo

    2017-01-01

    Common spatial pattern algorithm is widely used to estimate spatial filters in motor imagery based brain-computer interfaces. However, use of a large number of channels will make common spatial pattern tend to over-fitting and the classification of electroencephalographic signals time-consuming. To overcome these problems, it is necessary to choose an optimal subset of the whole channels to save computational time and improve the classification accuracy. In this paper, a novel method named backtracking search optimization algorithm is proposed to automatically select the optimal channel set for common spatial pattern. Each individual in the population is a N-dimensional vector, with each component representing one channel. A population of binary codes generate randomly in the beginning, and then channels are selected according to the evolution of these codes. The number and positions of 1's in the code denote the number and positions of chosen channels. The objective function of backtracking search optimization algorithm is defined as the combination of classification error rate and relative number of channels. Experimental results suggest that higher classification accuracy can be achieved with much fewer channels compared to standard common spatial pattern with whole channels.

  8. Discovering discovery patterns with predication-based Semantic Indexing

    PubMed Central

    Cohen, Trevor; Widdows, Dominic; Schvaneveldt, Roger W.; Davies, Peter; Rindflesch, Thomas C.

    2012-01-01

    In this paper we utilize methods of hyperdimensional computing to mediate the identification of therapeutically useful connections for the purpose of literature-based discovery. Our approach, named Predication-based Semantic Indexing, is utilized to identify empirically sequences of relationships known as “discovery patterns”, such as “drug x INHIBITS substance y, substance y CAUSES disease z” that link pharmaceutical substances to diseases they are known to treat. These sequences are derived from semantic predications extracted from the biomedical literature by the SemRep system, and subsequently utilized to direct the search for known treatments for a held out set of diseases. Rapid and efficient inference is accomplished through the application of geometric operators in PSI space, allowing for both the derivation of discovery patterns from a large set of known TREATS relationships, and the application of these discovered patterns to constrain search for therapeutic relationships at scale. Our results include the rediscovery of discovery patterns that have been constructed manually by other authors in previous research, as well as the discovery of a set of previously unrecognized patterns. The application of these patterns to direct search through PSI space results in better recovery of therapeutic relationships than is accomplished with models based on distributional statistics alone. These results demonstrate the utility of efficient approximate inference in geometric space as a means to identify therapeutic relationships, suggesting a role of these methods in drug repurposing efforts. In addition, the results provide strong support for the utility of the discovery pattern approach pioneered by Hristovski and his colleagues. PMID:22841748

  9. Rationalizing spatial exploration patterns of wild animals and humans through a temporal discounting framework.

    PubMed

    Namboodiri, Vijay Mohan K; Levy, Joshua M; Mihalas, Stefan; Sims, David W; Hussain Shuler, Marshall G

    2016-08-02

    Understanding the exploration patterns of foragers in the wild provides fundamental insight into animal behavior. Recent experimental evidence has demonstrated that path lengths (distances between consecutive turns) taken by foragers are well fitted by a power law distribution. Numerous theoretical contributions have posited that "Lévy random walks"-which can produce power law path length distributions-are optimal for memoryless agents searching a sparse reward landscape. It is unclear, however, whether such a strategy is efficient for cognitively complex agents, from wild animals to humans. Here, we developed a model to explain the emergence of apparent power law path length distributions in animals that can learn about their environments. In our model, the agent's goal during search is to build an internal model of the distribution of rewards in space that takes into account the cost of time to reach distant locations (i.e., temporally discounting rewards). For an agent with such a goal, we find that an optimal model of exploration in fact produces hyperbolic path lengths, which are well approximated by power laws. We then provide support for our model by showing that humans in a laboratory spatial exploration task search space systematically and modify their search patterns under a cost of time. In addition, we find that path length distributions in a large dataset obtained from free-ranging marine vertebrates are well described by our hyperbolic model. Thus, we provide a general theoretical framework for understanding spatial exploration patterns of cognitively complex foragers.

  10. Derivation of Optimal Cropping Pattern in Part of Hirakud Command using Cuckoo Search

    NASA Astrophysics Data System (ADS)

    Rath, Ashutosh; Biswal, Sudarsan; Samantaray, Sandeep; Swain, Prakash Chandra, PROF.

    2017-08-01

    The economicgrowth of a Nation depends on agriculture which relies on the obtainable water resources, available land and crops. The contribution of water in an appropriate quantity at appropriate time plays avitalrole to increase the agricultural production. Optimal utilization of available resources can be achieved by proper planning and management of water resources projects and adoption of appropriate technology. In the present work, the command area of Sambalpur distribrutary System is taken up for investigation. Further, adoption of a fixed cropping pattern causes the reduction of yield. The present study aims at developing different crop planning strategies to increase the net benefit from the command area with minimum investment. Optimization models are developed for Kharif season using LINDO and Cuckoo Search (CS) algorithm for maximization of the net benefits. In process of development of Optimization model the factors such as cultivable land, seeds, fertilizers, man power, water cost, etc. are taken as constraints. The irrigation water needs of major crops and the total available water through canals in the command of Sambalpur Distributary are estimated. LINDO and Cuckoo Search models are formulated and used to derive the optimal cropping pattern yielding maximum net benefits. The net benefits of Rs.585.0 lakhs in Kharif Season are obtained by adopting LINGO and 596.07 lakhs from Cuckoo Search, respectively, whereas the net benefits of 447.0 lakhs is received by the farmers of the locality with the adopting present cropping pattern.

  11. [Use of stimulation techniques in pain treatment].

    PubMed

    Rosted, Palle; Andersen, Claus

    2006-05-15

    Stimulation techniques (SB) include manipulation, acupuncture, acupressure, physiotherapy, transcutaneous electrical nerve stimulation, reflexotherapy, laser treatment and epidural stimulation technique. The purpose of this paper is to investigate the scientific evidence for these techniques. The Cochrane Library and Medline were searched for all techniques from 2000 to date. Only randomised controlled studies written in English were included. Search words were used, such as; acupuncture and neck pain, shoulder pain, etc. In total 587 papers were identified for the following diseases; headache, neck pain, shoulder pain, elbow pain, low back pain and knee pain. 415 papers were excluded, and the remaining 172 papers, a total of 20,431 patients, are the basis for this study. The effect of acupuncture and epidural stimulation technique is scientifically well-supported. For the remaining techniques, the scientific evidence is dubious.

  12. Optimal Refueling Pattern Search for a CANDU Reactor Using a Genetic Algorithm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quang Binh, DO; Gyuhong, ROH; Hangbok, CHOI

    2006-07-01

    This paper presents the results from the application of genetic algorithms to a refueling optimization of a Canada deuterium uranium (CANDU) reactor. This work aims at making a mathematical model of the refueling optimization problem including the objective function and constraints and developing a method based on genetic algorithms to solve the problem. The model of the optimization problem and the proposed method comply with the key features of the refueling strategy of the CANDU reactor which adopts an on-power refueling operation. In this study, a genetic algorithm combined with an elitism strategy was used to automatically search for themore » refueling patterns. The objective of the optimization was to maximize the discharge burn-up of the refueling bundles, minimize the maximum channel power, or minimize the maximum change in the zone controller unit (ZCU) water levels. A combination of these objectives was also investigated. The constraints include the discharge burn-up, maximum channel power, maximum bundle power, channel power peaking factor and the ZCU water level. A refueling pattern that represents the refueling rate and channels was coded by a one-dimensional binary chromosome, which is a string of binary numbers 0 and 1. A computer program was developed in FORTRAN 90 running on an HP 9000 workstation to conduct the search for the optimal refueling patterns for a CANDU reactor at the equilibrium state. The results showed that it was possible to apply genetic algorithms to automatically search for the refueling channels of the CANDU reactor. The optimal refueling patterns were compared with the solutions obtained from the AUTOREFUEL program and the results were consistent with each other. (authors)« less

  13. Practical Tips and Strategies for Finding Information on the Internet.

    ERIC Educational Resources Information Center

    Armstrong, Rhonda; Flanagan, Lynn

    This paper presents the most important concepts and techniques to use in successfully searching the major World Wide Web search engines and directories, explains the basics of how search engines work, and describes what is included in their indexes. Following an introduction that gives an overview of Web directories and search engines, the first…

  14. Quantum Associative Neural Network with Nonlinear Search Algorithm

    NASA Astrophysics Data System (ADS)

    Zhou, Rigui; Wang, Huian; Wu, Qian; Shi, Yang

    2012-03-01

    Based on analysis on properties of quantum linear superposition, to overcome the complexity of existing quantum associative memory which was proposed by Ventura, a new storage method for multiply patterns is proposed in this paper by constructing the quantum array with the binary decision diagrams. Also, the adoption of the nonlinear search algorithm increases the pattern recalling speed of this model which has multiply patterns to O( {log2}^{2^{n -t}} ) = O( n - t ) time complexity, where n is the number of quantum bit and t is the quantum information of the t quantum bit. Results of case analysis show that the associative neural network model proposed in this paper based on quantum learning is much better and optimized than other researchers' counterparts both in terms of avoiding the additional qubits or extraordinary initial operators, storing pattern and improving the recalling speed.

  15. Library Searching: An Industrial User's Viewpoint.

    ERIC Educational Resources Information Center

    Hendrickson, W. A.

    1982-01-01

    Discusses library searching of chemical literature from an industrial user's viewpoint, focusing on differences between academic and industrial researcher's searching techniques of the same problem area. Indicates that industry users need more exposure to patents, work with abstracting services and continued improvement in computer searching…

  16. Women: Marriage, Career, and Job Satisfaction.

    ERIC Educational Resources Information Center

    Bisconti, Ann Stouffer

    Marriage, childbearing, and employment patterns of women initially surveyed during their entrance to college in 1961 and subsequently surveyed in 1965, 1971, and 1974-5 were investigated. Subjects were questioned on marital status, number and timing of children in the family, employment patterns and shifts, job search patterns, preferred housework…

  17. The Myth of the L.D. WISC-R Profile.

    ERIC Educational Resources Information Center

    Miller, Maurice; Walker, Kenneth P.

    1981-01-01

    The review cites methodological and statistical flaws in studies attempting to identify subtest patterns on the Wechsler Intelligence Scale for Children-Revised indicative of learning disabilities (LD) and concludes that no LD pattern has been found and the search for such a pattern is not justified. (Author/CL)

  18. Effect of gravito-inertial cues on the coding of orientation in pre-attentive vision.

    PubMed

    Stivalet, P; Marendaz, C; Barraclough, L; Mourareau, C

    1995-01-01

    To see if the spatial reference frame used by pre-attentive vision is specified in a retino-centered frame or in a reference frame integrating visual and nonvisual information (vestibular and somatosensory), subjects were centrifuged in a non-pendular cabin and were asked to search for a target distinguishable from distractors by difference in orientation (Treisman's "pop-out" paradigm [1]). In a control condition, in which subjects were sitting immobilized but not centrifuged, this task gave an asymmetric search pattern: Search was rapid and pre-attentional except when the target was aligned with the horizontal retinal/head axis, in which case search was slow and attentional (2). Results using a centrifuge showed that slow/serial search patterns were obtained when the target was aligned with the subjective horizontal axis (and not with the horizontal retinal/head axis). These data suggest that a multisensory reference frame is used in pre-attentive vision. The results are interpreted in terms of Riccio and Stoffregen's "ecological theory" of orientation in which the vertical and horizontal axes constitute independent reference frames (3).

  19. Autonomous change of behavior for environmental context: An intermittent search model with misunderstanding search pattern

    NASA Astrophysics Data System (ADS)

    Murakami, Hisashi; Gunji, Yukio-Pegio

    2017-07-01

    Although foraging patterns have long been predicted to optimally adapt to environmental conditions, empirical evidence has been found in recent years. This evidence suggests that the search strategy of animals is open to change so that animals can flexibly respond to their environment. In this study, we began with a simple computational model that possesses the principal features of an intermittent strategy, i.e., careful local searches separated by longer steps, as a mechanism for relocation, where an agent in the model follows a rule to switch between two phases, but it could misunderstand this rule, i.e., the agent follows an ambiguous switching rule. Thanks to this ambiguity, the agent's foraging strategy can continuously change. First, we demonstrate that our model can exhibit an optimal change of strategy from Brownian-type to Lévy-type depending on the prey density, and we investigate the distribution of time intervals for switching between the phases. Moreover, we show that the model can display higher search efficiency than a correlated random walk.

  20. Privacy-preserving search for chemical compound databases.

    PubMed

    Shimizu, Kana; Nuida, Koji; Arai, Hiromi; Mitsunari, Shigeo; Attrapadung, Nuttapong; Hamada, Michiaki; Tsuda, Koji; Hirokawa, Takatsugu; Sakuma, Jun; Hanaoka, Goichiro; Asai, Kiyoshi

    2015-01-01

    Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information.

  1. Privacy-preserving search for chemical compound databases

    PubMed Central

    2015-01-01

    Background Searching for similar compounds in a database is the most important process for in-silico drug screening. Since a query compound is an important starting point for the new drug, a query holder, who is afraid of the query being monitored by the database server, usually downloads all the records in the database and uses them in a closed network. However, a serious dilemma arises when the database holder also wants to output no information except for the search results, and such a dilemma prevents the use of many important data resources. Results In order to overcome this dilemma, we developed a novel cryptographic protocol that enables database searching while keeping both the query holder's privacy and database holder's privacy. Generally, the application of cryptographic techniques to practical problems is difficult because versatile techniques are computationally expensive while computationally inexpensive techniques can perform only trivial computation tasks. In this study, our protocol is successfully built only from an additive-homomorphic cryptosystem, which allows only addition performed on encrypted values but is computationally efficient compared with versatile techniques such as general purpose multi-party computation. In an experiment searching ChEMBL, which consists of more than 1,200,000 compounds, the proposed method was 36,900 times faster in CPU time and 12,000 times as efficient in communication size compared with general purpose multi-party computation. Conclusion We proposed a novel privacy-preserving protocol for searching chemical compound databases. The proposed method, easily scaling for large-scale databases, may help to accelerate drug discovery research by making full use of unused but valuable data that includes sensitive information. PMID:26678650

  2. Tips, Techniques, and Words of Wisdom.

    ERIC Educational Resources Information Center

    Garman, Nancy, Comp.

    1990-01-01

    Presents suggestions from online searchers for using online services. Topics discussed include decreasing costs by using less expensive files; modifying searches on Dialog; use of controlled vocabularies and free text; using a variety of databases; the importance of search intermediaries understanding the topic; and patent searching. (LRW)

  3. Search strategies on the Internet: general and specific.

    PubMed

    Bottrill, Krys

    2004-06-01

    Some of the most up-to-date information on scientific activity is to be found on the Internet; for example, on the websites of academic and other research institutions and in databases of currently funded research studies provided on the websites of funding bodies. Such information can be valuable in suggesting new approaches and techniques that could be applicable in a Three Rs context. However, the Internet is a chaotic medium, not subject to the meticulous classification and organisation of classical information resources. At the same time, Internet search engines do not match the sophistication of search systems used by database hosts. Also, although some offer relatively advanced features, user awareness of these tends to be low. Furthermore, much of the information on the Internet is not accessible to conventional search engines, giving rise to the concept of the "Invisible Web". General strategies and techniques for Internet searching are presented, together with a comparative survey of selected search engines. The question of how the Invisible Web can be accessed is discussed, as well as how to keep up-to-date with Internet content and improve searching skills.

  4. Improved Search Techniques

    NASA Technical Reports Server (NTRS)

    Albornoz, Caleb Ronald

    2012-01-01

    Thousands of millions of documents are stored and updated daily in the World Wide Web. Most of the information is not efficiently organized to build knowledge from the stored data. Nowadays, search engines are mainly used by users who rely on their skills to look for the information needed. This paper presents different techniques search engine users can apply in Google Search to improve the relevancy of search results. According to the Pew Research Center, the average person spends eight hours a month searching for the right information. For instance, a company that employs 1000 employees wastes $2.5 million dollars on looking for nonexistent and/or not found information. The cost is very high because decisions are made based on the information that is readily available to use. Whenever the information necessary to formulate an argument is not available or found, poor decisions may be made and mistakes will be more likely to occur. Also, the survey indicates that only 56% of Google users feel confident with their current search skills. Moreover, just 76% of the information that is available on the Internet is accurate.

  5. Digging deeper for new physics in the LHC data

    NASA Astrophysics Data System (ADS)

    Asadi, Pouya; Buckley, Matthew R.; DiFranzo, Anthony; Monteux, Angelo; Shih, David

    2017-11-01

    In this paper, we describe a novel, model-independent technique of "rectangular aggregations" for mining the LHC data for hints of new physics. A typical (CMS) search now has hundreds of signal regions, which can obscure potentially interesting anomalies. Applying our technique to the two CMS jets+MET SUSY searches, we identify a set of previously overlooked ˜ 3 σ excesses. Among these, four excesses survive tests of inter-and intra-search compatibility, and two are especially interesting: they are largely overlappingbetween the jets+MET searches and are characterized by low jet multiplicity, zero b-jets, and low MET and H T . We find that resonant color-triplet production decaying to a quark plus an invisible particle provides an excellent fit to these two excesses and all other data — including the ATLAS jets+MET search, which actually sees a correlated excess. We discuss the additional constraints coming from dijet resonance searches, monojet searches and pair production. Based on these results, we believe the wide-spread view that the LHC data contains no interesting excesses is greatly exaggerated.

  6. The use of phylogeny to interpret cross-cultural patterns in plant use and guide medicinal plant discovery: an example from Pterocarpus (Leguminosae).

    PubMed

    Saslis-Lagoudakis, C Haris; Klitgaard, Bente B; Forest, Félix; Francis, Louise; Savolainen, Vincent; Williamson, Elizabeth M; Hawkins, Julie A

    2011-01-01

    The study of traditional knowledge of medicinal plants has led to discoveries that have helped combat diseases and improve healthcare. However, the development of quantitative measures that can assist our quest for new medicinal plants has not greatly advanced in recent years. Phylogenetic tools have entered many scientific fields in the last two decades to provide explanatory power, but have been overlooked in ethnomedicinal studies. Several studies show that medicinal properties are not randomly distributed in plant phylogenies, suggesting that phylogeny shapes ethnobotanical use. Nevertheless, empirical studies that explicitly combine ethnobotanical and phylogenetic information are scarce. In this study, we borrowed tools from community ecology phylogenetics to quantify significance of phylogenetic signal in medicinal properties in plants and identify nodes on phylogenies with high bioscreening potential. To do this, we produced an ethnomedicinal review from extensive literature research and a multi-locus phylogenetic hypothesis for the pantropical genus Pterocarpus (Leguminosae: Papilionoideae). We demonstrate that species used to treat a certain conditions, such as malaria, are significantly phylogenetically clumped and we highlight nodes in the phylogeny that are significantly overabundant in species used to treat certain conditions. These cross-cultural patterns in ethnomedicinal usage in Pterocarpus are interpreted in the light of phylogenetic relationships. This study provides techniques that enable the application of phylogenies in bioscreening, but also sheds light on the processes that shape cross-cultural ethnomedicinal patterns. This community phylogenetic approach demonstrates that similar ethnobotanical uses can arise in parallel in different areas where related plants are available. With a vast amount of ethnomedicinal and phylogenetic information available, we predict that this field, after further refinement of the techniques, will expand into similar research areas, such as pest management or the search for bioactive plant-based compounds.

  7. Methodological aspects of an adaptive multidirectional pattern search to optimize speech perception using three hearing-aid algorithms

    NASA Astrophysics Data System (ADS)

    Franck, Bas A. M.; Dreschler, Wouter A.; Lyzenga, Johannes

    2004-12-01

    In this study we investigated the reliability and convergence characteristics of an adaptive multidirectional pattern search procedure, relative to a nonadaptive multidirectional pattern search procedure. The procedure was designed to optimize three speech-processing strategies. These comprise noise reduction, spectral enhancement, and spectral lift. The search is based on a paired-comparison paradigm, in which subjects evaluated the listening comfort of speech-in-noise fragments. The procedural and nonprocedural factors that influence the reliability and convergence of the procedure are studied using various test conditions. The test conditions combine different tests, initial settings, background noise types, and step size configurations. Seven normal hearing subjects participated in this study. The results indicate that the reliability of the optimization strategy may benefit from the use of an adaptive step size. Decreasing the step size increases accuracy, while increasing the step size can be beneficial to create clear perceptual differences in the comparisons. The reliability also depends on starting point, stop criterion, step size constraints, background noise, algorithms used, as well as the presence of drifting cues and suboptimal settings. There appears to be a trade-off between reliability and convergence, i.e., when the step size is enlarged the reliability improves, but the convergence deteriorates. .

  8. Category-Specific Neural Oscillations Predict Recall Organization During Memory Search

    PubMed Central

    Morton, Neal W.; Kahana, Michael J.; Rosenberg, Emily A.; Baltuch, Gordon H.; Litt, Brian; Sharan, Ashwini D.; Sperling, Michael R.; Polyn, Sean M.

    2013-01-01

    Retrieved-context models of human memory propose that as material is studied, retrieval cues are constructed that allow one to target particular aspects of past experience. We examined the neural predictions of these models by using electrocorticographic/depth recordings and scalp electroencephalography (EEG) to characterize category-specific oscillatory activity, while participants studied and recalled items from distinct, neurally discriminable categories. During study, these category-specific patterns predict whether a studied item will be recalled. In the scalp EEG experiment, category-specific activity during study also predicts whether a given item will be recalled adjacent to other same-category items, consistent with the proposal that a category-specific retrieval cue is used to guide memory search. Retrieved-context models suggest that integrative neural circuitry is involved in the construction and maintenance of the retrieval cue. Consistent with this hypothesis, we observe category-specific patterns that rise in strength as multiple same-category items are studied sequentially, and find that individual differences in this category-specific neural integration during study predict the degree to which a participant will use category information to organize memory search. Finally, we track the deployment of this retrieval cue during memory search: Category-specific patterns are stronger when participants organize their responses according to the category of the studied material. PMID:22875859

  9. Scaling laws of marine predator search behaviour.

    PubMed

    Sims, David W; Southall, Emily J; Humphries, Nicolas E; Hays, Graeme C; Bradshaw, Corey J A; Pitchford, Jonathan W; James, Alex; Ahmed, Mohammed Z; Brierley, Andrew S; Hindell, Mark A; Morritt, David; Musyl, Michael K; Righton, David; Shepard, Emily L C; Wearmouth, Victoria J; Wilson, Rory P; Witt, Matthew J; Metcalfe, Julian D

    2008-02-28

    Many free-ranging predators have to make foraging decisions with little, if any, knowledge of present resource distribution and availability. The optimal search strategy they should use to maximize encounter rates with prey in heterogeneous natural environments remains a largely unresolved issue in ecology. Lévy walks are specialized random walks giving rise to fractal movement trajectories that may represent an optimal solution for searching complex landscapes. However, the adaptive significance of this putative strategy in response to natural prey distributions remains untested. Here we analyse over a million movement displacements recorded from animal-attached electronic tags to show that diverse marine predators-sharks, bony fishes, sea turtles and penguins-exhibit Lévy-walk-like behaviour close to a theoretical optimum. Prey density distributions also display Lévy-like fractal patterns, suggesting response movements by predators to prey distributions. Simulations show that predators have higher encounter rates when adopting Lévy-type foraging in natural-like prey fields compared with purely random landscapes. This is consistent with the hypothesis that observed search patterns are adapted to observed statistical patterns of the landscape. This may explain why Lévy-like behaviour seems to be widespread among diverse organisms, from microbes to humans, as a 'rule' that evolved in response to patchy resource distributions.

  10. Engineering Your Job Search: A Job-Finding Resource for Engineering Professionals.

    ERIC Educational Resources Information Center

    1995

    This guide, which is intended for engineering professionals, explains how to use up-to-date job search techniques to design and conduct an effective job hunt. The first 11 chapters discuss the following steps in searching for a job: handling a job loss; managing time and financial resources while conducting a full-time job search; using objective…

  11. Influence of social presence on eye movements in visual search tasks.

    PubMed

    Liu, Na; Yu, Ruifeng

    2017-12-01

    This study employed an eye-tracking technique to investigate the influence of social presence on eye movements in visual search tasks. A total of 20 male subjects performed visual search tasks in a 2 (target presence: present vs. absent) × 2 (task complexity: complex vs. simple) × 2 (social presence: alone vs. a human audience) within-subject experiment. Results indicated that the presence of an audience could evoke a social facilitation effect on response time in visual search tasks. Compared with working alone, the participants made fewer and shorter fixations, larger saccades and shorter scan path in simple search tasks and more and longer fixations, smaller saccades and longer scan path in complex search tasks when working with an audience. The saccade velocity and pupil diameter in the audience-present condition were larger than those in the working-alone condition. No significant change in target fixation number was observed between two social presence conditions. Practitioner Summary: This study employed an eye-tracking technique to examine the influence of social presence on eye movements in visual search tasks. Results clarified the variation mechanism and characteristics of oculomotor scanning induced by social presence in visual search.

  12. Search for free fractional electric charge elementary particles using an automated millikan oil drop technique

    PubMed

    Halyo; Kim; Lee; Lee; Loomba; Perl

    2000-03-20

    We have carried out a direct search in bulk matter for free fractional electric charge elementary particles using the largest mass single sample ever studied-about 17.4 mg of silicone oil. The search used an improved and highly automated Millikan oil drop technique. No evidence for fractional charge particles was found. The concentration of particles with fractional charge more than 0. 16e ( e being the magnitude of the electron charge) from the nearest integer charge is less than 4.71x10(-22) particles per nucleon with 95% confidence.

  13. Four-Dimensional Golden Search

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fenimore, Edward E.

    2015-02-25

    The Golden search technique is a method to search a multiple-dimension space to find the minimum. It basically subdivides the possible ranges of parameters until it brackets, to within an arbitrarily small distance, the minimum. It has the advantages that (1) the function to be minimized can be non-linear, (2) it does not require derivatives of the function, (3) the convergence criterion does not depend on the magnitude of the function. Thus, if the function is a goodness of fit parameter such as chi-square, the convergence does not depend on the noise being correctly estimated or the function correctly followingmore » the chi-square statistic. And, (4) the convergence criterion does not depend on the shape of the function. Thus, long shallow surfaces can be searched without the problem of premature convergence. As with many methods, the Golden search technique can be confused by surfaces with multiple minima.« less

  14. Investigating the Non-Covalent Functionalization and Chemical Transformation of Graphene

    NASA Astrophysics Data System (ADS)

    Sham, Chun-Hong

    Trend in device miniatures demands capabilities to produce rationally designed patterns in ever-shrinking length scale. The research community has examined various techniques to push the current lithography resolution to sub-10nm scale. One of the ideas is to utilize the natural nanoscale patterns of molecular assemblies. In this thesis, the self-assembling phenomenon of a photoactive molecule on epitaxial graphene (EG) grown on SiC was discussed. This molecular assembly enables manipulation of chemical contrast in nanoscale through UV exposure or atomic layer deposition. Future development of nanoelectronics industry will be fueled by innovations in electronics materials, which could be discovered through covalent modification of graphene. In a study reported in this thesis, silicon is deposited onto EG. After annealing, a new surface reconstruction, identified to be (3x3)-SiC, was formed. Raman spectroscopy finds no signature of graphene after annealing, indicating a complete chemical transformation of graphene. DFT calculations reveal a possible conversion mechanism. Overall, these studies provide insights for future device miniaturization; contribute to the search of novel materials and help bridging the gap between graphene and current silicon-based industrial infrastructures.

  15. Service patterns related to successful employment outcomes of persons with traumatic brain injury in vocational rehabilitation.

    PubMed

    Catalano, Denise; Pereira, Ana Paula; Wu, Ming-Yi; Ho, Hanson; Chan, Fong

    2006-01-01

    This study analyzed the Rehabilitation Services Administration (RSA) case service report (RSA-911) data for fiscal year 2004 to examine effects of demographic characteristics, work disincentives, and vocational rehabilitation services patterns on employment outcomes of persons with traumatic brain injuries (TBI). The results indicated that European Americans (53%) had appreciably higher competitive employment rates than Native American (50%), Asian Americans (44%), African Americans (42%), and Hispanic/Latino Americans (41%). Clients without co-occurring psychiatric disabilities had a higher employment rate (51%) than those with psychiatric disabilities (45%). Clients without work disincentives showed better employment outcomes (58%) than those with disincentives (45%). An important finding from this analysis was the central role of job search assistance, job placement assistance, and on-the-job support services for persons with TBI in predicting employment outcomes. A data mining technique, the exhaustive CHAID analysis, was used to examine the interaction effects of race, gender, work disincentives and service variables on employment outcomes. The results indicated that the TBI clients in this study could be segmented into 29 homogeneous subgroups with employment rates ranging from a low of 11% to a high of 82%, and these differences can be explained by differences in work disincentives, race, and rehabilitation service patterns.

  16. Is Google Trends a reliable tool for digital epidemiology? Insights from different clinical settings.

    PubMed

    Cervellin, Gianfranco; Comelli, Ivan; Lippi, Giuseppe

    2017-09-01

    Internet-derived information has been recently recognized as a valuable tool for epidemiological investigation. Google Trends, a Google Inc. portal, generates data on geographical and temporal patterns according to specified keywords. The aim of this study was to compare the reliability of Google Trends in different clinical settings, for both common diseases with lower media coverage, and for less common diseases attracting major media coverage. We carried out a search in Google Trends using the keywords "renal colic", "epistaxis", and "mushroom poisoning", selected on the basis of available and reliable epidemiological data. Besides this search, we carried out a second search for three clinical conditions (i.e., "meningitis", "Legionella Pneumophila pneumonia", and "Ebola fever"), which recently received major focus by the Italian media. In our analysis, no correlation was found between data captured from Google Trends and epidemiology of renal colics, epistaxis and mushroom poisoning. Only when searching for the term "mushroom" alone the Google Trends search generated a seasonal pattern which almost overlaps with the epidemiological profile, but this was probably mostly due to searches for harvesting and cooking rather than to for poisoning. The Google Trends data also failed to reflect the geographical and temporary patterns of disease for meningitis, Legionella Pneumophila pneumonia and Ebola fever. The results of our study confirm that Google Trends has modest reliability for defining the epidemiology of relatively common diseases with minor media coverage, or relatively rare diseases with higher audience. Overall, Google Trends seems to be more influenced by the media clamor than by true epidemiological burden. Copyright © 2017 Ministry of Health, Saudi Arabia. Published by Elsevier Ltd. All rights reserved.

  17. Wanna know about vaping? Patterns of message exposure, seeking and sharing information about e-cigarettes across media platforms.

    PubMed

    Emery, Sherry L; Vera, Lisa; Huang, Jidong; Szczypka, Glen

    2014-07-01

    Awareness and use of electronic cigarettes has rapidly grown in the USA recently, in step with increased product marketing. Using responses to a population survey of US adults, we analysed demographic patterns of exposure to, searching for and sharing of e-cigarette-related information across media platforms. An online survey of 17,522 US adults was conducted in 2013. The nationally representative sample was drawn from GfK Group's KnowledgePanel plus off-panel recruitment. Fixed effects logit models were applied to analyse relationships between exposure to, searching for and sharing of e-cigarette-related information and demographic characteristics, e-cigarette and tobacco use, and media behaviours. High levels of awareness about e-cigarettes were indicated (86% aware; 47% heard through media channels). Exposure to e-cigarette-related information was associated with tobacco use, age, gender, more education, social media use and time spent online. Although relatively small proportions of the sample had searched for (∼5%) or shared (∼2%) e-cigarette information, our analyses indicated demographic patterns to those behaviours. Gender, high income and using social media were associated with searching for e-cigarette information; lesbian, gay and bisexual and less education were associated with sharing. Current tobacco use, age, being Hispanic and time spent online were associated with both searching and sharing. US adults are widely exposed to e-cigarette marketing through the media; such marketing may differentially target specific demographic groups. Further research should longitudinally examine how exposure to, searching for and sharing of e-cigarette information relate to subsequent use of e-cigarettes and/or combustible tobacco. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  18. Wanna know about vaping? Patterns of message exposure, seeking and sharing information about e-cigarettes across media platforms

    PubMed Central

    Emery, Sherry L; Vera, Lisa; Huang, Jidong; Szczypka, Glen

    2014-01-01

    Background Awareness and use of electronic cigarettes has rapidly grown in the USA recently, in step with increased product marketing. Using responses to a population survey of US adults, we analysed demographic patterns of exposure to, searching for and sharing of e-cigarette-related information across media platforms. Methods An online survey of 17 522 US adults was conducted in 2013. The nationally representative sample was drawn from GfK Group's KnowledgePanel plus off-panel recruitment. Fixed effects logit models were applied to analyse relationships between exposure to, searching for and sharing of e-cigarette-related information and demographic characteristics, e-cigarette and tobacco use, and media behaviours. Results High levels of awareness about e-cigarettes were indicated (86% aware; 47% heard through media channels). Exposure to e-cigarette-related information was associated with tobacco use, age, gender, more education, social media use and time spent online. Although relatively small proportions of the sample had searched for (∼5%) or shared (∼2%) e-cigarette information, our analyses indicated demographic patterns to those behaviours. Gender, high income and using social media were associated with searching for e-cigarette information; lesbian, gay and bisexual and less education were associated with sharing. Current tobacco use, age, being Hispanic and time spent online were associated with both searching and sharing. Conclusions US adults are widely exposed to e-cigarette marketing through the media; such marketing may differentially target specific demographic groups. Further research should longitudinally examine how exposure to, searching for and sharing of e-cigarette information relate to subsequent use of e-cigarettes and/or combustible tobacco. PMID:24935893

  19. GEsture: an online hand-drawing tool for gene expression pattern search.

    PubMed

    Wang, Chunyan; Xu, Yiqing; Wang, Xuelin; Zhang, Li; Wei, Suyun; Ye, Qiaolin; Zhu, Youxiang; Yin, Hengfu; Nainwal, Manoj; Tanon-Reyes, Luis; Cheng, Feng; Yin, Tongming; Ye, Ning

    2018-01-01

    Gene expression profiling data provide useful information for the investigation of biological function and process. However, identifying a specific expression pattern from extensive time series gene expression data is not an easy task. Clustering, a popular method, is often used to classify similar expression genes, however, genes with a 'desirable' or 'user-defined' pattern cannot be efficiently detected by clustering methods. To address these limitations, we developed an online tool called GEsture. Users can draw, or graph a curve using a mouse instead of inputting abstract parameters of clustering methods. GEsture explores genes showing similar, opposite and time-delay expression patterns with a gene expression curve as input from time series datasets. We presented three examples that illustrate the capacity of GEsture in gene hunting while following users' requirements. GEsture also provides visualization tools (such as expression pattern figure, heat map and correlation network) to display the searching results. The result outputs may provide useful information for researchers to understand the targets, function and biological processes of the involved genes.

  20. Hybrid Stochastic Search Technique based Suboptimal AGC Regulator Design for Power System using Constrained Feedback Control Strategy

    NASA Astrophysics Data System (ADS)

    Ibraheem, Omveer, Hasan, N.

    2010-10-01

    A new hybrid stochastic search technique is proposed to design of suboptimal AGC regulator for a two area interconnected non reheat thermal power system incorporating DC link in parallel with AC tie-line. In this technique, we are proposing the hybrid form of Genetic Algorithm (GA) and simulated annealing (SA) based regulator. GASA has been successfully applied to constrained feedback control problems where other PI based techniques have often failed. The main idea in this scheme is to seek a feasible PI based suboptimal solution at each sampling time. The feasible solution decreases the cost function rather than minimizing the cost function.

  1. Multidimensional scaling for evolutionary algorithms--visualization of the path through search space and solution space using Sammon mapping.

    PubMed

    Pohlheim, Hartmut

    2006-01-01

    Multidimensional scaling as a technique for the presentation of high-dimensional data with standard visualization techniques is presented. The technique used is often known as Sammon mapping. We explain the mathematical foundations of multidimensional scaling and its robust calculation. We also demonstrate the use of this technique in the area of evolutionary algorithms. First, we present the visualization of the path through the search space of the best individuals during an optimization run. We then apply multidimensional scaling to the comparison of multiple runs regarding the variables of individuals and multi-criteria objective values (path through the solution space).

  2. Data Recommender: An Alternative Way to Discover Open Scientific Datasets

    NASA Astrophysics Data System (ADS)

    Klump, J. F.; Devaraju, A.; Williams, G.; Hogan, D.; Davy, R.; Page, J.; Singh, D.; Peterson, N.

    2017-12-01

    Over the past few years, institutions and government agencies have adopted policies to openly release their data, which has resulted in huge amounts of open data becoming available on the web. When trying to discover the data, users face two challenges: an overload of choice and the limitations of the existing data search tools. On the one hand, there are too many datasets to choose from, and therefore, users need to spend considerable effort to find the datasets most relevant to their research. On the other hand, data portals commonly offer keyword and faceted search, which depend fully on the user queries to search and rank relevant datasets. Consequently, keyword and faceted search may return loosely related or irrelevant results, although the results may contain the same query. They may also return highly specific results that depend more on how well metadata was authored. They do not account well for variance in metadata due to variance in author styles and preferences. The top-ranked results may also come from the same data collection, and users are unlikely to discover new and interesting datasets. These search modes mainly suits users who can express their information needs in terms of the structure and terminology of the data portals, but may pose a challenge otherwise. The above challenges reflect that we need a solution that delivers the most relevant (i.e., similar and serendipitous) datasets to users, beyond the existing search functionalities on the portals. A recommender system is an information filtering system that presents users with relevant and interesting contents based on users' context and preferences. Delivering data recommendations to users can make data discovery easier, and as a result may enhance user engagement with the portal. We developed a hybrid data recommendation approach for the CSIRO Data Access Portal. The approach leverages existing recommendation techniques (e.g., content-based filtering and item co-occurrence) to produce similar and serendipitous data recommendations. It measures the relevance between datasets based on their properties, and search and download patterns. We evaluated the recommendation approach in a user study, and the obtained user judgments revealed the ability of the approach to accurately quantify the relevance of the datasets.

  3. Discovering biomedical semantic relations in PubMed queries for information retrieval and database curation

    PubMed Central

    Huang, Chung-Chi; Lu, Zhiyong

    2016-01-01

    Identifying relevant papers from the literature is a common task in biocuration. Most current biomedical literature search systems primarily rely on matching user keywords. Semantic search, on the other hand, seeks to improve search accuracy by understanding the entities and contextual relations in user keywords. However, past research has mostly focused on semantically identifying biological entities (e.g. chemicals, diseases and genes) with little effort on discovering semantic relations. In this work, we aim to discover biomedical semantic relations in PubMed queries in an automated and unsupervised fashion. Specifically, we focus on extracting and understanding the contextual information (or context patterns) that is used by PubMed users to represent semantic relations between entities such as ‘CHEMICAL-1 compared to CHEMICAL-2.’ With the advances in automatic named entity recognition, we first tag entities in PubMed queries and then use tagged entities as knowledge to recognize pattern semantics. More specifically, we transform PubMed queries into context patterns involving participating entities, which are subsequently projected to latent topics via latent semantic analysis (LSA) to avoid the data sparseness and specificity issues. Finally, we mine semantically similar contextual patterns or semantic relations based on LSA topic distributions. Our two separate evaluation experiments of chemical-chemical (CC) and chemical–disease (CD) relations show that the proposed approach significantly outperforms a baseline method, which simply measures pattern semantics by similarity in participating entities. The highest performance achieved by our approach is nearly 0.9 and 0.85 respectively for the CC and CD task when compared against the ground truth in terms of normalized discounted cumulative gain (nDCG), a standard measure of ranking quality. These results suggest that our approach can effectively identify and return related semantic patterns in a ranked order covering diverse bio-entity relations. To assess the potential utility of our automated top-ranked patterns of a given relation in semantic search, we performed a pilot study on frequently sought semantic relations in PubMed and observed improved literature retrieval effectiveness based on post-hoc human relevance evaluation. Further investigation in larger tests and in real-world scenarios is warranted. PMID:27016698

  4. IMAAAGINE: a webserver for searching hypothetical 3D amino acid side chain arrangements in the Protein Data Bank

    PubMed Central

    Nadzirin, Nurul; Willett, Peter; Artymiuk, Peter J.; Firdaus-Raih, Mohd

    2013-01-01

    We describe a server that allows the interrogation of the Protein Data Bank for hypothetical 3D side chain patterns that are not limited to known patterns from existing 3D structures. A minimal side chain description allows a variety of side chain orientations to exist within the pattern, and generic side chain types such as acid, base and hydroxyl-containing can be additionally deployed in the search query. Moreover, only a subset of distances between the side chains need be specified. We illustrate these capabilities in case studies involving arginine stacks, serine-acid group arrangements and multiple catalytic triad-like configurations. The IMAAAGINE server can be accessed at http://mfrlab.org/grafss/imaaagine/. PMID:23716645

  5. The history of couple therapy: a millennial review.

    PubMed

    Gurman, Alan S; Fraenkel, Peter

    2002-01-01

    In this article, we review the major conceptual and clinical influences and trends in the history of couple therapy to date, and also chronicle the history of research on couple therapy. The evolving patterns in theory and practice are reviewed as having progressed through four distinctive phases: Phase I--Atheoretical Marriage Counseling Formation (1930-1963); Phase II--Psychoanalytic Experimentation (1931-1966); Phase III--Family Therapy Incorporation (1963-1985); and Phase IV--Refinement, Extension, Diversification, and Integration (1986-present). The history of research in the field is described as having passed through three phases: Phase I--A Technique in Search of Some Data (1930-1974), Phase II--Irrational(?) Exuberance (1975-1992), and Phase III--Caution and Extension (1993-present). The article concludes with the identification of Four Great Historical Ironies in the History of Couple Therapy.

  6. System of Indicators in the Innovation Management: Business Intelligence Applied to Tourism

    NASA Astrophysics Data System (ADS)

    Lozada, Dayana; Araque, Francisco; Castillo, Jose Manuel; Salguero, Alberto; Delgado, Cecilia; Noda, Marcia; Hernández, Gilberto

    The work presents an approach to study mechanisms that allows managers the Innovation Management (IM) measurements. It is assumed, as main motivation, the analysis of patterns for the design of an integral system of indicators. A methodology that integrates the thought process, focusing on the Business Intelligence and the Balance Scorecard will be presented. A group of indexes based on the multidimensionality of IM in organizations of the sector of tourism is proposed. To approach this quality it is necessary to contextualize, in the conditions of sectoral operation, the theories, models and systems used in our approach. It has been used intervention methods like experts' criteria, consensus search techniques by means of surveys, consultation of documents, and statistical methods such as analysis of the main components.

  7. Port closure techniques.

    PubMed

    Shaher, Z

    2007-08-01

    Laparoscopic trocars do create wounds. This article aims to review and list different techniques used for closure of the fascia incision at trocar sites. A literature search was performed for articles dealing with closure techniques. The author searched this subject in English on Medline by combining the words "trocar" and "hernia," as well as "Deschamps" and "Reverdin." All articles reporting techniques with their references were reviewed. The articles described many techniques in addition to classical closure using curved needles, including Grice needle, Maciol needles, Endoclose device, Carter-Thomason device, Tahoe ligature device, Endo-Judge device, eXit puncture closure device, Lowsley retractor, spinal cord needles, dual hemostat, suture carrier, Riverdin and Deschamps needles, and Gore-Tex closure device. Three main groups of techniques were found with favor of extracorporeal manipulations under direct visualization. Old methods are sufficient and cost-effective.

  8. Decision-making in information seeking on texts: an eye-fixation-related potentials investigation.

    PubMed

    Frey, Aline; Ionescu, Gelu; Lemaire, Benoit; López-Orozco, Francisco; Baccino, Thierry; Guérin-Dugué, Anne

    2013-01-01

    Reading on a web page is known to be not linear and people need to make fast decisions about whether they have to stop or not reading. In such context, reading, and decision-making processes are intertwined and this experiment attempts to separate them through electrophysiological patterns provided by the Eye-Fixation-Related Potentials technique (EFRPs). We conducted an experiment in which EFRPs were recorded while participants read blocks of text that were semantically highly related, moderately related, and unrelated to a given goal. Participants had to decide as fast as possible whether the text was related or not to the semantic goal given at a prior stage. Decision making (stopping information search) may occur when the paragraph is highly related to the goal (positive decision) or when it is unrelated to the goal (negative decision). EFRPs were analyzed on and around typical eye fixations: either on words belonging to the goal (target), subjected to a high rate of positive decisions, or on low frequency unrelated words (incongruent), subjected to a high rate of negative decisions. In both cases, we found EFRPs specific patterns (amplitude peaking between 51 to 120 ms after fixation onset) spreading out on the next words following the goal word and the second fixation after an incongruent word, in parietal and occipital areas. We interpreted these results as delayed late components (P3b and N400), reflecting the decision to stop information searching. Indeed, we show a clear spill-over effect showing that the effect on word N spread out on word N + 1 and N + 2.

  9. A search for stratiform massive-sulfide exploration targets in Appalachian Devonian rocks; a case study using computer-assisted attribute-coincidence mapping

    USGS Publications Warehouse

    Wedow, Helmuth

    1983-01-01

    The empirical model for sediment-associated, stratiform, exhalative, massive-sulfide deposits presented by D. Large in 1979 and 1980 has been redesigned to permit its use in a computer-assisted search for exploration-target areas in Devonian rocks of the Appalachian region using attribute-coincidence mapping (ACM). Some 36 gridded-data maps and selected maps derived therefrom were developed to show the orthogonal patterns, using the 7-1/2 minute quadrangle as an information cell, of geologic data patterns relevant to the empirical model. From these map and data files, six attribute-coincidence maps were prepared to illustrate both variation in the application of ACM techniques and the extent of possible significant exploration-target areas. As a result of this preliminary work in ACM, four major (and some lesser) exploration-target areas needing further study and analysis have been defined as follows: 1) in western and central New York in the outcrop area of lowermost Upper Devonian rocks straddling the Clarendon-Linden fault; 2) in western Virginia and eastern West Virginia in an area largely coincident with the well-known 'Oriskany' Mn-Fe ores; 3) an area in West Virginia, Maryland, and Virginia along and nearby the trend of the Alabama-New York lineament of King and Zietz approximately between 38- and 40-degrees N. latitude; and 4) an area in northeastern Ohio overlying an area coincident with a significant thickness of Silurian salt and high modern seismic activity. Some lesser, smaller areas suggested by relatively high coincidence may also be worthy of further study.

  10. Decision-making in information seeking on texts: an eye-fixation-related potentials investigation

    PubMed Central

    Frey, Aline; Ionescu, Gelu; Lemaire, Benoit; López-Orozco, Francisco; Baccino, Thierry; Guérin-Dugué, Anne

    2013-01-01

    Reading on a web page is known to be not linear and people need to make fast decisions about whether they have to stop or not reading. In such context, reading, and decision-making processes are intertwined and this experiment attempts to separate them through electrophysiological patterns provided by the Eye-Fixation-Related Potentials technique (EFRPs). We conducted an experiment in which EFRPs were recorded while participants read blocks of text that were semantically highly related, moderately related, and unrelated to a given goal. Participants had to decide as fast as possible whether the text was related or not to the semantic goal given at a prior stage. Decision making (stopping information search) may occur when the paragraph is highly related to the goal (positive decision) or when it is unrelated to the goal (negative decision). EFRPs were analyzed on and around typical eye fixations: either on words belonging to the goal (target), subjected to a high rate of positive decisions, or on low frequency unrelated words (incongruent), subjected to a high rate of negative decisions. In both cases, we found EFRPs specific patterns (amplitude peaking between 51 to 120 ms after fixation onset) spreading out on the next words following the goal word and the second fixation after an incongruent word, in parietal and occipital areas. We interpreted these results as delayed late components (P3b and N400), reflecting the decision to stop information searching. Indeed, we show a clear spill-over effect showing that the effect on word N spread out on word N + 1 and N + 2. PMID:23966913

  11. Supervised learning of tools for content-based search of image databases

    NASA Astrophysics Data System (ADS)

    Delanoy, Richard L.

    1996-03-01

    A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.

  12. Search for dark matter with the bolometric technique

    NASA Astrophysics Data System (ADS)

    Giuliani, Andrea

    2014-07-01

    After a concise introduction about the dark matter issue and a discussion of the problematics related to its direct detection, the bolometric technique is presented in this context, with a special focus on double-readout devices. The bolometric experiments for the search for dark matter are then described and reviewed. Their present and future roles are discussed, arguing about pros and cons of this technology.

  13. GeNemo: a search engine for web-based functional genomic data.

    PubMed

    Zhang, Yongqing; Cao, Xiaoyi; Zhong, Sheng

    2016-07-08

    A set of new data types emerged from functional genomic assays, including ChIP-seq, DNase-seq, FAIRE-seq and others. The results are typically stored as genome-wide intensities (WIG/bigWig files) or functional genomic regions (peak/BED files). These data types present new challenges to big data science. Here, we present GeNemo, a web-based search engine for functional genomic data. GeNemo searches user-input data against online functional genomic datasets, including the entire collection of ENCODE and mouse ENCODE datasets. Unlike text-based search engines, GeNemo's searches are based on pattern matching of functional genomic regions. This distinguishes GeNemo from text or DNA sequence searches. The user can input any complete or partial functional genomic dataset, for example, a binding intensity file (bigWig) or a peak file. GeNemo reports any genomic regions, ranging from hundred bases to hundred thousand bases, from any of the online ENCODE datasets that share similar functional (binding, modification, accessibility) patterns. This is enabled by a Markov Chain Monte Carlo-based maximization process, executed on up to 24 parallel computing threads. By clicking on a search result, the user can visually compare her/his data with the found datasets and navigate the identified genomic regions. GeNemo is available at www.genemo.org. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  14. Searching for pulsars using image pattern recognition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhu, W. W.; Berndsen, A.; Madsen, E. C.

    In the modern era of big data, many fields of astronomy are generating huge volumes of data, the analysis of which can sometimes be the limiting factor in research. Fortunately, computer scientists have developed powerful data-mining techniques that can be applied to various fields. In this paper, we present a novel artificial intelligence (AI) program that identifies pulsars from recent surveys by using image pattern recognition with deep neural nets—the PICS (Pulsar Image-based Classification System) AI. The AI mimics human experts and distinguishes pulsars from noise and interference by looking for patterns from candidate plots. Different from other pulsar selectionmore » programs that search for expected patterns, the PICS AI is taught the salient features of different pulsars from a set of human-labeled candidates through machine learning. The training candidates are collected from the Pulsar Arecibo L-band Feed Array (PALFA) survey. The information from each pulsar candidate is synthesized in four diagnostic plots, which consist of image data with up to thousands of pixels. The AI takes these data from each candidate as its input and uses thousands of such candidates to train its ∼9000 neurons. The deep neural networks in this AI system grant it superior ability to recognize various types of pulsars as well as their harmonic signals. The trained AI's performance has been validated with a large set of candidates from a different pulsar survey, the Green Bank North Celestial Cap survey. In this completely independent test, the PICS ranked 264 out of 277 pulsar-related candidates, including all 56 previously known pulsars and 208 of their harmonics, in the top 961 (1%) of 90,008 test candidates, missing only 13 harmonics. The first non-pulsar candidate appears at rank 187, following 45 pulsars and 141 harmonics. In other words, 100% of the pulsars were ranked in the top 1% of all candidates, while 80% were ranked higher than any noise or interference. The performance of this system can be improved over time as more training data are accumulated. This AI system has been integrated into the PALFA survey pipeline and has discovered six new pulsars to date.« less

  15. Searching for Pulsars Using Image Pattern Recognition

    NASA Astrophysics Data System (ADS)

    Zhu, W. W.; Berndsen, A.; Madsen, E. C.; Tan, M.; Stairs, I. H.; Brazier, A.; Lazarus, P.; Lynch, R.; Scholz, P.; Stovall, K.; Ransom, S. M.; Banaszak, S.; Biwer, C. M.; Cohen, S.; Dartez, L. P.; Flanigan, J.; Lunsford, G.; Martinez, J. G.; Mata, A.; Rohr, M.; Walker, A.; Allen, B.; Bhat, N. D. R.; Bogdanov, S.; Camilo, F.; Chatterjee, S.; Cordes, J. M.; Crawford, F.; Deneva, J. S.; Desvignes, G.; Ferdman, R. D.; Freire, P. C. C.; Hessels, J. W. T.; Jenet, F. A.; Kaplan, D. L.; Kaspi, V. M.; Knispel, B.; Lee, K. J.; van Leeuwen, J.; Lyne, A. G.; McLaughlin, M. A.; Siemens, X.; Spitler, L. G.; Venkataraman, A.

    2014-02-01

    In the modern era of big data, many fields of astronomy are generating huge volumes of data, the analysis of which can sometimes be the limiting factor in research. Fortunately, computer scientists have developed powerful data-mining techniques that can be applied to various fields. In this paper, we present a novel artificial intelligence (AI) program that identifies pulsars from recent surveys by using image pattern recognition with deep neural nets—the PICS (Pulsar Image-based Classification System) AI. The AI mimics human experts and distinguishes pulsars from noise and interference by looking for patterns from candidate plots. Different from other pulsar selection programs that search for expected patterns, the PICS AI is taught the salient features of different pulsars from a set of human-labeled candidates through machine learning. The training candidates are collected from the Pulsar Arecibo L-band Feed Array (PALFA) survey. The information from each pulsar candidate is synthesized in four diagnostic plots, which consist of image data with up to thousands of pixels. The AI takes these data from each candidate as its input and uses thousands of such candidates to train its ~9000 neurons. The deep neural networks in this AI system grant it superior ability to recognize various types of pulsars as well as their harmonic signals. The trained AI's performance has been validated with a large set of candidates from a different pulsar survey, the Green Bank North Celestial Cap survey. In this completely independent test, the PICS ranked 264 out of 277 pulsar-related candidates, including all 56 previously known pulsars and 208 of their harmonics, in the top 961 (1%) of 90,008 test candidates, missing only 13 harmonics. The first non-pulsar candidate appears at rank 187, following 45 pulsars and 141 harmonics. In other words, 100% of the pulsars were ranked in the top 1% of all candidates, while 80% were ranked higher than any noise or interference. The performance of this system can be improved over time as more training data are accumulated. This AI system has been integrated into the PALFA survey pipeline and has discovered six new pulsars to date.

  16. Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems

    NASA Astrophysics Data System (ADS)

    Hazra, Abhik; Das, Saborni; Basu, Mousumi

    2018-06-01

    This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.

  17. Heat Transfer Search Algorithm for Non-convex Economic Dispatch Problems

    NASA Astrophysics Data System (ADS)

    Hazra, Abhik; Das, Saborni; Basu, Mousumi

    2018-03-01

    This paper presents Heat Transfer Search (HTS) algorithm for the non-linear economic dispatch problem. HTS algorithm is based on the law of thermodynamics and heat transfer. The proficiency of the suggested technique has been disclosed on three dissimilar complicated economic dispatch problems with valve point effect; prohibited operating zone; and multiple fuels with valve point effect. Test results acquired from the suggested technique for the economic dispatch problem have been fitted to that acquired from other stated evolutionary techniques. It has been observed that the suggested HTS carry out superior solutions.

  18. Expertise in complex decision making: the role of search in chess 70 years after de Groot.

    PubMed

    Connors, Michael H; Burns, Bruce D; Campitelli, Guillermo

    2011-01-01

    One of the most influential studies in all expertise research is de Groot's (1946) study of chess players, which suggested that pattern recognition, rather than search, was the key determinant of expertise. Many changes have occurred in the chess world since de Groot's study, leading some authors to argue that the cognitive mechanisms underlying expertise have also changed. We decided to replicate de Groot's study to empirically test these claims and to examine whether the trends in the data have changed over time. Six Grandmasters, five International Masters, six Experts, and five Class A players completed the think-aloud procedure for two chess positions. Findings indicate that Grandmasters and International Masters search more quickly than Experts and Class A players, and that both groups today search substantially faster than players in previous studies. The findings, however, support de Groot's overall conclusions and are consistent with predictions made by pattern recognition models. Copyright © 2011 Cognitive Science Society, Inc.

  19. Spectroscopic vector analysis for fast pattern quality monitoring

    NASA Astrophysics Data System (ADS)

    Sohn, Younghoon; Ryu, Sungyoon; Lee, Chihoon; Yang, Yusin

    2018-03-01

    In semiconductor industry, fast and effective measurement of pattern variation has been key challenge for assuring massproduct quality. Pattern measurement techniques such as conventional CD-SEMs or Optical CDs have been extensively used, but these techniques are increasingly limited in terms of measurement throughput and time spent in modeling. In this paper we propose time effective pattern monitoring method through the direct spectrum-based approach. In this technique, a wavelength band sensitive to a specific pattern change is selected from spectroscopic ellipsometry signal scattered by pattern to be measured, and the amplitude and phase variation in the wavelength band are analyzed as a measurement index of the pattern change. This pattern change measurement technique is applied to several process steps and verified its applicability. Due to its fast and simple analysis, the methods can be adapted to the massive process variation monitoring maximizing measurement throughput.

  20. Anti-aliasing techniques in photon-counting depth imaging using GHz clock rates

    NASA Astrophysics Data System (ADS)

    Krichel, Nils J.; McCarthy, Aongus; Collins, Robert J.; Buller, Gerald S.

    2010-04-01

    Single-photon detection technologies in conjunction with low laser illumination powers allow for the eye-safe acquisition of time-of-flight range information on non-cooperative target surfaces. We previously presented a photon-counting depth imaging system designed for the rapid acquisition of three-dimensional target models by steering a single scanning pixel across the field angle of interest. To minimise the per-pixel dwelling times required to obtain sufficient photon statistics for accurate distance resolution, periodic illumination at multi- MHz repetition rates was applied. Modern time-correlated single-photon counting (TCSPC) hardware allowed for depth measurements with sub-mm precision. Resolving the absolute target range with a fast periodic signal is only possible at sufficiently short distances: if the round-trip time towards an object is extended beyond the timespan between two trigger pulses, the return signal cannot be assigned to an unambiguous range value. Whereas constructing a precise depth image based on relative results may still be possible, problems emerge for large or unknown pixel-by-pixel separations or in applications with a wide range of possible scene distances. We introduce a technique to avoid range ambiguity effects in time-of-flight depth imaging systems at high average pulse rates. A long pseudo-random bitstream is used to trigger the illuminating laser. A cyclic, fast-Fourier supported analysis algorithm is used to search for the pattern within return photon events. We demonstrate this approach at base clock rates of up to 2 GHz with varying pattern lengths, allowing for unambiguous distances of several kilometers. Scans at long stand-off distances and of scenes with large pixel-to-pixel range differences are presented. Numerical simulations are performed to investigate the relative merits of the technique.

  1. High School Students, Libraries, and the Search Process. An Analysis of Student Materials and Facilities Usage Patterns in Delaware Following Introduction of Online Bibliographic Database Searching.

    ERIC Educational Resources Information Center

    Mancall, Jacqueline C.; Deskins, Dreama

    This report assesses the impact of instruction in online bibliographic database searching on high school students' use of library materials and facilities in three Delaware secondary schools (one public, one parochial, and one private) during the spring of 1984. Most students involved in the analysis were given a brief explanation of online…

  2. Going beyond Google for Faster and Smarter Web Searching

    ERIC Educational Resources Information Center

    Vine, Rita

    2004-01-01

    With more than 4 billion web pages in its database, Google is suitable for many different kinds of searches. When you know what you are looking for, Google can be a pretty good first choice, as long as you want to search a word pattern that can be expected to appear on any results pages. The problem starts when you don't know exactly what you're…

  3. Acoustic tweezers: patterning cells and microparticles using standing surface acoustic waves (SSAW).

    PubMed

    Shi, Jinjie; Ahmed, Daniel; Mao, Xiaole; Lin, Sz-Chin Steven; Lawit, Aitan; Huang, Tony Jun

    2009-10-21

    Here we present an active patterning technique named "acoustic tweezers" that utilizes standing surface acoustic wave (SSAW) to manipulate and pattern cells and microparticles. This technique is capable of patterning cells and microparticles regardless of shape, size, charge or polarity. Its power intensity, approximately 5x10(5) times lower than that of optical tweezers, compares favorably with those of other active patterning methods. Flow cytometry studies have revealed it to be non-invasive. The aforementioned advantages, along with this technique's simple design and ability to be miniaturized, render the "acoustic tweezers" technique a promising tool for various applications in biology, chemistry, engineering, and materials science.

  4. Interactive Information Organization: Techniques and Evaluation

    DTIC Science & Technology

    2001-05-01

    information search and access. Locating interesting information on the World Wide Web is the main task of on-line search engines . Such engines accept a...likelihood of being relevant to the user’s request. The majority of today’s Web search engines follow this scenario. The ordering of documents in the

  5. Disease Monitoring and Health Campaign Evaluation Using Google Search Activities for HIV and AIDS, Stroke, Colorectal Cancer, and Marijuana Use in Canada: A Retrospective Observational Study.

    PubMed

    Ling, Rebecca; Lee, Joon

    2016-10-12

    Infodemiology can offer practical and feasible health research applications through the practice of studying information available on the Web. Google Trends provides publicly accessible information regarding search behaviors in a population, which may be studied and used for health campaign evaluation and disease monitoring. Additional studies examining the use and effectiveness of Google Trends for these purposes remain warranted. The objective of our study was to explore the use of infodemiology in the context of health campaign evaluation and chronic disease monitoring. It was hypothesized that following a launch of a campaign, there would be an increase in information seeking behavior on the Web. Second, increasing and decreasing disease patterns in a population would be associated with search activity patterns. This study examined 4 different diseases: human immunodeficiency virus (HIV) infection, stroke, colorectal cancer, and marijuana use. Using Google Trends, relative search volume data were collected throughout the period of February 2004 to January 2015. Campaign information and disease statistics were obtained from governmental publications. Search activity trends were graphed and assessed with disease trends and the campaign interval. Pearson product correlation statistics and joinpoint methodology analyses were used to determine significance. Disease patterns and online activity across all 4 diseases were significantly correlated: HIV infection (r=.36, P<.001), stroke (r=.40, P<.001), colorectal cancer (r= -.41, P<.001), and substance use (r=.64, P<.001). Visual inspection and the joinpoint analysis showed significant correlations for the campaigns on colorectal cancer and marijuana use in stimulating search activity. No significant correlations were observed for the campaigns on stroke and HIV regarding search activity. The use of infoveillance shows promise as an alternative and inexpensive solution to disease surveillance and health campaign evaluation. Further research is needed to understand Google Trends as a valid and reliable tool for health research.

  6. [Selective attention and schizophrenia before the administration of neuroleptics].

    PubMed

    Lussier, I; Stip, E

    1999-01-01

    In recent years, the presence of attention deficits has been recognized as a key feature of schizophrenia. Past studies reveal that selective attention, or the ability to select relevant information while ignoring simultaneously irrelevant information, is disturbed in schizophrenic patients. According to Treisman feature-integration theory of selective attention, visual search for conjunctive targets (e.g., shape and color) requires controlled processes, that necessitate attention and operate in a serial manner. Reaction times (RTs) are therefore function of the number of stimuli in the display. When subjects are asked to detect the presence or absence of a target in an array of a variable number of stimuli, different performance patterns are expected for positive (present target) and negative trials (absent target). For positive trials, a self-terminating search is triggered, that is, the search is ended when the target is encountered. For negative trials, an exhaustive search strategy is displayed, where each stimulus is examined before the search can end; the RT slope pattern is thus double that of the positive trials. To assess the integrity of these processes, thirteen drug naive schizophrenic patients were compared to twenty normal control subjects. Neuroleptic naive patients were chosen as subjects to avoid the potential influence of medication and chronicity-related factors on performance. The subjects had to specify as fast as possible the presence or absence of the target in an array of a variable number of stimuli presented in a circular display, and comprising or not the target. Results showed that the patients can use self-terminating search strategies as well as normal control subjects. However, their ability to trigger exhaustive search strategies is impaired. Not only were patients slower than controls, but their pattern of RT results was different. These results argue in favor of an early impairment in selective attention capacities in schizophrenia, which appears before the introduction of neuroleptics. The attention performance was also shown to present some association to clinical symptoms.

  7. Disease Monitoring and Health Campaign Evaluation Using Google Search Activities for HIV and AIDS, Stroke, Colorectal Cancer, and Marijuana Use in Canada: A Retrospective Observational Study

    PubMed Central

    2016-01-01

    Background Infodemiology can offer practical and feasible health research applications through the practice of studying information available on the Web. Google Trends provides publicly accessible information regarding search behaviors in a population, which may be studied and used for health campaign evaluation and disease monitoring. Additional studies examining the use and effectiveness of Google Trends for these purposes remain warranted. Objective The objective of our study was to explore the use of infodemiology in the context of health campaign evaluation and chronic disease monitoring. It was hypothesized that following a launch of a campaign, there would be an increase in information seeking behavior on the Web. Second, increasing and decreasing disease patterns in a population would be associated with search activity patterns. This study examined 4 different diseases: human immunodeficiency virus (HIV) infection, stroke, colorectal cancer, and marijuana use. Methods Using Google Trends, relative search volume data were collected throughout the period of February 2004 to January 2015. Campaign information and disease statistics were obtained from governmental publications. Search activity trends were graphed and assessed with disease trends and the campaign interval. Pearson product correlation statistics and joinpoint methodology analyses were used to determine significance. Results Disease patterns and online activity across all 4 diseases were significantly correlated: HIV infection (r=.36, P<.001), stroke (r=.40, P<.001), colorectal cancer (r= −.41, P<.001), and substance use (r=.64, P<.001). Visual inspection and the joinpoint analysis showed significant correlations for the campaigns on colorectal cancer and marijuana use in stimulating search activity. No significant correlations were observed for the campaigns on stroke and HIV regarding search activity. Conclusions The use of infoveillance shows promise as an alternative and inexpensive solution to disease surveillance and health campaign evaluation. Further research is needed to understand Google Trends as a valid and reliable tool for health research. PMID:27733330

  8. Fringe pattern demodulation with a two-dimensional digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-dimensional digital phase-locked loop (DPLL) for fringe pattern demodulation is presented. This algorithm is more suitable for demodulation of fringe patterns with varying phase in two directions than the existing DPLL techniques that assume that the phase of the fringe patterns varies only in one direction. The two-dimensional DPLL technique assumes that the phase of a fringe pattern is continuous in both directions and takes advantage of the phase continuity; consequently, the algorithm has better noise performance than the existing DPLL schemes. The two-dimensional DPLL algorithm is also suitable for demodulation of fringe patterns with low sampling rates, and it outperforms the Fourier fringe analysis technique in this aspect.

  9. In Vivo Measurement of Glenohumeral Joint Contact Patterns

    NASA Astrophysics Data System (ADS)

    Bey, Michael J.; Kline, Stephanie K.; Zauel, Roger; Kolowich, Patricia A.; Lock, Terrence R.

    2009-12-01

    The objectives of this study were to describe a technique for measuring in-vivo glenohumeral joint contact patterns during dynamic activities and to demonstrate application of this technique. The experimental technique calculated joint contact patterns by combining CT-based 3D bone models with joint motion data that were accurately measured from biplane x-ray images. Joint contact patterns were calculated for the repaired and contralateral shoulders of 20 patients who had undergone rotator cuff repair. Significant differences in joint contact patterns were detected due to abduction angle and shoulder condition (i.e., repaired versus contralateral). Abduction angle had a significant effect on the superior/inferior contact center position, with the average joint contact center of the repaired shoulder 12.1% higher on the glenoid than the contralateral shoulder. This technique provides clinically relevant information by calculating in-vivo joint contact patterns during dynamic conditions and overcomes many limitations associated with conventional techniques for quantifying joint mechanics.

  10. Internet Search and Krokodil in the Russian Federation: An Infoveillance Study

    PubMed Central

    2014-01-01

    Background Krokodil is an informal term for a cheap injectable illicit drug domestically prepared from codeine-containing medication (CCM). The method of krokodil preparation may produce desomorphine as well as toxic reactants that cause extensive tissue necrosis. The first confirmed report of krokodil use in Russia took place in 2004. In 2012, reports of krokodil-related injection injuries began to appear beyond Russia in Western Europe and the United States. Objective This exploratory study had two main objectives: (1) to determine if Internet search patterns could detect regularities in behavioral responses to Russian CCM policy at the population level, and (2) to determine if complementary data sources could explain the regularities we observed. Methods First, we obtained krokodil-related search pattern data for each Russia subregion (oblast) between 2011 and 2012. Second, we analyzed several complementary data sources included krokodil-related court cases, and related search terms on both Google and Yandex to evaluate the characteristics of terms accompanying krokodil-related search queries. Results In the 6 months preceding CCM sales restrictions, 21 of Russia's 83 oblasts had search rates higher than the national average (mean) of 16.67 searches per 100,000 population for terms associated with krokodil. In the 6 months following restrictions, mean national searches dropped to 9.65 per 100,000. Further, the number of oblasts recording a higher than average search rate dropped from 30 to 16. Second, we found krokodil-related court appearances were moderately positively correlated (Spearman correlation=.506, P≤.001) with behaviors consistent with an interest in the production and use of krokodil across Russia. Finally, Google Trends and Google and Yandex related terms suggested consistent public interest in the production and use of krokodil as well as for CCM as analgesic medication during the date range covered by this study. Conclusions Illicit drug use data are generally regarded as difficult to obtain through traditional survey methods. Our analysis suggests it is plausible that Yandex search behavior served as a proxy for patterns of krokodil production and use during the date range we investigated. More generally, this study demonstrates the application of novel methods recently used by policy makers to both monitor illicit drug use and influence drug policy decision making. PMID:25236385

  11. Circulating tumor cell isolation during resection of colorectal cancer lung and liver metastases: a prospective trial with different detection techniques.

    PubMed

    Kaifi, Jussuf T; Kunkel, Miriam; Das, Avisnata; Harouaka, Ramdane A; Dicker, David T; Li, Guangfu; Zhu, Junjia; Clawson, Gary A; Yang, Zhaohai; Reed, Michael F; Gusani, Niraj J; Kimchi, Eric T; Staveley-O'Carroll, Kevin F; Zheng, Si-Yang; El-Deiry, Wafik S

    2015-01-01

    Colorectal cancer (CRC) metastasectomy improves survival, however most patient develop recurrences. Circulating tumor cells (CTCs) are an independent prognostic marker in stage IV CRC. We hypothesized that CTCs can be enriched during metastasectomy applying different isolation techniques. 25 CRC patients undergoing liver (16 (64%)) or lung (9 (36%)) metastasectomy were prospectively enrolled (clinicaltrial.gov identifier: NCT01722903). Central venous (liver) or radial artery (lung) tumor outflow blood (7.5 ml) was collected at incision, during resection, 30 min after resection, and on postoperative day (POD) 1. CTCs were quantified with 1. EpCAM-based CellSearch® system and 2. size-based isolation with a novel filter device (FMSA). CTCs were immunohistochemically identified using CellSearch®'s criteria (cytokeratin 8/18/19+, CD45- cells containing a nucleus (DAPI+)). CTCs were also enriched with a centrifugation technique (OncoQuick®). CTC numbers peaked during the resection with the FMSA in contrast to CellSearch® (mean CTC number during resection: FMSA: 22.56 (SEM 7.48) (p = 0.0281), CellSearch®: 0.87 (SEM ± 0.44) (p = 0.3018)). Comparing the 2 techniques, CTC quantity was significantly higher with the FMSA device (range 0-101) than CellSearch® (range 0-9) at each of the 4 time points examined (P < 0.05). Immunofluorescence staining of cultured CTCs revealed that CTCs have a combined epithelial (CK8/18/19) and macrophage (CD45/CD14) phenotype. Blood sampling during CRC metastasis resection is an opportunity to increase CTC capture efficiency. CTC isolation with the FMSA yields more CTCs than the CellSearch® system. Future studies should focus on characterization of single CTCs to identify targets for molecular therapy and immune escape mechanisms of cancer cells.

  12. Rationalizing spatial exploration patterns of wild animals and humans through a temporal discounting framework

    PubMed Central

    Namboodiri, Vijay Mohan K.; Levy, Joshua M.; Mihalas, Stefan; Sims, David W.; Hussain Shuler, Marshall G.

    2016-01-01

    Understanding the exploration patterns of foragers in the wild provides fundamental insight into animal behavior. Recent experimental evidence has demonstrated that path lengths (distances between consecutive turns) taken by foragers are well fitted by a power law distribution. Numerous theoretical contributions have posited that “Lévy random walks”—which can produce power law path length distributions—are optimal for memoryless agents searching a sparse reward landscape. It is unclear, however, whether such a strategy is efficient for cognitively complex agents, from wild animals to humans. Here, we developed a model to explain the emergence of apparent power law path length distributions in animals that can learn about their environments. In our model, the agent’s goal during search is to build an internal model of the distribution of rewards in space that takes into account the cost of time to reach distant locations (i.e., temporally discounting rewards). For an agent with such a goal, we find that an optimal model of exploration in fact produces hyperbolic path lengths, which are well approximated by power laws. We then provide support for our model by showing that humans in a laboratory spatial exploration task search space systematically and modify their search patterns under a cost of time. In addition, we find that path length distributions in a large dataset obtained from free-ranging marine vertebrates are well described by our hyperbolic model. Thus, we provide a general theoretical framework for understanding spatial exploration patterns of cognitively complex foragers. PMID:27385831

  13. An efficient interior-point algorithm with new non-monotone line search filter method for nonlinear constrained programming

    NASA Astrophysics Data System (ADS)

    Wang, Liwei; Liu, Xinggao; Zhang, Zeyin

    2017-02-01

    An efficient primal-dual interior-point algorithm using a new non-monotone line search filter method is presented for nonlinear constrained programming, which is widely applied in engineering optimization. The new non-monotone line search technique is introduced to lead to relaxed step acceptance conditions and improved convergence performance. It can also avoid the choice of the upper bound on the memory, which brings obvious disadvantages to traditional techniques. Under mild assumptions, the global convergence of the new non-monotone line search filter method is analysed, and fast local convergence is ensured by second order corrections. The proposed algorithm is applied to the classical alkylation process optimization problem and the results illustrate its effectiveness. Some comprehensive comparisons to existing methods are also presented.

  14. Interactive design of generic chemical patterns.

    PubMed

    Schomburg, Karen T; Wetzer, Lars; Rarey, Matthias

    2013-07-01

    Every medicinal chemist has to create chemical patterns occasionally for querying databases, applying filters or describing functional groups. However, the representations of chemical patterns have been so far limited to languages with highly complex syntax, handicapping the application of patterns. Graphic pattern editors similar to chemical editors can facilitate the work with patterns. In this article, we review the interfaces of frequently used web search engines for chemical patterns. We take a look at pattern editing concepts of standalone chemical editors and finally present a completely new, unpublished graphical approach to pattern design, the SMARTSeditor. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. qPMS9: An Efficient Algorithm for Quorum Planted Motif Search

    NASA Astrophysics Data System (ADS)

    Nicolae, Marius; Rajasekaran, Sanguthevar

    2015-01-01

    Discovering patterns in biological sequences is a crucial problem. For example, the identification of patterns in DNA sequences has resulted in the determination of open reading frames, identification of gene promoter elements, intron/exon splicing sites, and SH RNAs, location of RNA degradation signals, identification of alternative splicing sites, etc. In protein sequences, patterns have led to domain identification, location of protease cleavage sites, identification of signal peptides, protein interactions, determination of protein degradation elements, identification of protein trafficking elements, discovery of short functional motifs, etc. In this paper we focus on the identification of an important class of patterns, namely, motifs. We study the (l, d) motif search problem or Planted Motif Search (PMS). PMS receives as input n strings and two integers l and d. It returns all sequences M of length l that occur in each input string, where each occurrence differs from M in at most d positions. Another formulation is quorum PMS (qPMS), where the motif appears in at least q% of the strings. We introduce qPMS9, a parallel exact qPMS algorithm that offers significant runtime improvements on DNA and protein datasets. qPMS9 solves the challenging DNA (l, d)-instances (28, 12) and (30, 13). The source code is available at https://code.google.com/p/qpms9/.

  16. Fringe pattern demodulation with a two-frame digital phase-locked loop algorithm.

    PubMed

    Gdeisat, Munther A; Burton, David R; Lalor, Michael J

    2002-09-10

    A novel technique called a two-frame digital phase-locked loop for fringe pattern demodulation is presented. In this scheme, two fringe patterns with different spatial carrier frequencies are grabbed for an object. A digital phase-locked loop algorithm tracks and demodulates the phase difference between both fringe patterns by employing the wrapped phase components of one of the fringe patterns as a reference to demodulate the second fringe pattern. The desired phase information can be extracted from the demodulated phase difference. We tested the algorithm experimentally using real fringe patterns. The technique is shown to be suitable for noncontact measurement of objects with rapid surface variations, and it outperforms the Fourier fringe analysis technique in this aspect. Phase maps produced withthis algorithm are noisy in comparison with phase maps generated with the Fourier fringe analysis technique.

  17. SearchGUI: An open-source graphical user interface for simultaneous OMSSA and X!Tandem searches.

    PubMed

    Vaudel, Marc; Barsnes, Harald; Berven, Frode S; Sickmann, Albert; Martens, Lennart

    2011-03-01

    The identification of proteins by mass spectrometry is a standard technique in the field of proteomics, relying on search engines to perform the identifications of the acquired spectra. Here, we present a user-friendly, lightweight and open-source graphical user interface called SearchGUI (http://searchgui.googlecode.com), for configuring and running the freely available OMSSA (open mass spectrometry search algorithm) and X!Tandem search engines simultaneously. Freely available under the permissible Apache2 license, SearchGUI is supported on Windows, Linux and OSX. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A review of estimation of distribution algorithms in bioinformatics

    PubMed Central

    Armañanzas, Rubén; Inza, Iñaki; Santana, Roberto; Saeys, Yvan; Flores, Jose Luis; Lozano, Jose Antonio; Peer, Yves Van de; Blanco, Rosa; Robles, Víctor; Bielza, Concha; Larrañaga, Pedro

    2008-01-01

    Evolutionary search algorithms have become an essential asset in the algorithmic toolbox for solving high-dimensional optimization problems in across a broad range of bioinformatics problems. Genetic algorithms, the most well-known and representative evolutionary search technique, have been the subject of the major part of such applications. Estimation of distribution algorithms (EDAs) offer a novel evolutionary paradigm that constitutes a natural and attractive alternative to genetic algorithms. They make use of a probabilistic model, learnt from the promising solutions, to guide the search process. In this paper, we set out a basic taxonomy of EDA techniques, underlining the nature and complexity of the probabilistic model of each EDA variant. We review a set of innovative works that make use of EDA techniques to solve challenging bioinformatics problems, emphasizing the EDA paradigm's potential for further research in this domain. PMID:18822112

  19. Using pattern recognition as a method for predicting extreme events in natural and socio-economic systems

    NASA Astrophysics Data System (ADS)

    Intriligator, M.

    2011-12-01

    Vladimir (Volodya) Keilis-Borok has pioneered the use of pattern recognition as a technique for analyzing and forecasting developments in natural as well as socio-economic systems. Keilis-Borok's work on predicting earthquakes and landslides using this technique as a leading geophysicist has been recognized around the world. Keilis-Borok has also been a world leader in the application of pattern recognition techniques to the analysis and prediction of socio-economic systems. He worked with Allan Lichtman of American University in using such techniques to predict presidential elections in the U.S. Keilis-Borok and I have worked together with others on the use of pattern recognition techniques to analyze and to predict socio-economic systems. We have used this technique to study the pattern of macroeconomic indicators that would predict the end of an economic recession in the U.S. We have also worked with officers in the Los Angeles Police Department to use this technique to predict surges of homicides in Los Angeles.

  20. Forensic quest for age determination of bloodstains.

    PubMed

    Bremmer, Rolf H; de Bruin, Karla G; van Gemert, Martin J C; van Leeuwen, Ton G; Aalders, Maurice C G

    2012-03-10

    Bloodstains at crime scenes are among the most important types of evidence for forensic investigators. They can be used for DNA-profiling for verifying the suspect's identity or for pattern analysis in order to reconstruct the crime. However, until now, using bloodstains to determine the time elapsed since the crime was committed is still not possible. From a criminalistic point of view, an accurate estimation of when the crime was committed enables to verify witnesses' statements, limits the number of suspects and assesses alibis. Despite several attempts and exploration of many technologies during a century, no method has been materialized into forensic practice. This review gives an overview of an extensive search in scientific literature of techniques that address the quest for age determination of bloodstains. We found that most techniques are complementary to each other, in short as well as long term age determination. Techniques are compared concerning their sensitivity for short and long term ageing of bloodstains and concerning their possible applicability to be used on a crime scene. In addition, experimental challenges like substrate variation, interdonor variation and environmental influences are addressed. Comparison of these techniques contributes to our knowledge of the physics and biochemistry in an ageing bloodstain. Further improvement and incorporation of environmental factors are necessary to enable age determination of bloodstains to be acceptable in court. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  1. Performances of JEM-EUSO: angular reconstruction. The JEM-EUSO Collaboration

    NASA Astrophysics Data System (ADS)

    Adams, J. H.; Ahmad, S.; Albert, J.-N.; Allard, D.; Anchordoqui, L.; Andreev, V.; Anzalone, A.; Arai, Y.; Asano, K.; Ave Pernas, M.; Baragatti, P.; Barrillon, P.; Batsch, T.; Bayer, J.; Bechini, R.; Belenguer, T.; Bellotti, R.; Belov, K.; Berlind, A. A.; Bertaina, M.; Biermann, P. L.; Biktemerova, S.; Blaksley, C.; Blanc, N.; Błȩcki, J.; Blin-Bondil, S.; Blümer, J.; Bobik, P.; Bogomilov, M.; Bonamente, M.; Briggs, M. S.; Briz, S.; Bruno, A.; Cafagna, F.; Campana, D.; Capdevielle, J.-N.; Caruso, R.; Casolino, M.; Cassardo, C.; Castellinic, G.; Catalano, C.; Catalano, G.; Cellino, A.; Chikawa, M.; Christl, M. J.; Cline, D.; Connaughton, V.; Conti, L.; Cordero, G.; Crawford, H. J.; Cremonini, R.; Csorna, S.; Dagoret-Campagne, S.; de Castro, A. J.; De Donato, C.; de la Taille, C.; De Santis, C.; del Peral, L.; Dell'Oro, A.; De Simone, N.; Di Martino, M.; Distratis, G.; Dulucq, F.; Dupieux, M.; Ebersoldt, A.; Ebisuzaki, T.; Engel, R.; Falk, S.; Fang, K.; Fenu, F.; Fernández-Gómez, I.; Ferrarese, S.; Finco, D.; Flamini, M.; Fornaro, C.; Franceschi, A.; Fujimoto, J.; Fukushima, M.; Galeotti, P.; Garipov, G.; Geary, J.; Gelmini, G.; Giraudo, G.; Gonchar, M.; González Alvarado, C.; Gorodetzky, P.; Guarino, F.; Guzmán, A.; Hachisu, Y.; Harlov, B.; Haungs, A.; Hernández Carretero, J.; Higashide, K.; Ikeda, D.; Ikeda, H.; Inoue, N.; Inoue, S.; Insolia, A.; Isgrò, F.; Itow, Y.; Joven, E.; Judd, E. G.; Jung, A.; Kajino, F.; Kajino, T.; Kaneko, I.; Karadzhov, Y.; Karczmarczyk, J.; Karus, M.; Katahira, K.; Kawai, K.; Kawasaki, Y.; Keilhauer, B.; Khrenov, B. A.; Kim, J.-S.; Kim, S.-W.; Kim, S.-W.; Kleifges, M.; Klimov, P. A.; Kolev, D.; Kreykenbohm, I.; Kudela, K.; Kurihara, Y.; Kusenko, A.; Kuznetsov, E.; Lacombe, M.; Lachaud, C.; Lee, J.; Licandro, J.; Lim, H.; López, F.; Maccarone, M. C.; Mannheim, K.; Maravilla, D.; Marcelli, L.; Marini, A.; Martinez, O.; Masciantonio, G.; Mase, K.; Matev, R.; Medina-Tanco, G.; Mernik, T.; Miyamoto, H.; Miyazaki, Y.; Mizumoto, Y.; Modestino, G.; Monaco, A.; Monnier-Ragaigne, D.; Morales de los Ríos, J. A.; Moretto, C.; Morozenko, V. S.; Mot, B.; Murakami, T.; Murakami, M. Nagano; Nagata, M.; Nagataki, S.; Nakamura, T.; Napolitano, T.; Naumov, D.; Nava, R.; Neronov, A.; Nomoto, K.; Nonaka, T.; Ogawa, T.; Ogio, S.; Ohmori, H.; Olinto, A. V.; Orleański, P.; Osteria, G.; Panasyuk, M. I.; Parizot, E.; Park, I. H.; Park, H. W.; Pastircak, B.; Patzak, T.; Paul, T.; Pennypacker, C.; Perez Cano, S.; Peter, T.; Picozza, P.; Pierog, T.; Piotrowski, L. W.; Piraino, S.; Plebaniak, Z.; Pollini, A.; Prat, P.; Prévôt, G.; Prieto, H.; Putis, M.; Reardon, P.; Reyes, M.; Ricci, M.; Rodríguez, I.; Rodríguez Frías, M. D.; Ronga, F.; Roth, M.; Rothkaehl, H.; Roudil, G.; Rusinov, I.; Rybczyński, M.; Sabau, M. D.; Sáez-Cano, G.; Sagawa, H.; Saito, A.; Sakaki, N.; Sakata, M.; Salazar, H.; Sánchez, S.; Santangelo, A.; Santiago Crúz, L.; Sanz Palomino, M.; Saprykin, O.; Sarazin, F.; Sato, H.; Sato, M.; Schanz, T.; Schieler, H.; Scotti, V.; Segreto, A.; Selmane, S.; Semikoz, D.; Serra, M.; Sharakin, S.; Shibata, T.; Shimizu, H. M.; Shinozaki, K.; Shirahama, T.; Siemieniec-Oziȩbło, G.; Silva López, H. H.; Sledd, J.; Słomińska, K.; Sobey, A.; Sugiyama, T.; Supanitsky, D.; Suzuki, M.; Szabelska, B.; Szabelski, J.; Tajima, F.; Tajima, N.; Tajima, T.; Takahashi, Y.; Takami, H.; Takeda, M.; Takizawa, Y.; Tenzer, C.; Tibolla, O.; Tkachev, L.; Tokuno, H.; Tomida, T.; Tone, N.; Toscano, S.; Trillaud, F.; Tsenov, R.; Tsunesada, Y.; Tsuno, K.; Tymieniecka, T.; Uchihori, Y.; Unger, M.; Vaduvescu, O.; Valdés-Galicia, J. F.; Vallania, P.; Valore, L.; Vankova, G.; Vigorito, C.; Villaseñor, L.; von Ballmoos, P.; Wada, S.; Watanabe, J.; Watanabe, S.; Watts, J.; Weber, M.; Weiler, T. J.; Wibig, T.; Wiencke, L.; Wille, M.; Wilms, J.; Włodarczyk, Z.; Yamamoto, T.; Yamamoto, Y.; Yang, J.; Yano, H.; Yashin, I. V.; Yonetoku, D.; Yoshida, K.; Yoshida, S.; Young, R.; Zotov, M. Yu.; Zuccaro Marchi, A.

    2015-11-01

    Mounted on the International Space Station(ISS), the Extreme Universe Space Observatory, on-board the Japanese Experimental Module (JEM-EUSO), relies on the well established fluorescence technique to observe Extensive Air Showers (EAS) developing in the earth's atmosphere. Focusing on the detection of Ultra High Energy Cosmic Rays (UHECR) in the decade of 1020eV, JEM-EUSO will face new challenges by applying this technique from space. The EUSO Simulation and Analysis Framework (ESAF) has been developed in this context to provide a full end-to-end simulation frame, and assess the overall performance of the detector. Within ESAF, angular reconstruction can be separated into two conceptually different steps. The first step is pattern recognition, or filtering, of the signal to separate it from the background. The second step is to perform different types of fitting in order to search for the relevant geometrical parameters that best describe the previously selected signal. In this paper, we discuss some of the techniques we have implemented in ESAF to perform the geometrical reconstruction of EAS seen by JEM-EUSO. We also conduct thorough tests to assess the performances of these techniques in conditions which are relevant to the scope of the JEM-EUSO mission. We conclude by showing the expected angular resolution in the energy range that JEM-EUSO is expected to observe.

  2. Optically Remote Noncontact Heart Rates Sensing Technique

    NASA Astrophysics Data System (ADS)

    Thongkongoum, W.; Boonduang, S.; Limsuwan, P.

    2017-09-01

    Heart rate monitoring via optically remote noncontact technique was reported in this research. A green laser (5 mW, 532±10 nm) was projected onto the left carotid artery. The reflected laser light on the screen carried the deviation of the interference patterns. The interference patterns were recorded by the digital camera. The recorded videos of the interference patterns were frame by frame analysed by 2 standard digital image processing (DIP) techniques, block matching (BM) and optical flow (OF) techniques. The region of interest (ROI) pixels within the interference patterns were analysed for periodically changes of the interference patterns due to the heart pumping action. Both results of BM and OF techniques were compared with the reference medical heart rate monitoring device by which a contact measurement using pulse transit technique. The results obtained from BM technique was 74.67 bpm (beats per minute) and OF technique was 75.95 bpm. Those results when compared with the reference value of 75.43±1 bpm, the errors were found to be 1.01% and 0.69%, respectively.

  3. An Adaptive Image Enhancement Technique by Combining Cuckoo Search and Particle Swarm Optimization Algorithm

    PubMed Central

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper. PMID:25784928

  4. An adaptive image enhancement technique by combining cuckoo search and particle swarm optimization algorithm.

    PubMed

    Ye, Zhiwei; Wang, Mingwei; Hu, Zhengbing; Liu, Wei

    2015-01-01

    Image enhancement is an important procedure of image processing and analysis. This paper presents a new technique using a modified measure and blending of cuckoo search and particle swarm optimization (CS-PSO) for low contrast images to enhance image adaptively. In this way, contrast enhancement is obtained by global transformation of the input intensities; it employs incomplete Beta function as the transformation function and a novel criterion for measuring image quality considering three factors which are threshold, entropy value, and gray-level probability density of the image. The enhancement process is a nonlinear optimization problem with several constraints. CS-PSO is utilized to maximize the objective fitness criterion in order to enhance the contrast and detail in an image by adapting the parameters of a novel extension to a local enhancement technique. The performance of the proposed method has been compared with other existing techniques such as linear contrast stretching, histogram equalization, and evolutionary computing based image enhancement methods like backtracking search algorithm, differential search algorithm, genetic algorithm, and particle swarm optimization in terms of processing time and image quality. Experimental results demonstrate that the proposed method is robust and adaptive and exhibits the better performance than other methods involved in the paper.

  5. Science on the Web: Secondary School Students' Navigation Patterns and Preferred Pages' Characteristics

    ERIC Educational Resources Information Center

    Dimopoulos, Kostas; Asimakopoulos, Apostolos

    2010-01-01

    This study aims to explore navigation patterns and preferred pages' characteristics of ten secondary school students searching the web for information about cloning. The students navigated the Web for as long as they wished in a context of minimum support of teaching staff. Their navigation patterns were analyzed using audit trail data software.…

  6. Emotionality in response to aircraft noise: A report of development work

    NASA Technical Reports Server (NTRS)

    Klaus, P. A.

    1975-01-01

    A literature search and pilot study conducted to investigate the topic of emotional response to aircraft noise are described. A Tell-A-Story Technique was developed for use in the pilot study which required respondents to make up stories for a series of aircraft-related and non-aircraft-related pictures. A content analysis of these stories was made. The major finding was that response patterns varied among three groups of respondents - those currently living near airports, those who had lived near airports in the past, and those who had never lived near airports. Negative emotional feelings toward aircraft were greatest among respondents who had lived near airports in the past but no longer did. A possible explanation offered for this finding was that people currently living near airports might adapt to the situation by denying some of their negative feelings, which they might feel more free to express after they had moved away from the situation. Other techniques used in the pilot study are also described, including group interviews and a word association task.

  7. Can genetic algorithms help virus writers reshape their creations and avoid detection?

    NASA Astrophysics Data System (ADS)

    Abu Doush, Iyad; Al-Saleh, Mohammed I.

    2017-11-01

    Different attack and defence techniques have been evolved over time as actions and reactions between black-hat and white-hat communities. Encryption, polymorphism, metamorphism and obfuscation are among the techniques used by the attackers to bypass security controls. On the other hand, pattern matching, algorithmic scanning, emulation and heuristic are used by the defence team. The Antivirus (AV) is a vital security control that is used against a variety of threats. The AV mainly scans data against its database of virus signatures. Basically, it claims a virus if a match is found. This paper seeks to find the minimal possible changes that can be made on the virus so that it will appear normal when scanned by the AV. Brute-force search through all possible changes can be a computationally expensive task. Alternatively, this paper tries to apply a Genetic Algorithm in solving such a problem. Our proposed algorithm is tested on seven different malware instances. The results show that in all the tested malware instances only a small change in each instance was good enough to bypass the AV.

  8. Creating the Perfect Umbilicus: A Systematic Review of Recent Literature.

    PubMed

    Joseph, Walter J; Sinno, Sammy; Brownstone, Nicholas D; Mirrer, Joshua; Thanik, Vishal D

    2016-06-01

    The aim of this study was to perform an updated systematic review of the literature over the last 10 years, analyzing and comparing the many published techniques with the hope of providing plastic surgeons with a new standard in creating the perfect umbilicus in the setting of both abdominoplasty and abdominally based free-flap breast reconstruction. An initial search using the PubMed online database with the keyword "umbilicoplasty" was performed. These results were filtered to only include articles published within the last 10 years. The remaining articles were thoroughly reviewed by the authors and only those pertaining to techniques for umbilicoplasty in the setting of abdominoplasty and abdominally based free flap were included. Of the 10 unique techniques yielded by our search, 9/10 (90 %) initially incised the native umbilicus with a round, oval, or vertical ellipse pattern. Of the 9 techniques that initially perform a round incision, 4 of them (44.4 %) later modify the round umbilicus with either an inferior or superior excision to create either a "U"- or "inverted U"-shaped umbilicus. In terms of the shape of the incision made in the abdominal flap for umbilical reinsertion, the most common were either a round incision or an inverted "V" or "U," both of which accounted for 4/10 (40 %) and 3/10 (30 %), respectively. Almost all of the studies (8/10; 80 %) describe "defatting" or trimming of the subcutaneous adipose tissue around the incision to create a periumbilical concavity following inset of the umbilicus. 4/10 (40 %) of the techniques describe suturing the dermis of the umbilical skin to rectus fascia. Furthermore, 3/10 (30 %) advise that stalk plication is a necessary step to their technique. 7/9 techniques (77.8 %) preferred nondissolvable sutures for skin closure, with nylon being the most common suture material used. Only 2/9 (22.2 %) used dissolvable sutures. Although future studies are necessary, it is our hope that this systematic review better elucidates the techniques and provides some guidance to both aesthetic and reconstructive plastic surgeons in the pursuit of creating the perfect umbilicus following abdominoplasty and TRAM/DIEP breast reconstruction. This journal requires that authors assign a level of evidence to each article. For a full description of these Evidence-Based Medicine ratings, please refer to the Table of Contents or the online Instructions to Authors www.springer.com/00266 .

  9. Using Search Engines to Investigate Shared Migraine Experiences.

    PubMed

    Burns, Sara M; Turner, Dana P; Sexton, Katherine E; Deng, Hao; Houle, Timothy T

    2017-09-01

    To investigate migraine patterns in the United States using Google search data and utilize this information to better understand societal-level trends. Additionally, we aimed to evaluate time-series relationships between migraines and social factors. Extensive research has been done on clinical factors associated with migraines, yet population-level social factors have not been widely explored. Migraine internet search data may provide insight into migraine trends beyond information that can be gleaned from other sources. In this longitudinal analysis of open access data, we performed a time-series analysis in which about 12 years of Google Trends data (January 1, 2004 to August 15, 2016) were assessed. Data points were captured at a daily level and Google's 0-100 adjusted scale was used as the primary outcome to enable the comparison of relative popularity in the migraine search term. We hypothesized that the volume of relative migraine Google searches would be affected by societal aspects such as day of the week, holidays, and novel social events. Several recurrent social factors that drive migraine searches were identified. Of these, day of the week had the most significant impact on the volume of Google migraine searches. On average, Mondays accumulated 13.31 higher relative search volume than Fridays (95% CI: 11.12-15.51, P ≤ .001). Surprisingly, holidays were associated with lower relative migraine search volumes. Christmas Day had 13.84 lower relative search volumes (95% CI: 6.26-21.43, P ≤ .001) and Thanks giving had 20.18 lower relative search volumes (95% CI: 12.55-27.82, P ≤ .001) than days that were not holidays. Certain novel social events and extreme weather also appear to be associated with relative migraine Google search volume. Social factors play a crucial role in explaining population level migraine patterns, and thus, warrant further exploration. © 2017 American Headache Society.

  10. Pro-eating disorder search patterns: the possible influence of celebrity eating disorder stories in the media.

    PubMed

    Lewis, Stephen P; Klauninger, Laura; Marcincinova, Ivana

    2016-01-01

    Pro eating disorder websites often contain celebrity-focused content (e.g., images) used as thinspiration to engage in unhealthy eating disorder behaviours. The current study was conducted to examine whether news media stories covering eating disorder disclosures of celebrities corresponded with increases in Internet searches for pro eating disorder material. Results indicated that search volumes for pro eating disorder terms spiked in the month immediately following such news coverage but only for particularly high-profile celebrities. Hence, there may be utility in providing recovery-oriented resources within the search results for pro-eating disorder Internet searches and within news stories of this nature.

  11. Astronomical polarization studies at radio and infrared wavelengths. Part 1: Gravitational deflection of polarized radiation

    NASA Technical Reports Server (NTRS)

    Dennison, B. K.

    1976-01-01

    The gravitational field is probed in a search for polarization dependence in the light bending. This involves searching for a splitting of a source image into orthogonal polarizations as the radiation passes through the solar gravitational field. This search was carried out using the techniques of very long and intermediate baseline interferometry, and by seeking a relative phase delay in orthogonal polarizations of microwaves passing through the solar gravitational field. In this last technique a change in the total polarization of the Helios 1 carrier wave was sought as the spacecraft passed behind the sun. No polarization splitting was detected.

  12. Walking the Filament of Feasibility: Global Optimization of Highly-Constrained, Multi-Modal Interplanetary Trajectories Using a Novel Stochastic Search Technique

    NASA Technical Reports Server (NTRS)

    Englander, Arnold C.; Englander, Jacob A.

    2017-01-01

    Interplanetary trajectory optimization problems are highly complex and are characterized by a large number of decision variables and equality and inequality constraints as well as many locally optimal solutions. Stochastic global search techniques, coupled with a large-scale NLP solver, have been shown to solve such problems but are inadequately robust when the problem constraints become very complex. In this work, we present a novel search algorithm that takes advantage of the fact that equality constraints effectively collapse the solution space to lower dimensionality. This new approach walks the filament'' of feasibility to efficiently find the global optimal solution.

  13. From Data to Knowledge – Promising Analytical Tools and Techniques for Capture and Reuse of Corporate Knowledge and to Aid in the State Evaluation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Danielson, Gary R.; Augustenborg, Elsa C.; Beck, Andrew E.

    2010-10-29

    The IAEA is challenged with limited availability of human resources for inspection and data analysis while proliferation threats increase. PNNL has a variety of IT solutions and techniques (at varying levels of maturity and development) that take raw data closer to useful knowledge, thereby assisting with and standardizing the analytical processes. This paper highlights some PNNL tools and techniques which are applicable to the international safeguards community, including: • Intelligent in-situ triage of data prior to reliable transmission to an analysis center resulting in the transmission of smaller and more relevant data sets • Capture of expert knowledge in re-usablemore » search strings tailored to specific mission outcomes • Image based searching fused with text based searching • Use of gaming to discover unexpected proliferation scenarios • Process modeling (e.g. Physical Model) as the basis for an information integration portal, which links to data storage locations along with analyst annotations, categorizations, geographic data, search strings and visualization outputs.« less

  14. Improvement of sub-20nm pattern quality with dose modulation technique for NIL template production

    NASA Astrophysics Data System (ADS)

    Yagawa, Keisuke; Ugajin, Kunihiro; Suenaga, Machiko; Kanamitsu, Shingo; Motokawa, Takeharu; Hagihara, Kazuki; Arisawa, Yukiyasu; Kobayashi, Sachiko; Saito, Masato; Ito, Masamitsu

    2016-04-01

    Nanoimprint lithography (NIL) technology is in the spotlight as a next-generation semiconductor manufacturing technique for integrated circuits at 22 nm and beyond. NIL is the unmagnified lithography technique using template which is replicated from master templates. On the other hand, master templates are currently fabricated by electron-beam (EB) lithography[1]. In near future, finer patterns less than 15nm will be required on master template and EB data volume increases exponentially. So, we confront with a difficult challenge. A higher resolution EB mask writer and a high performance fabrication process will be required. In our previous study, we investigated a potential of photomask fabrication process for finer patterning and achieved 15.5nm line and space (L/S) pattern on template by using VSB (Variable Shaped Beam) type EB mask writer and chemically amplified resist. In contrast, we found that a contrast loss by backscattering decreases the performance of finer patterning. For semiconductor devices manufacturing, we must fabricate complicated patterns which includes high and low density simultaneously except for consecutive L/S pattern. Then it's quite important to develop a technique to make various size or coverage patterns all at once. In this study, a small feature pattern was experimentally formed on master template with dose modulation technique. This technique makes it possible to apply the appropriate exposure dose for each pattern size. As a result, we succeed to improve the performance of finer patterning in bright field area. These results show that the performance of current EB lithography process have a potential to fabricate NIL template.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perl, M.L.

    This paper is based upon lectures in which I have described and explored the ways in which experimenters can try to find answers, or at least clues toward answers, to some of the fundamental questions of elementary particle physics. All of these experimental techniques and directions have been discussed fully in other papers, for example: searches for heavy charged leptons, tests of quantum chromodynamics, searches for Higgs particles, searches for particles predicted by supersymmetric theories, searches for particles predicted by technicolor theories, searches for proton decay, searches for neutrino oscillations, monopole searches, studies of low transfer momentum hadron physics atmore » very high energies, and elementary particle studies using cosmic rays. Each of these subjects requires several lectures by itself to do justice to the large amount of experimental work and theoretical thought which has been devoted to these subjects. My approach in these tutorial lectures is to describe general ways to experiment beyond the standard model. I will use some of the topics listed to illustrate these general ways. Also, in these lectures I present some dreams and challenges about new techniques in experimental particle physics and accelerator technology, I call these Experimental Needs. 92 references.« less

  16. A Different Web-Based Geocoding Service Using Fuzzy Techniques

    NASA Astrophysics Data System (ADS)

    Pahlavani, P.; Abbaspour, R. A.; Zare Zadiny, A.

    2015-12-01

    Geocoding - the process of finding position based on descriptive data such as address or postal code - is considered as one of the most commonly used spatial analyses. Many online map providers such as Google Maps, Bing Maps and Yahoo Maps present geocoding as one of their basic capabilities. Despite the diversity of geocoding services, users usually face some limitations when they use available online geocoding services. In existing geocoding services, proximity and nearness concept is not modelled appropriately as well as these services search address only by address matching based on descriptive data. In addition there are also some limitations in display searching results. Resolving these limitations can enhance efficiency of the existing geocoding services. This paper proposes the idea of integrating fuzzy technique with geocoding process to resolve these limitations. In order to implement the proposed method, a web-based system is designed. In proposed method, nearness to places is defined by fuzzy membership functions and multiple fuzzy distance maps are created. Then these fuzzy distance maps are integrated using fuzzy overlay technique for obtain the results. Proposed methods provides different capabilities for users such as ability to search multi-part addresses, searching places based on their location, non-point representation of results as well as displaying search results based on their priority.

  17. Configural learning in contextual cuing of visual search.

    PubMed

    Beesley, Tom; Vadillo, Miguel A; Pearson, Daniel; Shanks, David R

    2016-08-01

    Two experiments were conducted to explore the role of configural representations in contextual cuing of visual search. Repeating patterns of distractors (contexts) were trained incidentally as predictive of the target location. Training participants with repeating contexts of consistent configurations led to stronger contextual cuing than when participants were trained with contexts of inconsistent configurations. Computational simulations with an elemental associative learning model of contextual cuing demonstrated that purely elemental representations could not account for the results. However, a configural model of associative learning was able to simulate the ordinal pattern of data. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  18. Skull base lesions: extracranial origins.

    PubMed

    Mosier, Kristine M

    2013-10-01

    A number of extracranial anatomical sites, including the nasopharynx, paranasal sinuses, and masticator space, may give rise to lesions involving the skull base. Implicit in the nature of an invasive lesion, the majority of these lesions are malignant. Accordingly, for optimal patient outcomes and treatment planning, it is imperative to include a search pattern for extracranial sites and to assess accurately the character and extent of these diverse lesions. Of particular importance to radiologists are lesions arising from each extracranial site, the search patterns, and relevant information important to convey to the referring clinician. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. A novel high-frequency encoding algorithm for image compression

    NASA Astrophysics Data System (ADS)

    Siddeq, Mohammed M.; Rodrigues, Marcos A.

    2017-12-01

    In this paper, a new method for image compression is proposed whose quality is demonstrated through accurate 3D reconstruction from 2D images. The method is based on the discrete cosine transform (DCT) together with a high-frequency minimization encoding algorithm at compression stage and a new concurrent binary search algorithm at decompression stage. The proposed compression method consists of five main steps: (1) divide the image into blocks and apply DCT to each block; (2) apply a high-frequency minimization method to the AC-coefficients reducing each block by 2/3 resulting in a minimized array; (3) build a look up table of probability data to enable the recovery of the original high frequencies at decompression stage; (4) apply a delta or differential operator to the list of DC-components; and (5) apply arithmetic encoding to the outputs of steps (2) and (4). At decompression stage, the look up table and the concurrent binary search algorithm are used to reconstruct all high-frequency AC-coefficients while the DC-components are decoded by reversing the arithmetic coding. Finally, the inverse DCT recovers the original image. We tested the technique by compressing and decompressing 2D images including images with structured light patterns for 3D reconstruction. The technique is compared with JPEG and JPEG2000 through 2D and 3D RMSE. Results demonstrate that the proposed compression method is perceptually superior to JPEG with equivalent quality to JPEG2000. Concerning 3D surface reconstruction from images, it is demonstrated that the proposed method is superior to both JPEG and JPEG2000.

  20. Recognition of Facially Expressed Emotions and Visual Search Strategies in Adults with Asperger Syndrome

    ERIC Educational Resources Information Center

    Falkmer, Marita; Bjallmark, Anna; Larsson, Matilda; Falkmer, Torbjorn

    2011-01-01

    Can the disadvantages persons with Asperger syndrome frequently experience with reading facially expressed emotions be attributed to a different visual perception, affecting their scanning patterns? Visual search strategies, particularly regarding the importance of information from the eye area, and the ability to recognise facially expressed…

  1. Coherent correlator and equalizer using a reconfigurable all-optical tapped delay line.

    PubMed

    Chitgarha, Mohammad Reza; Khaleghi, Salman; Yilmaz, Omer F; Tur, Moshe; Haney, Michael W; Langrock, Carsten; Fejer, Martin M; Willner, Alan E

    2013-07-01

    We experimentally demonstrate a reconfigurable optical tapped delay line in conjunction with coherent detection to search multiple patterns among quadrature phase shift keying (QPSK) symbols in 20 Gbaud data channel and also to equalize 20 and 31 Gbaud QPSK, 20 Gbaud 8 phase shift keying (PSK), and 16 QAM signals. Multiple patterns are searched successfully on QPSK signals, and correlation peaks are obtained at the matched patterns. QPSK, 8 PSK, and 16 QAM signals are also successfully recovered after 25 km of SMF-28 with average EVMs of 8.3%, 8.9%, and 7.8%. A penalty of <1 dB optical signal to noise penalty is achieved for a 20 Gbaud QPSK signal distorted by up to 400  ps/nm dispersion.

  2. Bomb Threats and Bomb Search Techniques.

    ERIC Educational Resources Information Center

    Department of the Treasury, Washington, DC.

    This pamphlet explains how to be prepared and plan for bomb threats and describes procedures to follow once a call has been received. The content covers (1) preparation for bomb threats, (2) evacuation procedures, (3) room search methods, (4) procedures to follow once a bomb has been located, and (5) typical problems that search teams will…

  3. User Practices in Keyword and Boolean Searching on an Online Public Access Catalog.

    ERIC Educational Resources Information Center

    Ensor, Pat

    1992-01-01

    Discussion of keyword and Boolean searching techniques in online public access catalogs (OPACs) focuses on a study conducted at Indiana State University that examined users' attitudes toward searching on NOTIS (Northwestern Online Total Integrated System). Relevant literature is reviewed, and implications for library instruction are suggested. (17…

  4. Modeling the role of parallel processing in visual search.

    PubMed

    Cave, K R; Wolfe, J M

    1990-04-01

    Treisman's Feature Integration Theory and Julesz's Texton Theory explain many aspects of visual search. However, these theories require that parallel processing mechanisms not be used in many visual searches for which they would be useful, and they imply that visual processing should be much slower than it is. Most importantly, they cannot account for recent data showing that some subjects can perform some conjunction searches very efficiently. Feature Integration Theory can be modified so that it accounts for these data and helps to answer these questions. In this new theory, which we call Guided Search, the parallel stage guides the serial stage as it chooses display elements to process. A computer simulation of Guided Search produces the same general patterns as human subjects in a number of different types of visual search.

  5. eHealth Search Patterns: A Comparison of Private and Public Health Care Markets Using Online Panel Data

    PubMed Central

    2017-01-01

    Background Patient and consumer access to eHealth information is of crucial importance because of its role in patient-centered medicine and to improve knowledge about general aspects of health and medical topics. Objectives The objectives were to analyze and compare eHealth search patterns in a private (United States) and a public (United Kingdom) health care market. Methods A new taxonomy of eHealth websites is proposed to organize the largest eHealth websites. An online measurement framework is developed that provides a precise and detailed measurement system. Online panel data are used to accurately track and analyze detailed search behavior across 100 of the largest eHealth websites in the US and UK health care markets. Results The health, medical, and lifestyle categories account for approximately 90% of online activity, and e-pharmacies, social media, and professional categories account for the remaining 10% of online activity. Overall search penetration of eHealth websites is significantly higher in the private (United States) than the public market (United Kingdom). Almost twice the number of eHealth users in the private market have adopted online search in the health and lifestyle categories and also spend more time per website than those in the public market. The use of medical websites for specific conditions is almost identical in both markets. The allocation of search effort across categories is similar in both the markets. For all categories, the vast majority of eHealth users only access one website within each category. Those that conduct a search of two or more websites display very narrow search patterns. All users spend relatively little time on eHealth, that is, 3-7 minutes per website. Conclusions The proposed online measurement framework exploits online panel data to provide a powerful and objective method of analyzing and exploring eHealth behavior. The private health care system does appear to have an influence on eHealth search behavior in terms of search penetration and time spent per website in the health and lifestyle categories. Two explanations are offered: (1) the personal incentive of medical costs in the private market incentivizes users to conduct online search; and (2) health care information is more easily accessible through health care professionals in the United Kingdom compared with the United States. However, the use of medical websites is almost identical, suggesting that patients interested in a specific condition have a motivation to search and evaluate health information, irrespective of the health care market. The relatively low level of search in terms of the number of websites accessed and the average time per website raise important questions about the actual level of patient informedness in both the markets. Areas for future research are outlined. PMID:28408362

  6. Collaborative search in electronic health records.

    PubMed

    Zheng, Kai; Mei, Qiaozhu; Hanauer, David A

    2011-05-01

    A full-text search engine can be a useful tool for augmenting the reuse value of unstructured narrative data stored in electronic health records (EHR). A prominent barrier to the effective utilization of such tools originates from users' lack of search expertise and/or medical-domain knowledge. To mitigate the issue, the authors experimented with a 'collaborative search' feature through a homegrown EHR search engine that allows users to preserve their search knowledge and share it with others. This feature was inspired by the success of many social information-foraging techniques used on the web that leverage users' collective wisdom to improve the quality and efficiency of information retrieval. The authors conducted an empirical evaluation study over a 4-year period. The user sample consisted of 451 academic researchers, medical practitioners, and hospital administrators. The data were analyzed using a social-network analysis to delineate the structure of the user collaboration networks that mediated the diffusion of knowledge of search. The users embraced the concept with considerable enthusiasm. About half of the EHR searches processed by the system (0.44 million) were based on stored search knowledge; 0.16 million utilized shared knowledge made available by other users. The social-network analysis results also suggest that the user-collaboration networks engendered by the collaborative search feature played an instrumental role in enabling the transfer of search knowledge across people and domains. Applying collaborative search, a social information-foraging technique popularly used on the web, may provide the potential to improve the quality and efficiency of information retrieval in healthcare.

  7. Habitat Selection and Foraging Behavior of Southern Elephant Seals in the Western Antarctic Peninsula

    NASA Astrophysics Data System (ADS)

    Huckstadt, L.; Costa, D. P.; McDonald, B. I.; Tremblay, Y.; Crocker, D. E.; Goebel, M. E.; Fedak, M. E.

    2006-12-01

    We examined the foraging behavior of 18 southern elephant seals foraging over two seasons in the Western Antarctic Peninsula. The foraging behavior and habitat utilization of 7 females in 2005 and 12 in 2006 were followed using satellite linked Satellite Relay Data Loggers that measured diving behavior as well collected salinity and temperature profiles as the animals dove. Animals were tagged after the annual molt during February at Cape Shirreff Livngston Island, South Shetland Islands. There was significant interannual variation in the regions of the Southern Ocean used by seals from Livingston Island. In 2005 of the 7 animals tagged one foraged 4700 km due west of the Antarctic Peninsula going as far as 150 W. The remaining females headed south along the Western Antarctic Peninsula bypassing Marguerite Bay moving south along Alexander Island. Three of these animals continued to forage in the pack ice as it developed. On their return trip all females swam past Livingston Island, continuing on to South Georgia Island where they apparently bred in the austral spring. One animal returned to Cape Shirreff to molt and her tag was recovered. During 2006 animals initially followed a similar migratory pattern going south along the Antarctic Peninsula, but unlike 2005 where the majority of the animals remained in the immediate vicinity of the Western Antarctic Peninsula, most of the animals in 2006 moved well to the west foraging as far as the Amundsen Sea. We compared the area restricted search (focal foraging areas) areas of these animals using a newly developed fractal landscape technique that identifies and quantifies areas of intensive search. The fractal analysis of area restricted search shows that the area, distance and coverage (Fractal D) searched were not different between years, while the time spent in the search areas was higher in 2005. Further analysis will examine how the physical properties of the water column as determined from the CTD data derived from the tags compare across these different focal foraging areas.

  8. Search automation of the generalized method of device operational characteristics improvement

    NASA Astrophysics Data System (ADS)

    Petrova, I. Yu; Puchkova, A. A.; Zaripova, V. M.

    2017-01-01

    The article presents brief results of analysis of existing search methods of the closest patents, which can be applied to determine generalized methods of device operational characteristics improvement. There were observed the most widespread clustering algorithms and metrics for determining the proximity degree between two documents. The article proposes the technique of generalized methods determination; it has two implementation variants and consists of 7 steps. This technique has been implemented in the “Patents search” subsystem of the “Intellect” system. Also the article gives an example of the use of the proposed technique.

  9. Prospects and limitations of citizen science in invasive species management: A case study with Burmese pythons in Everglades National Park

    USGS Publications Warehouse

    Falk, Bryan; Snow, Raymond W.; Reed, Robert

    2016-01-01

    Citizen-science programs have the potential to contribute to the management of invasive species, including Python molurus bivittatus (Burmese Python) in Florida. We characterized citizen-science–generated Burmese Python information from Everglades National Park (ENP) to explore how citizen science may be useful in this effort. As an initial step, we compiled and summarized records of Burmese Python observations and removals collected by both professional and citizen scientists in ENP during 2000–2014 and found many patterns of possible significance, including changes in annual observations and in demographic composition after a cold event. These patterns are difficult to confidently interpret because the records lack search-effort information, however, and differences among years may result from differences in search effort. We began collecting search-effort information in 2014 by leveraging an ongoing citizen-science program in ENP. Program participation was generally low, with most authorized participants in 2014 not searching for the snakes at all. We discuss the possible explanations for low participation, especially how the low likelihood of observing pythons weakens incentives to search. The monthly rate of Burmese Python observations for 2014 averaged ~1 observation for every 8 h of searching, but during several months, the rate was 1 python per >40 h of searching. These low observation-rates are a natural outcome of the snakes’ low detectability—few Burmese Pythons are likely to be observed even if many are present. The general inaccessibility of the southern Florida landscape also severely limits the effectiveness of using visual searches to find and remove pythons for the purposes of population control. Instead, and despite the difficulties in incentivizing voluntary participation, the value of citizen-science efforts in the management of the Burmese Python population is in collecting search-effort information.

  10. A search for Earth-crossing asteroids, supplement

    NASA Technical Reports Server (NTRS)

    Taff, L. G.; Sorvari, J. M.; Kostishack, D. F.

    1984-01-01

    The ground based electro-optical deep space surveillance program involves a network of computer controlled 40 inch 1m telescopes equipped with large format, low light level, television cameras of the intensified silicon diode array type which is to replace the Baker-Nunn photographic camera system for artificial satellite tracking. A prototype observatory was constructed where distant artificial satellites are discriminated from stars in real time on the basis of the satellites' proper motion. Hardware was modified and the technique was used to observe and search for minor planets. Asteroids are now routinely observed and searched. The complete observing cycle, including the 2"-3" measurement of position, requires about four minutes at present. The commonality of asteroids and artificial satellite observing, searching, data reduction, and orbital analysis is stressed. Improvements to the hardware and software as well as operational techniques are considered.

  11. The CTBTO Link to the database of the International Seismological Centre (ISC)

    NASA Astrophysics Data System (ADS)

    Bondar, I.; Storchak, D. A.; Dando, B.; Harris, J.; Di Giacomo, D.

    2011-12-01

    The CTBTO Link to the database of the International Seismological Centre (ISC) is a project to provide access to seismological data sets maintained by the ISC using specially designed interactive tools. The Link is open to National Data Centres and to the CTBTO. By means of graphical interfaces and database queries tailored to the needs of the monitoring community, the users are given access to a multitude of products. These include the ISC and ISS bulletins, covering the seismicity of the Earth since 1904; nuclear and chemical explosions; the EHB bulletin; the IASPEI Reference Event list (ground truth database); and the IDC Reviewed Event Bulletin. The searches are divided into three main categories: The Area Based Search (a spatio-temporal search based on the ISC Bulletin), the REB search (a spatio-temporal search based on specific events in the REB) and the IMS Station Based Search (a search for historical patterns in the reports of seismic stations close to a particular IMS seismic station). The outputs are HTML based web-pages with a simplified version of the ISC Bulletin showing the most relevant parameters with access to ISC, GT, EHB and REB Bulletins in IMS1.0 format for single or multiple events. The CTBTO Link offers a tool to view REB events in context within the historical seismicity, look at observations reported by non-IMS networks, and investigate station histories and residual patterns for stations registered in the International Seismographic Station Registry.

  12. Search prefilters to assist in library searching of infrared spectra of automotive clear coats.

    PubMed

    Lavine, Barry K; Fasasi, Ayuba; Mirjankar, Nikhil; White, Collin; Sandercock, Mark

    2015-01-01

    Clear coat searches of the infrared (IR) spectral library of the paint data query (PDQ) forensic database often generate an unusable number of hits that span multiple manufacturers, assembly plants, and years. To improve the accuracy of the hit list, pattern recognition methods have been used to develop search prefilters (i.e., principal component models) that differentiate between similar but non-identical IR spectra of clear coats on the basis of manufacturer (e.g., General Motors, Ford, Chrysler) or assembly plant. A two step procedure to develop these search prefilters was employed. First, the discrete wavelet transform was used to decompose each IR spectrum into wavelet coefficients to enhance subtle but significant features in the spectral data. Second, a genetic algorithm for IR spectral pattern recognition was employed to identify wavelet coefficients characteristic of the manufacturer or assembly plant of the vehicle. Even in challenging trials where the paint samples evaluated were all from the same manufacturer (General Motors) within a limited production year range (2000-2006), the respective assembly plant of the vehicle was correctly identified. Search prefilters to identify assembly plants were successfully validated using 10 blind samples provided by the Royal Canadian Mounted Police (RCMP) as part of a study to populate PDQ to current production years, whereas the search prefilter to discriminate among automobile manufacturers was successfully validated using IR spectra obtained directly from the PDQ database. Copyright © 2014 Elsevier B.V. All rights reserved.

  13. Data classification using metaheuristic Cuckoo Search technique for Levenberg Marquardt back propagation (CSLM) algorithm

    NASA Astrophysics Data System (ADS)

    Nawi, Nazri Mohd.; Khan, Abdullah; Rehman, M. Z.

    2015-05-01

    A nature inspired behavior metaheuristic techniques which provide derivative-free solutions to solve complex problems. One of the latest additions to the group of nature inspired optimization procedure is Cuckoo Search (CS) algorithm. Artificial Neural Network (ANN) training is an optimization task since it is desired to find optimal weight set of a neural network in training process. Traditional training algorithms have some limitation such as getting trapped in local minima and slow convergence rate. This study proposed a new technique CSLM by combining the best features of two known algorithms back-propagation (BP) and Levenberg Marquardt algorithm (LM) for improving the convergence speed of ANN training and avoiding local minima problem by training this network. Some selected benchmark classification datasets are used for simulation. The experiment result show that the proposed cuckoo search with Levenberg Marquardt algorithm has better performance than other algorithm used in this study.

  14. Near Real-Time Imaging of the Galactic Plane with BATSE

    NASA Technical Reports Server (NTRS)

    Harmon, B. A.; Zhang, S. N.; Robinson, C. R.; Paciesas, W. S.; Barret, D.; Grindlay, J.; Bloser, P.; Monnelly, C.

    1997-01-01

    The discovery of new transient or persistent sources in the hard X-ray regime with the BATSE Earth occultation Technique has been limited previously to bright sources of about 200 mCrab or more. While monitoring known source locations is not a problem to a daily limiting sensitivity of about 75 mCrab, the lack of a reliable background model forces us to use more intensive computer techniques to find weak, previously unknown emission from hard X-ray/gamma sources. The combination of Radon transform imaging of the galactic plane in 10 by 10 degree fields and the Harvard/CFA-developed Image Search (CBIS) allows us to straightforwardly search the sky for candidate sources in a +/- 20 degree latitude band along the plane. This procedure has been operating routinely on a weekly basis since spring 1997. We briefly describe the procedure, then concentrate on the performance aspects of the technique and candidate source results from the search.

  15. Searching for chemical classes among metal-poor stars using medium-resolution spectroscopy

    NASA Astrophysics Data System (ADS)

    Cruz, Monique A.; Cogo-Moreira, Hugo; Rossi, Silvia

    2018-04-01

    Astronomy is in the era of large spectroscopy surveys, with the spectra of hundreds of thousands of stars in the Galaxy being collected. Although most of these surveys have low or medium resolution, which makes precise abundance measurements not possible, there is still important information to be extracted from the available data. Our aim is to identify chemically distinct classes among metal-poor stars, observed by the Sloan Digital Sky Survey, using line indices. The present work focused on carbon-enhanced metal-poor (CEMP) stars and their subclasses. We applied the latent profile analysis technique to line indices for carbon, barium, iron and europium, in order to separate the sample into classes with similar chemical signatures. This technique provides not only the number of possible groups but also the probability of each object to belong to each class. The method was able to distinguish at least two classes among the observed sample, with one of them being probable CEMP stars enriched in s-process elements. However, it was not able to separate CEMP-no stars from the rest of the sample. Latent profile analysis is a powerful model-based tool to be used in the identification of patterns in astrophysics. Our tests show the potential of the technique for the attainment of additional chemical information from `poor' data.

  16. Pattern recognition technique

    NASA Technical Reports Server (NTRS)

    Hong, J. P.

    1971-01-01

    Technique operates regardless of pattern rotation, translation or magnification and successfully detects out-of-register patterns. It improves accuracy and reduces cost of various optical character recognition devices and page readers and provides data input to computer.

  17. Association between Stock Market Gains and Losses and Google Searches

    PubMed Central

    Arditi, Eli; Yechiam, Eldad; Zahavi, Gal

    2015-01-01

    Experimental studies in the area of Psychology and Behavioral Economics have suggested that people change their search pattern in response to positive and negative events. Using Internet search data provided by Google, we investigated the relationship between stock-specific events and related Google searches. We studied daily data from 13 stocks from the Dow-Jones and NASDAQ100 indices, over a period of 4 trading years. Focusing on periods in which stocks were extensively searched (Intensive Search Periods), we found a correlation between the magnitude of stock returns at the beginning of the period and the volume, peak, and duration of search generated during the period. This relation between magnitudes of stock returns and subsequent searches was considerably magnified in periods following negative stock returns. Yet, we did not find that intensive search periods following losses were associated with more Google searches than periods following gains. Thus, rather than increasing search, losses improved the fit between people’s search behavior and the extent of real-world events triggering the search. The findings demonstrate the robustness of the attentional effect of losses. PMID:26513371

  18. Are There Patterns of Bruising in Childhood Which Are Diagnostic or Suggestive of Abuse? A Systematic Review

    ERIC Educational Resources Information Center

    Maguire, S.; Mann, M. K.; Sibert, J.; Kemp, A.

    2005-01-01

    Aims: To investigate what patterns of bruising are diagnostic or suggestive of child abuse by means of a systematic review. Methods: All language literature search 1951-2004. Included: studies that defined patterns of bruising in non-abused or abused children <18 years. Excluded: personal practice, review articles, single case reports, inadequate…

  19. Using a Search Engine-Based Mutually Reinforcing Approach to Assess the Semantic Relatedness of Biomedical Terms

    PubMed Central

    Hsu, Yi-Yu; Chen, Hung-Yu; Kao, Hung-Yu

    2013-01-01

    Background Determining the semantic relatedness of two biomedical terms is an important task for many text-mining applications in the biomedical field. Previous studies, such as those using ontology-based and corpus-based approaches, measured semantic relatedness by using information from the structure of biomedical literature, but these methods are limited by the small size of training resources. To increase the size of training datasets, the outputs of search engines have been used extensively to analyze the lexical patterns of biomedical terms. Methodology/Principal Findings In this work, we propose the Mutually Reinforcing Lexical Pattern Ranking (ReLPR) algorithm for learning and exploring the lexical patterns of synonym pairs in biomedical text. ReLPR employs lexical patterns and their pattern containers to assess the semantic relatedness of biomedical terms. By combining sentence structures and the linking activities between containers and lexical patterns, our algorithm can explore the correlation between two biomedical terms. Conclusions/Significance The average correlation coefficient of the ReLPR algorithm was 0.82 for various datasets. The results of the ReLPR algorithm were significantly superior to those of previous methods. PMID:24348899

  20. Survey of optimization techniques for nonlinear spacecraft trajectory searches

    NASA Technical Reports Server (NTRS)

    Wang, Tseng-Chan; Stanford, Richard H.; Sunseri, Richard F.; Breckheimer, Peter J.

    1988-01-01

    Mathematical analysis of the optimal search of a nonlinear spacecraft trajectory to arrive at a set of desired targets is presented. A high precision integrated trajectory program and several optimization software libraries are used to search for a converged nonlinear spacecraft trajectory. Several examples for the Galileo Jupiter Orbiter and the Ocean Topography Experiment (TOPEX) are presented that illustrate a variety of the optimization methods used in nonlinear spacecraft trajectory searches.

  1. A diagnostic technique used to obtain cross range radiation centers from antenna patterns

    NASA Technical Reports Server (NTRS)

    Lee, T. H.; Burnside, W. D.

    1988-01-01

    A diagnostic technique to obtain cross range radiation centers based on antenna radiation patterns is presented. This method is similar to the synthetic aperture processing of scattered fields in the radar application. Coherent processing of the radiated fields is used to determine the various radiation centers associated with the far-zone pattern of an antenna for a given radiation direction. This technique can be used to identify an unexpected radiation center that creates an undesired effect in a pattern; on the other hand, it can improve a numerical simulation of the pattern by identifying other significant mechanisms. Cross range results for two 8' reflector antennas are presented to illustrate as well as validate that technique.

  2. Have Users Changed Their Style? A Survey of CD-ROM vs. OPAC Product Usage.

    ERIC Educational Resources Information Center

    Anderson, Judy

    1995-01-01

    A survey of online search techniques of 50 undergraduate and graduate students at Arizona State University's Hayden Library revealed heavy reliance on simple subject searching. The analysis included search types, use of library personnel and online help screens, exposure to library instruction, and length of time at the terminal for citation…

  3. Taking It to the Top: A Lesson in Search Engine Optimization

    ERIC Educational Resources Information Center

    Frydenberg, Mark; Miko, John S.

    2011-01-01

    Search engine optimization (SEO), the promoting of a Web site so it achieves optimal position with a search engine's rankings, is an important strategy for organizations and individuals in order to promote their brands online. Techniques for achieving SEO are relevant to students of marketing, computing, media arts, and other disciplines, and many…

  4. Internet Power Searching: The Advanced Manual. 2nd Edition. Neal-Schuman NetGuide Series.

    ERIC Educational Resources Information Center

    Bradley, Phil

    This handbook provides information on how Internet search engines and related software and utilities work and how to use them in order to improve search techniques. The book begins with an introduction to the Internet. Part 1 contains the following chapters that cover mining the Internet for information: "An Introduction to Search…

  5. Collaborative search in electronic health records

    PubMed Central

    Mei, Qiaozhu; Hanauer, David A

    2011-01-01

    Objective A full-text search engine can be a useful tool for augmenting the reuse value of unstructured narrative data stored in electronic health records (EHR). A prominent barrier to the effective utilization of such tools originates from users' lack of search expertise and/or medical-domain knowledge. To mitigate the issue, the authors experimented with a ‘collaborative search’ feature through a homegrown EHR search engine that allows users to preserve their search knowledge and share it with others. This feature was inspired by the success of many social information-foraging techniques used on the web that leverage users' collective wisdom to improve the quality and efficiency of information retrieval. Design The authors conducted an empirical evaluation study over a 4-year period. The user sample consisted of 451 academic researchers, medical practitioners, and hospital administrators. The data were analyzed using a social-network analysis to delineate the structure of the user collaboration networks that mediated the diffusion of knowledge of search. Results The users embraced the concept with considerable enthusiasm. About half of the EHR searches processed by the system (0.44 million) were based on stored search knowledge; 0.16 million utilized shared knowledge made available by other users. The social-network analysis results also suggest that the user-collaboration networks engendered by the collaborative search feature played an instrumental role in enabling the transfer of search knowledge across people and domains. Conclusion Applying collaborative search, a social information-foraging technique popularly used on the web, may provide the potential to improve the quality and efficiency of information retrieval in healthcare. PMID:21486887

  6. Locating underwater objects. [technology transfer

    NASA Technical Reports Server (NTRS)

    Grice, C. F.

    1974-01-01

    Underwater search operations are considered to be engineering and operational problems. A process for proper definition of the problem and selection of instrumentation and operational procedures is described. An outline of underwater search instrumentation and techniques is given.

  7. Bio-inspired computational heuristics to study Lane-Emden systems arising in astrophysics model.

    PubMed

    Ahmad, Iftikhar; Raja, Muhammad Asif Zahoor; Bilal, Muhammad; Ashraf, Farooq

    2016-01-01

    This study reports novel hybrid computational methods for the solutions of nonlinear singular Lane-Emden type differential equation arising in astrophysics models by exploiting the strength of unsupervised neural network models and stochastic optimization techniques. In the scheme the neural network, sub-part of large field called soft computing, is exploited for modelling of the equation in an unsupervised manner. The proposed approximated solutions of higher order ordinary differential equation are calculated with the weights of neural networks trained with genetic algorithm, and pattern search hybrid with sequential quadratic programming for rapid local convergence. The results of proposed solvers for solving the nonlinear singular systems are in good agreements with the standard solutions. Accuracy and convergence the design schemes are demonstrated by the results of statistical performance measures based on the sufficient large number of independent runs.

  8. Marker Registration Technique for Handwritten Text Marker in Augmented Reality Applications

    NASA Astrophysics Data System (ADS)

    Thanaborvornwiwat, N.; Patanukhom, K.

    2018-04-01

    Marker registration is a fundamental process to estimate camera poses in marker-based Augmented Reality (AR) systems. We developed AR system that creates correspondence virtual objects on handwritten text markers. This paper presents a new method for registration that is robust for low-content text markers, variation of camera poses, and variation of handwritten styles. The proposed method uses Maximally Stable Extremal Regions (MSER) and polygon simplification for a feature point extraction. The experiment shows that we need to extract only five feature points per image which can provide the best registration results. An exhaustive search is used to find the best matching pattern of the feature points in two images. We also compared performance of the proposed method to some existing registration methods and found that the proposed method can provide better accuracy and time efficiency.

  9. High Altitude Platforms for Disaster Recovery: Capabilities, Strategies, and Techniques for Providing Emergency Telecommunications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Juan D. Deaton

    2008-05-01

    Natural disasters and terrorist acts have significant potential to disrupt emergency communication systems. These emergency communication networks include first-responder, cellular, landline, and emergency answering services such as 911, 112, or 999. Without these essential emergency communications capabilities, search, rescue, and recovery operations during a catastrophic event will be severely debilitated. High altitude platforms could be fitted with telecommunications equipment and used to support these critical communications missions once the catastrophic event occurs. With the ability to be continuously on station, HAPs provide excellent options for providing emergency coverage over high-risk areas before catastrophic incidents occur. HAPs could also provide enhancedmore » 911 capabilities using either GPS or reference stations. This paper proposes potential emergency communications architecture and presents a method for estimating emergency communications systems traffic patterns for a catastrophic event.« less

  10. Pattern recognition of electronic bit-sequences using a semiconductor mode-locked laser and spatial light modulators

    NASA Astrophysics Data System (ADS)

    Bhooplapur, Sharad; Akbulut, Mehmetkan; Quinlan, Franklyn; Delfyett, Peter J.

    2010-04-01

    A novel scheme for recognition of electronic bit-sequences is demonstrated. Two electronic bit-sequences that are to be compared are each mapped to a unique code from a set of Walsh-Hadamard codes. The codes are then encoded in parallel on the spectral phase of the frequency comb lines from a frequency-stabilized mode-locked semiconductor laser. Phase encoding is achieved by using two independent spatial light modulators based on liquid crystal arrays. Encoded pulses are compared using interferometric pulse detection and differential balanced photodetection. Orthogonal codes eight bits long are compared, and matched codes are successfully distinguished from mismatched codes with very low error rates, of around 10-18. This technique has potential for high-speed, high accuracy recognition of bit-sequences, with applications in keyword searches and internet protocol packet routing.

  11. Using linear algebra for protein structural comparison and classification

    PubMed Central

    2009-01-01

    In this article, we describe a novel methodology to extract semantic characteristics from protein structures using linear algebra in order to compose structural signature vectors which may be used efficiently to compare and classify protein structures into fold families. These signatures are built from the pattern of hydrophobic intrachain interactions using Singular Value Decomposition (SVD) and Latent Semantic Indexing (LSI) techniques. Considering proteins as documents and contacts as terms, we have built a retrieval system which is able to find conserved contacts in samples of myoglobin fold family and to retrieve these proteins among proteins of varied folds with precision of up to 80%. The classifier is a web tool available at our laboratory website. Users can search for similar chains from a specific PDB, view and compare their contact maps and browse their structures using a JMol plug-in. PMID:21637532

  12. Using linear algebra for protein structural comparison and classification.

    PubMed

    Gomide, Janaína; Melo-Minardi, Raquel; Dos Santos, Marcos Augusto; Neshich, Goran; Meira, Wagner; Lopes, Júlio César; Santoro, Marcelo

    2009-07-01

    In this article, we describe a novel methodology to extract semantic characteristics from protein structures using linear algebra in order to compose structural signature vectors which may be used efficiently to compare and classify protein structures into fold families. These signatures are built from the pattern of hydrophobic intrachain interactions using Singular Value Decomposition (SVD) and Latent Semantic Indexing (LSI) techniques. Considering proteins as documents and contacts as terms, we have built a retrieval system which is able to find conserved contacts in samples of myoglobin fold family and to retrieve these proteins among proteins of varied folds with precision of up to 80%. The classifier is a web tool available at our laboratory website. Users can search for similar chains from a specific PDB, view and compare their contact maps and browse their structures using a JMol plug-in.

  13. Advantages and Disadvantages of Transtibial, Anteromedial Portal, and Outside-In Femoral Tunnel Drilling in Single-Bundle Anterior Cruciate Ligament Reconstruction: A Systematic Review.

    PubMed

    Robin, Brett N; Jani, Sunil S; Marvil, Sean C; Reid, John B; Schillhammer, Carl K; Lubowitz, James H

    2015-07-01

    Controversy exists regarding the best method for creating the knee anterior cruciate ligament (ACL) femoral tunnel or socket. The purpose of this study was to systematically review the risks, benefits, advantages, and disadvantages of the endoscopic transtibial (TT) technique, anteromedial portal technique, outside-in technique, and outside-in retrograde drilling technique for creating the ACL femoral tunnel. A PubMed search of English-language studies published between January 1, 2000, and February 17, 2014, was performed using the following keywords: "anterior cruciate ligament" AND "femoral tunnel." Included were studies reporting risks, benefits, advantages, and/or disadvantages of any ACL femoral technique. In addition, references of included articles were reviewed to identify potential studies missed in the original search. A total of 27 articles were identified through the search. TT technique advantages include familiarity and proven long-term outcomes; disadvantages include the risk of nonanatomic placement because of constrained (TT) drilling. Anteromedial portal technique advantages include unconstrained anatomic placement; disadvantages include technical challenges, short tunnels or sockets, and posterior-wall blowout. Outside-in technique advantages include unconstrained anatomic placement; disadvantages include the need for 2 incisions. Retrograde drilling technique advantages include unconstrained anatomic placement, as well as all-epiphyseal drilling in skeletally immature patients; disadvantages include the need for fluoroscopy for all-epiphyseal drilling. There is no one, single, established "gold-standard" technique for creation of the ACL femoral socket. Four accepted techniques show diverse and subjective advantages, disadvantages, risks, and benefits. Level V, systematic review of Level II through V evidence. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  14. Improve Data Mining and Knowledge Discovery Through the Use of MatLab

    NASA Technical Reports Server (NTRS)

    Shaykhian, Gholam Ali; Martin, Dawn (Elliott); Beil, Robert

    2011-01-01

    Data mining is widely used to mine business, engineering, and scientific data. Data mining uses pattern based queries, searches, or other analyses of one or more electronic databases/datasets in order to discover or locate a predictive pattern or anomaly indicative of system failure, criminal or terrorist activity, etc. There are various algorithms, techniques and methods used to mine data; including neural networks, genetic algorithms, decision trees, nearest neighbor method, rule induction association analysis, slice and dice, segmentation, and clustering. These algorithms, techniques and methods used to detect patterns in a dataset, have been used in the development of numerous open source and commercially available products and technology for data mining. Data mining is best realized when latent information in a large quantity of data stored is discovered. No one technique solves all data mining problems; challenges are to select algorithms or methods appropriate to strengthen data/text mining and trending within given datasets. In recent years, throughout industry, academia and government agencies, thousands of data systems have been designed and tailored to serve specific engineering and business needs. Many of these systems use databases with relational algebra and structured query language to categorize and retrieve data. In these systems, data analyses are limited and require prior explicit knowledge of metadata and database relations; lacking exploratory data mining and discoveries of latent information. This presentation introduces MatLab(R) (MATrix LABoratory), an engineering and scientific data analyses tool to perform data mining. MatLab was originally intended to perform purely numerical calculations (a glorified calculator). Now, in addition to having hundreds of mathematical functions, it is a programming language with hundreds built in standard functions and numerous available toolboxes. MatLab's ease of data processing, visualization and its enormous availability of built in functionalities and toolboxes make it suitable to perform numerical computations and simulations as well as a data mining tool. Engineers and scientists can take advantage of the readily available functions/toolboxes to gain wider insight in their perspective data mining experiments.

  15. Improve Data Mining and Knowledge Discovery through the use of MatLab

    NASA Technical Reports Server (NTRS)

    Shaykahian, Gholan Ali; Martin, Dawn Elliott; Beil, Robert

    2011-01-01

    Data mining is widely used to mine business, engineering, and scientific data. Data mining uses pattern based queries, searches, or other analyses of one or more electronic databases/datasets in order to discover or locate a predictive pattern or anomaly indicative of system failure, criminal or terrorist activity, etc. There are various algorithms, techniques and methods used to mine data; including neural networks, genetic algorithms, decision trees, nearest neighbor method, rule induction association analysis, slice and dice, segmentation, and clustering. These algorithms, techniques and methods used to detect patterns in a dataset, have been used in the development of numerous open source and commercially available products and technology for data mining. Data mining is best realized when latent information in a large quantity of data stored is discovered. No one technique solves all data mining problems; challenges are to select algorithms or methods appropriate to strengthen data/text mining and trending within given datasets. In recent years, throughout industry, academia and government agencies, thousands of data systems have been designed and tailored to serve specific engineering and business needs. Many of these systems use databases with relational algebra and structured query language to categorize and retrieve data. In these systems, data analyses are limited and require prior explicit knowledge of metadata and database relations; lacking exploratory data mining and discoveries of latent information. This presentation introduces MatLab(TradeMark)(MATrix LABoratory), an engineering and scientific data analyses tool to perform data mining. MatLab was originally intended to perform purely numerical calculations (a glorified calculator). Now, in addition to having hundreds of mathematical functions, it is a programming language with hundreds built in standard functions and numerous available toolboxes. MatLab's ease of data processing, visualization and its enormous availability of built in functionalities and toolboxes make it suitable to perform numerical computations and simulations as well as a data mining tool. Engineers and scientists can take advantage of the readily available functions/toolboxes to gain wider insight in their perspective data mining experiments.

  16. Search Regimes and the Industrial Dynamics of Science

    ERIC Educational Resources Information Center

    Bonaccorsi, Andrea

    2008-01-01

    The article addresses the issue of dynamics of science, in particular of new sciences born in twentieth century and developed after the Second World War (information science, materials science, life science). The article develops the notion of search regime as an abstract characterization of dynamic patterns, based on three dimensions: the rate of…

  17. A Semiotic Analysis of Icons on the World Wide Web.

    ERIC Educational Resources Information Center

    Ma, Yan

    The World Wide Web allows users to interact with a graphic interface to search information in a hypermedia and multimedia environment. Graphics serve as reference points on the World Wide Web for searching and retrieving information. This study analyzed the culturally constructed syntax patterns, or codes, embedded in the icons of library…

  18. Children, Technology, and Instruction: A Case Study of Elementary School Children Using an Online Public Access Catalog (OPAC).

    ERIC Educational Resources Information Center

    Solomon, Paul

    1994-01-01

    Examines elementary school students' use of an online public access catalog to investigate the interaction between children, technology, curriculum, instruction, and learning. Highlights include patterns of successes and breakdowns; search strategies; instructional approaches and childrens' interests; structure of interaction; search terms; and…

  19. Cooperative Control of UAVs for Localization of Intermittently Emitting Mobile Targets

    DTIC Science & Technology

    2009-08-01

    as lawn - mower serpentine patterns [21]. Second, due to the limited energy supplies intrinsic to UAV applications, it is also important that the search...Robotic Embedded Systems Laboratory, Univ. Southern Calif., Los Angeles, CA, 2002. Tech. Rep. [21] J. Ousingsawat and M. G. Earl, “Modified lawn - mower search

  20. Application of multivariable search techniques to structural design optimization

    NASA Technical Reports Server (NTRS)

    Jones, R. T.; Hague, D. S.

    1972-01-01

    Multivariable optimization techniques are applied to a particular class of minimum weight structural design problems: the design of an axially loaded, pressurized, stiffened cylinder. Minimum weight designs are obtained by a variety of search algorithms: first- and second-order, elemental perturbation, and randomized techniques. An exterior penalty function approach to constrained minimization is employed. Some comparisons are made with solutions obtained by an interior penalty function procedure. In general, it would appear that an interior penalty function approach may not be as well suited to the class of design problems considered as the exterior penalty function approach. It is also shown that a combination of search algorithms will tend to arrive at an extremal design in a more reliable manner than a single algorithm. The effect of incorporating realistic geometrical constraints on stiffener cross-sections is investigated. A limited comparison is made between minimum weight cylinders designed on the basis of a linear stability analysis and cylinders designed on the basis of empirical buckling data. Finally, a technique for locating more than one extremal is demonstrated.

  1. Machine learning approaches to analysing textual injury surveillance data: a systematic review.

    PubMed

    Vallmuur, Kirsten

    2015-06-01

    To synthesise recent research on the use of machine learning approaches to mining textual injury surveillance data. Systematic review. The electronic databases which were searched included PubMed, Cinahl, Medline, Google Scholar, and Proquest. The bibliography of all relevant articles was examined and associated articles were identified using a snowballing technique. For inclusion, articles were required to meet the following criteria: (a) used a health-related database, (b) focused on injury-related cases, AND used machine learning approaches to analyse textual data. The papers identified through the search were screened resulting in 16 papers selected for review. Articles were reviewed to describe the databases and methodology used, the strength and limitations of different techniques, and quality assurance approaches used. Due to heterogeneity between studies meta-analysis was not performed. Occupational injuries were the focus of half of the machine learning studies and the most common methods described were Bayesian probability or Bayesian network based methods to either predict injury categories or extract common injury scenarios. Models were evaluated through either comparison with gold standard data or content expert evaluation or statistical measures of quality. Machine learning was found to provide high precision and accuracy when predicting a small number of categories, was valuable for visualisation of injury patterns and prediction of future outcomes. However, difficulties related to generalizability, source data quality, complexity of models and integration of content and technical knowledge were discussed. The use of narrative text for injury surveillance has grown in popularity, complexity and quality over recent years. With advances in data mining techniques, increased capacity for analysis of large databases, and involvement of computer scientists in the injury prevention field, along with more comprehensive use and description of quality assurance methods in text mining approaches, it is likely that we will see a continued growth and advancement in knowledge of text mining in the injury field. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Discovering biomedical semantic relations in PubMed queries for information retrieval and database curation.

    PubMed

    Huang, Chung-Chi; Lu, Zhiyong

    2016-01-01

    Identifying relevant papers from the literature is a common task in biocuration. Most current biomedical literature search systems primarily rely on matching user keywords. Semantic search, on the other hand, seeks to improve search accuracy by understanding the entities and contextual relations in user keywords. However, past research has mostly focused on semantically identifying biological entities (e.g. chemicals, diseases and genes) with little effort on discovering semantic relations. In this work, we aim to discover biomedical semantic relations in PubMed queries in an automated and unsupervised fashion. Specifically, we focus on extracting and understanding the contextual information (or context patterns) that is used by PubMed users to represent semantic relations between entities such as 'CHEMICAL-1 compared to CHEMICAL-2' With the advances in automatic named entity recognition, we first tag entities in PubMed queries and then use tagged entities as knowledge to recognize pattern semantics. More specifically, we transform PubMed queries into context patterns involving participating entities, which are subsequently projected to latent topics via latent semantic analysis (LSA) to avoid the data sparseness and specificity issues. Finally, we mine semantically similar contextual patterns or semantic relations based on LSA topic distributions. Our two separate evaluation experiments of chemical-chemical (CC) and chemical-disease (CD) relations show that the proposed approach significantly outperforms a baseline method, which simply measures pattern semantics by similarity in participating entities. The highest performance achieved by our approach is nearly 0.9 and 0.85 respectively for the CC and CD task when compared against the ground truth in terms of normalized discounted cumulative gain (nDCG), a standard measure of ranking quality. These results suggest that our approach can effectively identify and return related semantic patterns in a ranked order covering diverse bio-entity relations. To assess the potential utility of our automated top-ranked patterns of a given relation in semantic search, we performed a pilot study on frequently sought semantic relations in PubMed and observed improved literature retrieval effectiveness based on post-hoc human relevance evaluation. Further investigation in larger tests and in real-world scenarios is warranted. Published by Oxford University Press 2016. This work is written by US Government employees and is in the public domain in the US.

  3. The effect of four user interface concepts on visual scan pattern similarity and information foraging in a complex decision making task.

    PubMed

    Starke, Sandra D; Baber, Chris

    2018-07-01

    User interface (UI) design can affect the quality of decision making, where decisions based on digitally presented content are commonly informed by visually sampling information through eye movements. Analysis of the resulting scan patterns - the order in which people visually attend to different regions of interest (ROIs) - gives an insight into information foraging strategies. In this study, we quantified scan pattern characteristics for participants engaging with conceptually different user interface designs. Four interfaces were modified along two dimensions relating to effort in accessing information: data presentation (either alpha-numerical data or colour blocks), and information access time (all information sources readily available or sequential revealing of information required). The aim of the study was to investigate whether a) people develop repeatable scan patterns and b) different UI concepts affect information foraging and task performance. Thirty-two participants (eight for each UI concept) were given the task to correctly classify 100 credit card transactions as normal or fraudulent based on nine transaction attributes. Attributes varied in their usefulness of predicting the correct outcome. Conventional and more recent (network analysis- and bioinformatics-based) eye tracking metrics were used to quantify visual search. Empirical findings were evaluated in context of random data and possible accuracy for theoretical decision making strategies. Results showed short repeating sequence fragments within longer scan patterns across participants and conditions, comprising a systematic and a random search component. The UI design concept showing alpha-numerical data in full view resulted in most complete data foraging, while the design concept showing colour blocks in full view resulted in the fastest task completion time. Decision accuracy was not significantly affected by UI design. Theoretical calculations showed that the difference in achievable accuracy between very complex and simple decision making strategies was small. We conclude that goal-directed search of familiar information results in repeatable scan pattern fragments (often corresponding to information sources considered particularly important), but no repeatable complete scan pattern. The underlying concept of the UI affects how visual search is performed, and a decision making strategy develops. This should be taken in consideration when designing for applied domains. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Idiosyncratic characteristics of saccadic eye movements when viewing different visual environments.

    PubMed

    Andrews, T J; Coppola, D M

    1999-08-01

    Eye position was recorded in different viewing conditions to assess whether the temporal and spatial characteristics of saccadic eye movements in different individuals are idiosyncratic. Our aim was to determine the degree to which oculomotor control is based on endogenous factors. A total of 15 naive subjects viewed five visual environments: (1) The absence of visual stimulation (i.e. a dark room); (2) a repetitive visual environment (i.e. simple textured patterns); (3) a complex natural scene; (4) a visual search task; and (5) reading text. Although differences in visual environment had significant effects on eye movements, idiosyncrasies were also apparent. For example, the mean fixation duration and size of an individual's saccadic eye movements when passively viewing a complex natural scene covaried significantly with those same parameters in the absence of visual stimulation and in a repetitive visual environment. In contrast, an individual's spatio-temporal characteristics of eye movements during active tasks such as reading text or visual search covaried together, but did not correlate with the pattern of eye movements detected when viewing a natural scene, simple patterns or in the dark. These idiosyncratic patterns of eye movements in normal viewing reveal an endogenous influence on oculomotor control. The independent covariance of eye movements during different visual tasks shows that saccadic eye movements during active tasks like reading or visual search differ from those engaged during the passive inspection of visual scenes.

  5. Getting to the top of Google: search engine optimization.

    PubMed

    Maley, Catherine; Baum, Neil

    2010-01-01

    Search engine optimization is the process of making your Web site appear at or near the top of popular search engines such as Google, Yahoo, and MSN. This is not done by luck or knowing someone working for the search engines but by understanding the process of how search engines select Web sites for placement on top or on the first page. This article will review the process and provide methods and techniques to use to have your site rated at the top or very near the top.

  6. Variable neighborhood search for reverse engineering of gene regulatory networks.

    PubMed

    Nicholson, Charles; Goodwin, Leslie; Clark, Corey

    2017-01-01

    A new search heuristic, Divided Neighborhood Exploration Search, designed to be used with inference algorithms such as Bayesian networks to improve on the reverse engineering of gene regulatory networks is presented. The approach systematically moves through the search space to find topologies representative of gene regulatory networks that are more likely to explain microarray data. In empirical testing it is demonstrated that the novel method is superior to the widely employed greedy search techniques in both the quality of the inferred networks and computational time. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Neglect assessment as an application of virtual reality.

    PubMed

    Broeren, J; Samuelsson, H; Stibrant-Sunnerhagen, K; Blomstrand, C; Rydmark, M

    2007-09-01

    In this study a cancellation task in a virtual environment was applied to describe the pattern of search and the kinematics of hand movements in eight patients with right hemisphere stroke. Four of these patients had visual neglect and four had recovered clinically from initial symptoms of neglect. The performance of the patients was compared with that of a control group consisting of eight subjects with no history of neurological deficits. Patients with neglect as well as patients clinically recovered from neglect showed aberrant search performance in the virtual reality (VR) task, such as mixed search pattern, repeated target pressures and deviating hand movements. The results indicate that in patients with a right hemispheric stroke, this VR application can provide an additional tool for assessment that can identify small variations otherwise not detectable with standard paper-and-pencil tests. VR technology seems to be well suited for the assessment of visually guided manual exploration in space.

  8. Investigating Intrinsic and Extrinsic Variables During Simulated Internet Search

    NASA Technical Reports Server (NTRS)

    Liechty, Molly M.; Madhavan, Poornima

    2011-01-01

    Using an eye tracker we examined decision-making processes during an internet search task. Twenty experienced homebuyers and twenty-five undergraduates from Old Dominion University viewed homes on a simulated real estate website. Several of the homes included physical properties that had the potential to negatively impact individual perceptions. These negative externalities were either easy to change (Level 1) or impossible to change (Level 2). Eye movements were analyzed to examine the relationship between participants' "stated preferences"[verbalized preferences], "revealed preferences" [actual decisions[, and experience. Dwell times, fixation durations/counts, and saccade counts/amplitudes were analyzed. Results revealed that experienced homebuyers demonstrated a more refined search pattern than novice searchers. Experienced homebuyers were also less impacted by negative externalities. Furthermore, stated preferences were discrepant from revealed preferences; although participants initially stated they liked/disliked a graphic, their eye movement patterns did not reflect this trend. These results have important implications for design of user-friendly web interfaces.

  9. A Computer-Aided Instruction Program for Teaching the TOPS20-MM Facility on the DDN (Defense Data Network)

    DTIC Science & Technology

    1988-06-01

    Continue on reverse if necessary and identify by block number) FIELD GROUP SUB-GROUP Computer Assisted Instruction; Artificial Intelligence 194...while he/she tries to perform given tasks. Means-ends analysis, a classic technique for solving search problems in Artificial Intelligence, has been...he/she tries to perform given tasks. Means-ends analysis, a classic technique for solving search problems in Artificial Intelligence, has been used

  10. Implementation of GAMMON - An efficient load balancing strategy for a local computer system

    NASA Technical Reports Server (NTRS)

    Baumgartner, Katherine M.; Kling, Ralph M.; Wah, Benjamin W.

    1989-01-01

    GAMMON (Global Allocation from Maximum to Minimum in cONstant time), an efficient load-balancing algorithm, is described. GAMMON uses the available broadcast capability of multiaccess networks to implement an efficient search technique for finding hosts with maximal and minimal loads. The search technique has an average overhead which is independent of the number of participating stations. The transition from the theoretical concept to a practical, reliable, and efficient implementation is described.

  11. Task demands determine the specificity of the search template.

    PubMed

    Bravo, Mary J; Farid, Hany

    2012-01-01

    When searching for an object, an observer holds a representation of the target in mind while scanning the scene. If the observer repeats the search, performance may become more efficient as the observer hones this target representation, or "search template," to match the specific demands of the search task. An effective search template must have two characteristics: It must reliably discriminate the target from the distractors, and it must tolerate variability in the appearance of the target. The present experiment examined how the tolerance of the search template is affected by the search task. Two groups of 18 observers trained on the same set of stimuli blocked either by target image (block-by-image group) or by target category (block-by-category group). One or two days after training, both groups were tested on a related search task. The pattern of test results revealed that the two groups of observers had developed different search templates, and that the templates of the block-by-category observers better captured the general characteristics of the category. These results demonstrate that observers match their search templates to the demands of the search task.

  12. Computer assisted analysis of auroral images obtained from high altitude polar satellites

    NASA Technical Reports Server (NTRS)

    Samadani, Ramin; Flynn, Michael

    1993-01-01

    Automatic techniques that allow the extraction of physically significant parameters from auroral images were developed. This allows the processing of a much larger number of images than is currently possible with manual techniques. Our techniques were applied to diverse auroral image datasets. These results were made available to geophysicists at NASA and at universities in the form of a software system that performs the analysis. After some feedback from users, an upgraded system was transferred to NASA and to two universities. The feasibility of user-trained search and retrieval of large amounts of data using our automatically derived parameter indices was demonstrated. Techniques based on classification and regression trees (CART) were developed and applied to broaden the types of images to which the automated search and retrieval may be applied. Our techniques were tested with DE-1 auroral images.

  13. Microwave Technique for Detecting and Locating Concealed Weapons

    DOT National Transportation Integrated Search

    1971-12-01

    The subject of this report is the evaluation of a microwave technique for detecting and locating weapons concealed under clothing. The principal features of this technique are: persons subjected to search are not exposed to 'objectional' microwave ra...

  14. Technological innovations for a sustainable business model in the semiconductor industry

    NASA Astrophysics Data System (ADS)

    Levinson, Harry J.

    2014-09-01

    Increasing costs of wafer processing, particularly for lithographic processes, have made it increasingly difficult to achieve simultaneous reductions in cost-per-function and area per device. Multiple patterning techniques have made possible the fabrication of circuit layouts below the resolution limit of single optical exposures but have led to significant increases in the costs of patterning. Innovative techniques, such as self-aligned double patterning (SADP) have enabled good device performance when using less expensive patterning equipment. Other innovations have directly reduced the cost of manufacturing. A number of technical challenges must be overcome to enable a return to single-exposure patterning using short wavelength optical techniques, such as EUV patterning.

  15. Profiling and Quantifying Differential Gene Transcription Provide Insights into Ganoderic Acid Biosynthesis in Ganoderma lucidum in Response to Methyl Jasmonate

    PubMed Central

    Shi, Liang; Mu, Da-Shuai; Jiang, Ai-Liang; Han, Qin; Zhao, Ming-Wen

    2013-01-01

    Ganoderma lucidum is a mushroom with traditional medicinal properties that has been widely used in China and other countries in Eastern Asia. Ganoderic acids (GA) produced by G. lucidum exhibit important pharmacological activities. Previous studies have demonstrated that methyl jasmonate (MeJA) is a potent inducer of GA biosynthesis and the expression of genes involved in the GA biosynthesis pathway in G. lucidum. To further explore the mechanism of GA biosynthesis, cDNA-Amplified Fragment Length Polymorphism (cDNA-AFLP) was used to identify genes that are differentially expressed in response to MeJA. Using 64 primer combinations, over 3910 transcriptionally derived fragments (TDFs) were obtained. Reliable sequence data were obtained for 390 of 458 selected TDFs. Ninety of these TDFs were annotated with known functions through BLASTX searching the GenBank database, and 12 annotated TDFs were assigned into secondary metabolic pathways by searching the KEGGPATHWAY database. Twenty-five TDFs were selected for qRT-PCR analysis to confirm the expression patterns observed with cDNA-AFLP. The qRT-PCR results were consistent with the altered patterns of gene expression revealed by the cDNA-AFLP technique. Additionally, the transcript levels of 10 genes were measured at the mycelium, primordia, and fruiting body developmental stages of G. lucidum. The greatest expression levels were reached during primordia for all of the genes except cytochrome b2 reached its highest expression level in the mycelium stage. This study not only identifies new candidate genes involved in the regulation of GA biosynthesis but also provides further insight into MeJA-induced gene expression and secondary metabolic response in G. lucidum. PMID:23762280

  16. In Search of Search Engine Marketing Strategy Amongst SME's in Ireland

    NASA Astrophysics Data System (ADS)

    Barry, Chris; Charleton, Debbie

    Researchers have identified the Web as a searchers first port of call for locating information. Search Engine Marketing (SEM) strategies have been noted as a key consideration when developing, maintaining and managing Websites. A study presented here of SEM practices of Irish small to medium enterprises (SMEs) reveals they plan to spend more resources on SEM in the future. Most firms utilize an informal SEM strategy, where Website optimization is perceived most effective in attracting traffic. Respondents cite the use of ‘keywords in title and description tags’ as the most used SEM technique, followed by the use of ‘keywords throughout the whole Website’; while ‘Pay for Placement’ was most widely used Paid Search technique. In concurrence with the literature, measuring SEM performance remains a significant challenge with many firms unsure if they measure it effectively. An encouraging finding is that Irish SMEs adopt a positive ethical posture when undertaking SEM.

  17. Logic-Based Retrieval: Technology for Content-Oriented and Analytical Querying of Patent Data

    NASA Astrophysics Data System (ADS)

    Klampanos, Iraklis Angelos; Wu, Hengzhi; Roelleke, Thomas; Azzam, Hany

    Patent searching is a complex retrieval task. An initial document search is only the starting point of a chain of searches and decisions that need to be made by patent searchers. Keyword-based retrieval is adequate for document searching, but it is not suitable for modelling comprehensive retrieval strategies. DB-like and logical approaches are the state-of-the-art techniques to model strategies, reasoning and decision making. In this paper we present the application of logical retrieval to patent searching. The two grand challenges are expressiveness and scalability, where high degree of expressiveness usually means a loss in scalability. In this paper we report how to maintain scalability while offering the expressiveness of logical retrieval required for solving patent search tasks. We present logical retrieval background, and how to model data-source selection and results' fusion. Moreover, we demonstrate the modelling of a retrieval strategy, a technique by which patent professionals are able to express, store and exchange their strategies and rationales when searching patents or when making decisions. An overview of the architecture and technical details complement the paper, while the evaluation reports preliminary results on how query processing times can be guaranteed, and how quality is affected by trading off responsiveness.

  18. Guided Text Search Using Adaptive Visual Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steed, Chad A; Symons, Christopher T; Senter, James K

    This research demonstrates the promise of augmenting interactive visualizations with semi- supervised machine learning techniques to improve the discovery of significant associations and insights in the search and analysis of textual information. More specifically, we have developed a system called Gryffin that hosts a unique collection of techniques that facilitate individualized investigative search pertaining to an ever-changing set of analytical questions over an indexed collection of open-source documents related to critical national infrastructure. The Gryffin client hosts dynamic displays of the search results via focus+context record listings, temporal timelines, term-frequency views, and multiple coordinate views. Furthermore, as the analyst interactsmore » with the display, the interactions are recorded and used to label the search records. These labeled records are then used to drive semi-supervised machine learning algorithms that re-rank the unlabeled search records such that potentially relevant records are moved to the top of the record listing. Gryffin is described in the context of the daily tasks encountered at the US Department of Homeland Security s Fusion Center, with whom we are collaborating in its development. The resulting system is capable of addressing the analysts information overload that can be directly attributed to the deluge of information that must be addressed in the search and investigative analysis of textual information.« less

  19. Finding "hard to find" literature on hard to find groups: A novel technique to search grey literature on refugees and asylum seekers.

    PubMed

    Enticott, Joanne; Buck, Kimberly; Shawyer, Frances

    2018-03-01

    There is a lack of information on how to execute effective searches of the grey literature on refugee and asylum seeker groups for inclusion in systematic reviews. High-quality government reports and other grey literature relevant to refugees may not always be identified in conventional literature searches. During the process of conducting a recent systematic review, we developed a novel strategy for systematically searching international refugee and asylum seeker-related grey literature. The approach targets governmental health departments and statistical agencies, who have considerable access to refugee and asylum seeker populations for research purposes but typically do not publish findings in academic forums. Compared to a conventional grey literature search strategy, our novel technique yielded an eightfold increase in relevant high-quality grey sources that provided valuable content in informing our review. Incorporating a search of the grey literature into systematic reviews of refugee and asylum seeker research is essential to providing a more complete view of the evidence. Our novel strategy offers a practical and feasible method of conducting systematic grey literature searches that may be adaptable to a range of research questions, contexts, and resource constraints. Copyright © 2017 John Wiley & Sons, Ltd.

  20. Pattern Search in Multi-structure Data: A Framework for the Next-Generation Evidence-based Medicine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sukumar, Sreenivas R; Ainsworth, Keela C

    With the advent of personalized and evidence-based medicine, the need for a framework to analyze/interpret quantitative measurements (blood work, toxicology, etc.) with qualitative descriptions (specialist reports after reading images, bio-medical knowledge-bases) to predict diagnostic risks is fast emerging. Addressing this need, we pose and address the following questions (i) How can we jointly analyze both qualitative and quantitative data ? (ii) Is the fusion of multi-structure data expected to provide better insights than either of them individually ? We present experiments on two bio-medical data sets - mammography and traumatic brain studies to demonstrate architectures and tools for evidence-pattern search.

Top