Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-21
... RAILROAD RETIREMENT BOARD Computer Matching and Privacy Protection Act of 1988; Report of Matching...: Notice of a renewal of an existing computer matching program due to expire on May 24, 2013. SUMMARY: As... of its intent to renew an ongoing computer matching program. In this match, we provide certain...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-05
... RAILROAD RETIREMENT BOARD Computer Matching and Privacy Protection Act of 1988; Report of Matching.... General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), amended the Privacy... of an existing computer matching program due to expire on August 12, 2012. SUMMARY: The Privacy Act...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-24
...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching... above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988... computer matching involving the Federal government could be performed and adding certain protections for...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0067] Privacy Act of 1974; Computer Matching... Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching program... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-21
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0059] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid Services (CMS))--Match Number 1076 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-14
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0022] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare & Medicaid Services (CMS))--Match Number 1076 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-01
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0089] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Homeland Security (DHS))--Match Number 1010 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching program that...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-21
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0010] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Railroad Retirement Board (RRB))--Match Number 1006 AGENCY: Social Security Administration. ACTION: Notice of a renewal of an existing computer matching program that will expire on...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... 1021 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer.... SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub... computer matching involving the Federal government could be performed and adding certain protections for...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-17
...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer-matching... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0021] Privacy Act of 1974, as Amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-29
... Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching program that... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0002] Privacy Act of 1974, as Amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-08
...: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer matching... protections for such persons. The Privacy Act, as amended, regulates the use of computer matching by Federal... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0010] Privacy Act of 1974, as Amended...
Computers Launch Faster, Better Job Matching
ERIC Educational Resources Information Center
Stevenson, Gloria
1976-01-01
Employment Security Automation Project (ESAP), a five-year program sponsored by the Employment and Training Administration, features an innovative computer-assisted job matching system and instantaneous computer-assisted service for unemployment insurance claimants. ESAP will also consolidate existing automated employment security systems to…
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-01
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2009-0043] Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration/Railroad Retirement Board (SSA/RRB))-- Match Number 1308 AGENCY: Social Security Administration (SSA). ACTION: Notice of renewal of an existing...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-18
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0055] Privacy Act of 1974, as Amended; Computer Matching Program (Social Security Administration (SSA)/Office of Personnel Management (OPM))--Match Number 1307 AGENCY: Social Security Administration. ACTION: Notice of a renewal of an existing...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-15
... 1021 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of existing computer... above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0073] Privacy Act of 1974, as Amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-02-21
... 1310 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer..., as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0007] Privacy Act of 1974, as Amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-18
... 1310 AGENCY: Social Security Administration (SSA) ACTION: Notice of a renewal of an existing computer..., as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0035] Privacy Act of 1974, as Amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-20
... 1016 AGENCY: Social Security Administration (SSA). ACTION: Notice of a renewal of an existing computer... above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0022] Privacy Act of 1974, as Amended...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-03-31
... program for the purpose of income verifications and computer matching. DATES: Effective Date: The... additional verification to identify inappropriate (excess or insufficient) rental assistance, and perhaps... Act, the Native American Housing Assistance and Self-Determination Act of 1996, and the Quality...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-05
... provides an updated cost/benefit analysis providing an assessment of the benefits attained by HUD through... the scope of the existing computer matching program to now include the updated cost/ benefit analysis... change, and find a continued favorable examination of benefit/cost results; and (2) All parties certify...
Joint histogram-based cost aggregation for stereo matching.
Min, Dongbo; Lu, Jiangbo; Do, Minh N
2013-10-01
This paper presents a novel method for performing efficient cost aggregation in stereo matching. The cost aggregation problem is reformulated from the perspective of a histogram, giving us the potential to reduce the complexity of the cost aggregation in stereo matching significantly. Differently from previous methods which have tried to reduce the complexity in terms of the size of an image and a matching window, our approach focuses on reducing the computational redundancy that exists among the search range, caused by a repeated filtering for all the hypotheses. Moreover, we also reduce the complexity of the window-based filtering through an efficient sampling scheme inside the matching window. The tradeoff between accuracy and complexity is extensively investigated by varying the parameters used in the proposed method. Experimental results show that the proposed method provides high-quality disparity maps with low complexity and outperforms existing local methods. This paper also provides new insights into complexity-constrained stereo-matching algorithm design.
Nonparametric Bayesian Modeling for Automated Database Schema Matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferragut, Erik M; Laska, Jason A
2015-01-01
The problem of merging databases arises in many government and commercial applications. Schema matching, a common first step, identifies equivalent fields between databases. We introduce a schema matching framework that builds nonparametric Bayesian models for each field and compares them by computing the probability that a single model could have generated both fields. Our experiments show that our method is more accurate and faster than the existing instance-based matching algorithms in part because of the use of nonparametric Bayesian models.
Azad, Ariful; Buluç, Aydın
2016-05-16
We describe parallel algorithms for computing maximal cardinality matching in a bipartite graph on distributed-memory systems. Unlike traditional algorithms that match one vertex at a time, our algorithms process many unmatched vertices simultaneously using a matrix-algebraic formulation of maximal matching. This generic matrix-algebraic framework is used to develop three efficient maximal matching algorithms with minimal changes. The newly developed algorithms have two benefits over existing graph-based algorithms. First, unlike existing parallel algorithms, cardinality of matching obtained by the new algorithms stays constant with increasing processor counts, which is important for predictable and reproducible performance. Second, relying on bulk-synchronous matrix operations,more » these algorithms expose a higher degree of parallelism on distributed-memory platforms than existing graph-based algorithms. We report high-performance implementations of three maximal matching algorithms using hybrid OpenMP-MPI and evaluate the performance of these algorithm using more than 35 real and randomly generated graphs. On real instances, our algorithms achieve up to 200 × speedup on 2048 cores of a Cray XC30 supercomputer. Even higher speedups are obtained on larger synthetically generated graphs where our algorithms show good scaling on up to 16,384 cores.« less
Fast template matching with polynomials.
Omachi, Shinichiro; Omachi, Masako
2007-08-01
Template matching is widely used for many applications in image and signal processing. This paper proposes a novel template matching algorithm, called algebraic template matching. Given a template and an input image, algebraic template matching efficiently calculates similarities between the template and the partial images of the input image, for various widths and heights. The partial image most similar to the template image is detected from the input image for any location, width, and height. In the proposed algorithm, a polynomial that approximates the template image is used to match the input image instead of the template image. The proposed algorithm is effective especially when the width and height of the template image differ from the partial image to be matched. An algorithm using the Legendre polynomial is proposed for efficient approximation of the template image. This algorithm not only reduces computational costs, but also improves the quality of the approximated image. It is shown theoretically and experimentally that the computational cost of the proposed algorithm is much smaller than the existing methods.
A Simple and Robust Method for Partially Matched Samples Using the P-Values Pooling Approach
Kuan, Pei Fen; Huang, Bo
2013-01-01
This paper focuses on statistical analyses in scenarios where some samples from the matched pairs design are missing, resulting in partially matched samples. Motivated by the idea of meta-analysis, we recast the partially matched samples as coming from two experimental designs, and propose a simple yet robust approach based on the weighted Z-test to integrate the p-values computed from these two designs. We show that the proposed approach achieves better operating characteristics in simulations and a case study, compared to existing methods for partially matched samples. PMID:23417968
Myers, E W; Mount, D W
1986-01-01
We describe a program which may be used to find approximate matches to a short predefined DNA sequence in a larger target DNA sequence. The program predicts the usefulness of specific DNA probes and sequencing primers and finds nearly identical sequences that might represent the same regulatory signal. The program is written in the C programming language and will run on virtually any computer system with a C compiler, such as the IBM/PC and other computers running under the MS/DOS and UNIX operating systems. The program has been integrated into an existing software package for the IBM personal computer (see article by Mount and Conrad, this volume). Some examples of its use are given. PMID:3753785
Sketch Matching on Topology Product Graph.
Liang, Shuang; Luo, Jun; Liu, Wenyin; Wei, Yichen
2015-08-01
Sketch matching is the fundamental problem in sketch based interfaces. After years of study, it remains challenging when there exists large irregularity and variations in the hand drawn sketch shapes. While most existing works exploit topology relations and graph representations for this problem, they are usually limited by the coarse topology exploration and heuristic (thus suboptimal) similarity metrics between graphs. We present a new sketch matching method with two novel contributions. We introduce a comprehensive definition of topology relations, which results in a rich and informative graph representation of sketches. For graph matching, we propose topology product graph that retains the full correspondence for matching two graphs. Based on it, we derive an intuitive sketch similarity metric whose exact solution is easy to compute. In addition, the graph representation and new metric naturally support partial matching, an important practical problem that received less attention in the literature. Extensive experimental results on a real challenging dataset and the superior performance of our method show that it outperforms the state-of-the-art.
Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo
2015-09-18
A content-matched (CM) rangemonitoring query overmoving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CMrange monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods.
Estimation of High-Dimensional Graphical Models Using Regularized Score Matching
Lin, Lina; Drton, Mathias; Shojaie, Ali
2017-01-01
Graphical models are widely used to model stochastic dependences among large collections of variables. We introduce a new method of estimating undirected conditional independence graphs based on the score matching loss, introduced by Hyvärinen (2005), and subsequently extended in Hyvärinen (2007). The regularized score matching method we propose applies to settings with continuous observations and allows for computationally efficient treatment of possibly non-Gaussian exponential family models. In the well-explored Gaussian setting, regularized score matching avoids issues of asymmetry that arise when applying the technique of neighborhood selection, and compared to existing methods that directly yield symmetric estimates, the score matching approach has the advantage that the considered loss is quadratic and gives piecewise linear solution paths under ℓ1 regularization. Under suitable irrepresentability conditions, we show that ℓ1-regularized score matching is consistent for graph estimation in sparse high-dimensional settings. Through numerical experiments and an application to RNAseq data, we confirm that regularized score matching achieves state-of-the-art performance in the Gaussian case and provides a valuable tool for computationally efficient estimation in non-Gaussian graphical models. PMID:28638498
Deformable registration of CT and cone-beam CT with local intensity matching.
Park, Seyoun; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon
2017-02-07
Cone-beam CT (CBCT) is a widely used intra-operative imaging modality in image-guided radiotherapy and surgery. A short scan followed by a filtered-backprojection is typically used for CBCT reconstruction. While data on the mid-plane (plane of source-detector rotation) is complete, off-mid-planes undergo different information deficiency and the computed reconstructions are approximate. This causes different reconstruction artifacts at off-mid-planes depending on slice locations, and therefore impedes accurate registration between CT and CBCT. In this paper, we propose a method to accurately register CT and CBCT by iteratively matching local CT and CBCT intensities. We correct CBCT intensities by matching local intensity histograms slice by slice in conjunction with intensity-based deformable registration. The correction-registration steps are repeated in an alternating way until the result image converges. We integrate the intensity matching into three different deformable registration methods, B-spline, demons, and optical flow that are widely used for CT-CBCT registration. All three registration methods were implemented on a graphics processing unit for efficient parallel computation. We tested the proposed methods on twenty five head and neck cancer cases and compared the performance with state-of-the-art registration methods. Normalized cross correlation (NCC), structural similarity index (SSIM), and target registration error (TRE) were computed to evaluate the registration performance. Our method produced overall NCC of 0.96, SSIM of 0.94, and TRE of 2.26 → 2.27 mm, outperforming existing methods by 9%, 12%, and 27%, respectively. Experimental results also show that our method performs consistently and is more accurate than existing algorithms, and also computationally efficient.
Deformable registration of CT and cone-beam CT with local intensity matching
NASA Astrophysics Data System (ADS)
Park, Seyoun; Plishker, William; Quon, Harry; Wong, John; Shekhar, Raj; Lee, Junghoon
2017-02-01
Cone-beam CT (CBCT) is a widely used intra-operative imaging modality in image-guided radiotherapy and surgery. A short scan followed by a filtered-backprojection is typically used for CBCT reconstruction. While data on the mid-plane (plane of source-detector rotation) is complete, off-mid-planes undergo different information deficiency and the computed reconstructions are approximate. This causes different reconstruction artifacts at off-mid-planes depending on slice locations, and therefore impedes accurate registration between CT and CBCT. In this paper, we propose a method to accurately register CT and CBCT by iteratively matching local CT and CBCT intensities. We correct CBCT intensities by matching local intensity histograms slice by slice in conjunction with intensity-based deformable registration. The correction-registration steps are repeated in an alternating way until the result image converges. We integrate the intensity matching into three different deformable registration methods, B-spline, demons, and optical flow that are widely used for CT-CBCT registration. All three registration methods were implemented on a graphics processing unit for efficient parallel computation. We tested the proposed methods on twenty five head and neck cancer cases and compared the performance with state-of-the-art registration methods. Normalized cross correlation (NCC), structural similarity index (SSIM), and target registration error (TRE) were computed to evaluate the registration performance. Our method produced overall NCC of 0.96, SSIM of 0.94, and TRE of 2.26 → 2.27 mm, outperforming existing methods by 9%, 12%, and 27%, respectively. Experimental results also show that our method performs consistently and is more accurate than existing algorithms, and also computationally efficient.
Jung, HaRim; Song, MoonBae; Youn, Hee Yong; Kim, Ung Mo
2015-01-01
A content-matched (CM) range monitoring query over moving objects continually retrieves the moving objects (i) whose non-spatial attribute values are matched to given non-spatial query values; and (ii) that are currently located within a given spatial query range. In this paper, we propose a new query indexing structure, called the group-aware query region tree (GQR-tree) for efficient evaluation of CM range monitoring queries. The primary role of the GQR-tree is to help the server leverage the computational capabilities of moving objects in order to improve the system performance in terms of the wireless communication cost and server workload. Through a series of comprehensive simulations, we verify the superiority of the GQR-tree method over the existing methods. PMID:26393613
Social computing for image matching
Rivas, Alberto; Sánchez-Torres, Ramiro; Rodríguez, Sara
2018-01-01
One of the main technological trends in the last five years is mass data analysis. This trend is due in part to the emergence of concepts such as social networks, which generate a large volume of data that can provide added value through their analysis. This article is focused on a business and employment-oriented social network. More specifically, it focuses on the analysis of information provided by different users in image form. The images are analyzed to detect whether other existing users have posted or talked about the same image, even if the image has undergone some type of modification such as watermarks or color filters. This makes it possible to establish new connections among unknown users by detecting what they are posting or whether they are talking about the same images. The proposed solution consists of an image matching algorithm, which is based on the rapid calculation and comparison of hashes. However, there is a computationally expensive aspect in charge of revoking possible image transformations. As a result, the image matching process is supported by a distributed forecasting system that enables or disables nodes to serve all the possible requests. The proposed system has shown promising results for matching modified images, especially when compared with other existing systems. PMID:29813082
Equasions for Curriculum Improvement.
ERIC Educational Resources Information Center
Eckenrod, James S.
1986-01-01
Describes the Technology in Curriculum (TIC) program resource guides which will be distributed to California schools in the fall of 1986. These guides match available instructional television programs and computer software to existing California curriculum guides in order to facilitate teachers' classroom use. (JDH)
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-05
.... Description of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub... 1974: CMS Computer Matching Program Match No. 2013-01; HHS Computer Matching Program Match No. 1312...). ACTION: Notice of Computer Matching Program (CMP). SUMMARY: In accordance with the requirements of the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-02
... of the Matching Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L.... 100-503, the Computer Matching and Privacy Protection Act (CMPPA) of 1988), the Office of Management... 1974; CMS Computer Match No. 2010-03, HHS Computer Match No. 1003, SSA Computer Match No. 1048, IRS...
Large-scale computational drug repositioning to find treatments for rare diseases.
Govindaraj, Rajiv Gandhi; Naderi, Misagh; Singha, Manali; Lemoine, Jeffrey; Brylinski, Michal
2018-01-01
Rare, or orphan, diseases are conditions afflicting a small subset of people in a population. Although these disorders collectively pose significant health care problems, drug companies require government incentives to develop drugs for rare diseases due to extremely limited individual markets. Computer-aided drug repositioning, i.e., finding new indications for existing drugs, is a cheaper and faster alternative to traditional drug discovery offering a promising venue for orphan drug research. Structure-based matching of drug-binding pockets is among the most promising computational techniques to inform drug repositioning. In order to find new targets for known drugs ultimately leading to drug repositioning, we recently developed e MatchSite, a new computer program to compare drug-binding sites. In this study, e MatchSite is combined with virtual screening to systematically explore opportunities to reposition known drugs to proteins associated with rare diseases. The effectiveness of this integrated approach is demonstrated for a kinase inhibitor, which is a confirmed candidate for repositioning to synapsin Ia. The resulting dataset comprises 31,142 putative drug-target complexes linked to 980 orphan diseases. The modeling accuracy is evaluated against the structural data recently released for tyrosine-protein kinase HCK. To illustrate how potential therapeutics for rare diseases can be identified, we discuss a possibility to repurpose a steroidal aromatase inhibitor to treat Niemann-Pick disease type C. Overall, the exhaustive exploration of the drug repositioning space exposes new opportunities to combat orphan diseases with existing drugs. DrugBank/Orphanet repositioning data are freely available to research community at https://osf.io/qdjup/.
Window-based method for approximating the Hausdorff in three-dimensional range imagery
Koch, Mark W [Albuquerque, NM
2009-06-02
One approach to pattern recognition is to use a template from a database of objects and match it to a probe image containing the unknown. Accordingly, the Hausdorff distance can be used to measure the similarity of two sets of points. In particular, the Hausdorff can measure the goodness of a match in the presence of occlusion, clutter, and noise. However, existing 3D algorithms for calculating the Hausdorff are computationally intensive, making them impractical for pattern recognition that requires scanning of large databases. The present invention is directed to a new method that can efficiently, in time and memory, compute the Hausdorff for 3D range imagery. The method uses a window-based approach.
Lombaert, Herve; Grady, Leo; Polimeni, Jonathan R.; Cheriet, Farida
2013-01-01
Existing methods for surface matching are limited by the trade-off between precision and computational efficiency. Here we present an improved algorithm for dense vertex-to-vertex correspondence that uses direct matching of features defined on a surface and improves it by using spectral correspondence as a regularization. This algorithm has the speed of both feature matching and spectral matching while exhibiting greatly improved precision (distance errors of 1.4%). The method, FOCUSR, incorporates implicitly such additional features to calculate the correspondence and relies on the smoothness of the lowest-frequency harmonics of a graph Laplacian to spatially regularize the features. In its simplest form, FOCUSR is an improved spectral correspondence method that nonrigidly deforms spectral embeddings. We provide here a full realization of spectral correspondence where virtually any feature can be used as additional information using weights on graph edges, but also on graph nodes and as extra embedded coordinates. As an example, the full power of FOCUSR is demonstrated in a real case scenario with the challenging task of brain surface matching across several individuals. Our results show that combining features and regularizing them in a spectral embedding greatly improves the matching precision (to a sub-millimeter level) while performing at much greater speed than existing methods. PMID:23868776
Graph edit distance from spectral seriation.
Robles-Kelly, Antonio; Hancock, Edwin R
2005-03-01
This paper is concerned with computing graph edit distance. One of the criticisms that can be leveled at existing methods for computing graph edit distance is that they lack some of the formality and rigor of the computation of string edit distance. Hence, our aim is to convert graphs to string sequences so that string matching techniques can be used. To do this, we use a graph spectral seriation method to convert the adjacency matrix into a string or sequence order. We show how the serial ordering can be established using the leading eigenvector of the graph adjacency matrix. We pose the problem of graph-matching as a maximum a posteriori probability (MAP) alignment of the seriation sequences for pairs of graphs. This treatment leads to an expression in which the edit cost is the negative logarithm of the a posteriori sequence alignment probability. We compute the edit distance by finding the sequence of string edit operations which minimizes the cost of the path traversing the edit lattice. The edit costs are determined by the components of the leading eigenvectors of the adjacency matrix and by the edge densities of the graphs being matched. We demonstrate the utility of the edit distance on a number of graph clustering problems.
Further studies using matched filter theory and stochastic simulation for gust loads prediction
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd Iii
1993-01-01
This paper describes two analysis methods -- one deterministic, the other stochastic -- for computing maximized and time-correlated gust loads for aircraft with nonlinear control systems. The first method is based on matched filter theory; the second is based on stochastic simulation. The paper summarizes the methods, discusses the selection of gust intensity for each method and presents numerical results. A strong similarity between the results from the two methods is seen to exist for both linear and nonlinear configurations.
RMP: Reduced-set matching pursuit approach for efficient compressed sensing signal reconstruction.
Abdel-Sayed, Michael M; Khattab, Ahmed; Abu-Elyazeed, Mohamed F
2016-11-01
Compressed sensing enables the acquisition of sparse signals at a rate that is much lower than the Nyquist rate. Compressed sensing initially adopted [Formula: see text] minimization for signal reconstruction which is computationally expensive. Several greedy recovery algorithms have been recently proposed for signal reconstruction at a lower computational complexity compared to the optimal [Formula: see text] minimization, while maintaining a good reconstruction accuracy. In this paper, the Reduced-set Matching Pursuit (RMP) greedy recovery algorithm is proposed for compressed sensing. Unlike existing approaches which either select too many or too few values per iteration, RMP aims at selecting the most sufficient number of correlation values per iteration, which improves both the reconstruction time and error. Furthermore, RMP prunes the estimated signal, and hence, excludes the incorrectly selected values. The RMP algorithm achieves a higher reconstruction accuracy at a significantly low computational complexity compared to existing greedy recovery algorithms. It is even superior to [Formula: see text] minimization in terms of the normalized time-error product, a new metric introduced to measure the trade-off between the reconstruction time and error. RMP superior performance is illustrated with both noiseless and noisy samples.
Citation Matching in Sanskrit Corpora Using Local Alignment
NASA Astrophysics Data System (ADS)
Prasad, Abhinandan S.; Rao, Shrisha
Citation matching is the problem of finding which citation occurs in a given textual corpus. Most existing citation matching work is done on scientific literature. The goal of this paper is to present methods for performing citation matching on Sanskrit texts. Exact matching and approximate matching are the two methods for performing citation matching. The exact matching method checks for exact occurrence of the citation with respect to the textual corpus. Approximate matching is a fuzzy string-matching method which computes a similarity score between an individual line of the textual corpus and the citation. The Smith-Waterman-Gotoh algorithm for local alignment, which is generally used in bioinformatics, is used here for calculating the similarity score. This similarity score is a measure of the closeness between the text and the citation. The exact- and approximate-matching methods are evaluated and compared. The methods presented can be easily applied to corpora in other Indic languages like Kannada, Tamil, etc. The approximate-matching method can in particular be used in the compilation of critical editions and plagiarism detection in a literary work.
Heinrich, Andreas; Güttler, Felix; Wendt, Sebastian; Schenkl, Sebastian; Hubig, Michael; Wagner, Rebecca; Mall, Gita; Teichgräber, Ulf
2018-06-18
In forensic odontology the comparison between antemortem and postmortem panoramic radiographs (PRs) is a reliable method for person identification. The purpose of this study was to improve and automate identification of unknown people by comparison between antemortem and postmortem PR using computer vision. The study includes 43 467 PRs from 24 545 patients (46 % females/54 % males). All PRs were filtered and evaluated with Matlab R2014b including the toolboxes image processing and computer vision system. The matching process used the SURF feature to find the corresponding points between two PRs (unknown person and database entry) out of the whole database. From 40 randomly selected persons, 34 persons (85 %) could be reliably identified by corresponding PR matching points between an already existing scan in the database and the most recent PR. The systematic matching yielded a maximum of 259 points for a successful identification between two different PRs of the same person and a maximum of 12 corresponding matching points for other non-identical persons in the database. Hence 12 matching points are the threshold for reliable assignment. Operating with an automatic PR system and computer vision could be a successful and reliable tool for identification purposes. The applied method distinguishes itself by virtue of its fast and reliable identification of persons by PR. This Identification method is suitable even if dental characteristics were removed or added in the past. The system seems to be robust for large amounts of data. · Computer vision allows an automated antemortem and postmortem comparison of panoramic radiographs (PRs) for person identification.. · The present method is able to find identical matching partners among huge datasets (big data) in a short computing time.. · The identification method is suitable even if dental characteristics were removed or added.. · Heinrich A, Güttler F, Wendt S et al. Forensic Odontology: Automatic Identification of Persons Comparing Antemortem and Postmortem Panoramic Radiographs Using Computer Vision. Fortschr Röntgenstr 2018; DOI: 10.1055/a-0632-4744. © Georg Thieme Verlag KG Stuttgart · New York.
NASA Technical Reports Server (NTRS)
Hamilton, George S.; Williams, Jermaine C.
1998-01-01
This paper describes the methods, rationale, and comparative results of the conversion of an intravehicular (IVA) 3D human computer model (HCM) to extravehicular (EVA) use and compares the converted model to an existing model on another computer platform. The task of accurately modeling a spacesuited human figure in software is daunting: the suit restricts the human's joint range of motion (ROM) and does not have joints collocated with human joints. The modeling of the variety of materials needed to construct a space suit (e. g. metal bearings, rigid fiberglass torso, flexible cloth limbs and rubber coated gloves) attached to a human figure is currently out of reach of desktop computer hardware and software. Therefore a simplified approach was taken. The HCM's body parts were enlarged and the joint ROM was restricted to match the existing spacesuit model. This basic approach could be used to model other restrictive environments in industry such as chemical or fire protective clothing. In summary, the approach provides a moderate fidelity, usable tool which will run on current notebook computers.
A novel image registration approach via combining local features and geometric invariants
Lu, Yan; Gao, Kun; Zhang, Tinghua; Xu, Tingfa
2018-01-01
Image registration is widely used in many fields, but the adaptability of the existing methods is limited. This work proposes a novel image registration method with high precision for various complex applications. In this framework, the registration problem is divided into two stages. First, we detect and describe scale-invariant feature points using modified computer vision-oriented fast and rotated brief (ORB) algorithm, and a simple method to increase the performance of feature points matching is proposed. Second, we develop a new local constraint of rough selection according to the feature distances. Evidence shows that the existing matching techniques based on image features are insufficient for the images with sparse image details. Then, we propose a novel matching algorithm via geometric constraints, and establish local feature descriptions based on geometric invariances for the selected feature points. Subsequently, a new price function is constructed to evaluate the similarities between points and obtain exact matching pairs. Finally, we employ the progressive sample consensus method to remove wrong matches and calculate the space transform parameters. Experimental results on various complex image datasets verify that the proposed method is more robust and significantly reduces the rate of false matches while retaining more high-quality feature points. PMID:29293595
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-28
...; Computer Matching Program (SSA Internal Match)--Match Number 1014 AGENCY: Social Security Administration... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching....C. 552a, as amended, and the provisions of the Computer Matching and Privacy Protection Act of 1988...
7 CFR 272.12 - Computer matching requirements.
Code of Federal Regulations, 2014 CFR
2014-01-01
... 7 Agriculture 4 2014-01-01 2014-01-01 false Computer matching requirements. 272.12 Section 272.12... Computer matching requirements. (a) General purpose. The Computer Matching and Privacy Protection Act (CMA) of 1988, as amended, addresses the use of information from computer matching programs that involve a...
7 CFR 272.12 - Computer matching requirements.
Code of Federal Regulations, 2013 CFR
2013-01-01
... 7 Agriculture 4 2013-01-01 2013-01-01 false Computer matching requirements. 272.12 Section 272.12... Computer matching requirements. (a) General purpose. The Computer Matching and Privacy Protection Act (CMA) of 1988, as amended, addresses the use of information from computer matching programs that involve a...
Learning Optimized Local Difference Binaries for Scalable Augmented Reality on Mobile Devices.
Xin Yang; Kwang-Ting Cheng
2014-06-01
The efficiency, robustness and distinctiveness of a feature descriptor are critical to the user experience and scalability of a mobile augmented reality (AR) system. However, existing descriptors are either too computationally expensive to achieve real-time performance on a mobile device such as a smartphone or tablet, or not sufficiently robust and distinctive to identify correct matches from a large database. As a result, current mobile AR systems still only have limited capabilities, which greatly restrict their deployment in practice. In this paper, we propose a highly efficient, robust and distinctive binary descriptor, called Learning-based Local Difference Binary (LLDB). LLDB directly computes a binary string for an image patch using simple intensity and gradient difference tests on pairwise grid cells within the patch. To select an optimized set of grid cell pairs, we densely sample grid cells from an image patch and then leverage a modified AdaBoost algorithm to automatically extract a small set of critical ones with the goal of maximizing the Hamming distance between mismatches while minimizing it between matches. Experimental results demonstrate that LLDB is extremely fast to compute and to match against a large database due to its high robustness and distinctiveness. Compared to the state-of-the-art binary descriptors, primarily designed for speed, LLDB has similar efficiency for descriptor construction, while achieving a greater accuracy and faster matching speed when matching over a large database with 2.3M descriptors on mobile devices.
32 CFR 806b.50 - Computer matching.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 6 2013-07-01 2013-07-01 false Computer matching. 806b.50 Section 806b.50... PROGRAM Disclosing Records to Third Parties § 806b.50 Computer matching. Computer matching programs... on forms used in applying for benefits. Coordinate computer matching statements on forms with Air...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2013 CFR
2013-07-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2012 CFR
2012-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
32 CFR 806b.50 - Computer matching.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 6 2014-07-01 2014-07-01 false Computer matching. 806b.50 Section 806b.50... PROGRAM Disclosing Records to Third Parties § 806b.50 Computer matching. Computer matching programs... on forms used in applying for benefits. Coordinate computer matching statements on forms with Air...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2014 CFR
2014-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
32 CFR 806b.50 - Computer matching.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 6 2012-07-01 2012-07-01 false Computer matching. 806b.50 Section 806b.50... PROGRAM Disclosing Records to Third Parties § 806b.50 Computer matching. Computer matching programs... on forms used in applying for benefits. Coordinate computer matching statements on forms with Air...
76 FR 50198 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-12
... DEPARTMENT OF EDUCATION Privacy Act of 1974; Computer Matching Program AGENCY: Office of the Inspector General, U.S. Department of Education. ACTION: Notice of computer matching between the U.S... conduct of computer matching programs, notice is hereby given of the establishment of a computer matching...
32 CFR 806b.50 - Computer matching.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 6 2010-07-01 2010-07-01 false Computer matching. 806b.50 Section 806b.50... PROGRAM Disclosing Records to Third Parties § 806b.50 Computer matching. Computer matching programs... on forms used in applying for benefits. Coordinate computer matching statements on forms with Air...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
32 CFR 806b.50 - Computer matching.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 6 2011-07-01 2011-07-01 false Computer matching. 806b.50 Section 806b.50... PROGRAM Disclosing Records to Third Parties § 806b.50 Computer matching. Computer matching programs... on forms used in applying for benefits. Coordinate computer matching statements on forms with Air...
41 CFR 105-64.110 - When may GSA establish computer matching programs?
Code of Federal Regulations, 2011 CFR
2011-01-01
... computer matching programs? 105-64.110 Section 105-64.110 Public Contracts and Property Management Federal... GSA establish computer matching programs? (a) System managers will establish computer matching... direction of the GSA Data Integrity Board that will be established when and if computer matching programs...
75 FR 28252 - Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-20
... GENERAL SERVICES ADMINISTRATION Notice of a Computer Matching Program AGENCY: General Services... providing notice of a proposed computer match. The purpose of this match is to identify individuals who are... providing notice of a proposed computer match. The purpose of this match is to identify individuals who are...
NASA Astrophysics Data System (ADS)
Yang, Chang-Ying Joseph; Huang, Weidong
2009-02-01
Computed radiography (CR) is considered a drop-in addition or replacement for traditional screen-film (SF) systems in digital mammography. Unlike other technologies, CR has the advantage of being compatible with existing mammography units. One of the challenges, however, is to properly configure the automatic exposure control (AEC) on existing mammography units for CR use. Unlike analogue systems, the capture and display of digital CR images is decoupled. The function of AEC is changed from ensuring proper and consistent optical density of the captured image on film to balancing image quality with patient dose needed for CR. One of the preferences when acquiring CR images under AEC is to use the same patient dose as SF systems. The challenge is whether the existing AEC design and calibration process-most of them proprietary from the X-ray systems manufacturers and tailored specifically for SF response properties-can be adapted for CR cassettes, in order to compensate for their response and attenuation differences. This paper describes the methods for configuring the AEC of three different mammography units models to match the patient dose used for CR with those that are used for a KODAK MIN-R 2000 SF System. Based on phantom test results, these methods provide the dose level under AEC for the CR systems to match with the dose of SF systems. These methods can be used in clinical environments that require the acquisition of CR images under AEC at the same dose levels as those used for SF systems.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-15
... RAILROAD RETIREMENT BOARD Computer Matching and Privacy Protection Act of 1988; Report of Matching... Railroad Retirement Act. SUMMARY: As required by the Computer Matching and Privacy Protection Act of [[Page...: Under certain circumstances, the Computer Matching and Privacy Protection Act of 1988, Public Law 100...
Correlation-coefficient-based fast template matching through partial elimination.
Mahmood, Arif; Khan, Sohaib
2012-04-01
Partial computation elimination techniques are often used for fast template matching. At a particular search location, computations are prematurely terminated as soon as it is found that this location cannot compete with an already known best match location. Due to the nonmonotonic growth pattern of the correlation-based similarity measures, partial computation elimination techniques have been traditionally considered inapplicable to speed up these measures. In this paper, we show that partial elimination techniques may be applied to a correlation coefficient by using a monotonic formulation, and we propose basic-mode and extended-mode partial correlation elimination algorithms for fast template matching. The basic-mode algorithm is more efficient on small template sizes, whereas the extended mode is faster on medium and larger templates. We also propose a strategy to decide which algorithm to use for a given data set. To achieve a high speedup, elimination algorithms require an initial guess of the peak correlation value. We propose two initialization schemes including a coarse-to-fine scheme for larger templates and a two-stage technique for small- and medium-sized templates. Our proposed algorithms are exact, i.e., having exhaustive equivalent accuracy, and are compared with the existing fast techniques using real image data sets on a wide variety of template sizes. While the actual speedups are data dependent, in most cases, our proposed algorithms have been found to be significantly faster than the other algorithms.
Verifying speculative multithreading in an application
Felton, Mitchell D
2014-12-09
Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.
Verifying speculative multithreading in an application
Felton, Mitchell D
2014-11-18
Verifying speculative multithreading in an application executing in a computing system, including: executing one or more test instructions serially thereby producing a serial result, including insuring that all data dependencies among the test instructions are satisfied; executing the test instructions speculatively in a plurality of threads thereby producing a speculative result; and determining whether a speculative multithreading error exists including: comparing the serial result to the speculative result and, if the serial result does not match the speculative result, determining that a speculative multithreading error exists.
Curvelet-domain multiple matching method combined with cubic B-spline function
NASA Astrophysics Data System (ADS)
Wang, Tong; Wang, Deli; Tian, Mi; Hu, Bin; Liu, Chengming
2018-05-01
Since the large amount of surface-related multiple existed in the marine data would influence the results of data processing and interpretation seriously, many researchers had attempted to develop effective methods to remove them. The most successful surface-related multiple elimination method was proposed based on data-driven theory. However, the elimination effect was unsatisfactory due to the existence of amplitude and phase errors. Although the subsequent curvelet-domain multiple-primary separation method achieved better results, poor computational efficiency prevented its application. In this paper, we adopt the cubic B-spline function to improve the traditional curvelet multiple matching method. First, select a little number of unknowns as the basis points of the matching coefficient; second, apply the cubic B-spline function on these basis points to reconstruct the matching array; third, build constraint solving equation based on the relationships of predicted multiple, matching coefficients, and actual data; finally, use the BFGS algorithm to iterate and realize the fast-solving sparse constraint of multiple matching algorithm. Moreover, the soft-threshold method is used to make the method perform better. With the cubic B-spline function, the differences between predicted multiple and original data diminish, which results in less processing time to obtain optimal solutions and fewer iterative loops in the solving procedure based on the L1 norm constraint. The applications to synthetic and field-derived data both validate the practicability and validity of the method.
Existence conditions for unknown input functional observers
NASA Astrophysics Data System (ADS)
Fernando, T.; MacDougall, S.; Sreeram, V.; Trinh, H.
2013-01-01
This article presents necessary and sufficient conditions for the existence and design of an unknown input Functional observer. The existence of the observer can be verified by computing a nullspace of a known matrix and testing some matrix rank conditions. The existence of the observer does not require the satisfaction of the observer matching condition (i.e. Equation (16) in Hou and Muller 1992, 'Design of Observers for Linear Systems with Unknown Inputs', IEEE Transactions on Automatic Control, 37, 871-875), is not limited to estimating scalar functionals and allows for arbitrary pole placement. The proposed observer always exists when a state observer exists for the unknown input system, and furthermore, the proposed observer can exist even in some instances when an unknown input state observer does not exist.
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
32 CFR 310.53 - Computer matching agreements (CMAs).
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching agreements (CMAs). 310.53... (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.53 Computer.... (3) Justification and expected results. Explain why computer matching as opposed to some other...
Choi, Koeun; Kirkorian, Heather L; Pempek, Tiffany A
2017-04-17
Researchers tested the impact of contextual mismatch, proactive interference, and working memory (WM) on toddlers' transfer across contexts. Forty-two toddlers (27-34 months) completed four object-retrieval trials, requiring memory updating on Trials 2-4. Participants watched hiding events on a tablet computer. Search performance was tested using another tablet (match) or a felt board (mismatch). WM was assessed. On earlier search trials, WM predicted transfer in both conditions, and toddlers in the match condition outperformed those in the mismatch condition; however, the benefit of contextual match and WM decreased over trials. Contextual match apparently increased proactive interference on later trials. Findings are interpreted within existing accounts of the transfer deficit, and a combined account is proposed. © 2017 The Authors. Child Development © 2017 Society for Research in Child Development, Inc.
Perceptual expertise in forensic facial image comparison
White, David; Phillips, P. Jonathon; Hahn, Carina A.; Hill, Matthew; O'Toole, Alice J.
2015-01-01
Forensic facial identification examiners are required to match the identity of faces in images that vary substantially, owing to changes in viewing conditions and in a person's appearance. These identifications affect the course and outcome of criminal investigations and convictions. Despite calls for research on sources of human error in forensic examination, existing scientific knowledge of face matching accuracy is based, almost exclusively, on people without formal training. Here, we administered three challenging face matching tests to a group of forensic examiners with many years' experience of comparing face images for law enforcement and government agencies. Examiners outperformed untrained participants and computer algorithms, thereby providing the first evidence that these examiners are experts at this task. Notably, computationally fusing responses of multiple experts produced near-perfect performance. Results also revealed qualitative differences between expert and non-expert performance. First, examiners' superiority was greatest at longer exposure durations, suggestive of more entailed comparison in forensic examiners. Second, experts were less impaired by image inversion than non-expert students, contrasting with face memory studies that show larger face inversion effects in high performers. We conclude that expertise in matching identity across unfamiliar face images is supported by processes that differ qualitatively from those supporting memory for individual faces. PMID:26336174
40 CFR 51.361 - Motorist compliance enforcement.
Code of Federal Regulations, 2012 CFR
2012-07-01
... programs and computer-matching programs. States that did not adopt an I/M program for any area of the State... approved. An enhanced I/M area may use an existing alternative if it demonstrates that the alternative has... alternative” only in States that, for some area in the State, had an I/M program with that mechanism in...
40 CFR 51.361 - Motorist compliance enforcement.
Code of Federal Regulations, 2013 CFR
2013-07-01
... programs and computer-matching programs. States that did not adopt an I/M program for any area of the State... approved. An enhanced I/M area may use an existing alternative if it demonstrates that the alternative has... alternative” only in States that, for some area in the State, had an I/M program with that mechanism in...
40 CFR 51.361 - Motorist compliance enforcement.
Code of Federal Regulations, 2014 CFR
2014-07-01
... programs and computer-matching programs. States that did not adopt an I/M program for any area of the State... approved. An enhanced I/M area may use an existing alternative if it demonstrates that the alternative has... alternative” only in States that, for some area in the State, had an I/M program with that mechanism in...
NASA Astrophysics Data System (ADS)
Thomsen, M.; Ghaisas, S. V.; Madhukar, A.
1987-07-01
A previously developed computer simulation of molecular beam epitaxial growth of III-V semiconductors based on the configuration dependent reactive incorporation (CDRI) model is extended to allow for two different cation species. Attention is focussed on examining the nature of interfaces formed in lattice matched quantum well structures of the form AC/BC/AC(100). We consider cation species with substantially different effective diffusion lengths, as is the case with Al and Ga during the growth of their respective As compounds. The degree of intermixing occuring at the interface is seen to be dependent upon, among other growth parameters, the pressure of the group V species during growth. Examination of an intraplanar order parameter at the interfaces reveals the existence of short range clustering of the cation species.
76 FR 14669 - Privacy Act of 1974; CMS Computer Match No. 2011-02; HHS Computer Match No. 1007
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-17
... (CMS); and Department of Defense (DoD), Manpower Data Center (DMDC), Defense Enrollment and Eligibility... the results of the computer match and provide the information to TMA for use in its matching program... under TRICARE. DEERS will receive the results of the computer match and provide the information provided...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-21
... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching... savings securities. C. Authority for Conducting the Matching Program This computer matching agreement sets... amended by the Computer Matching and Privacy Protection Act of 1988, as amended, and the regulations and...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-05-31
... Matching and Privacy Protection Act of 1988 (Pub. L. 100-503) and the Computer Matching and Privacy... DEPARTMENT OF EDUCATION Privacy Act of 1974, as Amended; Renewal of Computer Matching Program.... ACTION: Notice. SUMMARY: This document provides notice of the renewal of the computer matching program...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-17
...; Computer Matching Program (SSA/ Law Enforcement Agencies (LEA)) Match Number 5001 AGENCY: Social Security... protections for such persons. The Privacy Act, as amended, regulates the use of computer matching by Federal... accordance with the Privacy Act of 1974, as amended by the Computer Matching and Privacy Protection Act of...
77 FR 38610 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-28
... DEPARTMENT OF EDUCATION Privacy Act of 1974; Computer Matching Program AGENCY: Department of Education. ACTION: Notice--Computer matching agreement between the Department of Education and the Department of Defense. SUMMARY: This document provides notice of the continuation of the computer matching...
NASA Technical Reports Server (NTRS)
Thompson, R. A.; Sutton, Kenneth
1987-01-01
A computational analysis, modification, and preliminary redesign study was performed on the nozzle contour of the Langley Hypersonic CF4 Tunnel. This study showed that the existing nozzle was contoured incorrectly for the design operating condition, and this error was shown to produce the measured disturbances in the exit flow field. A modified contour was designed for the current nozzle downstream of the maximum turning point that would provide a uniform exit flow. New nozzle contours were also designed for an exit Mach number and Reynolds number combination which matches that attainable in the Langley 20-Inch Mach 6 Tunnel. Two nozzle contours were designed: one having the same exit radius but a larger mass flow rate than that of the existing CF4 Tunnel, and the other having the same mass flow rate but a smaller exit radius than that of the existing CF4 Tunnel.
Doi, Takahiro; Fujita, Ichiro
2014-01-01
Three-dimensional visual perception requires correct matching of images projected to the left and right eyes. The matching process is faced with an ambiguity: part of one eye's image can be matched to multiple parts of the other eye's image. This stereo correspondence problem is complicated for random-dot stereograms (RDSs), because dots with an identical appearance produce numerous potential matches. Despite such complexity, human subjects can perceive a coherent depth structure. A coherent solution to the correspondence problem does not exist for anticorrelated RDSs (aRDSs), in which luminance contrast is reversed in one eye. Neurons in the visual cortex reduce disparity selectivity for aRDSs progressively along the visual processing hierarchy. A disparity-energy model followed by threshold nonlinearity (threshold energy model) can account for this reduction, providing a possible mechanism for the neural matching process. However, the essential computation underlying the threshold energy model is not clear. Here, we propose that a nonlinear modification of cross-correlation, which we term “cross-matching,” represents the essence of the threshold energy model. We placed half-wave rectification within the cross-correlation of the left-eye and right-eye images. The disparity tuning derived from cross-matching was attenuated for aRDSs. We simulated a psychometric curve as a function of graded anticorrelation (graded mixture of aRDS and normal RDS); this simulated curve reproduced the match-based psychometric function observed in human near/far discrimination. The dot density was 25% for both simulation and observation. We predicted that as the dot density increased, the performance for aRDSs should decrease below chance (i.e., reversed depth), and the level of anticorrelation that nullifies depth perception should also decrease. We suggest that cross-matching serves as a simple computation underlying the match-based disparity signals in stereoscopic depth perception. PMID:25360107
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 2 2012-07-01 2012-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 2 2014-07-01 2014-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 2 2013-07-01 2013-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 5 2012-07-01 2012-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 5 2014-07-01 2014-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 5 2013-07-01 2013-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 5 2010-07-01 2010-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 701.125 - Computer matching program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 5 2011-07-01 2011-07-01 false Computer matching program. 701.125 Section 701... OF THE NAVY DOCUMENTS AFFECTING THE PUBLIC DON Privacy Program § 701.125 Computer matching program. The DPO has responsibility for coordinating the approval of DOD's participation in Computer Matching...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 2 2010-07-01 2010-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
32 CFR 310.52 - Computer matching publication and review requirements.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 2 2011-07-01 2011-07-01 false Computer matching publication and review... OF DEFENSE (CONTINUED) PRIVACY PROGRAM DOD PRIVACY PROGRAM Computer Matching Program Procedures § 310.52 Computer matching publication and review requirements. (a) DoD Components shall identify the...
Integration of prior knowledge into dense image matching for video surveillance
NASA Astrophysics Data System (ADS)
Menze, M.; Heipke, C.
2014-08-01
Three-dimensional information from dense image matching is a valuable input for a broad range of vision applications. While reliable approaches exist for dedicated stereo setups they do not easily generalize to more challenging camera configurations. In the context of video surveillance the typically large spatial extent of the region of interest and repetitive structures in the scene render the application of dense image matching a challenging task. In this paper we present an approach that derives strong prior knowledge from a planar approximation of the scene. This information is integrated into a graph-cut based image matching framework that treats the assignment of optimal disparity values as a labelling task. Introducing the planar prior heavily reduces ambiguities together with the search space and increases computational efficiency. The results provide a proof of concept of the proposed approach. It allows the reconstruction of dense point clouds in more general surveillance camera setups with wider stereo baselines.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-06
...: Computer Matching Program AGENCY: Treasury Inspector General for Tax Administration, Treasury. ACTION... Internal Revenue Service (IRS) concerning the conduct of TIGTA's computer matching program. DATES... INFORMATION: TIGTA's computer matching program assists in the detection and deterrence of fraud, waste, and...
76 FR 11435 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-02
... Security Administration. SUMMARY: Pursuant to the Computer Matching and Privacy Protection Act of 1988, Public Law 100-503, the Computer Matching and Privacy Protections Amendments of 1990, Pub. L. 101-508... Interpreting the Provisions of Public Law 100-503, the Computer Matching and Privacy Protection Act of 1988...
78 FR 50146 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-16
... DEPARTMENT OF VETERANS AFFAIRS Privacy Act of 1974: Computer Matching Program AGENCY: Department of Veterans Affairs. ACTION: Notice of Computer Match Program. SUMMARY: Pursuant to 5 U.S.C. 552a... to conduct a computer matching program with the Internal Revenue Service (IRS). Data from the...
76 FR 47299 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-04
... DEPARTMENT OF VETERANS AFFAIRS Privacy Act of 1974: Computer Matching Program AGENCY: Department of Veterans Affairs. ACTION: Notice of Computer Match Program. SUMMARY: Pursuant to 5 U.S.C. 552a... to conduct a computer matching program with the Internal Revenue Service (IRS). Data from the...
CP-ABE Based Privacy-Preserving User Profile Matching in Mobile Social Networks
Cui, Weirong; Du, Chenglie; Chen, Jinchao
2016-01-01
Privacy-preserving profile matching, a challenging task in mobile social networks, is getting more attention in recent years. In this paper, we propose a novel scheme that is based on ciphertext-policy attribute-based encryption to tackle this problem. In our scheme, a user can submit a preference-profile and search for users with matching-profile in decentralized mobile social networks. In this process, no participant’s profile and the submitted preference-profile is exposed. Meanwhile, a secure communication channel can be established between the pair of successfully matched users. In contrast to existing related schemes which are mainly based on the secure multi-party computation, our scheme can provide verifiability (both the initiator and any unmatched user cannot cheat each other to pretend to be matched), and requires few interactions among users. We provide thorough security analysis and performance evaluation on our scheme, and show its advantages in terms of security, efficiency and usability over state-of-the-art schemes. PMID:27337001
CP-ABE Based Privacy-Preserving User Profile Matching in Mobile Social Networks.
Cui, Weirong; Du, Chenglie; Chen, Jinchao
2016-01-01
Privacy-preserving profile matching, a challenging task in mobile social networks, is getting more attention in recent years. In this paper, we propose a novel scheme that is based on ciphertext-policy attribute-based encryption to tackle this problem. In our scheme, a user can submit a preference-profile and search for users with matching-profile in decentralized mobile social networks. In this process, no participant's profile and the submitted preference-profile is exposed. Meanwhile, a secure communication channel can be established between the pair of successfully matched users. In contrast to existing related schemes which are mainly based on the secure multi-party computation, our scheme can provide verifiability (both the initiator and any unmatched user cannot cheat each other to pretend to be matched), and requires few interactions among users. We provide thorough security analysis and performance evaluation on our scheme, and show its advantages in terms of security, efficiency and usability over state-of-the-art schemes.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-05
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0052] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Labor (DOL))--Match Number 1003 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...
Integrating image quality in 2nu-SVM biometric match score fusion.
Vatsa, Mayank; Singh, Richa; Noore, Afzel
2007-10-01
This paper proposes an intelligent 2nu-support vector machine based match score fusion algorithm to improve the performance of face and iris recognition by integrating the quality of images. The proposed algorithm applies redundant discrete wavelet transform to evaluate the underlying linear and non-linear features present in the image. A composite quality score is computed to determine the extent of smoothness, sharpness, noise, and other pertinent features present in each subband of the image. The match score and the corresponding quality score of an image are fused using 2nu-support vector machine to improve the verification performance. The proposed algorithm is experimentally validated using the FERET face database and the CASIA iris database. The verification performance and statistical evaluation show that the proposed algorithm outperforms existing fusion algorithms.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-21
... DEPARTMENT OF EDUCATION Privacy Act of 1974; Computer Matching Program Between the Department of... document provides notice of the continuation of a computer matching program between the Department of... 5301, the Department of Justice and the Department of Education implemented a computer matching program...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... notice of its renewal of an ongoing computer-matching program with the Social Security Administration... computer-matching program with the Committee on Homeland Security and Governmental Affairs of the Senate... RAILROAD RETIREMENT BOARD Privacy Act of 1974, as amended; Notice of Computer Matching Program...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-10
... notice of its renewal of an ongoing computer-matching program with the Social Security Administration... computer-matching program with the Committee on Homeland Security and Governmental Affairs of the Senate... RAILROAD RETIREMENT BOARD Privacy Act of 1974, as Amended; Notice of Computer Matching Program...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
...; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1038 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection... containing SSNs extracted from the Supplemental Security Record database. Exchanges for this computer...
77 FR 74020 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
... 29, 2012, sent a report of a Computer Matching Program to the Committee on Homeland Security and... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-30
... report of this computer-matching program with the Committee on Homeland Security and Governmental Affairs... INFORMATION: A. General The Computer-Matching and Privacy Protection Act of 1988, (Pub. L. 100-503), amended... RAILROAD RETIREMENT BOARD Privacy Act of 1974, as Amended; Notice of Computer-Matching Program...
13 CFR 102.40 - Computer matching.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Protection of Privacy and Access to Individual Records Under the Privacy Act of 1974 § 102.40 Computer...) Matching agreements. SBA will comply with the Computer Matching and Privacy Protection Act of 1988 (5 U.S.C... 13 Business Credit and Assistance 1 2013-01-01 2013-01-01 false Computer matching. 102.40 Section...
76 FR 77015 - Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-09
... 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of 1988 (Pub. L... DEPARTMENT OF JUSTICE [AAG/A Order No. 001/2011] Privacy Act of 1974; Computer Matching Agreement AGENCY: Department of Justice. ACTION: Notice--computer matching between the Department of Justice and...
13 CFR 102.40 - Computer matching.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Protection of Privacy and Access to Individual Records Under the Privacy Act of 1974 § 102.40 Computer...) Matching agreements. SBA will comply with the Computer Matching and Privacy Protection Act of 1988 (5 U.S.C... 13 Business Credit and Assistance 1 2014-01-01 2014-01-01 false Computer matching. 102.40 Section...
13 CFR 102.40 - Computer matching.
Code of Federal Regulations, 2010 CFR
2010-01-01
... Protection of Privacy and Access to Individual Records Under the Privacy Act of 1974 § 102.40 Computer...) Matching agreements. SBA will comply with the Computer Matching and Privacy Protection Act of 1988 (5 U.S.C... 13 Business Credit and Assistance 1 2010-01-01 2010-01-01 false Computer matching. 102.40 Section...
13 CFR 102.40 - Computer matching.
Code of Federal Regulations, 2011 CFR
2011-01-01
... Protection of Privacy and Access to Individual Records Under the Privacy Act of 1974 § 102.40 Computer...) Matching agreements. SBA will comply with the Computer Matching and Privacy Protection Act of 1988 (5 U.S.C... 13 Business Credit and Assistance 1 2011-01-01 2011-01-01 false Computer matching. 102.40 Section...
13 CFR 102.40 - Computer matching.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Protection of Privacy and Access to Individual Records Under the Privacy Act of 1974 § 102.40 Computer...) Matching agreements. SBA will comply with the Computer Matching and Privacy Protection Act of 1988 (5 U.S.C... 13 Business Credit and Assistance 1 2012-01-01 2012-01-01 false Computer matching. 102.40 Section...
75 FR 8311 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-24
...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, DoD. ACTION: Notice of a... hereby giving notice to the record subjects of a computer matching program between the Department of... conduct a computer matching program between the agencies. The purpose of this agreement is to verify an...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0084] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Labor (DOL))--Match Number 1003 AGENCY: Social Security... above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-02-08
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0102] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ the States); Match 6000 and 6003 AGENCY: Social Security Administration..., as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-25
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2011-0083] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Department of Labor (DOL))--Match Number 1015 AGENCY: Social Security... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching...
Perceptual expertise in forensic facial image comparison.
White, David; Phillips, P Jonathon; Hahn, Carina A; Hill, Matthew; O'Toole, Alice J
2015-09-07
Forensic facial identification examiners are required to match the identity of faces in images that vary substantially, owing to changes in viewing conditions and in a person's appearance. These identifications affect the course and outcome of criminal investigations and convictions. Despite calls for research on sources of human error in forensic examination, existing scientific knowledge of face matching accuracy is based, almost exclusively, on people without formal training. Here, we administered three challenging face matching tests to a group of forensic examiners with many years' experience of comparing face images for law enforcement and government agencies. Examiners outperformed untrained participants and computer algorithms, thereby providing the first evidence that these examiners are experts at this task. Notably, computationally fusing responses of multiple experts produced near-perfect performance. Results also revealed qualitative differences between expert and non-expert performance. First, examiners' superiority was greatest at longer exposure durations, suggestive of more entailed comparison in forensic examiners. Second, experts were less impaired by image inversion than non-expert students, contrasting with face memory studies that show larger face inversion effects in high performers. We conclude that expertise in matching identity across unfamiliar face images is supported by processes that differ qualitatively from those supporting memory for individual faces. © 2015 The Author(s).
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-09
... of a Computer Matching Program Between HUD and the United States Department of Veterans Affairs (VA) AGENCY: Office of the Chief Information Officer, HUD. ACTION: Notice of a computer matching program... the Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), and the Office of...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-29
... DEPARTMENT OF EDUCATION Privacy Act of 1974; Computer Matching Program between the U.S. Department.... ACTION: Notice. SUMMARY: Notice is hereby given of the renewal of the computer matching program between... (VA) (source agency). After the ED and VA Data Integrity Boards approve a new computer matching...
77 FR 2299 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-17
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
77 FR 74019 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-12
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-11-27
... will file a report of this computer-matching program with the Committee on Homeland Security and... . SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988, (Pub. L. 100-503... RAILROAD RETIREMENT BOARD Privacy Act of 1974, as Amended; Notice of Computer Matching Program...
75 FR 29774 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-27
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a computer matching program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
75 FR 31457 - Office of Child Support Enforcement; Privacy Act of 1974; Computer Matching Agreement
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-03
... Support Enforcement; Privacy Act of 1974; Computer Matching Agreement AGENCY: Office of Child Support Enforcement (OCSE), ACF, HHS. ACTION: Notice of a Computer Matching Program. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 522a), as amended, OCSE is publishing notice of a computer matching program...
Inoue, Kentaro; Shimozono, Shinichi; Yoshida, Hideaki; Kurata, Hiroyuki
2012-01-01
Background For visualizing large-scale biochemical network maps, it is important to calculate the coordinates of molecular nodes quickly and to enhance the understanding or traceability of them. The grid layout is effective in drawing compact, orderly, balanced network maps with node label spaces, but existing grid layout algorithms often require a high computational cost because they have to consider complicated positional constraints through the entire optimization process. Results We propose a hybrid grid layout algorithm that consists of a non-grid, fast layout (preprocessor) algorithm and an approximate pattern matching algorithm that distributes the resultant preprocessed nodes on square grid points. To demonstrate the feasibility of the hybrid layout algorithm, it is characterized in terms of the calculation time, numbers of edge-edge and node-edge crossings, relative edge lengths, and F-measures. The proposed algorithm achieves outstanding performances compared with other existing grid layouts. Conclusions Use of an approximate pattern matching algorithm quickly redistributes the laid-out nodes by fast, non-grid algorithms on the square grid points, while preserving the topological relationships among the nodes. The proposed algorithm is a novel use of the pattern matching, thereby providing a breakthrough for grid layout. This application program can be freely downloaded from http://www.cadlive.jp/hybridlayout/hybridlayout.html. PMID:22679486
Inoue, Kentaro; Shimozono, Shinichi; Yoshida, Hideaki; Kurata, Hiroyuki
2012-01-01
For visualizing large-scale biochemical network maps, it is important to calculate the coordinates of molecular nodes quickly and to enhance the understanding or traceability of them. The grid layout is effective in drawing compact, orderly, balanced network maps with node label spaces, but existing grid layout algorithms often require a high computational cost because they have to consider complicated positional constraints through the entire optimization process. We propose a hybrid grid layout algorithm that consists of a non-grid, fast layout (preprocessor) algorithm and an approximate pattern matching algorithm that distributes the resultant preprocessed nodes on square grid points. To demonstrate the feasibility of the hybrid layout algorithm, it is characterized in terms of the calculation time, numbers of edge-edge and node-edge crossings, relative edge lengths, and F-measures. The proposed algorithm achieves outstanding performances compared with other existing grid layouts. Use of an approximate pattern matching algorithm quickly redistributes the laid-out nodes by fast, non-grid algorithms on the square grid points, while preserving the topological relationships among the nodes. The proposed algorithm is a novel use of the pattern matching, thereby providing a breakthrough for grid layout. This application program can be freely downloaded from http://www.cadlive.jp/hybridlayout/hybridlayout.html.
Discovering gene annotations in biomedical text databases
Cakmak, Ali; Ozsoyoglu, Gultekin
2008-01-01
Background Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data. Results In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general. Conclusion GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values. PMID:18325104
Discovering gene annotations in biomedical text databases.
Cakmak, Ali; Ozsoyoglu, Gultekin
2008-03-06
Genes and gene products are frequently annotated with Gene Ontology concepts based on the evidence provided in genomics articles. Manually locating and curating information about a genomic entity from the biomedical literature requires vast amounts of human effort. Hence, there is clearly a need forautomated computational tools to annotate the genes and gene products with Gene Ontology concepts by computationally capturing the related knowledge embedded in textual data. In this article, we present an automated genomic entity annotation system, GEANN, which extracts information about the characteristics of genes and gene products in article abstracts from PubMed, and translates the discoveredknowledge into Gene Ontology (GO) concepts, a widely-used standardized vocabulary of genomic traits. GEANN utilizes textual "extraction patterns", and a semantic matching framework to locate phrases matching to a pattern and produce Gene Ontology annotations for genes and gene products. In our experiments, GEANN has reached to the precision level of 78% at therecall level of 61%. On a select set of Gene Ontology concepts, GEANN either outperforms or is comparable to two other automated annotation studies. Use of WordNet for semantic pattern matching improves the precision and recall by 24% and 15%, respectively, and the improvement due to semantic pattern matching becomes more apparent as the Gene Ontology terms become more general. GEANN is useful for two distinct purposes: (i) automating the annotation of genomic entities with Gene Ontology concepts, and (ii) providing existing annotations with additional "evidence articles" from the literature. The use of textual extraction patterns that are constructed based on the existing annotations achieve high precision. The semantic pattern matching framework provides a more flexible pattern matching scheme with respect to "exactmatching" with the advantage of locating approximate pattern occurrences with similar semantics. Relatively low recall performance of our pattern-based approach may be enhanced either by employing a probabilistic annotation framework based on the annotation neighbourhoods in textual data, or, alternatively, the statistical enrichment threshold may be adjusted to lower values for applications that put more value on achieving higher recall values.
78 FR 15730 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... 1974; Computer Matching Program AGENCY: U.S. Citizenship and Immigration Services, Department of... Matching Program between the Department of Homeland Security, U.S. Citizenship and Immigration Services and... computer matching program between the Department of Homeland Security, U.S. Citizenship and Immigration...
78 FR 32711 - Privacy Act of 1974: Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-31
... DEPARTMENT OF VETERANS AFFAIRS Privacy Act of 1974: Computer Matching Program AGENCY: Department of Veterans Affairs. ACTION: Notice. SUMMARY: The Department of Veterans Affairs (VA) provides notice that it intends to conduct a recurring computer-matching program matching Internal Revenue Service (IRS...
Fast and accurate genotype imputation in genome-wide association studies through pre-phasing
Howie, Bryan; Fuchsberger, Christian; Stephens, Matthew; Marchini, Jonathan; Abecasis, Gonçalo R.
2013-01-01
Sequencing efforts, including the 1000 Genomes Project and disease-specific efforts, are producing large collections of haplotypes that can be used for genotype imputation in genome-wide association studies (GWAS). Imputing from these reference panels can help identify new risk alleles, but the use of large panels with existing methods imposes a high computational burden. To keep imputation broadly accessible, we introduce a strategy called “pre-phasing” that maintains the accuracy of leading methods while cutting computational costs by orders of magnitude. In brief, we first statistically estimate the haplotypes for each GWAS individual (“pre-phasing”) and then impute missing genotypes into these estimated haplotypes. This reduces the computational cost because: (i) the GWAS samples must be phased only once, whereas standard methods would implicitly re-phase with each reference panel update; (ii) it is much faster to match a phased GWAS haplotype to one reference haplotype than to match unphased GWAS genotypes to a pair of reference haplotypes. This strategy will be particularly valuable for repeated imputation as reference panels evolve. PMID:22820512
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-09
... of a Computer Matching Program Between the Department of Housing and Urban Development (HUD) and the.... ACTION: Notice of a computer matching program between the HUD and the USDA. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-05
... of a Computer Matching Program Between the Department of Housing and Urban Development (HUD) and the...: Notice of a computer matching program between the HUD and ED. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection Act of 1988 (Pub...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-11-03
... of a Computer Matching Program Between the Department of Housing and Urban Development (HUD) and the.... ACTION: Notice of a computer matching program between the HUD and the SBA. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as amended by the Computer Matching and Privacy Protection...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-07
...), Defense Manpower Data Center (DMDC) and the Office of the Assistant Secretary of Defense (Health Affairs.../TRICARE. DMDC will receive the results of the computer match and provide the information to TMA for use in...
Semi-Global Matching with Self-Adjusting Penalties
NASA Astrophysics Data System (ADS)
Karkalou, E.; Stentoumis, C.; Karras, G.
2017-02-01
The demand for 3D models of various scales and precisions is strong for a wide range of applications, among which cultural heritage recording is particularly important and challenging. In this context, dense image matching is a fundamental task for processes which involve image-based reconstruction of 3D models. Despite the existence of commercial software, the need for complete and accurate results under different conditions, as well as for computational efficiency under a variety of hardware, has kept image-matching algorithms as one of the most active research topics. Semi-global matching (SGM) is among the most popular optimization algorithms due to its accuracy, computational efficiency, and simplicity. A challenging aspect in SGM implementation is the determination of smoothness constraints, i.e. penalties P1, P2 for disparity changes and discontinuities. In fact, penalty adjustment is needed for every particular stereo-pair and cost computation. In this work, a novel formulation of self-adjusting penalties is proposed: SGM penalties can be estimated solely from the statistical properties of the initial disparity space image. The proposed method of self-adjusting penalties (SGM-SAP) is evaluated using typical cost functions on stereo-pairs from the recent Middlebury dataset of interior scenes, as well as from the EPFL Herz-Jesu architectural scenes. Results are competitive against the original SGM estimates. The significant aspects of self-adjusting penalties are: (i) the time-consuming tuning process is avoided; (ii) SGM can be used in image collections with limited number of stereo-pairs; and (iii) no heuristic user intervention is needed.
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2013 CFR
2013-07-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2012 CFR
2012-01-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2014 CFR
2014-01-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2014 CFR
2014-01-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2012 CFR
2012-01-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2013 CFR
2013-07-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
78 FR 1275 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-08
... Social Security Administration (Computer Matching Agreement 1071). SUMMARY: In accordance with the... of its new computer matching program with the Social Security Administration (SSA). DATES: OPM will... conditions under which SSA will disclose Social Security benefit data to OPM via direct computer link. OPM...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2011 CFR
2011-01-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.027 - Centralized salary offset computer match.
Code of Federal Regulations, 2010 CFR
2010-07-01
... offset computer match. 105-56.027 Section 105-56.027 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2010 CFR
2010-07-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
41 CFR 105-56.017 - Centralized salary offset computer match.
Code of Federal Regulations, 2011 CFR
2011-01-01
... offset computer match. 105-56.017 Section 105-56.017 Public Contracts and Property Management Federal... computer match. (a) Delinquent debt records will be compared with Federal employee records maintained by... a delegation of authority from the Secretary, has waived certain requirements of the Computer...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-06
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2012-0015] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Centers for Medicare and Medicaid Services (CMS))--Match Number 1094 AGENCY: Social Security Administration (SSA). ACTION: Notice of a new computer matching program that will expire...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-09
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0077] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Office of Personnel Management (OPM))--Match 1307 AGENCY: Social Security... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-09
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0066] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Internal Revenue Service (IRS))--Match 1305 AGENCY: Social Security... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Public Law (Pub. L.) 100-503...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-07
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0034] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Bureau of the Public Debt (BPD))--Match Number 1304 AGENCY: Social Security... as shown above. SUPPLEMENTARY INFORMATION: A. General The Computer Matching and Privacy Protection...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-12
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0015] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Internal Revenue Service (IRS))--Match Number 1016 AGENCY: Social Security... regarding protections for such persons. The Privacy Act, as amended, regulates the use of computer matching...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-28
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2010-0040] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ Railroad Retirement Board (RRB))--Match Number 1006 AGENCY: Social Security...: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L.) 100-503), amended the...
78 FR 45513 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-29
...; Computer Matching Program AGENCY: Defense Manpower Data Center (DMDC), DoD. ACTION: Notice of a Computer... individual's privacy, and would result in additional delay in determining eligibility and, if applicable, the... Defense. NOTICE OF A COMPUTER MATCHING PROGRAM AMONG THE DEFENSE MANPOWER DATA CENTER, THE DEPARTMENT OF...
76 FR 1410 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-10
...; Computer Matching Program AGENCY: Defense Manpower Data Center (DMDC), DoD. ACTION: Notice of a Computer... administrative burden, constitute a greater intrusion of the individual's privacy, and would result in additional... Liaison Officer, Department of Defense. Notice of a Computer Matching Program Among the Defense Manpower...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 32 National Defense 3 2013-07-01 2013-07-01 false Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 32 National Defense 3 2012-07-01 2009-07-01 true Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 32 National Defense 3 2014-07-01 2014-07-01 false Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 32 National Defense 3 2011-07-01 2009-07-01 true Computer Matching Agreement Program. 505.13... AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a) General provisions. (1) Pursuant to the Privacy Act and this part, DA records may be subject to computer...
Efficient Iris Recognition Based on Optimal Subfeature Selection and Weighted Subregion Fusion
Deng, Ning
2014-01-01
In this paper, we propose three discriminative feature selection strategies and weighted subregion matching method to improve the performance of iris recognition system. Firstly, we introduce the process of feature extraction and representation based on scale invariant feature transformation (SIFT) in detail. Secondly, three strategies are described, which are orientation probability distribution function (OPDF) based strategy to delete some redundant feature keypoints, magnitude probability distribution function (MPDF) based strategy to reduce dimensionality of feature element, and compounded strategy combined OPDF and MPDF to further select optimal subfeature. Thirdly, to make matching more effective, this paper proposes a novel matching method based on weighted sub-region matching fusion. Particle swarm optimization is utilized to accelerate achieve different sub-region's weights and then weighted different subregions' matching scores to generate the final decision. The experimental results, on three public and renowned iris databases (CASIA-V3 Interval, Lamp, andMMU-V1), demonstrate that our proposed methods outperform some of the existing methods in terms of correct recognition rate, equal error rate, and computation complexity. PMID:24683317
Efficient iris recognition based on optimal subfeature selection and weighted subregion fusion.
Chen, Ying; Liu, Yuanning; Zhu, Xiaodong; He, Fei; Wang, Hongye; Deng, Ning
2014-01-01
In this paper, we propose three discriminative feature selection strategies and weighted subregion matching method to improve the performance of iris recognition system. Firstly, we introduce the process of feature extraction and representation based on scale invariant feature transformation (SIFT) in detail. Secondly, three strategies are described, which are orientation probability distribution function (OPDF) based strategy to delete some redundant feature keypoints, magnitude probability distribution function (MPDF) based strategy to reduce dimensionality of feature element, and compounded strategy combined OPDF and MPDF to further select optimal subfeature. Thirdly, to make matching more effective, this paper proposes a novel matching method based on weighted sub-region matching fusion. Particle swarm optimization is utilized to accelerate achieve different sub-region's weights and then weighted different subregions' matching scores to generate the final decision. The experimental results, on three public and renowned iris databases (CASIA-V3 Interval, Lamp, and MMU-V1), demonstrate that our proposed methods outperform some of the existing methods in terms of correct recognition rate, equal error rate, and computation complexity.
ERIC Educational Resources Information Center
Olive, G.; And Others
A selective dissemination of information service based on computer scanning of Nuclear Science Abstracts tapes has operated at the Atomic Energy Research Establishment, Harwell, England since October, 1968. The performance of the mechanized SDI service has been compared with that of the pre-existing current awareness service which is based on…
Massanes, Francesc; Cadennes, Marie; Brankov, Jovan G.
2012-01-01
In this paper we describe and evaluate a fast implementation of a classical block matching motion estimation algorithm for multiple Graphical Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) computing engine. The implemented block matching algorithm (BMA) uses summed absolute difference (SAD) error criterion and full grid search (FS) for finding optimal block displacement. In this evaluation we compared the execution time of a GPU and CPU implementation for images of various sizes, using integer and non-integer search grids. The results show that use of a GPU card can shorten computation time by a factor of 200 times for integer and 1000 times for a non-integer search grid. The additional speedup for non-integer search grid comes from the fact that GPU has built-in hardware for image interpolation. Further, when using multiple GPU cards, the presented evaluation shows the importance of the data splitting method across multiple cards, but an almost linear speedup with a number of cards is achievable. In addition we compared execution time of the proposed FS GPU implementation with two existing, highly optimized non-full grid search CPU based motion estimations methods, namely implementation of the Pyramidal Lucas Kanade Optical flow algorithm in OpenCV and Simplified Unsymmetrical multi-Hexagon search in H.264/AVC standard. In these comparisons, FS GPU implementation still showed modest improvement even though the computational complexity of FS GPU implementation is substantially higher than non-FS CPU implementation. We also demonstrated that for an image sequence of 720×480 pixels in resolution, commonly used in video surveillance, the proposed GPU implementation is sufficiently fast for real-time motion estimation at 30 frames-per-second using two NVIDIA C1060 Tesla GPU cards. PMID:22347787
Massanes, Francesc; Cadennes, Marie; Brankov, Jovan G
2011-07-01
In this paper we describe and evaluate a fast implementation of a classical block matching motion estimation algorithm for multiple Graphical Processing Units (GPUs) using the Compute Unified Device Architecture (CUDA) computing engine. The implemented block matching algorithm (BMA) uses summed absolute difference (SAD) error criterion and full grid search (FS) for finding optimal block displacement. In this evaluation we compared the execution time of a GPU and CPU implementation for images of various sizes, using integer and non-integer search grids.The results show that use of a GPU card can shorten computation time by a factor of 200 times for integer and 1000 times for a non-integer search grid. The additional speedup for non-integer search grid comes from the fact that GPU has built-in hardware for image interpolation. Further, when using multiple GPU cards, the presented evaluation shows the importance of the data splitting method across multiple cards, but an almost linear speedup with a number of cards is achievable.In addition we compared execution time of the proposed FS GPU implementation with two existing, highly optimized non-full grid search CPU based motion estimations methods, namely implementation of the Pyramidal Lucas Kanade Optical flow algorithm in OpenCV and Simplified Unsymmetrical multi-Hexagon search in H.264/AVC standard. In these comparisons, FS GPU implementation still showed modest improvement even though the computational complexity of FS GPU implementation is substantially higher than non-FS CPU implementation.We also demonstrated that for an image sequence of 720×480 pixels in resolution, commonly used in video surveillance, the proposed GPU implementation is sufficiently fast for real-time motion estimation at 30 frames-per-second using two NVIDIA C1060 Tesla GPU cards.
Using maximum topology matching to explore differences in species distribution models
Poco, Jorge; Doraiswamy, Harish; Talbert, Marian; Morisette, Jeffrey; Silva, Claudio
2015-01-01
Species distribution models (SDM) are used to help understand what drives the distribution of various plant and animal species. These models are typically high dimensional scalar functions, where the dimensions of the domain correspond to predictor variables of the model algorithm. Understanding and exploring the differences between models help ecologists understand areas where their data or understanding of the system is incomplete and will help guide further investigation in these regions. These differences can also indicate an important source of model to model uncertainty. However, it is cumbersome and often impractical to perform this analysis using existing tools, which allows for manual exploration of the models usually as 1-dimensional curves. In this paper, we propose a topology-based framework to help ecologists explore the differences in various SDMs directly in the high dimensional domain. In order to accomplish this, we introduce the concept of maximum topology matching that computes a locality-aware correspondence between similar extrema of two scalar functions. The matching is then used to compute the similarity between two functions. We also design a visualization interface that allows ecologists to explore SDMs using their topological features and to study the differences between pairs of models found using maximum topological matching. We demonstrate the utility of the proposed framework through several use cases using different data sets and report the feedback obtained from ecologists.
The prediction in computer color matching of dentistry based on GA+BP neural network.
Li, Haisheng; Lai, Long; Chen, Li; Lu, Cheng; Cai, Qiang
2015-01-01
Although the use of computer color matching can reduce the influence of subjective factors by technicians, matching the color of a natural tooth with a ceramic restoration is still one of the most challenging topics in esthetic prosthodontics. Back propagation neural network (BPNN) has already been introduced into the computer color matching in dentistry, but it has disadvantages such as unstable and low accuracy. In our study, we adopt genetic algorithm (GA) to optimize the initial weights and threshold values in BPNN for improving the matching precision. To our knowledge, we firstly combine the BPNN with GA in computer color matching in dentistry. Extensive experiments demonstrate that the proposed method improves the precision and prediction robustness of the color matching in restorative dentistry.
libFLASM: a software library for fixed-length approximate string matching.
Ayad, Lorraine A K; Pissis, Solon P P; Retha, Ahmad
2016-11-10
Approximate string matching is the problem of finding all factors of a given text that are at a distance at most k from a given pattern. Fixed-length approximate string matching is the problem of finding all factors of a text of length n that are at a distance at most k from any factor of length ℓ of a pattern of length m. There exist bit-vector techniques to solve the fixed-length approximate string matching problem in time [Formula: see text] and space [Formula: see text] under the edit and Hamming distance models, where w is the size of the computer word; as such these techniques are independent of the distance threshold k or the alphabet size. Fixed-length approximate string matching is a generalisation of approximate string matching and, hence, has numerous direct applications in computational molecular biology and elsewhere. We present and make available libFLASM, a free open-source C++ software library for solving fixed-length approximate string matching under both the edit and the Hamming distance models. Moreover we describe how fixed-length approximate string matching is applied to solve real problems by incorporating libFLASM into established applications for multiple circular sequence alignment as well as single and structured motif extraction. Specifically, we describe how it can be used to improve the accuracy of multiple circular sequence alignment in terms of the inferred likelihood-based phylogenies; and we also describe how it is used to efficiently find motifs in molecular sequences representing regulatory or functional regions. The comparison of the performance of the library to other algorithms show how it is competitive, especially with increasing distance thresholds. Fixed-length approximate string matching is a generalisation of the classic approximate string matching problem. We present libFLASM, a free open-source C++ software library for solving fixed-length approximate string matching. The extensive experimental results presented here suggest that other applications could benefit from using libFLASM, and thus further maintenance and development of libFLASM is desirable.
78 FR 40541 - Privacy Act of 1974, as Amended; Computer Matching Program (SSA)-Match Number 1014
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-05
... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA 2013-0019] Privacy Act of 1974, as Amended; Computer Matching Program (SSA)--Match Number 1014 AGENCY: Social Security Administration (SSA). [[Page 40542
Computational solutions to large-scale data management and analysis
Schadt, Eric E.; Linderman, Michael D.; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P.
2011-01-01
Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist — such as cloud and heterogeneous computing — to successfully tackle our big data problems. PMID:20717155
A neural substrate for object permanence in monkey inferotemporal cortex
Puneeth, N. C.; Arun, S. P.
2016-01-01
We take it for granted that objects continue to exist after being occluded. This knowledge – known as object permanence – is present even in childhood, but its neural basis is not fully understood. Here, we show that monkey inferior temporal (IT) neurons carry potential signals of object permanence even in animals that received no explicit behavioral training. We compared two conditions with identical visual stimulation: the same object emerged from behind an occluder as expected following its occlusion, or unexpectedly after occlusion of a different object. Some neurons produced a larger (surprise) signal when the object emerged unexpectedly, whereas other neurons produced a larger (match) signal when the object reappeared as expected. Neurons carrying match signals also reinstated selective delay period activity just before the object emerged. Thus, signals related to object permanence are present in IT neurons and may arise through an interplay of memory and match computations. PMID:27484111
A neural substrate for object permanence in monkey inferotemporal cortex.
Puneeth, N C; Arun, S P
2016-08-03
We take it for granted that objects continue to exist after being occluded. This knowledge - known as object permanence - is present even in childhood, but its neural basis is not fully understood. Here, we show that monkey inferior temporal (IT) neurons carry potential signals of object permanence even in animals that received no explicit behavioral training. We compared two conditions with identical visual stimulation: the same object emerged from behind an occluder as expected following its occlusion, or unexpectedly after occlusion of a different object. Some neurons produced a larger (surprise) signal when the object emerged unexpectedly, whereas other neurons produced a larger (match) signal when the object reappeared as expected. Neurons carrying match signals also reinstated selective delay period activity just before the object emerged. Thus, signals related to object permanence are present in IT neurons and may arise through an interplay of memory and match computations.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Secretary, has waived certain requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U... process known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...
76 FR 50460 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-15
...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a Computer Matching Program. SUMMARY: Subsection (e)(12) of the Privacy Act of 1974, as amended, (5 U.S.C. 552a) requires agencies to publish advance notice of any proposed or revised computer...
76 FR 77811 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-12-14
...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD). ACTION: Notice of a Computer Matching Program. SUMMARY: Subsection (e)(12) of the Privacy Act of 1974, as amended, (5 U.S.C. 552a) requires agencies to publish advance notice of any proposed or revised computer...
Systematic comparison of the behaviors produced by computational models of epileptic neocortex.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Warlaumont, A. S.; Lee, H. C.; Benayoun, M.
2010-12-01
Two existing models of brain dynamics in epilepsy, one detailed (i.e., realistic) and one abstract (i.e., simplified) are compared in terms of behavioral range and match to in vitro mouse recordings. A new method is introduced for comparing across computational models that may have very different forms. First, high-level metrics were extracted from model and in vitro output time series. A principal components analysis was then performed over these metrics to obtain a reduced set of derived features. These features define a low-dimensional behavior space in which quantitative measures of behavioral range and degree of match to real data canmore » be obtained. The detailed and abstract models and the mouse recordings overlapped considerably in behavior space. Both the range of behaviors and similarity to mouse data were similar between the detailed and abstract models. When no high-level metrics were used and principal components analysis was computed over raw time series, the models overlapped minimally with the mouse recordings. The method introduced here is suitable for comparing across different kinds of model data and across real brain recordings. It appears that, despite differences in form and computational expense, detailed and abstract models do not necessarily differ in their behaviors.« less
Stewart, Terrence C; Eliasmith, Chris
2013-06-01
Quantum probability (QP) theory can be seen as a type of vector symbolic architecture (VSA): mental states are vectors storing structured information and manipulated using algebraic operations. Furthermore, the operations needed by QP match those in other VSAs. This allows existing biologically realistic neural models to be adapted to provide a mechanistic explanation of the cognitive phenomena described in the target article by Pothos & Busemeyer (P&B).
NASA Technical Reports Server (NTRS)
Chen, Shu-Po
1999-01-01
This paper presents software for solving the non-conforming fluid structure interfaces in aeroelastic simulation. It reviews the algorithm of interpolation and integration, highlights the flexibility and the user-friendly feature that allows the user to select the existing structure and fluid package, like NASTRAN and CLF3D, to perform the simulation. The presented software is validated by computing the High Speed Civil Transport model.
78 FR 15734 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0010] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... computer matching program between the Department of Homeland Security/U.S. Citizenship and Immigration...
78 FR 15733 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0008] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... computer matching program between the Department of Homeland Security/U.S. Citizenship and Immigration...
78 FR 15731 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0011] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and Immigration Services. ACTION: Notice. Overview Information: Privacy Act of 1974; Computer Matching Program...
78 FR 15732 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-03-12
... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0007] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and Immigration Services. ACTION: Notice. Overview Information: Privacy Act of 1974; Computer Matching Program...
Code of Federal Regulations, 2010 CFR
2010-07-01
... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...
Code of Federal Regulations, 2011 CFR
2011-07-01
... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...
Code of Federal Regulations, 2012 CFR
2012-07-01
... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...
Code of Federal Regulations, 2013 CFR
2013-07-01
... requirements of the Computer Matching and Privacy Protection Act of 1988, 5 U.S.C. 552a, as amended, for... known as centralized salary offset computer matching, identify Federal employees who owe delinquent nontax debt to the United States. Centralized salary offset computer matching is the computerized...
Matching pursuit parallel decomposition of seismic data
NASA Astrophysics Data System (ADS)
Li, Chuanhui; Zhang, Fanchang
2017-07-01
In order to improve the computation speed of matching pursuit decomposition of seismic data, a matching pursuit parallel algorithm is designed in this paper. We pick a fixed number of envelope peaks from the current signal in every iteration according to the number of compute nodes and assign them to the compute nodes on average to search the optimal Morlet wavelets in parallel. With the help of parallel computer systems and Message Passing Interface, the parallel algorithm gives full play to the advantages of parallel computing to significantly improve the computation speed of the matching pursuit decomposition and also has good expandability. Besides, searching only one optimal Morlet wavelet by every compute node in every iteration is the most efficient implementation.
The architecture of human kin detection
Lieberman, Debra; Tooby, John; Cosmides, Leda
2012-01-01
Evolved mechanisms for assessing genetic relatedness have been found in many species, but their existence in humans has been a matter of controversy. Here we report three converging lines of evidence, drawn from siblings, that support the hypothesis that kin detection mechanisms exist in humans. These operate by computing, for each familiar individual, a unitary regulatory variable (the kinship index) that corresponds to a pairwise estimate of genetic relatedness between self and other. The cues that the system uses were identified by quantitatively matching individual exposure to potential cues of relatedness to variation in three outputs relevant to the system’s evolved functions: sibling altruism, aversion to personally engaging in sibling incest, and moral opposition to third party sibling incest. As predicted, the kin detection system uses two distinct, ancestrally valid cues to compute relatedness: the familiar other’s perinatal association with the individual’s biological mother, and duration of sibling coresidence. PMID:17301784
78 FR 38724 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-27
... DEPARTMENT OF HOMELAND SECURITY Office of the Secretary [Docket No. DHS-2013-0006] Privacy Act of 1974; Computer Matching Program AGENCY: Department of Homeland Security/U.S. Citizenship and... Agreement that establishes a computer matching program between the Department of Homeland Security/U.S...
Feasibility of Equivalent Dipole Models for Electroencephalogram-Based Brain Computer Interfaces.
Schimpf, Paul H
2017-09-15
This article examines the localization errors of equivalent dipolar sources inverted from the surface electroencephalogram in order to determine the feasibility of using their location as classification parameters for non-invasive brain computer interfaces. Inverse localization errors are examined for two head models: a model represented by four concentric spheres and a realistic model based on medical imagery. It is shown that the spherical model results in localization ambiguity such that a number of dipolar sources, with different azimuths and varying orientations, provide a near match to the electroencephalogram of the best equivalent source. No such ambiguity exists for the elevation of inverted sources, indicating that for spherical head models, only the elevation of inverted sources (and not the azimuth) can be expected to provide meaningful classification parameters for brain-computer interfaces. In a realistic head model, all three parameters of the inverted source location are found to be reliable, providing a more robust set of parameters. In both cases, the residual error hypersurfaces demonstrate local minima, indicating that a search for the best-matching sources should be global. Source localization error vs. signal-to-noise ratio is also demonstrated for both head models.
Empirical constrained Bayes predictors accounting for non-detects among repeated measures.
Moore, Reneé H; Lyles, Robert H; Manatunga, Amita K
2010-11-10
When the prediction of subject-specific random effects is of interest, constrained Bayes predictors (CB) have been shown to reduce the shrinkage of the widely accepted Bayes predictor while still maintaining desirable properties, such as optimizing mean-square error subsequent to matching the first two moments of the random effects of interest. However, occupational exposure and other epidemiologic (e.g. HIV) studies often present a further challenge because data may fall below the measuring instrument's limit of detection. Although methodology exists in the literature to compute Bayes estimates in the presence of non-detects (Bayes(ND)), CB methodology has not been proposed in this setting. By combining methodologies for computing CBs and Bayes(ND), we introduce two novel CBs that accommodate an arbitrary number of observable and non-detectable measurements per subject. Based on application to real data sets (e.g. occupational exposure, HIV RNA) and simulation studies, these CB predictors are markedly superior to the Bayes predictor and to alternative predictors computed using ad hoc methods in terms of meeting the goal of matching the first two moments of the true random effects distribution. Copyright © 2010 John Wiley & Sons, Ltd.
32 CFR 505.13 - Computer Matching Agreement Program.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 32 National Defense 3 2010-07-01 2010-07-01 true Computer Matching Agreement Program. 505.13 Section 505.13 National Defense Department of Defense (Continued) DEPARTMENT OF THE ARMY AID OF CIVIL AUTHORITIES AND PUBLIC RELATIONS ARMY PRIVACY ACT PROGRAM § 505.13 Computer Matching Agreement Program. (a...
77 FR 74518 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-12-14
... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. ACTION: Notice--computer matching between the Office of Personnel Management and the Social Security Administration. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as...
78 FR 35647 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-06-13
... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. ACTION: Notice of computer matching between the Office of Personnel Management and the Social Security Administration (CMA 1045). SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C...
75 FR 17788 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-07
... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. ACTION: Notice--computer matching between the Office of Personnel Management and the Social Security Administration. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as...
75 FR 31819 - Privacy Act of 1974; Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-04
... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program AGENCY: Office of Personnel Management. AGENCY: Notice--computer matching between the Office of Personnel Management and the Social Security Administration. SUMMARY: In accordance with the Privacy Act of 1974 (5 U.S.C. 552a), as...
75 FR 54162 - Privacy Act of 1974
Federal Register 2010, 2011, 2012, 2013, 2014
2010-09-03
... Program A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), amended the... DEPARTMENT OF HEALTH AND HUMAN SERVICES Centers for Medicare and Medicaid Services [CMS Computer Match No. 2010-01; HHS Computer Match No. 1006] Privacy Act of 1974 AGENCY: Department of Health and...
Supervised learning of tools for content-based search of image databases
NASA Astrophysics Data System (ADS)
Delanoy, Richard L.
1996-03-01
A computer environment, called the Toolkit for Image Mining (TIM), is being developed with the goal of enabling users with diverse interests and varied computer skills to create search tools for content-based image retrieval and other pattern matching tasks. Search tools are generated using a simple paradigm of supervised learning that is based on the user pointing at mistakes of classification made by the current search tool. As mistakes are identified, a learning algorithm uses the identified mistakes to build up a model of the user's intentions, construct a new search tool, apply the search tool to a test image, display the match results as feedback to the user, and accept new inputs from the user. Search tools are constructed in the form of functional templates, which are generalized matched filters capable of knowledge- based image processing. The ability of this system to learn the user's intentions from experience contrasts with other existing approaches to content-based image retrieval that base searches on the characteristics of a single input example or on a predefined and semantically- constrained textual query. Currently, TIM is capable of learning spectral and textural patterns, but should be adaptable to the learning of shapes, as well. Possible applications of TIM include not only content-based image retrieval, but also quantitative image analysis, the generation of metadata for annotating images, data prioritization or data reduction in bandwidth-limited situations, and the construction of components for larger, more complex computer vision algorithms.
Rajaraman, Prathish K; Manteuffel, T A; Belohlavek, M; Heys, Jeffrey J
2017-01-01
A new approach has been developed for combining and enhancing the results from an existing computational fluid dynamics model with experimental data using the weighted least-squares finite element method (WLSFEM). Development of the approach was motivated by the existence of both limited experimental blood velocity in the left ventricle and inexact numerical models of the same flow. Limitations of the experimental data include measurement noise and having data only along a two-dimensional plane. Most numerical modeling approaches do not provide the flexibility to assimilate noisy experimental data. We previously developed an approach that could assimilate experimental data into the process of numerically solving the Navier-Stokes equations, but the approach was limited because it required the use of specific finite element methods for solving all model equations and did not support alternative numerical approximation methods. The new approach presented here allows virtually any numerical method to be used for approximately solving the Navier-Stokes equations, and then the WLSFEM is used to combine the experimental data with the numerical solution of the model equations in a final step. The approach dynamically adjusts the influence of the experimental data on the numerical solution so that more accurate data are more closely matched by the final solution and less accurate data are not closely matched. The new approach is demonstrated on different test problems and provides significantly reduced computational costs compared with many previous methods for data assimilation. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
78 FR 25785 - Privacy Act of 1974; Report of Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-02
... of Veterans Affairs. ACTION: Notice of Computer Matching Program. SUMMARY: The Department of Veterans Affairs (VA) provides notice that it intends to conduct a recurring computer-matching program matching Social Security Administration (SSA) Master Beneficiary Records (MBR) and Self-Employment Income System...
A Surrogate-based Adaptive Sampling Approach for History Matching and Uncertainty Quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Zhang, Dongxiao; Lin, Guang
A critical procedure in reservoir simulations is history matching (or data assimilation in a broader sense), which calibrates model parameters such that the simulation results are consistent with field measurements, and hence improves the credibility of the predictions given by the simulations. Often there exist non-unique combinations of parameter values that all yield the simulation results matching the measurements. For such ill-posed history matching problems, Bayesian theorem provides a theoretical foundation to represent different solutions and to quantify the uncertainty with the posterior PDF. Lacking an analytical solution in most situations, the posterior PDF may be characterized with a samplemore » of realizations, each representing a possible scenario. A novel sampling algorithm is presented here for the Bayesian solutions to history matching problems. We aim to deal with two commonly encountered issues: 1) as a result of the nonlinear input-output relationship in a reservoir model, the posterior distribution could be in a complex form, such as multimodal, which violates the Gaussian assumption required by most of the commonly used data assimilation approaches; 2) a typical sampling method requires intensive model evaluations and hence may cause unaffordable computational cost. In the developed algorithm, we use a Gaussian mixture model as the proposal distribution in the sampling process, which is simple but also flexible to approximate non-Gaussian distributions and is particularly efficient when the posterior is multimodal. Also, a Gaussian process is utilized as a surrogate model to speed up the sampling process. Furthermore, an iterative scheme of adaptive surrogate refinement and re-sampling ensures sampling accuracy while keeping the computational cost at a minimum level. The developed approach is demonstrated with an illustrative example and shows its capability in handling the above-mentioned issues. Multimodal posterior of the history matching problem is captured and are used to give a reliable production prediction with uncertainty quantification. The new algorithm reveals a great improvement in terms of computational efficiency comparing previously studied approaches for the sample problem.« less
Matching and correlation computations in stereoscopic depth perception.
Doi, Takahiro; Tanabe, Seiji; Fujita, Ichiro
2011-03-02
A fundamental task of the visual system is to infer depth by using binocular disparity. To encode binocular disparity, the visual cortex performs two distinct computations: one detects matched patterns in paired images (matching computation); the other constructs the cross-correlation between the images (correlation computation). How the two computations are used in stereoscopic perception is unclear. We dissociated their contributions in near/far discrimination by varying the magnitude of the disparity across separate sessions. For small disparity (0.03°), subjects performed at chance level to a binocularly opposite-contrast (anti-correlated) random-dot stereogram (RDS) but improved their performance with the proportion of contrast-matched (correlated) dots. For large disparity (0.48°), the direction of perceived depth reversed with an anti-correlated RDS relative to that for a correlated one. Neither reversed nor normal depth was perceived when anti-correlation was applied to half of the dots. We explain the decision process as a weighted average of the two computations, with the relative weight of the correlation computation increasing with the disparity magnitude. We conclude that matching computation dominates fine depth perception, while both computations contribute to coarser depth perception. Thus, stereoscopic depth perception recruits different computations depending on the disparity magnitude.
77 FR 34941 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-12
...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, DoD. ACTION: Notice of a... computer matching program are the Department of Veterans Affairs (VA) and the Defense Manpower Data Center... identified as DMDC 01, entitled ``Defense Manpower Data Center Data Base,'' last published in the Federal...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-13
... the Defense Manpower Data Center, Department of Defense AGENCY: Postal Service TM . ACTION: Notice of Computer Matching Program--United States Postal Service and the Defense Manpower Data Center, Department of... as the recipient agency in a computer matching program with the Defense Manpower Data Center (DMDC...
22 CFR 1101.4 - Reports on new systems of records; computer matching programs.
Code of Federal Regulations, 2012 CFR
2012-04-01
... 22 Foreign Relations 2 2012-04-01 2009-04-01 true Reports on new systems of records; computer matching programs. 1101.4 Section 1101.4 Foreign Relations INTERNATIONAL BOUNDARY AND WATER COMMISSION... records; computer matching programs. (a) Before establishing any new systems of records, or making any...
22 CFR 1101.4 - Reports on new systems of records; computer matching programs.
Code of Federal Regulations, 2014 CFR
2014-04-01
... 22 Foreign Relations 2 2014-04-01 2014-04-01 false Reports on new systems of records; computer matching programs. 1101.4 Section 1101.4 Foreign Relations INTERNATIONAL BOUNDARY AND WATER COMMISSION... records; computer matching programs. (a) Before establishing any new systems of records, or making any...
22 CFR 1101.4 - Reports on new systems of records; computer matching programs.
Code of Federal Regulations, 2013 CFR
2013-04-01
... 22 Foreign Relations 2 2013-04-01 2009-04-01 true Reports on new systems of records; computer matching programs. 1101.4 Section 1101.4 Foreign Relations INTERNATIONAL BOUNDARY AND WATER COMMISSION... records; computer matching programs. (a) Before establishing any new systems of records, or making any...
22 CFR 1101.4 - Reports on new systems of records; computer matching programs.
Code of Federal Regulations, 2011 CFR
2011-04-01
... 22 Foreign Relations 2 2011-04-01 2009-04-01 true Reports on new systems of records; computer matching programs. 1101.4 Section 1101.4 Foreign Relations INTERNATIONAL BOUNDARY AND WATER COMMISSION... records; computer matching programs. (a) Before establishing any new systems of records, or making any...
Mo, Yun; Zhang, Zhongzhao; Meng, Weixiao; Ma, Lin; Wang, Yao
2014-01-01
Indoor positioning systems based on the fingerprint method are widely used due to the large number of existing devices with a wide range of coverage. However, extensive positioning regions with a massive fingerprint database may cause high computational complexity and error margins, therefore clustering methods are widely applied as a solution. However, traditional clustering methods in positioning systems can only measure the similarity of the Received Signal Strength without being concerned with the continuity of physical coordinates. Besides, outage of access points could result in asymmetric matching problems which severely affect the fine positioning procedure. To solve these issues, in this paper we propose a positioning system based on the Spatial Division Clustering (SDC) method for clustering the fingerprint dataset subject to physical distance constraints. With the Genetic Algorithm and Support Vector Machine techniques, SDC can achieve higher coarse positioning accuracy than traditional clustering algorithms. In terms of fine localization, based on the Kernel Principal Component Analysis method, the proposed positioning system outperforms its counterparts based on other feature extraction methods in low dimensionality. Apart from balancing online matching computational burden, the new positioning system exhibits advantageous performance on radio map clustering, and also shows better robustness and adaptability in the asymmetric matching problem aspect. PMID:24451470
A Community Publication and Dissemination System for Hydrology Education Materials
NASA Astrophysics Data System (ADS)
Ruddell, B. L.
2015-12-01
Hosted by CUAHSI and the Science Education Resource Center (SERC), federated by the National Science Digital Library (NSDL), and allied with the Water Data Center (WDC), Hydrologic Information System (HIS), and HydroShare projects, a simple cyberinfrastructure has been launched for the publication and dissemination of data and model driven university hydrology education materials. This lightweight system's metadata describes learning content as a data-driven module with defined data inputs and outputs. This structure allows a user to mix and match modules to create sequences of content that teach both hydrology and computer learning outcomes. Importantly, this modular infrastructure allows an instructor to substitute a module based on updated computer methods for one based on outdated computer methods, hopefully solving the problem of rapid obsolescence that has hampered previous community efforts. The prototype system is now available from CUAHSI and SERC, with some example content. The system is designed to catalog, link to, make visible, and make accessible the existing and future contributions of the community; this system does not create content. Submissions from hydrology educators are eagerly solicited, especially for existing content.
Local coding based matching kernel method for image classification.
Song, Yan; McLoughlin, Ian Vince; Dai, Li-Rong
2014-01-01
This paper mainly focuses on how to effectively and efficiently measure visual similarity for local feature based representation. Among existing methods, metrics based on Bag of Visual Word (BoV) techniques are efficient and conceptually simple, at the expense of effectiveness. By contrast, kernel based metrics are more effective, but at the cost of greater computational complexity and increased storage requirements. We show that a unified visual matching framework can be developed to encompass both BoV and kernel based metrics, in which local kernel plays an important role between feature pairs or between features and their reconstruction. Generally, local kernels are defined using Euclidean distance or its derivatives, based either explicitly or implicitly on an assumption of Gaussian noise. However, local features such as SIFT and HoG often follow a heavy-tailed distribution which tends to undermine the motivation behind Euclidean metrics. Motivated by recent advances in feature coding techniques, a novel efficient local coding based matching kernel (LCMK) method is proposed. This exploits the manifold structures in Hilbert space derived from local kernels. The proposed method combines advantages of both BoV and kernel based metrics, and achieves a linear computational complexity. This enables efficient and scalable visual matching to be performed on large scale image sets. To evaluate the effectiveness of the proposed LCMK method, we conduct extensive experiments with widely used benchmark datasets, including 15-Scenes, Caltech101/256, PASCAL VOC 2007 and 2011 datasets. Experimental results confirm the effectiveness of the relatively efficient LCMK method.
78 FR 47830 - Privacy Act of 1974; Report of Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-06
... of Veterans Affairs. ACTION: Notice of Computer Matching Program. SUMMARY: The Department of Veterans Affairs (VA) provides notice that it intends to conduct a recurring computer matching program matching... necessary information from RRB-26: Payment, Rate, and Entitlement History File, published at 75 FR 43729...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-07
... Wesolowski, Director, Verifications Policy & Operations Branch, Division of Eligibility and Enrollment Policy..., electronic interfaces and an on-line system for the verification of eligibility. PURPOSE(S) OF THE MATCHING... Security number (SSN) verifications, (2) a death indicator, (3) an indicator of a finding of disability by...
76 FR 56744 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-14
...; Notice of a Computer Matching Program AGENCY: Defense Manpower Data Center, Department of Defense (DoD... (SSA) and DoD Defense Manpower Data Center (DMDC) that their records are being matched by computer. The... intrusion of the individual's privacy and would result in additional delay in the eventual SSI payment and...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-26
... INFORMATION: A. General The Computer Matching and Privacy Protection Act of 1988 (Pub. L. 100-503), amended... SOCIAL SECURITY ADMINISTRATION [Docket No. SSA-2009-0052] Privacy Act of 1974, as Amended; Computer Matching Program (SSA/ U.S. Department of Health and Human Services (HHS), Administration for...
eduCRATE--a Virtual Hospital architecture.
Stoicu-Tivadar, Lăcrimioara; Stoicu-Tivadar, Vasile; Berian, Dorin; Drăgan, Simona; Serban, Alexandru; Serban, Corina
2014-01-01
eduCRATE is a complex project proposal which aims to develop a virtual learning environment offering interactive digital content through original and integrated solutions using cloud computing, complex multimedia systems in virtual space and personalized design with avatars. Compared to existing similar products the project brings the novelty of using languages for medical guides in order to ensure a maximum of flexibility. The Virtual Hospital simulations will create interactive clinical scenarios for which students will find solutions for positive diagnosis and therapeutic management. The solution based on cloud computing and immersive multimedia is an attractive option in education because is economical and it matches the current working style of the young generation to whom it addresses.
Sanad, Mohamed; Hassan, Noha
2014-01-01
A dual resonant antenna configuration is developed for multistandard multifunction mobile handsets and portable computers. Only two wideband resonant antennas can cover most of the LTE spectrums in portable communication equipment. The bandwidth that can be covered by each antenna exceeds 70% without using any matching or tuning circuits, with efficiencies that reach 80%. Thus, a dual configuration of them is capable of covering up to 39 LTE (4G) bands besides the existing 2G and 3G bands. 2×2 MIMO configurations have been also developed for the two wideband antennas with a maximum isolation and a minimum correlation coefficient between the primary and the diversity antennas.
The effect of opponent type on human performance in a three-alternative choice task.
Lie, Celia; Baxter, Jennifer; Alsop, Brent
2013-10-01
Adult participants played computerised games of "Paper Scissors Rock". Participants in one group were told that they were playing against the computer, and those in the other group were told that they were playing against another participant in the adjacent room. The participant who won the most games would receive a $50 prize. For both groups however, the opponent's responses (paper, scissors, or rock) were generated by the computer, and the distribution of these responses was varied across four blocks of 126 trials. Results were analysed using the generalised matching law for the three possible pairs of alternatives (paper vs. scissors, paper vs. rock, and scissors vs. rock) across all participants in each group. Overall, significantly higher estimates of sensitivity to the distribution of opponent's responses were obtained from participants who were told their opponent was a computer compared to participants who were told their opponent was another participant. While adding to the existing literature showing that the generalised matching law is an adequate descriptor of human three-alternative choice behaviour, these findings show that external factors such as perceived opponent type can affect the efficacy of reinforcer contingencies on human behaviour. This suggests that generalising the results from tasks performed against a computer to real-life human-to-human interactions warrants some caution. Copyright © 2013 Elsevier B.V. All rights reserved.
Model reductions using a projection formulation
NASA Technical Reports Server (NTRS)
De Villemagne, Christian; Skelton, Robert E.
1987-01-01
A new methodology for model reduction of MIMO systems exploits the notion of an oblique projection. A reduced model is uniquely defined by a projector whose range space and orthogonal to the null space are chosen among the ranges of generalized controllability and observability matrices. The reduced order models match various combinations (chosen by the designer) of four types of parameters of the full order system associated with (1) low frequency response, (2) high frequency response, (3) low frequency power spectral density, and (4) high frequency power spectral density. Thus, the proposed method is a computationally simple substitute for many existing methods, has an extreme flexibility to embrace combinations of existing methods and offers some new features.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 45 Public Welfare 2 2013-10-01 2012-10-01 true Must States do computer matching of data records... for Other Program Penalties? § 264.10 Must States do computer matching of data records under IEVS to... Internal Revenue Service (IRS), the State Wage Information Collections Agency (SWICA), the Social Security...
Code of Federal Regulations, 2014 CFR
2014-10-01
... 45 Public Welfare 2 2014-10-01 2012-10-01 true Must States do computer matching of data records... for Other Program Penalties? § 264.10 Must States do computer matching of data records under IEVS to... Internal Revenue Service (IRS), the State Wage Information Collections Agency (SWICA), the Social Security...
Code of Federal Regulations, 2010 CFR
2010-10-01
... 45 Public Welfare 2 2010-10-01 2010-10-01 false Must States do computer matching of data records... for Other Program Penalties? § 264.10 Must States do computer matching of data records under IEVS to... Internal Revenue Service (IRS), the State Wage Information Collections Agency (SWICA), the Social Security...
Code of Federal Regulations, 2011 CFR
2011-10-01
... 45 Public Welfare 2 2011-10-01 2011-10-01 false Must States do computer matching of data records... for Other Program Penalties? § 264.10 Must States do computer matching of data records under IEVS to... Internal Revenue Service (IRS), the State Wage Information Collections Agency (SWICA), the Social Security...
Code of Federal Regulations, 2012 CFR
2012-10-01
... 45 Public Welfare 2 2012-10-01 2012-10-01 false Must States do computer matching of data records... for Other Program Penalties? § 264.10 Must States do computer matching of data records under IEVS to... Internal Revenue Service (IRS), the State Wage Information Collections Agency (SWICA), the Social Security...
A FPGA-based architecture for real-time image matching
NASA Astrophysics Data System (ADS)
Wang, Jianhui; Zhong, Sheng; Xu, Wenhui; Zhang, Weijun; Cao, Zhiguo
2013-10-01
Image matching is a fundamental task in computer vision. It is used to establish correspondence between two images taken at different viewpoint or different time from the same scene. However, its large computational complexity has been a challenge to most embedded systems. This paper proposes a single FPGA-based image matching system, which consists of SIFT feature detection, BRIEF descriptor extraction and BRIEF matching. It optimizes the FPGA architecture for the SIFT feature detection to reduce the FPGA resources utilization. Moreover, we implement BRIEF description and matching on FPGA also. The proposed system can implement image matching at 30fps (frame per second) for 1280x720 images. Its processing speed can meet the demand of most real-life computer vision applications.
The Neural Representations Underlying Human Episodic Memory.
Xue, Gui
2018-06-01
A fundamental question of human episodic memory concerns the cognitive and neural representations and processes that give rise to the neural signals of memory. By integrating behavioral tests, formal computational models, and neural measures of brain activity patterns, recent studies suggest that memory signals not only depend on the neural processes and representations during encoding and retrieval, but also on the interaction between encoding and retrieval (e.g., transfer-appropriate processing), as well as on the interaction between the tested events and all other events in the episodic memory space (e.g., global matching). In addition, memory signals are also influenced by the compatibility of the event with the existing long-term knowledge (e.g., schema matching). These studies highlight the interactive nature of human episodic memory. Copyright © 2018 Elsevier Ltd. All rights reserved.
A diagram retrieval method with multi-label learning
NASA Astrophysics Data System (ADS)
Fu, Songping; Lu, Xiaoqing; Liu, Lu; Qu, Jingwei; Tang, Zhi
2015-01-01
In recent years, the retrieval of plane geometry figures (PGFs) has attracted increasing attention in the fields of mathematics education and computer science. However, the high cost of matching complex PGF features leads to the low efficiency of most retrieval systems. This paper proposes an indirect classification method based on multi-label learning, which improves retrieval efficiency by reducing the scope of compare operation from the whole database to small candidate groups. Label correlations among PGFs are taken into account for the multi-label classification task. The primitive feature selection for multi-label learning and the feature description of visual geometric elements are conducted individually to match similar PGFs. The experiment results show the competitive performance of the proposed method compared with existing PGF retrieval methods in terms of both time consumption and retrieval quality.
Gomez, Luis J; Goetz, Stefan M; Peterchev, Angel V
2018-08-01
Transcranial magnetic stimulation (TMS) is a noninvasive brain stimulation technique used for research and clinical applications. Existent TMS coils are limited in their precision of spatial targeting (focality), especially for deeper targets. This paper presents a methodology for designing TMS coils to achieve optimal trade-off between the depth and focality of the induced electric field (E-field), as well as the energy required by the coil. A multi-objective optimization technique is used for computationally designing TMS coils that achieve optimal trade-offs between E-field focality, depth, and energy (fdTMS coils). The fdTMS coil winding(s) maximize focality (minimize the volume of the brain region with E-field above a given threshold) while reaching a target at a specified depth and not exceeding predefined peak E-field strength and required coil energy. Spherical and MRI-derived head models are used to compute the fundamental depth-focality trade-off as well as focality-energy trade-offs for specific target depths. Across stimulation target depths of 1.0-3.4 cm from the brain surface, the suprathreshold volume can be theoretically decreased by 42%-55% compared to existing TMS coil designs. The suprathreshold volume of a figure-8 coil can be decreased by 36%, 44%, or 46%, for matched, doubled, or quadrupled energy. For matched focality and energy, the depth of a figure-8 coil can be increased by 22%. Computational design of TMS coils could enable more selective targeting of the induced E-field. The presented results appear to be the first significant advancement in the depth-focality trade-off of TMS coils since the introduction of the figure-8 coil three decades ago, and likely represent the fundamental physical limit.
Miller, Vonda H; Jansen, Ben H
2008-12-01
Computer algorithms that match human performance in recognizing written text or spoken conversation remain elusive. The reasons why the human brain far exceeds any existing recognition scheme to date in the ability to generalize and to extract invariant characteristics relevant to category matching are not clear. However, it has been postulated that the dynamic distribution of brain activity (spatiotemporal activation patterns) is the mechanism by which stimuli are encoded and matched to categories. This research focuses on supervised learning using a trajectory based distance metric for category discrimination in an oscillatory neural network model. Classification is accomplished using a trajectory based distance metric. Since the distance metric is differentiable, a supervised learning algorithm based on gradient descent is demonstrated. Classification of spatiotemporal frequency transitions and their relation to a priori assessed categories is shown along with the improved classification results after supervised training. The results indicate that this spatiotemporal representation of stimuli and the associated distance metric is useful for simple pattern recognition tasks and that supervised learning improves classification results.
NASA Astrophysics Data System (ADS)
Wong-Loya, J. A.; Santoyo, E.; Andaverde, J. A.; Quiroz-Ruiz, A.
2015-12-01
A Web-Based Computer System (RPM-WEBBSYS) has been developed for the application of the Rational Polynomial Method (RPM) to estimate static formation temperatures (SFT) of geothermal and petroleum wells. The system is also capable to reproduce the full thermal recovery processes occurred during the well completion. RPM-WEBBSYS has been programmed using advances of the information technology to perform more efficiently computations of SFT. RPM-WEBBSYS may be friendly and rapidly executed by using any computing device (e.g., personal computers and portable computing devices such as tablets or smartphones) with Internet access and a web browser. The computer system was validated using bottomhole temperature (BHT) measurements logged in a synthetic heat transfer experiment, where a good matching between predicted and true SFT was achieved. RPM-WEBBSYS was finally applied to BHT logs collected from well drilling and shut-in operations, where the typical problems of the under- and over-estimation of the SFT (exhibited by most of the existing analytical methods) were effectively corrected.
Nonreflective Conditions for Perfectly Matched Layer in Computational Aeroacoustics
NASA Astrophysics Data System (ADS)
Choung, Hanahchim; Jang, Seokjong; Lee, Soogab
2018-05-01
In computational aeroacoustics, boundary conditions such as radiation, outflow, or absorbing boundary conditions are critical issues in that they can affect the entire solution of the computation. Among these types of boundary conditions, the perfectly matched layer boundary condition, which has been widely used in computational fluid dynamics and computational aeroacoustics, is developed by augmenting the additional term in the original governing equations by an absorption function so as to stably absorb the outgoing waves. Even if the perfectly matched layer is analytically a perfectly nonreflective boundary condition, spurious waves occur at the interface, since the analysis is performed in discretized space. Hence, this study is focused on factors that affect numerical errors from perfectly matched layer to find the optimum conditions for nonreflective PML. Through a mathematical approach, a minimum width of perfectly matched layer and an optimum absorption coefficient are suggested. To validate the prediction of the analysis, numerical simulations are performed in a generalized coordinate system, as well as in a Cartesian coordinate system.
We look like our names: The manifestation of name stereotypes in facial appearance.
Zwebner, Yonat; Sellier, Anne-Laure; Rosenfeld, Nir; Goldenberg, Jacob; Mayo, Ruth
2017-04-01
Research demonstrates that facial appearance affects social perceptions. The current research investigates the reverse possibility: Can social perceptions influence facial appearance? We examine a social tag that is associated with us early in life-our given name. The hypothesis is that name stereotypes can be manifested in facial appearance, producing a face-name matching effect , whereby both a social perceiver and a computer are able to accurately match a person's name to his or her face. In 8 studies we demonstrate the existence of this effect, as participants examining an unfamiliar face accurately select the person's true name from a list of several names, significantly above chance level. We replicate the effect in 2 countries and find that it extends beyond the limits of socioeconomic cues. We also find the effect using a computer-based paradigm and 94,000 faces. In our exploration of the underlying mechanism, we show that existing name stereotypes produce the effect, as its occurrence is culture-dependent. A self-fulfilling prophecy seems to be at work, as initial evidence shows that facial appearance regions that are controlled by the individual (e.g., hairstyle) are sufficient to produce the effect, and socially using one's given name is necessary to generate the effect. Together, these studies suggest that facial appearance represents social expectations of how a person with a specific name should look. In this way a social tag may influence one's facial appearance. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Shade matching assisted by digital photography and computer software.
Schropp, Lars
2009-04-01
To evaluate the efficacy of digital photographs and graphic computer software for color matching compared to conventional visual matching. The shade of a tab from a shade guide (Vita 3D-Master Guide) placed in a phantom head was matched to a second guide of the same type by nine observers. This was done for twelve selected shade tabs (tests). The shade-matching procedure was performed visually in a simulated clinic environment and with digital photographs, and the time spent for both procedures was recorded. An alternative arrangement of the shade tabs was used in the digital photographs. In addition, a graphic software program was used for color analysis. Hue, chroma, and lightness values of the test tab and all tabs of the second guide were derived from the digital photographs. According to the CIE L*C*h* color system, the color differences between the test tab and tabs of the second guide were calculated. The shade guide tab that deviated least from the test tab was determined to be the match. Shade matching performance by means of graphic software was compared with the two visual methods and tested by Chi-square tests (alpha= 0.05). Eight of twelve test tabs (67%) were matched correctly by the computer software method. This was significantly better (p < 0.02) than the performance of the visual shade matching methods conducted in the simulated clinic (32% correct match) and with photographs (28% correct match). No correlation between time consumption for the visual shade matching methods and frequency of correct match was observed. Shade matching assisted by digital photographs and computer software was significantly more reliable than by conventional visual methods.
78 FR 49525 - Privacy Act of 1974; CMS Computer Match No. 2013-06; HHS Computer Match No. 1308
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... Care Act of 2010 (Pub. L. 111-148), as amended by the Health Care and Education Reconciliation Act of..., 2009). INCLUSIVE DATES OF THE MATCH: The CMP will become effective no sooner than 40 days after the...
78 FR 49524 - Privacy Act of 1974; CMS Computer Match No. 2013-08; HHS Computer Match No. 1309
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-14
... by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111-152) (collectively, the ACA...). INCLUSIVE DATES OF THE MATCH: The CMP will become effective no sooner than 40 days after the report of the...
78 FR 50419 - Privacy Act of 1974; CMS Computer Match No. 2013-10; HHS Computer Match No. 1310
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-19
... (Pub. L. 111- 148), as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111... Entitlements Program System of Records Notice, 77 FR 47415 (August 8, 2012). Inclusive Dates of the Match: The...
Interhemispheric Resource Sharing: Decreasing Benefits with Increasing Processing Efficiency
ERIC Educational Resources Information Center
Maertens, M.; Pollmann, S.
2005-01-01
Visual matches are sometimes faster when stimuli are presented across visual hemifields, compared to within-field matching. Using a cued geometric figure matching task, we investigated the influence of computational complexity vs. processing efficiency on this bilateral distribution advantage (BDA). Computational complexity was manipulated by…
Njeh, Ines; Sallemi, Lamia; Ayed, Ismail Ben; Chtourou, Khalil; Lehericy, Stephane; Galanaud, Damien; Hamida, Ahmed Ben
2015-03-01
This study investigates a fast distribution-matching, data-driven algorithm for 3D multimodal MRI brain glioma tumor and edema segmentation in different modalities. We learn non-parametric model distributions which characterize the normal regions in the current data. Then, we state our segmentation problems as the optimization of several cost functions of the same form, each containing two terms: (i) a distribution matching prior, which evaluates a global similarity between distributions, and (ii) a smoothness prior to avoid the occurrence of small, isolated regions in the solution. Obtained following recent bound-relaxation results, the optima of the cost functions yield the complement of the tumor region or edema region in nearly real-time. Based on global rather than pixel wise information, the proposed algorithm does not require an external learning from a large, manually-segmented training set, as is the case of the existing methods. Therefore, the ensuing results are independent of the choice of a training set. Quantitative evaluations over the publicly available training and testing data set from the MICCAI multimodal brain tumor segmentation challenge (BraTS 2012) demonstrated that our algorithm yields a highly competitive performance for complete edema and tumor segmentation, among nine existing competing methods, with an interesting computing execution time (less than 0.5s per image). Copyright © 2014 Elsevier Ltd. All rights reserved.
Data-Driven Neural Network Model for Robust Reconstruction of Automobile Casting
NASA Astrophysics Data System (ADS)
Lin, Jinhua; Wang, Yanjie; Li, Xin; Wang, Lu
2017-09-01
In computer vision system, it is a challenging task to robustly reconstruct complex 3D geometries of automobile castings. However, 3D scanning data is usually interfered by noises, the scanning resolution is low, these effects normally lead to incomplete matching and drift phenomenon. In order to solve these problems, a data-driven local geometric learning model is proposed to achieve robust reconstruction of automobile casting. In order to relieve the interference of sensor noise and to be compatible with incomplete scanning data, a 3D convolution neural network is established to match the local geometric features of automobile casting. The proposed neural network combines the geometric feature representation with the correlation metric function to robustly match the local correspondence. We use the truncated distance field(TDF) around the key point to represent the 3D surface of casting geometry, so that the model can be directly embedded into the 3D space to learn the geometric feature representation; Finally, the training labels is automatically generated for depth learning based on the existing RGB-D reconstruction algorithm, which accesses to the same global key matching descriptor. The experimental results show that the matching accuracy of our network is 92.2% for automobile castings, the closed loop rate is about 74.0% when the matching tolerance threshold τ is 0.2. The matching descriptors performed well and retained 81.6% matching accuracy at 95% closed loop. For the sparse geometric castings with initial matching failure, the 3D matching object can be reconstructed robustly by training the key descriptors. Our method performs 3D reconstruction robustly for complex automobile castings.
Human Expertise Helps Computer Classify Images
NASA Technical Reports Server (NTRS)
Rorvig, Mark E.
1991-01-01
Two-domain method of computational classification of images requires less computation than other methods for computational recognition, matching, or classification of images or patterns. Does not require explicit computational matching of features, and incorporates human expertise without requiring translation of mental processes of classification into language comprehensible to computer. Conceived to "train" computer to analyze photomicrographs of microscope-slide specimens of leucocytes from human peripheral blood to distinguish between specimens from healthy and specimens from traumatized patients.
From serological to computer cross-matching in nine hospitals.
Georgsen, J; Kristensen, T
1998-01-01
In 1991 it was decided to reorganise the transfusion service of the County of Funen. The aims were to standardise and improve the quality of blood components, laboratory procedures and the transfusion service and to reduce the number of outdated blood units. Part of the efficiency gains was reinvested in a dedicated computer system making it possible--among other things--to change the cross-match procedures from serological to computer cross-matching according to the ABCD-concept. This communication describes how this transition was performed in terms of laboratory techniques, education of personnel as well as implementation of the computer system and indicates the results obtained. The Funen Transfusion Service has by now performed more than 100.000 red cell transfusions based on ABCD-cross-matching and has not encountered any problems. Major results are the significant reductions of cross-match procedures, blood grouping as well as the number of outdated blood components.
78 FR 39730 - Privacy Act of 1974; CMS Computer Match No. 2013-11; HHS Computer Match No. 1302
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-02
... (Pub. L. 111-148), as amended by the Health Care and Education Reconciliation Act of 2010 (Pub. L. 111... 78 FR 32256 on May 29, 2013. Inclusive Dates of the Match: The CMP shall become effective no sooner...
Sanad, Mohamed; Hassan, Noha
2014-01-01
A dual resonant antenna configuration is developed for multistandard multifunction mobile handsets and portable computers. Only two wideband resonant antennas can cover most of the LTE spectrums in portable communication equipment. The bandwidth that can be covered by each antenna exceeds 70% without using any matching or tuning circuits, with efficiencies that reach 80%. Thus, a dual configuration of them is capable of covering up to 39 LTE (4G) bands besides the existing 2G and 3G bands. 2 × 2 MIMO configurations have been also developed for the two wideband antennas with a maximum isolation and a minimum correlation coefficient between the primary and the diversity antennas. PMID:24558322
Pricing the Computing Resources: Reading Between the Lines and Beyond
NASA Technical Reports Server (NTRS)
Nakai, Junko; Veronico, Nick (Editor); Thigpen, William W. (Technical Monitor)
2001-01-01
Distributed computing systems have the potential to increase the usefulness of existing facilities for computation without adding anything physical, but that is realized only when necessary administrative features are in place. In a distributed environment, the best match is sought between a computing job to be run and a computer to run the job (global scheduling), which is a function that has not been required by conventional systems. Viewing the computers as 'suppliers' and the users as 'consumers' of computing services, markets for computing services/resources have been examined as one of the most promising mechanisms for global scheduling. We first establish why economics can contribute to scheduling. We further define the criterion for a scheme to qualify as an application of economics. Many studies to date have claimed to have applied economics to scheduling. If their scheduling mechanisms do not utilize economics, contrary to their claims, their favorable results do not contribute to the assertion that markets provide the best framework for global scheduling. We examine the well-known scheduling schemes, which concern pricing and markets, using our criterion of what application of economics is. Our conclusion is that none of the schemes examined makes full use of economics.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-09-14
... require additional verification to identify inappropriate or inaccurate rental assistance, and may provide... Affordable Housing Act, the Native American Housing Assistance and Self-Determination Act of 1996, and the... matching activities. The computer matching program will also provide for the verification of social...
LOCAL ORTHOGONAL CUTTING METHOD FOR COMPUTING MEDIAL CURVES AND ITS BIOMEDICAL APPLICATIONS
Einstein, Daniel R.; Dyedov, Vladimir
2010-01-01
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method called local orthogonal cutting (LOC) for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stability and consistency tests. These concepts lend themselves to robust numerical techniques and result in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods. PMID:20628546
On Stable Marriages and Greedy Matchings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manne, Fredrik; Naim, Md; Lerring, Hakon
2016-12-11
Research on stable marriage problems has a long and mathematically rigorous history, while that of exploiting greedy matchings in combinatorial scientific computing is a younger and less developed research field. In this paper we consider the relationships between these two areas. In particular we show that several problems related to computing greedy matchings can be formulated as stable marriage problems and as a consequence several recently proposed algorithms for computing greedy matchings are in fact special cases of well known algorithms for the stable marriage problem. However, in terms of implementations and practical scalable solutions on modern hardware, the greedymore » matching community has made considerable progress. We show that due to the strong relationship between these two fields many of these results are also applicable for solving stable marriage problems.« less
Federal Register 2010, 2011, 2012, 2013, 2014
2010-07-26
... safeguards for disclosure of Social Security benefit information to OPM via direct computer link for the... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program Between the Office of Personnel Management and Social Security Administration AGENCY: Office of Personnel Management...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-16
... Security benefit information to OPM via direct computer link for the administration of certain programs by... OFFICE OF PERSONNEL MANAGEMENT Privacy Act of 1974; Computer Matching Program Between the Office Of Personnel Management and Social Security Administration AGENCY: Office of Personnel Management...
HOPI: on-line injection optimization program
DOE Office of Scientific and Technical Information (OSTI.GOV)
LeMaire, J L
1977-10-26
A method of matching the beam from the 200 MeV linac to the AGS without the necessity of making emittance measurements is presented. An on-line computer program written on the PDP10 computer performs the matching by modifying independently the horizontal and vertical emittance. Experimental results show success with this method, which can be applied to any matching section.
1986-05-01
AD-ft?l 552 TIGHT BOUNDS FOR NININAX GRID MATCHING WITH i APPLICATIONS TO THE AVERAGE C.. (U) MASSACHUSETTS INST OF TECH CAMBRIDGE LAS FOR COMPUTER...MASSACHUSETTS LABORATORYFORNSTITUTE OF COMPUTER SCIENCE TECHNOLOGY MIT/LCS/TM-298 TIGHT BOUNDS FOR MINIMAX GRID MATCHING, WITH APPLICATIONS TO THE AVERAGE...PERIOD COVERED Tight bounds for minimax grid matching, Interim research with applications to the average case May 1986 analysis of algorithms. 6
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wayne F. Boyer; Gurdeep S. Hura
2005-09-01
The Problem of obtaining an optimal matching and scheduling of interdependent tasks in distributed heterogeneous computing (DHC) environments is well known to be an NP-hard problem. In a DHC system, task execution time is dependent on the machine to which it is assigned and task precedence constraints are represented by a directed acyclic graph. Recent research in evolutionary techniques has shown that genetic algorithms usually obtain more efficient schedules that other known algorithms. We propose a non-evolutionary random scheduling (RS) algorithm for efficient matching and scheduling of inter-dependent tasks in a DHC system. RS is a succession of randomized taskmore » orderings and a heuristic mapping from task order to schedule. Randomized task ordering is effectively a topological sort where the outcome may be any possible task order for which the task precedent constraints are maintained. A detailed comparison to existing evolutionary techniques (GA and PSGA) shows the proposed algorithm is less complex than evolutionary techniques, computes schedules in less time, requires less memory and fewer tuning parameters. Simulation results show that the average schedules produced by RS are approximately as efficient as PSGA schedules for all cases studied and clearly more efficient than PSGA for certain cases. The standard formulation for the scheduling problem addressed in this paper is Rm|prec|Cmax.,« less
NASA Astrophysics Data System (ADS)
Geng, Weihua; Zhao, Shan
2017-12-01
We present a new Matched Interface and Boundary (MIB) regularization method for treating charge singularity in solvated biomolecules whose electrostatics are described by the Poisson-Boltzmann (PB) equation. In a regularization method, by decomposing the potential function into two or three components, the singular component can be analytically represented by the Green's function, while other components possess a higher regularity. Our new regularization combines the efficiency of two-component schemes with the accuracy of the three-component schemes. Based on this regularization, a new MIB finite difference algorithm is developed for solving both linear and nonlinear PB equations, where the nonlinearity is handled by using the inexact-Newton's method. Compared with the existing MIB PB solver based on a three-component regularization, the present algorithm is simpler to implement by circumventing the work to solve a boundary value Poisson equation inside the molecular interface and to compute related interface jump conditions numerically. Moreover, the new MIB algorithm becomes computationally less expensive, while maintains the same second order accuracy. This is numerically verified by calculating the electrostatic potential and solvation energy on the Kirkwood sphere on which the analytical solutions are available and on a series of proteins with various sizes.
Computer-aided diagnosis of cavernous malformations in brain MR images.
Wang, Huiquan; Ahmed, S Nizam; Mandal, Mrinal
2018-06-01
Cavernous malformation or cavernoma is one of the most common epileptogenic lesions. It is a type of brain vessel abnormality that can cause serious symptoms such as seizures, intracerebral hemorrhage, and various neurological disorders. Manual detection of cavernomas by physicians in a large set of brain MRI slices is a time-consuming and labor-intensive task and often delays diagnosis. In this paper, we propose a computer-aided diagnosis (CAD) system for cavernomas based on T2-weighted axial plane MRI image analysis. The proposed technique first extracts the brain area based on atlas registration and active contour model, and then performs template matching to obtain candidate cavernoma regions. Texture, the histogram of oriented gradients and local binary pattern features of each candidate region are calculated, and principal component analysis is applied to reduce the feature dimensionality. Support vector machines (SVMs) are finally used to classify each region into cavernoma or non-cavernoma so that most of the false positives (obtained by template matching) are eliminated. The performance of the proposed CAD system is evaluated and experimental results show that it provides superior performance in cavernoma detection compared to existing techniques. Copyright © 2018 Elsevier Ltd. All rights reserved.
Features and the ‘primal sketch’
Morgan, Michael J.
2014-01-01
This review is concerned primarily with psychophysical and physiological evidence relevant to the question of the existence of spatial features or spatial primitives in human vision. The review will be almost exclusively confined to features defined in the luminance domain. The emphasis will be on the experimental and computational methods that have been used for revealing features, rather than on a detailed comparison between different models of feature extraction. Color and texture fall largely outside the scope of the review, though the principles may be similar. Stereo matching and motion matching are also largely excluded because they are covered in other contributions to this volume, although both have addressed the question of the spatial primitives involved in matching. Similarities between different psychophysically-based model will be emphasized rather than minor differences. All the models considered in the review are based on the extraction of directional spatial derivatives of the luminance profile, typically the first and second, but in one case the third order, and all have some form of non-linearity, be it rectification or thresholding. PMID:20696182
20 CFR 10.527 - Does OWCP verify reports of earnings?
Code of Federal Regulations, 2012 CFR
2012-04-01
... to computer matches with the Office of Personnel Management and inquiries to the Social Security Administration. Also, OWCP may perform computer matches with records of State agencies, including but not limited...
20 CFR 10.527 - Does OWCP verify reports of earnings?
Code of Federal Regulations, 2011 CFR
2011-04-01
... to computer matches with the Office of Personnel Management and inquiries to the Social Security Administration. Also, OWCP may perform computer matches with records of State agencies, including but not limited...
20 CFR 10.527 - Does OWCP verify reports of earnings?
Code of Federal Regulations, 2014 CFR
2014-04-01
... to computer matches with the Office of Personnel Management and inquiries to the Social Security Administration. Also, OWCP may perform computer matches with records of State agencies, including but not limited...
20 CFR 10.527 - Does OWCP verify reports of earnings?
Code of Federal Regulations, 2010 CFR
2010-04-01
... to computer matches with the Office of Personnel Management and inquiries to the Social Security Administration. Also, OWCP may perform computer matches with records of State agencies, including but not limited...
20 CFR 10.527 - Does OWCP verify reports of earnings?
Code of Federal Regulations, 2013 CFR
2013-04-01
... to computer matches with the Office of Personnel Management and inquiries to the Social Security Administration. Also, OWCP may perform computer matches with records of State agencies, including but not limited...
Towards Open-World Person Re-Identification by One-Shot Group-Based Verification.
Zheng, Wei-Shi; Gong, Shaogang; Xiang, Tao
2016-03-01
Solving the problem of matching people across non-overlapping multi-camera views, known as person re-identification (re-id), has received increasing interests in computer vision. In a real-world application scenario, a watch-list (gallery set) of a handful of known target people are provided with very few (in many cases only a single) image(s) (shots) per target. Existing re-id methods are largely unsuitable to address this open-world re-id challenge because they are designed for (1) a closed-world scenario where the gallery and probe sets are assumed to contain exactly the same people, (2) person-wise identification whereby the model attempts to verify exhaustively against each individual in the gallery set, and (3) learning a matching model using multi-shots. In this paper, a novel transfer local relative distance comparison (t-LRDC) model is formulated to address the open-world person re-identification problem by one-shot group-based verification. The model is designed to mine and transfer useful information from a labelled open-world non-target dataset. Extensive experiments demonstrate that the proposed approach outperforms both non-transfer learning and existing transfer learning based re-id methods.
Chen, Chi-Jim; Pai, Tun-Wen; Cheng, Mox
2015-01-01
A sweeping fingerprint sensor converts fingerprints on a row by row basis through image reconstruction techniques. However, a built fingerprint image might appear to be truncated and distorted when the finger was swept across a fingerprint sensor at a non-linear speed. If the truncated fingerprint images were enrolled as reference targets and collected by any automated fingerprint identification system (AFIS), successful prediction rates for fingerprint matching applications would be decreased significantly. In this paper, a novel and effective methodology with low time computational complexity was developed for detecting truncated fingerprints in a real time manner. Several filtering rules were implemented to validate existences of truncated fingerprints. In addition, a machine learning method of supported vector machine (SVM), based on the principle of structural risk minimization, was applied to reject pseudo truncated fingerprints containing similar characteristics of truncated ones. The experimental result has shown that an accuracy rate of 90.7% was achieved by successfully identifying truncated fingerprint images from testing images before AFIS enrollment procedures. The proposed effective and efficient methodology can be extensively applied to all existing fingerprint matching systems as a preliminary quality control prior to construction of fingerprint templates. PMID:25835186
Accuracy and robustness evaluation in stereo matching
NASA Astrophysics Data System (ADS)
Nguyen, Duc M.; Hanca, Jan; Lu, Shao-Ping; Schelkens, Peter; Munteanu, Adrian
2016-09-01
Stereo matching has received a lot of attention from the computer vision community, thanks to its wide range of applications. Despite of the large variety of algorithms that have been proposed so far, it is not trivial to select suitable algorithms for the construction of practical systems. One of the main problems is that many algorithms lack sufficient robustness when employed in various operational conditions. This problem is due to the fact that most of the proposed methods in the literature are usually tested and tuned to perform well on one specific dataset. To alleviate this problem, an extensive evaluation in terms of accuracy and robustness of state-of-the-art stereo matching algorithms is presented. Three datasets (Middlebury, KITTI, and MPEG FTV) representing different operational conditions are employed. Based on the analysis, improvements over existing algorithms have been proposed. The experimental results show that our improved versions of cross-based and cost volume filtering algorithms outperform the original versions with large margins on Middlebury and KITTI datasets. In addition, the latter of the two proposed algorithms ranks itself among the best local stereo matching approaches on the KITTI benchmark. Under evaluations using specific settings for depth-image-based-rendering applications, our improved belief propagation algorithm is less complex than MPEG's FTV depth estimation reference software (DERS), while yielding similar depth estimation performance. Finally, several conclusions on stereo matching algorithms are also presented.
Van Stan, Jarrad H.; Mehta, Daryush D.; Zeitels, Steven M.; Burns, James A.; Barbu, Anca M.; Hillman, Robert E.
2015-01-01
Objectives Clinical management of phonotraumatic vocal fold lesions (nodules, polyps) is based largely on assumptions that abnormalities in habitual levels of sound pressure level (SPL), fundamental frequency (f0), and/or amount of voice use play a major role in lesion development and chronic persistence. This study used ambulatory voice monitoring to evaluate if significant differences in voice use exist between patients with phonotraumatic lesions and normal matched controls. Methods Subjects were 70 adult females: 35 with vocal fold nodules or polyps and 35 age-, sex-, and occupation-matched normal individuals. Weeklong summary statistics of voice use were computed from anterior neck surface acceleration recorded using a smartphone-based ambulatory voice monitor. Results Paired t-tests and Kolmogorov-Smirnov tests resulted in no statistically significant differences between patients and matched controls regarding average measures of SPL, f0, vocal dose measures, and voicing/voice rest periods. Paired t-tests comparing f0 variability between the groups resulted in statistically significant differences with moderate effect sizes. Conclusions Individuals with phonotraumatic lesions did not exhibit differences in average ambulatory measures of vocal behavior when compared with matched controls. More refined characterizations of underlying phonatory mechanisms and other potentially contributing causes are warranted to better understand risk factors associated with phonotraumatic lesions. PMID:26024911
Drory Retwitzer, Matan; Polishchuk, Maya; Churkin, Elena; Kifer, Ilona; Yakhini, Zohar; Barash, Danny
2015-01-01
Searching for RNA sequence-structure patterns is becoming an essential tool for RNA practitioners. Novel discoveries of regulatory non-coding RNAs in targeted organisms and the motivation to find them across a wide range of organisms have prompted the use of computational RNA pattern matching as an enhancement to sequence similarity. State-of-the-art programs differ by the flexibility of patterns allowed as queries and by their simplicity of use. In particular—no existing method is available as a user-friendly web server. A general program that searches for RNA sequence-structure patterns is RNA Structator. However, it is not available as a web server and does not provide the option to allow flexible gap pattern representation with an upper bound of the gap length being specified at any position in the sequence. Here, we introduce RNAPattMatch, a web-based application that is user friendly and makes sequence/structure RNA queries accessible to practitioners of various background and proficiency. It also extends RNA Structator and allows a more flexible variable gaps representation, in addition to analysis of results using energy minimization methods. RNAPattMatch service is available at http://www.cs.bgu.ac.il/rnapattmatch. A standalone version of the search tool is also available to download at the site. PMID:25940619
Konheim, Jeremy A; Kon, Zachary N; Pasrija, Chetan; Luo, Qingyang; Sanchez, Pablo G; Garcia, Jose P; Griffith, Bartley P; Jeudy, Jean
2016-04-01
Size matching for lung transplantation is widely accomplished using height comparisons between donors and recipients. This gross approximation allows for wide variation in lung size and, potentially, size mismatch. Three-dimensional computed tomography (3D-CT) volumetry comparisons could offer more accurate size matching. Although recipient CT scans are universally available, donor CT scans are rarely performed. Therefore, predicted donor lung volumes could be used for comparison to measured recipient lung volumes, but no such predictive equations exist. We aimed to use 3D-CT volumetry measurements from a normal patient population to generate equations for predicted total lung volume (pTLV), predicted right lung volume (pRLV), and predicted left lung volume (pLLV), for size-matching purposes. Chest CT scans of 400 normal patients were retrospectively evaluated. 3D-CT volumetry was performed to measure total lung volume, right lung volume, and left lung volume of each patient, and predictive equations were generated. The fitted model was tested in a separate group of 100 patients. The model was externally validated by comparison of total lung volume with total lung capacity from pulmonary function tests in a subset of those patients. Age, gender, height, and race were independent predictors of lung volume. In the test group, there were strong linear correlations between predicted and actual lung volumes measured by 3D-CT volumetry for pTLV (r = 0.72), pRLV (r = 0.72), and pLLV (r = 0.69). A strong linear correlation was also observed when comparing pTLV and total lung capacity (r = 0.82). We successfully created a predictive model for pTLV, pRLV, and pLLV. These may serve as reference standards and predict donor lung volume for size matching in lung transplantation. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Conditional Random Field-Based Offline Map Matching for Indoor Environments
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-01-01
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm. PMID:27537892
Conditional Random Field-Based Offline Map Matching for Indoor Environments.
Bataineh, Safaa; Bahillo, Alfonso; Díez, Luis Enrique; Onieva, Enrique; Bataineh, Ikram
2016-08-16
In this paper, we present an offline map matching technique designed for indoor localization systems based on conditional random fields (CRF). The proposed algorithm can refine the results of existing indoor localization systems and match them with the map, using loose coupling between the existing localization system and the proposed map matching technique. The purpose of this research is to investigate the efficiency of using the CRF technique in offline map matching problems for different scenarios and parameters. The algorithm was applied to several real and simulated trajectories of different lengths. The results were then refined and matched with the map using the CRF algorithm.
Zhou, Y.; Ojeda-May, P.; Nagaraju, M.; Pu, J.
2016-01-01
Adenosine triphosphate (ATP)-binding cassette (ABC) transporters are ubiquitous ATP-dependent membrane proteins involved in translocations of a wide variety of substrates across cellular membranes. To understand the chemomechanical coupling mechanism as well as functional asymmetry in these systems, a quantitative description of how ABC transporters hydrolyze ATP is needed. Complementary to experimental approaches, computer simulations based on combined quantum mechanical and molecular mechanical (QM/MM) potentials have provided new insights into the catalytic mechanism in ABC transporters. Quantitatively reliable determination of the free energy requirement for enzymatic ATP hydrolysis, however, requires substantial statistical sampling on QM/MM potential. A case study shows that brute force sampling of ab initio QM/MM (AI/MM) potential energy surfaces is computationally impractical for enzyme simulations of ABC transporters. On the other hand, existing semiempirical QM/MM (SE/MM) methods, although affordable for free energy sampling, are unreliable for studying ATP hydrolysis. To close this gap, a multiscale QM/MM approach named reaction path–force matching (RP–FM) has been developed. In RP–FM, specific reaction parameters for a selected SE method are optimized against AI reference data along reaction paths by employing the force matching technique. The feasibility of the method is demonstrated for a proton transfer reaction in the gas phase and in solution. The RP–FM method may offer a general tool for simulating complex enzyme systems such as ABC transporters. PMID:27498639
Dudding-Byth, Tracy; Baxter, Anne; Holliday, Elizabeth G; Hackett, Anna; O'Donnell, Sheridan; White, Susan M; Attia, John; Brunner, Han; de Vries, Bert; Koolen, David; Kleefstra, Tjitske; Ratwatte, Seshika; Riveros, Carlos; Brain, Steve; Lovell, Brian C
2017-12-19
Massively parallel genetic sequencing allows rapid testing of known intellectual disability (ID) genes. However, the discovery of novel syndromic ID genes requires molecular confirmation in at least a second or a cluster of individuals with an overlapping phenotype or similar facial gestalt. Using computer face-matching technology we report an automated approach to matching the faces of non-identical individuals with the same genetic syndrome within a database of 3681 images [1600 images of one of 10 genetic syndrome subgroups together with 2081 control images]. Using the leave-one-out method, two research questions were specified: 1) Using two-dimensional (2D) photographs of individuals with one of 10 genetic syndromes within a database of images, did the technology correctly identify more than expected by chance: i) a top match? ii) at least one match within the top five matches? or iii) at least one in the top 10 with an individual from the same syndrome subgroup? 2) Was there concordance between correct technology-based matches and whether two out of three clinical geneticists would have considered the diagnosis based on the image alone? The computer face-matching technology correctly identifies a top match, at least one correct match in the top five and at least one in the top 10 more than expected by chance (P < 0.00001). There was low agreement between the technology and clinicians, with higher accuracy of the technology when results were discordant (P < 0.01) for all syndromes except Kabuki syndrome. Although the accuracy of the computer face-matching technology was tested on images of individuals with known syndromic forms of intellectual disability, the results of this pilot study illustrate the potential utility of face-matching technology within deep phenotyping platforms to facilitate the interpretation of DNA sequencing data for individuals who remain undiagnosed despite testing the known developmental disorder genes.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-07-15
... with the Department of Defense (DoD), Defense Manpower Data Center (DMDC). We have provided background... & Medicaid Services and the Department of Defense, Defense Manpower Data Center for the Determination of...), Centers for Medicare & Medicaid Services (CMS), and Department of Defense (DoD), Defense Manpower Data...
Unconstrained and contactless hand geometry biometrics.
de-Santos-Sierra, Alberto; Sánchez-Ávila, Carmen; Del Pozo, Gonzalo Bailador; Guerra-Casanova, Javier
2011-01-01
This paper presents a hand biometric system for contact-less, platform-free scenarios, proposing innovative methods in feature extraction, template creation and template matching. The evaluation of the proposed method considers both the use of three contact-less publicly available hand databases, and the comparison of the performance to two competitive pattern recognition techniques existing in literature: namely support vector machines (SVM) and k-nearest neighbour (k-NN). Results highlight the fact that the proposed method outcomes existing approaches in literature in terms of computational cost, accuracy in human identification, number of extracted features and number of samples for template creation. The proposed method is a suitable solution for human identification in contact-less scenarios based on hand biometrics, providing a feasible solution to devices with limited hardware requirements like mobile devices.
NASA Astrophysics Data System (ADS)
Hoffman, John; VanderPlas, Jake; Hartman, Joel; Bakos, Gáspár
2017-09-01
This proceedings contribution presents a novel, non-linear extension to the Lomb-Scargle periodogram that allows periodograms to be generated for arbitrary signal shapes. Such periodograms are already known as "template periodograms" or "periodic matched filters," but current implementations are computationally inefficient. The "fast template periodogram" presented here improves existing techniques by a factor of ˜a few for small test cases (O(10) observations), and over three orders of magnitude for lightcurves containing O(104) observations. The fast template periodogram scales asymptotically as O(HNf log HNf + H4Nf), where H denotes the number of harmonics required to adequately approximate the template and Nf is the number of trial frequencies. Existing implementations scale as O(NobsNf), where Nobs is the number of observations in the lightcurve. An open source Python implementation is available on GitHub.
Unconstrained and Contactless Hand Geometry Biometrics
de-Santos-Sierra, Alberto; Sánchez-Ávila, Carmen; del Pozo, Gonzalo Bailador; Guerra-Casanova, Javier
2011-01-01
This paper presents a hand biometric system for contact-less, platform-free scenarios, proposing innovative methods in feature extraction, template creation and template matching. The evaluation of the proposed method considers both the use of three contact-less publicly available hand databases, and the comparison of the performance to two competitive pattern recognition techniques existing in literature: namely Support Vector Machines (SVM) and k-Nearest Neighbour (k-NN). Results highlight the fact that the proposed method outcomes existing approaches in literature in terms of computational cost, accuracy in human identification, number of extracted features and number of samples for template creation. The proposed method is a suitable solution for human identification in contact-less scenarios based on hand biometrics, providing a feasible solution to devices with limited hardware requirements like mobile devices. PMID:22346634
Order of stimulus presentation influences children's acquisition in receptive identification tasks.
Petursdottir, Anna Ingeborg; Aguilar, Gabriella
2016-03-01
Receptive identification is usually taught in matching-to-sample format, which entails the presentation of an auditory sample stimulus and several visual comparison stimuli in each trial. Conflicting recommendations exist regarding the order of stimulus presentation in matching-to-sample trials. The purpose of this study was to compare acquisition in receptive identification tasks under 2 conditions: when the sample was presented before the comparisons (sample first) and when the comparisons were presented before the sample (comparison first). Participants included 4 typically developing kindergarten-age boys. Stimuli, which included birds and flags, were presented on a computer screen. Acquisition in the 2 conditions was compared in an adapted alternating-treatments design combined with a multiple baseline design across stimulus sets. All participants took fewer trials to meet the mastery criterion in the sample-first condition than in the comparison-first condition. © 2015 Society for the Experimental Analysis of Behavior.
Patterns across multiple memories are identified over time.
Richards, Blake A; Xia, Frances; Santoro, Adam; Husse, Jana; Woodin, Melanie A; Josselyn, Sheena A; Frankland, Paul W
2014-07-01
Memories are not static but continue to be processed after encoding. This is thought to allow the integration of related episodes via the identification of patterns. Although this idea lies at the heart of contemporary theories of systems consolidation, it has yet to be demonstrated experimentally. Using a modified water-maze paradigm in which platforms are drawn stochastically from a spatial distribution, we found that mice were better at matching platform distributions 30 d compared to 1 d after training. Post-training time-dependent improvements in pattern matching were associated with increased sensitivity to new platforms that conflicted with the pattern. Increased sensitivity to pattern conflict was reduced by pharmacogenetic inhibition of the medial prefrontal cortex (mPFC). These results indicate that pattern identification occurs over time, which can lead to conflicts between new information and existing knowledge that must be resolved, in part, by computations carried out in the mPFC.
Reflection symmetry detection using locally affine invariant edge correspondence.
Wang, Zhaozhong; Tang, Zesheng; Zhang, Xiao
2015-04-01
Reflection symmetry detection receives increasing attentions in recent years. The state-of-the-art algorithms mainly use the matching of intensity-based features (such as the SIFT) within a single image to find symmetry axes. This paper proposes a novel approach by establishing the correspondence of locally affine invariant edge-based features, which are superior to the intensity based in the aspects that it is insensitive to illumination variations, and applicable to textureless objects. The locally affine invariance is achieved by simple linear algebra for efficient and robust computations, making the algorithm suitable for detections under object distortions like perspective projection. Commonly used edge detectors and a voting process are, respectively, used before and after the edge description and matching steps to form a complete reflection detection pipeline. Experiments are performed using synthetic and real-world images with both multiple and single reflection symmetry axis. The test results are compared with existing algorithms to validate the proposed method.
A Lightweight Radio Propagation Model for Vehicular Communication in Road Tunnels.
Qureshi, Muhammad Ahsan; Noor, Rafidah Md; Shamim, Azra; Shamshirband, Shahaboddin; Raymond Choo, Kim-Kwang
2016-01-01
Radio propagation models (RPMs) are generally employed in Vehicular Ad Hoc Networks (VANETs) to predict path loss in multiple operating environments (e.g. modern road infrastructure such as flyovers, underpasses and road tunnels). For example, different RPMs have been developed to predict propagation behaviour in road tunnels. However, most existing RPMs for road tunnels are computationally complex and are based on field measurements in frequency band not suitable for VANET deployment. Furthermore, in tunnel applications, consequences of moving radio obstacles, such as large buses and delivery trucks, are generally not considered in existing RPMs. This paper proposes a computationally inexpensive RPM with minimal set of parameters to predict path loss in an acceptable range for road tunnels. The proposed RPM utilizes geometric properties of the tunnel, such as height and width along with the distance between sender and receiver, to predict the path loss. The proposed RPM also considers the additional attenuation caused by the moving radio obstacles in road tunnels, while requiring a negligible overhead in terms of computational complexity. To demonstrate the utility of our proposed RPM, we conduct a comparative summary and evaluate its performance. Specifically, an extensive data gathering campaign is carried out in order to evaluate the proposed RPM. The field measurements use the 5 GHz frequency band, which is suitable for vehicular communication. The results demonstrate that a close match exists between the predicted values and measured values of path loss. In particular, an average accuracy of 94% is found with R2 = 0.86.
Luboz, Vincent; Chabanas, Matthieu; Swider, Pascal; Payan, Yohan
2005-08-01
This paper addresses an important issue raised for the clinical relevance of Computer-Assisted Surgical applications, namely the methodology used to automatically build patient-specific finite element (FE) models of anatomical structures. From this perspective, a method is proposed, based on a technique called the mesh-matching method, followed by a process that corrects mesh irregularities. The mesh-matching algorithm generates patient-specific volume meshes from an existing generic model. The mesh regularization process is based on the Jacobian matrix transform related to the FE reference element and the current element. This method for generating patient-specific FE models is first applied to computer-assisted maxillofacial surgery, and more precisely, to the FE elastic modelling of patient facial soft tissues. For each patient, the planned bone osteotomies (mandible, maxilla, chin) are used as boundary conditions to deform the FE face model, in order to predict the aesthetic outcome of the surgery. Seven FE patient-specific models were successfully generated by our method. For one patient, the prediction of the FE model is qualitatively compared with the patient's post-operative appearance, measured from a computer tomography scan. Then, our methodology is applied to computer-assisted orbital surgery. It is, therefore, evaluated for the generation of 11 patient-specific FE poroelastic models of the orbital soft tissues. These models are used to predict the consequences of the surgical decompression of the orbit. More precisely, an average law is extrapolated from the simulations carried out for each patient model. This law links the size of the osteotomy (i.e. the surgical gesture) and the backward displacement of the eyeball (the consequence of the surgical gesture).
Participant, Rater, and Computer Measures of Coherence in Posttraumatic Stress Disorder
Rubin, David C.; Deffler, Samantha A.; Ogle, Christin M.; Dowell, Nia M.; Graesser, Arthur C.; Beckham, Jean C.
2015-01-01
We examined the coherence of trauma memories in a trauma-exposed community sample of 30 adults with and 30 without PTSD. The groups had similar categories of traumas and were matched on multiple factors that could affect the coherence of memories. We compared the transcribed oral trauma memories of participants with their most important and most positive memories. A comprehensive set of 28 measures of coherence including 3 ratings by the participants, 7 ratings by outside raters, and 18 computer-scored measures, provided a variety of approaches to defining and measuring coherence. A MANOVA indicated differences in coherence among the trauma, important, and positive memories, but not between the diagnostic groups or their interaction with these memory types. Most differences were small in magnitude; in some cases, the trauma memories were more, rather than less, coherent than the control memories. Where differences existed, the results agreed with the existing literature, suggesting that factors other than the incoherence of trauma memories are most likely to be central to the maintenance of PTSD and thus its treatment. PMID:26523945
Calculation of Crystallographic Texture of BCC Steels During Cold Rolling
NASA Astrophysics Data System (ADS)
Das, Arpan
2017-05-01
BCC alloys commonly tend to develop strong fibre textures and often represent as isointensity diagrams in φ 1 sections or by fibre diagrams. Alpha fibre in bcc steels is generally characterised by <110> crystallographic axis parallel to the rolling direction. The objective of present research is to correlate carbon content, carbide dispersion, rolling reduction, Euler angles (ϕ) (when φ 1 = 0° and φ 2 = 45° along alpha fibre) and the resulting alpha fibre texture orientation intensity. In the present research, Bayesian neural computation has been employed to correlate these and compare with the existing feed-forward neural network model comprehensively. Excellent match to the measured texture data within the bounding box of texture training data set has been already predicted through the feed-forward neural network model by other researchers. Feed-forward neural network prediction outside the bounds of training texture data showed deviations from the expected values. Currently, Bayesian computation has been similarly applied to confirm that the predictions are reasonable in the context of basic metallurgical principles, and matched better outside the bounds of training texture data set than the reported feed-forward neural network. Bayesian computation puts error bars on predicted values and allows significance of each individual parameters to be estimated. Additionally, it is also possible by Bayesian computation to estimate the isolated influence of particular variable such as carbon concentration, which exactly cannot in practice be varied independently. This shows the ability of the Bayesian neural network to examine the new phenomenon in situations where the data cannot be accessed through experiments.
A computer program for automated flutter solution and matched point determination
NASA Technical Reports Server (NTRS)
Bhatia, K. G.
1973-01-01
The use of a digital computer program (MATCH) for automated determination of the flutter velocity and the matched-point flutter density is described. The program is based on the use of the modified Laguerre iteration formula to converge to a flutter crossing or a matched-point density. A general description of the computer program is included and the purpose of all subroutines used is stated. The input required by the program and various input options are detailed, and the output description is presented. The program can solve flutter equations formulated with up to 12 vibration modes and obtain flutter solutions for up to 10 air densities. The program usage is illustrated by a sample run, and the FORTRAN program listing is included.
Research on three-dimensional reconstruction method based on binocular vision
NASA Astrophysics Data System (ADS)
Li, Jinlin; Wang, Zhihui; Wang, Minjun
2018-03-01
As the hot and difficult issue in computer vision, binocular stereo vision is an important form of computer vision,which has a broad application prospects in many computer vision fields,such as aerial mapping,vision navigation,motion analysis and industrial inspection etc.In this paper, a research is done into binocular stereo camera calibration, image feature extraction and stereo matching. In the binocular stereo camera calibration module, the internal parameters of a single camera are obtained by using the checkerboard lattice of zhang zhengyou the field of image feature extraction and stereo matching, adopted the SURF operator in the local feature operator and the SGBM algorithm in the global matching algorithm are used respectively, and the performance are compared. After completed the feature points matching, we can build the corresponding between matching points and the 3D object points using the camera parameters which are calibrated, which means the 3D information.
NASA Astrophysics Data System (ADS)
Park, Sang-Gon; Jeong, Dong-Seok
2000-12-01
In this paper, we propose a fast adaptive diamond search algorithm (FADS) for block matching motion estimation. Many fast motion estimation algorithms reduce the computational complexity by the UESA (Unimodal Error Surface Assumption) where the matching error monotonically increases as the search moves away from the global minimum point. Recently, many fast BMAs (Block Matching Algorithms) make use of the fact that global minimum points in real world video sequences are centered at the position of zero motion. But these BMAs, especially in large motion, are easily trapped into the local minima and result in poor matching accuracy. So, we propose a new motion estimation algorithm using the spatial correlation among the neighboring blocks. We move the search origin according to the motion vectors of the spatially neighboring blocks and their MAEs (Mean Absolute Errors). The computer simulation shows that the proposed algorithm has almost the same computational complexity with DS (Diamond Search), but enhances PSNR. Moreover, the proposed algorithm gives almost the same PSNR as that of FS (Full Search), even for the large motion with half the computational load.
Fast Legendre moment computation for template matching
NASA Astrophysics Data System (ADS)
Li, Bing C.
2017-05-01
Normalized cross correlation (NCC) based template matching is insensitive to intensity changes and it has many applications in image processing, object detection, video tracking and pattern recognition. However, normalized cross correlation implementation is computationally expensive since it involves both correlation computation and normalization implementation. In this paper, we propose Legendre moment approach for fast normalized cross correlation implementation and show that the computational cost of this proposed approach is independent of template mask sizes which is significantly faster than traditional mask size dependent approaches, especially for large mask templates. Legendre polynomials have been widely used in solving Laplace equation in electrodynamics in spherical coordinate systems, and solving Schrodinger equation in quantum mechanics. In this paper, we extend Legendre polynomials from physics to computer vision and pattern recognition fields, and demonstrate that Legendre polynomials can help to reduce the computational cost of NCC based template matching significantly.
CometQuest: A Rosetta Adventure
NASA Technical Reports Server (NTRS)
Leon, Nancy J.; Fisher, Diane K.; Novati, Alexander; Chmielewski, Artur B.; Fitzpatrick, Austin J.; Angrum, Andrea
2012-01-01
This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2012-01-01
This software is a higher-performance implementation of tiled WMS, with integral support for KML and time-varying data. This software is compliant with the Open Geospatial WMS standard, and supports KML natively as a WMS return type, including support for the time attribute. Regionated KML wrappers are generated that match the existing tiled WMS dataset. Ping and JPG formats are supported, and the software is implemented as an Apache 2.0 module that supports a threading execution model that is capable of supporting very high request rates. The module intercepts and responds to WMS requests that match certain patterns and returns the existing tiles. If a KML format that matches an existing pyramid and tile dataset is requested, regionated KML is generated and returned to the requesting application. In addition, KML requests that do not match the existing tile datasets generate a KML response that includes the corresponding JPG WMS request, effectively adding KML support to a backing WMS server.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-13
... false positive match rate of 10 percent. Making the match mandatory for the States who did not perform... number of prisoners from 1995 to 2013 and assumed a 10 percent false positive match rate. Finally, we... matches are false positives. We estimate that mandatory matches at certification will identify an...
Ronald E. Coleman
1977-01-01
SEMTAP (Serpentine End Match TApe Program) is an easy and inexpensive method of programing a numerically controlled router for the manufacture of SEM (Serpentine End Matching) joints. The SEMTAP computer program allows the user to issue commands that will accurately direct a numerically controlled router along any SEM path. The user need not be a computer programer to...
An algorithm for automating the registration of USDA segment ground data to LANDSAT MSS data
NASA Technical Reports Server (NTRS)
Graham, M. H. (Principal Investigator)
1981-01-01
The algorithm is referred to as the Automatic Segment Matching Algorithm (ASMA). The ASMA uses control points or the annotation record of a P-format LANDSAT compter compatible tape as the initial registration to relate latitude and longitude to LANDSAT rows and columns. It searches a given area of LANDSAT data with a 2x2 sliding window and computes gradient values for bands 5 and 7 to match the segment boundaries. The gradient values are held in memory during the shifting (or matching) process. The reconstructed segment array, containing ones (1's) for boundaries and zeros elsewhere are computer compared to the LANDSAT array and the best match computed. Initial testing of the ASMA indicates that it has good potential for replacing the manual technique.
NASA Astrophysics Data System (ADS)
Park, Jonghee; Yoon, Kuk-Jin
2015-02-01
We propose a real-time line matching method for stereo systems. To achieve real-time performance while retaining a high level of matching precision, we first propose a nonparametric transform to represent the spatial relations between neighboring lines and nearby textures as a binary stream. Since the length of a line can vary across images, the matching costs between lines are computed within an overlap area (OA) based on the binary stream. The OA is determined for each line pair by employing the properties of a rectified image pair. Finally, the line correspondence is determined using a winner-takes-all method with a left-right consistency check. To reduce the computational time requirements further, we filter out unreliable matching candidates in advance based on their rectification properties. The performance of the proposed method was compared with state-of-the-art methods in terms of the computational time, matching precision, and recall. The proposed method required 47 ms to match lines from an image pair in the KITTI dataset with an average precision of 95%. We also verified the proposed method under image blur, illumination variation, and viewpoint changes.
Toothguide Trainer tests with color vision deficiency simulation monitor.
Borbély, Judit; Varsányi, Balázs; Fejérdy, Pál; Hermann, Péter; Jakstat, Holger A
2010-01-01
The aim of this study was to evaluate whether simulated severe red and green color vision deficiency (CVD) influenced color matching results and to investigate whether training with Toothguide Trainer (TT) computer program enabled better color matching results. A total of 31 color normal dental students participated in the study. Every participant had to pass the Ishihara Test. Participants with a red/green color vision deficiency were excluded. A lecture on tooth color matching was given, and individual training with TT was performed. To measure the individual tooth color matching results in normal and color deficient display modes, the TT final exam was displayed on a calibrated monitor that served as a hardware-based method of simulating protanopy and deuteranopy. Data from the TT final exams were collected in normal and in severe red and green CVD-simulating monitor display modes. Color difference values for each participant in each display mode were computed (∑ΔE(ab)(*)), and the respective means and standard deviations were calculated. The Student's t-test was used in statistical evaluation. Participants made larger ΔE(ab)(*) errors in severe color vision deficient display modes than in the normal monitor mode. TT tests showed significant (p<0.05) difference in the tooth color matching results of severe green color vision deficiency simulation mode compared to normal vision mode. Students' shade matching results were significantly better after training (p=0.009). Computer-simulated severe color vision deficiency mode resulted in significantly worse color matching quality compared to normal color vision mode. Toothguide Trainer computer program improved color matching results. Copyright © 2010 Elsevier Ltd. All rights reserved.
Signal processing: opportunities for superconductive circuits
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ralston, R.W.
1985-03-01
Prime motivators in the evolution of increasingly sophisticated communication and detection systems are the needs for handling ever wider signal bandwidths and higher data-processing speeds. These same needs drive the development of electronic device technology. Until recently the superconductive community has been tightly focused on digital devices for high speed computers. The purpose of this paper is to describe opportunities and challenges which exist for both analog and digital devices in a less familiar area, that of wideband signal processing. The function and purpose of analog signal-processing components, including matched filters, correlators and Fourier transformers, will be described and examplesmore » of superconductive implementations given. A canonic signal-processing system is then configured using these components and digital output circuits to highlight the important issues of dynamic range, accuracy and equivalent computation rate. (Reprints)« less
Parsa, Azin; Ibrahim, Norliza; Hassan, Bassam; Motroni, Alessandro; van der Stelt, Paul; Wismeijer, Daniel
2012-01-01
To assess the reliability of cone beam computed tomography (CBCT) voxel gray value measurements using Hounsfield units (HU) derived from multislice computed tomography (MSCT) as a clinical reference (gold standard). Ten partially edentulous human mandibular cadavers were scanned by two types of computed tomography (CT) modalities: multislice CT and cone beam CT. On MSCT scans, eight regions of interest (ROI) designating the site for preoperative implant placement were selected in each mandible. The datasets from both CT systems were matched using a three-dimensional (3D) registration algorithm. The mean voxel gray values of the region around the implant sites were compared between MSCT and CBCT. Significant differences between the mean gray values obtained by CBCT and HU by MSCT were found. In all the selected ROIs, CBCT showed higher mean values than MSCT. A strong correlation (R=0.968) between mean voxel gray values of CBCT and mean HU of MSCT was determined. Voxel gray values from CBCT deviate from actual HU units. However, a strong linear correlation exists, which may permit deriving actual HU units from CBCT using linear regression models.
Local Orthogonal Cutting Method for Computing Medial Curves and Its Biomedical Applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiao, Xiangmin; Einstein, Daniel R.; Dyedov, Volodymyr
2010-03-24
Medial curves have a wide range of applications in geometric modeling and analysis (such as shape matching) and biomedical engineering (such as morphometry and computer assisted surgery). The computation of medial curves poses significant challenges, both in terms of theoretical analysis and practical efficiency and reliability. In this paper, we propose a definition and analysis of medial curves and also describe an efficient and robust method for computing medial curves. Our approach is based on three key concepts: a local orthogonal decomposition of objects into substructures, a differential geometry concept called the interior center of curvature (ICC), and integrated stabilitymore » and consistency tests. These concepts lend themselves to robust numerical techniques including eigenvalue analysis, weighted least squares approximations, and numerical minimization, resulting in an algorithm that is efficient and noise resistant. We illustrate the effectiveness and robustness of our approach with some highly complex, large-scale, noisy biomedical geometries derived from medical images, including lung airways and blood vessels. We also present comparisons of our method with some existing methods.« less
Aerodynamic Performance of an Active Flow Control Configuration Using Unstructured-Grid RANS
NASA Technical Reports Server (NTRS)
Joslin, Ronald D.; Viken, Sally A.
2001-01-01
This research is focused on assessing the value of the Reynolds-Averaged Navier-Stokes (RANS) methodology for active flow control applications. An experimental flow control database exists for a TAU0015 airfoil, which is a modification of a NACA0015 airfoil. The airfoil has discontinuities at the leading edge due to the implementation of a fluidic actuator and aft of mid chord on the upper surface. This paper documents two- and three-dimensional computational results for the baseline wing configuration (no control) with tile experimental results. The two-dimensional results suggest that the mid-chord discontinuity does not effect the aerodynamics of the wing and can be ignored for more efficient computations. The leading-edge discontinuity significantly affects tile lift and drag; hence, the integrity of the leading-edge notch discontinuity must be maintained in the computations to achieve a good match with the experimental data. The three-dimensional integrated performance results are in good agreement with the experiments inspite of some convergence and grid resolution issues.
Ab initio calculations of the concentration dependent band gap reduction in dilute nitrides
NASA Astrophysics Data System (ADS)
Rosenow, Phil; Bannow, Lars C.; Fischer, Eric W.; Stolz, Wolfgang; Volz, Kerstin; Koch, Stephan W.; Tonner, Ralf
2018-02-01
While being of persistent interest for the integration of lattice-matched laser devices with silicon circuits, the electronic structure of dilute nitride III/V-semiconductors has presented a challenge to ab initio computational approaches. The origin of the computational problems is the strong distortion exerted by the N atoms on most host materials. Here, these issues are resolved by combining density functional theory calculations based on the meta-GGA functional presented by Tran and Blaha (TB09) with a supercell approach for the dilute nitride Ga(NAs). Exploring the requirements posed to supercells, it is shown that the distortion field of a single N atom must be allowed to decrease so far that it does not overlap with its periodic images. This also prevents spurious electronic interactions between translational symmetric atoms, allowing us to compute band gaps in very good agreement with experimentally derived reference values. In addition to existing approaches, these results offer a promising ab initio avenue to the electronic structure of dilute nitride semiconductor compounds.
KeyWare: an open wireless distributed computing environment
NASA Astrophysics Data System (ADS)
Shpantzer, Isaac; Schoenfeld, Larry; Grindahl, Merv; Kelman, Vladimir
1995-12-01
Deployment of distributed applications in the wireless domain lack equivalent tools, methodologies, architectures, and network management that exist in LAN based applications. A wireless distributed computing environment (KeyWareTM) based on intelligent agents within a multiple client multiple server scheme was developed to resolve this problem. KeyWare renders concurrent application services to wireline and wireless client nodes encapsulated in multiple paradigms such as message delivery, database access, e-mail, and file transfer. These services and paradigms are optimized to cope with temporal and spatial radio coverage, high latency, limited throughput and transmission costs. A unified network management paradigm for both wireless and wireline facilitates seamless extensions of LAN- based management tools to include wireless nodes. A set of object oriented tools and methodologies enables direct asynchronous invocation of agent-based services supplemented by tool-sets matched to supported KeyWare paradigms. The open architecture embodiment of KeyWare enables a wide selection of client node computing platforms, operating systems, transport protocols, radio modems and infrastructures while maintaining application portability.
39 CFR 266.10 - Computer matching.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Service UNITED STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION PRIVACY OF INFORMATION § 266.10... matching proposals. A proposal must include information required for the matching agreement discussed in... matching proposals, whether from postal organizations or other government agencies, must be mailed directly...
39 CFR 266.10 - Computer matching.
Code of Federal Regulations, 2010 CFR
2010-07-01
... Service UNITED STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION PRIVACY OF INFORMATION § 266.10... matching proposals. A proposal must include information required for the matching agreement discussed in... matching proposals, whether from postal organizations or other government agencies, must be mailed directly...
39 CFR 266.10 - Computer matching.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Service UNITED STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION PRIVACY OF INFORMATION § 266.10... matching proposals. A proposal must include information required for the matching agreement discussed in... matching proposals, whether from postal organizations or other government agencies, must be mailed directly...
39 CFR 266.10 - Computer matching.
Code of Federal Regulations, 2011 CFR
2011-07-01
... Service UNITED STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION PRIVACY OF INFORMATION § 266.10... matching proposals. A proposal must include information required for the matching agreement discussed in... matching proposals, whether from postal organizations or other government agencies, must be mailed directly...
39 CFR 266.10 - Computer matching.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Service UNITED STATES POSTAL SERVICE ORGANIZATION AND ADMINISTRATION PRIVACY OF INFORMATION § 266.10... matching proposals. A proposal must include information required for the matching agreement discussed in... matching proposals, whether from postal organizations or other government agencies, must be mailed directly...
Clinical Study of Orthogonal-View Phase-Matched Digital Tomosynthesis for Lung Tumor Localization.
Zhang, You; Ren, Lei; Vergalasova, Irina; Yin, Fang-Fang
2017-01-01
Compared to cone-beam computed tomography, digital tomosynthesis imaging has the benefits of shorter scanning time, less imaging dose, and better mechanical clearance for tumor localization in radiation therapy. However, for lung tumors, the localization accuracy of the conventional digital tomosynthesis technique is affected by the lack of depth information and the existence of lung tumor motion. This study investigates the clinical feasibility of using an orthogonal-view phase-matched digital tomosynthesis technique to improve the accuracy of lung tumor localization. The proposed orthogonal-view phase-matched digital tomosynthesis technique benefits from 2 major features: (1) it acquires orthogonal-view projections to improve the depth information in reconstructed digital tomosynthesis images and (2) it applies respiratory phase-matching to incorporate patient motion information into the synthesized reference digital tomosynthesis sets, which helps to improve the localization accuracy of moving lung tumors. A retrospective study enrolling 14 patients was performed to evaluate the accuracy of the orthogonal-view phase-matched digital tomosynthesis technique. Phantom studies were also performed using an anthropomorphic phantom to investigate the feasibility of using intratreatment aggregated kV and beams' eye view cine MV projections for orthogonal-view phase-matched digital tomosynthesis imaging. The localization accuracy of the orthogonal-view phase-matched digital tomosynthesis technique was compared to that of the single-view digital tomosynthesis techniques and the digital tomosynthesis techniques without phase-matching. The orthogonal-view phase-matched digital tomosynthesis technique outperforms the other digital tomosynthesis techniques in tumor localization accuracy for both the patient study and the phantom study. For the patient study, the orthogonal-view phase-matched digital tomosynthesis technique localizes the tumor to an average (± standard deviation) error of 1.8 (0.7) mm for a 30° total scan angle. For the phantom study using aggregated kV-MV projections, the orthogonal-view phase-matched digital tomosynthesis localizes the tumor to an average error within 1 mm for varying magnitudes of scan angles. The pilot clinical study shows that the orthogonal-view phase-matched digital tomosynthesis technique enables fast and accurate localization of moving lung tumors.
Standard model anatomy of WIMP dark matter direct detection. I. Weak-scale matching
NASA Astrophysics Data System (ADS)
Hill, Richard J.; Solon, Mikhail P.
2015-02-01
We present formalism necessary to determine weak-scale matching coefficients in the computation of scattering cross sections for putative dark matter candidates interacting with the Standard Model. We pay particular attention to the heavy-particle limit. A consistent renormalization scheme in the presence of nontrivial residual masses is implemented. Two-loop diagrams appearing in the matching to gluon operators are evaluated. Details are given for the computation of matching coefficients in the universal limit of WIMP-nucleon scattering for pure states of arbitrary quantum numbers, and for singlet-doublet and doublet-triplet mixed states.
Parameterizing by the Number of Numbers
NASA Astrophysics Data System (ADS)
Fellows, Michael R.; Gaspers, Serge; Rosamond, Frances A.
The usefulness of parameterized algorithmics has often depended on what Niedermeier has called "the art of problem parameterization". In this paper we introduce and explore a novel but general form of parameterization: the number of numbers. Several classic numerical problems, such as Subset Sum, Partition, 3-Partition, Numerical 3-Dimensional Matching, and Numerical Matching with Target Sums, have multisets of integers as input. We initiate the study of parameterizing these problems by the number of distinct integers in the input. We rely on an FPT result for Integer Linear Programming Feasibility to show that all the above-mentioned problems are fixed-parameter tractable when parameterized in this way. In various applied settings, problem inputs often consist in part of multisets of integers or multisets of weighted objects (such as edges in a graph, or jobs to be scheduled). Such number-of-numbers parameterized problems often reduce to subproblems about transition systems of various kinds, parameterized by the size of the system description. We consider several core problems of this kind relevant to number-of-numbers parameterization. Our main hardness result considers the problem: given a non-deterministic Mealy machine M (a finite state automaton outputting a letter on each transition), an input word x, and a census requirement c for the output word specifying how many times each letter of the output alphabet should be written, decide whether there exists a computation of M reading x that outputs a word y that meets the requirement c. We show that this problem is hard for W[1]. If the question is whether there exists an input word x such that a computation of M on x outputs a word that meets c, the problem becomes fixed-parameter tractable.
NASA Technical Reports Server (NTRS)
Gordy, R. S.
1972-01-01
An improved broadband impedance matching technique was developed. The technique is capable of resolving points in the waveguide which generate reflected energy. A version of the comparison reflectometer was developed and fabricated to determine the mean amplitude of the reflection coefficient excited at points in the guide as a function of distance, and the complex reflection coefficient of a specific discontinuity in the guide as a function of frequency. An impedance matching computer program was developed which is capable of impedance matching the characteristics of each disturbance independent of other reflections in the guide. The characteristics of four standard matching elements were compiled, and their associated curves of reflection coefficient and shunt susceptance as a function of frequency are presented. It is concluded that an economical, fast, and reliable impedance matching technique has been established which can provide broadband impedance matches.
Send-side matching of data communications messages
Archer, Charles J.; Blocksome, Michael A.; Ratterman, Joseph D.; Smith, Brian E.
2014-06-17
Send-side matching of data communications messages in a distributed computing system comprising a plurality of compute nodes, including: issuing by a receiving node to source nodes a receive message that specifies receipt of a single message to be sent from any source node, the receive message including message matching information, a specification of a hardware-level mutual exclusion device, and an identification of a receive buffer; matching by two or more of the source nodes the receive message with pending send messages in the two or more source nodes; operating by one of the source nodes having a matching send message the mutual exclusion device, excluding messages from other source nodes with matching send messages and identifying to the receiving node the source node operating the mutual exclusion device; and sending to the receiving node from the source node operating the mutual exclusion device a matched pending message.
NASA Technical Reports Server (NTRS)
hoelzer, H. D.; Fourroux, K. A.; Rickman, D. L.; Schrader, C. M.
2011-01-01
Figures of Merit (FoMs) and the FoM software provide a method for quantitatively evaluating the quality of a regolith simulant by comparing the simulant to a reference material. FoMs may be used for comparing a simulant to actual regolith material, specification by stating the value a simulant s FoMs must attain to be suitable for a given application and comparing simulants from different vendors or production runs. FoMs may even be used to compare different simulants to each other. A single FoM is conceptually an algorithm that computes a single number for quantifying the similarity or difference of a single characteristic of a simulant material and a reference material and provides a clear measure of how well a simulant and reference material match or compare. FoMs have been constructed to lie between zero and 1, with zero indicating a poor or no match and 1 indicating a perfect match. FoMs are defined for modal composition, particle size distribution, particle shape distribution, (aspect ratio and angularity), and density. This TM covers the mathematics, use, installation, and licensing for the existing FoM code in detail.
Castillo, Edward; Castillo, Richard; White, Benjamin; Rojo, Javier; Guerrero, Thomas
2012-01-01
Compressible flow based image registration operates under the assumption that the mass of the imaged material is conserved from one image to the next. Depending on how the mass conservation assumption is modeled, the performance of existing compressible flow methods is limited by factors such as image quality, noise, large magnitude voxel displacements, and computational requirements. The Least Median of Squares Filtered Compressible Flow (LFC) method introduced here is based on a localized, nonlinear least squares, compressible flow model that describes the displacement of a single voxel that lends itself to a simple grid search (block matching) optimization strategy. Spatially inaccurate grid search point matches, corresponding to erroneous local minimizers of the nonlinear compressible flow model, are removed by a novel filtering approach based on least median of squares fitting and the forward search outlier detection method. The spatial accuracy of the method is measured using ten thoracic CT image sets and large samples of expert determined landmarks (available at www.dir-lab.com). The LFC method produces an average error within the intra-observer error on eight of the ten cases, indicating that the method is capable of achieving a high spatial accuracy for thoracic CT registration. PMID:22797602
A Lightweight Radio Propagation Model for Vehicular Communication in Road Tunnels
Shamim, Azra; Shamshirband, Shahaboddin; Raymond Choo, Kim-Kwang
2016-01-01
Radio propagation models (RPMs) are generally employed in Vehicular Ad Hoc Networks (VANETs) to predict path loss in multiple operating environments (e.g. modern road infrastructure such as flyovers, underpasses and road tunnels). For example, different RPMs have been developed to predict propagation behaviour in road tunnels. However, most existing RPMs for road tunnels are computationally complex and are based on field measurements in frequency band not suitable for VANET deployment. Furthermore, in tunnel applications, consequences of moving radio obstacles, such as large buses and delivery trucks, are generally not considered in existing RPMs. This paper proposes a computationally inexpensive RPM with minimal set of parameters to predict path loss in an acceptable range for road tunnels. The proposed RPM utilizes geometric properties of the tunnel, such as height and width along with the distance between sender and receiver, to predict the path loss. The proposed RPM also considers the additional attenuation caused by the moving radio obstacles in road tunnels, while requiring a negligible overhead in terms of computational complexity. To demonstrate the utility of our proposed RPM, we conduct a comparative summary and evaluate its performance. Specifically, an extensive data gathering campaign is carried out in order to evaluate the proposed RPM. The field measurements use the 5 GHz frequency band, which is suitable for vehicular communication. The results demonstrate that a close match exists between the predicted values and measured values of path loss. In particular, an average accuracy of 94% is found with R2 = 0.86. PMID:27031989
Dietterich, Hannah; Lev, Einat; Chen, Jiangzhi; Richardson, Jacob A.; Cashman, Katharine V.
2017-01-01
Numerical simulations of lava flow emplacement are valuable for assessing lava flow hazards, forecasting active flows, designing flow mitigation measures, interpreting past eruptions, and understanding the controls on lava flow behavior. Existing lava flow models vary in simplifying assumptions, physics, dimensionality, and the degree to which they have been validated against analytical solutions, experiments, and natural observations. In order to assess existing models and guide the development of new codes, we conduct a benchmarking study of computational fluid dynamics (CFD) models for lava flow emplacement, including VolcFlow, OpenFOAM, FLOW-3D, COMSOL, and MOLASSES. We model viscous, cooling, and solidifying flows over horizontal planes, sloping surfaces, and into topographic obstacles. We compare model results to physical observations made during well-controlled analogue and molten basalt experiments, and to analytical theory when available. Overall, the models accurately simulate viscous flow with some variability in flow thickness where flows intersect obstacles. OpenFOAM, COMSOL, and FLOW-3D can each reproduce experimental measurements of cooling viscous flows, and OpenFOAM and FLOW-3D simulations with temperature-dependent rheology match results from molten basalt experiments. We assess the goodness-of-fit of the simulation results and the computational cost. Our results guide the selection of numerical simulation codes for different applications, including inferring emplacement conditions of past lava flows, modeling the temporal evolution of ongoing flows during eruption, and probabilistic assessment of lava flow hazard prior to eruption. Finally, we outline potential experiments and desired key observational data from future flows that would extend existing benchmarking data sets.
Kekre, Natasha; Antin, Joseph H
2014-07-17
Most patients who require allogeneic stem cell transplantation do not have a matched sibling donor, and many patients do not have a matched unrelated donor. In an effort to increase the applicability of transplantation, alternative donors such as mismatched adult unrelated donors, haploidentical related donors, and umbilical cord blood stem cell products are frequently used when a well matched donor is unavailable. We do not yet have the benefit of randomized trials comparing alternative donor stem cell sources to inform the choice of donor; however, the existing data allow some inferences to be made on the basis of existing observational and phase 2 studies. All 3 alternative donor sources can provide effective lymphohematopoietic reconstitution, but time to engraftment, graft failure rate, graft-versus-host disease, transplant-related mortality, and relapse risk vary by donor source. These factors all contribute to survival outcomes and an understanding of them should help guide clinicians when choosing among alternative donor sources when a matched related or matched unrelated donor is not available. © 2014 by The American Society of Hematology.
41 CFR 105-56.024 - Purpose and scope.
Code of Federal Regulations, 2010 CFR
2010-07-01
... offset computer matching, identify Federal employees who owe delinquent non-tax debt to the United States. Centralized salary offset computer matching is the computerized comparison of delinquent debt records with...) administrative offset program, to collect delinquent debts owed to the Federal Government. This process is known...
39 CFR 262.5 - Systems (Privacy).
Code of Federal Regulations, 2010 CFR
2010-07-01
..., partnerships or corporations. A business firm identified by the name of one or more persons is not an... computer matches are specifically excluded from the term “matching program”: (i) Statistical matches whose purpose is solely to produce aggregate data stripped of personal identifiers. (ii) Statistical matches...
Technical Note: spektr 3.0-A computational tool for x-ray spectrum modeling and analysis.
Punnoose, J; Xu, J; Sisniega, A; Zbijewski, W; Siewerdsen, J H
2016-08-01
A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. The spektr code generates x-ray spectra (photons/mm(2)/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20-150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30-140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available.
Wee, Leonard; Hackett, Sara Lyons; Jones, Andrew; Lim, Tee Sin; Harper, Christopher Stirling
2013-01-01
This study evaluated the agreement of fiducial marker localization between two modalities — an electronic portal imaging device (EPID) and cone‐beam computed tomography (CBCT) — using a low‐dose, half‐rotation scanning protocol. Twenty‐five prostate cancer patients with implanted fiducial markers were enrolled. Before each daily treatment, EPID and half‐rotation CBCT images were acquired. Translational shifts were computed for each modality and two marker‐matching algorithms, seed‐chamfer and grey‐value, were performed for each set of CBCT images. The localization offsets, and systematic and random errors from both modalities were computed. Localization performances for both modalities were compared using Bland‐Altman limits of agreement (LoA) analysis, Deming regression analysis, and Cohen's kappa inter‐rater analysis. The differences in the systematic and random errors between the modalities were within 0.2 mm in all directions. The LoA analysis revealed a 95% agreement limit of the modalities of 2 to 3.5 mm in any given translational direction. Deming regression analysis demonstrated that constant biases existed in the shifts computed by the modalities in the superior–inferior (SI) direction, but no significant proportional biases were identified in any direction. Cohen's kappa analysis showed good agreement between the modalities in prescribing translational corrections of the couch at 3 and 5 mm action levels. Images obtained from EPID and half‐rotation CBCT showed acceptable agreement for registration of fiducial markers. The seed‐chamfer algorithm for tracking of fiducial markers in CBCT datasets yielded better agreement than the grey‐value matching algorithm with EPID‐based registration. PACS numbers: 87.55.km, 87.55.Qr PMID:23835391
Fast group matching for MR fingerprinting reconstruction.
Cauley, Stephen F; Setsompop, Kawin; Ma, Dan; Jiang, Yun; Ye, Huihui; Adalsteinsson, Elfar; Griswold, Mark A; Wald, Lawrence L
2015-08-01
MR fingerprinting (MRF) is a technique for quantitative tissue mapping using pseudorandom measurements. To estimate tissue properties such as T1 , T2 , proton density, and B0 , the rapidly acquired data are compared against a large dictionary of Bloch simulations. This matching process can be a very computationally demanding portion of MRF reconstruction. We introduce a fast group matching algorithm (GRM) that exploits inherent correlation within MRF dictionaries to create highly clustered groupings of the elements. During matching, a group specific signature is first used to remove poor matching possibilities. Group principal component analysis (PCA) is used to evaluate all remaining tissue types. In vivo 3 Tesla brain data were used to validate the accuracy of our approach. For a trueFISP sequence with over 196,000 dictionary elements, 1000 MRF samples, and image matrix of 128 × 128, GRM was able to map MR parameters within 2s using standard vendor computational resources. This is an order of magnitude faster than global PCA and nearly two orders of magnitude faster than direct matching, with comparable accuracy (1-2% relative error). The proposed GRM method is a highly efficient model reduction technique for MRF matching and should enable clinically relevant reconstruction accuracy and time on standard vendor computational resources. © 2014 Wiley Periodicals, Inc.
de Carvalho, Sarah Negreiros; Costa, Thiago Bulhões da Silva; Attux, Romis; Hornung, Heiko Horst; Arantes, Dalton Soares
2018-01-01
This paper presents a systematic analysis of a game controlled by a Brain-Computer Interface (BCI) based on Steady-State Visually Evoked Potentials (SSVEP). The objective is to understand BCI systems from the Human-Computer Interface (HCI) point of view, by observing how the users interact with the game and evaluating how the interface elements influence the system performance. The interactions of 30 volunteers with our computer game, named “Get Coins,” through a BCI based on SSVEP, have generated a database of brain signals and the corresponding responses to a questionnaire about various perceptual parameters, such as visual stimulation, acoustic feedback, background music, visual contrast, and visual fatigue. Each one of the volunteers played one match using the keyboard and four matches using the BCI, for comparison. In all matches using the BCI, the volunteers achieved the goals of the game. Eight of them achieved a perfect score in at least one of the four matches, showing the feasibility of the direct communication between the brain and the computer. Despite this successful experiment, adaptations and improvements should be implemented to make this innovative technology accessible to the end user. PMID:29849549
Leite, Harlei Miguel de Arruda; de Carvalho, Sarah Negreiros; Costa, Thiago Bulhões da Silva; Attux, Romis; Hornung, Heiko Horst; Arantes, Dalton Soares
2018-01-01
This paper presents a systematic analysis of a game controlled by a Brain-Computer Interface (BCI) based on Steady-State Visually Evoked Potentials (SSVEP). The objective is to understand BCI systems from the Human-Computer Interface (HCI) point of view, by observing how the users interact with the game and evaluating how the interface elements influence the system performance. The interactions of 30 volunteers with our computer game, named "Get Coins," through a BCI based on SSVEP, have generated a database of brain signals and the corresponding responses to a questionnaire about various perceptual parameters, such as visual stimulation, acoustic feedback, background music, visual contrast, and visual fatigue. Each one of the volunteers played one match using the keyboard and four matches using the BCI, for comparison. In all matches using the BCI, the volunteers achieved the goals of the game. Eight of them achieved a perfect score in at least one of the four matches, showing the feasibility of the direct communication between the brain and the computer. Despite this successful experiment, adaptations and improvements should be implemented to make this innovative technology accessible to the end user.
Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces
Andresen, Elena M.; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L.
2016-01-01
Purpose The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology for individuals with severe speech and physical impairments (SSPI). Method In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Results Most items (79%) mapped to the ICF environmental domain; over half (53%) mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: Quality of Life (QOL) and Assistive Technology. Component domains and themes were identified for each. Conclusions Preliminary constructs, domains, and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. PMID:25806719
Cosmic Reionization on Computers: Properties of the Post-reionization IGM
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gnedin, Nickolay Y.; Becker, George D.; Fan, Xiaohui
Here, we present a comparison between several observational tests of the post-reionization IGM and the numerical simulations of reionization completed under the Cosmic Reionization On Computers (CROC) project. The CROC simulations match the gap distribution reasonably well, and also provide a good match for the distribution of peak heights, but there is a notable lack of wide peaks in the simulated spectra and the flux PDFs are poorly matched in the narrow redshift interval 5.5 < z < 5.7, with the match at other redshifts being significantly better, albeit not exact. Both discrepancies are related: simulations show more opacity thanmore » the data.« less
Cosmic Reionization on Computers: Properties of the Post-reionization IGM
Gnedin, Nickolay Y.; Becker, George D.; Fan, Xiaohui
2017-05-19
Here, we present a comparison between several observational tests of the post-reionization IGM and the numerical simulations of reionization completed under the Cosmic Reionization On Computers (CROC) project. The CROC simulations match the gap distribution reasonably well, and also provide a good match for the distribution of peak heights, but there is a notable lack of wide peaks in the simulated spectra and the flux PDFs are poorly matched in the narrow redshift interval 5.5 < z < 5.7, with the match at other redshifts being significantly better, albeit not exact. Both discrepancies are related: simulations show more opacity thanmore » the data.« less
Geradts, Z J; Bijhold, J; Hermsen, R; Murtagh, F
2001-06-01
On the market several systems exist for collecting spent ammunition data for forensic investigation. These databases store images of cartridge cases and the marks on them. Image matching is used to create hit lists that show which marks on a cartridge case are most similar to another cartridge case. The research in this paper is focused on the different methods of feature selection and pattern recognition that can be used for optimizing the results of image matching. The images are acquired by side light images for the breech face marks and by ring light for the firing pin impression. For these images a standard way of digitizing the images used. For the side light images and ring light images this means that the user has to position the cartridge case in the same position according to a protocol. The positioning is important for the sidelight, since the image that is obtained of a striation mark depends heavily on the angle of incidence of the light. In practice, it appears that the user positions the cartridge case with +/-10 degrees accuracy. We tested our algorithms using 49 cartridge cases of 19 different firearms, where the examiner determined that they were shot with the same firearm. For testing, these images were mixed with a database consisting of approximately 4900 images that were available from the Drugfire database of different calibers.In cases where the registration and the light conditions among those matching pairs was good, a simple computation of the standard deviation of the subtracted gray levels, delivered the best-matched images. For images that were rotated and shifted, we have implemented a "brute force" way of registration. The images are translated and rotated until the minimum of the standard deviation of the difference is found. This method did not result in all relevant matches in the top position. This is caused by the effect that shadows and highlights are compared in intensity. Since the angle of incidence of the light will give a different intensity profile, this method is not optimal. For this reason a preprocessing of the images was required. It appeared that the third scale of the "à trous" wavelet transform gives the best results in combination with brute force. Matching the contents of the images is less sensitive to the variation of the lighting. The problem with the brute force method is however that the time for calculation for 49 cartridge cases to compare between them, takes over 1 month of computing time on a Pentium II-computer with 333MHz. For this reason a faster approach is implemented: correlation in log polar coordinates. This gave similar results as the brute force calculation, however it was computed in 24h for a complete database with 4900 images.A fast pre-selection method based on signatures is carried out that is based on the Kanade Lucas Tomasi (KLT) equation. The positions of the points computed with this method are compared. In this way, 11 of the 49 images were in the top position in combination with the third scale of the à trous equation. It depends however on the light conditions and the prominence of the marks if correct matches are found in the top ranked position. All images were retrieved in the top 5% of the database. This method takes only a few minutes for the complete database if, and can be optimized for comparison in seconds if the location of points are stored in files. For further improvement, it is useful to have the refinement in which the user selects the areas that are relevant on the cartridge case for their marks. This is necessary if this cartridge case is damaged and other marks that are not from the firearm appear on it.
78 FR 73851 - Privacy Act of 1974; Notice of a Computer Matching Program
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-09
... system for the match: Compensation, Pension, Education, and Vocational Rehabilitation and Employment... spent time on active duty while a member of the Reserve Forces. F. Inclusive Dates of the Matching...
Visual salience metrics for image inpainting
NASA Astrophysics Data System (ADS)
Ardis, Paul A.; Singhal, Amit
2009-01-01
Quantitative metrics for successful image inpainting currently do not exist, with researchers instead relying upon qualitative human comparisons to evaluate their methodologies and techniques. In an attempt to rectify this situation, we propose two new metrics to capture the notions of noticeability and visual intent in order to evaluate inpainting results. The proposed metrics use a quantitative measure of visual salience based upon a computational model of human visual attention. We demonstrate how these two metrics repeatably correlate with qualitative opinion in a human observer study, correctly identify the optimum uses for exemplar-based inpainting (as specified in the original publication), and match qualitative opinion in published examples.
Inverse Tone Mapping Based upon Retina Response
Huo, Yongqing; Yang, Fan; Brost, Vincent
2014-01-01
The development of high dynamic range (HDR) display arouses the research of inverse tone mapping methods, which expand dynamic range of the low dynamic range (LDR) image to match that of HDR monitor. This paper proposed a novel physiological approach, which could avoid artifacts occurred in most existing algorithms. Inspired by the property of the human visual system (HVS), this dynamic range expansion scheme performs with a low computational complexity and a limited number of parameters and obtains high-quality HDR results. Comparisons with three recent algorithms in the literature also show that the proposed method reveals more important image details and produces less contrast loss and distortion. PMID:24744678
Public response to carpooling programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kidder, A.E.; Morgan, B.; Saltzman, A.
1976-08-01
This paper reviews the progress of several cities' campaigns to stimulate carpooling among workers, answering such questions as: (1) how many persons have sent in the information necessary to computer or manual-match potential carpoolers; (2) of those who submitted the necessary information, how many were ''matchable'' by origin and destination; (3) of those matched, how many added to or reorganized existing carpools; (4) of what duration were the carpools, how successful were the pools in consolidating work trips; (5) what variables are associated with high rates of carpool formation; and, (6) in general, what is public response to the variousmore » forms of carpool campaigns. The report also provides a detailed description of the several carpooling projects launched in smaller cities, particularly those of North Carolina. The Transportation Institue of North Carolina A and T State University worked cooperatively with 15 companies in Greensboro to assess corporate response to the move toward carpooling on a company-wide and on a city-wide basis.« less
Numerical Studies into Flow Profiles in Confined Lubricant
NASA Astrophysics Data System (ADS)
di Mare, Luca; Ponjavic, Aleks; Wong, Janet
2013-03-01
This paper documents a computational study of flow profiles in confined fluids. The study is motivated by experimental evidence for deviation from Couette flow found by one of the authors (JSW). The computational study examines several possible stress-strain relations. Since a linear profile is the only possible solution for a constant stress layer even in presence of a power law, the study introduces a functional dependence of the fluid viscosity on the distance from the wall. Based on this dependence, a family of scaling laws for the velocity profile near the wall is derived which matches the measured profiles. The existence of this scaling law requires the viscosity of the fluid to increase at least linearly away from the wall. This behaviour is explained at a microscopic level by considerations on the mobility of long molecules near a wall. This behaviour is reminiscent of the variation of eddy length scales in near-wall turbulence.
Improved Force Fields for Peptide Nucleic Acids with Optimized Backbone Torsion Parameters.
Jasiński, Maciej; Feig, Michael; Trylska, Joanna
2018-06-06
Peptide nucleic acids are promising nucleic acid analogs for antisense therapies as they can form stable duplex and triplex structures with DNA and RNA. Computational studies of PNA-containing duplexes and triplexes are an important component for guiding their design, yet existing force fields have not been well validated and parametrized with modern computational capabilities. We present updated CHARMM and Amber force fields for PNA that greatly improve the stability of simulated PNA-containing duplexes and triplexes in comparison with experimental structures and allow such systems to be studied on microsecond time scales. The force field modifications focus on reparametrized PNA backbone torsion angles to match high-level quantum mechanics reference energies for a model compound. The microsecond simulations of PNA-PNA, PNA-DNA, PNA-RNA, and PNA-DNA-PNA complexes also allowed a comprehensive analysis of hydration and ion interactions with such systems.
Database crime to crime match rate calculation.
Buckleton, John; Bright, Jo-Anne; Walsh, Simon J
2009-06-01
Guidance exists on how to count matches between samples in a crime sample database but we are unable to locate a definition of how to estimate a match rate. We propose a method that does not proceed from the match counting definition but which has a strong logic.
SAM: The "Search and Match" Computer Program of the Escherichia coli Genetic Stock Center
ERIC Educational Resources Information Center
Bachmann, B. J.; And Others
1973-01-01
Describes a computer program used at a genetic stock center to locate particular strains of bacteria. The program can match up to 30 strain descriptions requested by a researcher with the records on file. Uses of this particular program can be made in many fields. (PS)
A Computer-Based Program to Teach Braille Reading to Sighted Individuals
ERIC Educational Resources Information Center
Scheithauer, Mindy C.; Tiger, Jeffrey H.
2012-01-01
Instructors of the visually impaired need efficient braille-training methods. This study conducted a preliminary evaluation of a computer-based program intended to teach the relation between braille characters and English letters using a matching-to-sample format with 4 sighted college students. Each participant mastered matching visual depictions…
24 CFR 5.234 - Requests for information from SWICAs and Federal agencies; restrictions on use.
Code of Federal Regulations, 2013 CFR
2013-04-01
...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers... obtained through computer matching agreements between HUD and a SWICA or Federal agency, or between a PHA... Privacy Act notice is required, as follows: (1) When HUD requests the computer match, the processing...
24 CFR 5.234 - Requests for information from SWICAs and Federal agencies; restrictions on use.
Code of Federal Regulations, 2010 CFR
2010-04-01
...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers... obtained through computer matching agreements between HUD and a SWICA or Federal agency, or between a PHA... Privacy Act notice is required, as follows: (1) When HUD requests the computer match, the processing...
24 CFR 5.234 - Requests for information from SWICAs and Federal agencies; restrictions on use.
Code of Federal Regulations, 2011 CFR
2011-04-01
...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers... obtained through computer matching agreements between HUD and a SWICA or Federal agency, or between a PHA... Privacy Act notice is required, as follows: (1) When HUD requests the computer match, the processing...
24 CFR 5.234 - Requests for information from SWICAs and Federal agencies; restrictions on use.
Code of Federal Regulations, 2012 CFR
2012-04-01
...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers... obtained through computer matching agreements between HUD and a SWICA or Federal agency, or between a PHA... Privacy Act notice is required, as follows: (1) When HUD requests the computer match, the processing...
24 CFR 5.234 - Requests for information from SWICAs and Federal agencies; restrictions on use.
Code of Federal Regulations, 2014 CFR
2014-04-01
...; WAIVERS Disclosure and Verification of Social Security Numbers and Employer Identification Numbers... obtained through computer matching agreements between HUD and a SWICA or Federal agency, or between a PHA... Privacy Act notice is required, as follows: (1) When HUD requests the computer match, the processing...
45 CFR 205.56 - Requirements governing the use of income and eligibility information.
Code of Federal Regulations, 2011 CFR
2011-10-01
...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...
45 CFR 205.56 - Requirements governing the use of income and eligibility information.
Code of Federal Regulations, 2014 CFR
2014-10-01
...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...
45 CFR 205.56 - Requirements governing the use of income and eligibility information.
Code of Federal Regulations, 2013 CFR
2013-10-01
...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...
45 CFR 205.56 - Requirements governing the use of income and eligibility information.
Code of Federal Regulations, 2010 CFR
2010-10-01
...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...
45 CFR 205.56 - Requirements governing the use of income and eligibility information.
Code of Federal Regulations, 2012 CFR
2012-10-01
...) of the Social Security Act must provide that: (a) The State agency will use the information obtained... received from the Internal Revenue Service, and earnings information received from the Social Security... Federal computer matching program that is subject to the requirements in the Computer Matching and Privacy...
Computer-automated tinnitus assessment: noise-band matching, maskability, and residual inhibition.
Henry, James A; Roberts, Larry E; Ellingson, Roger M; Thielman, Emily J
2013-06-01
Psychoacoustic measures of tinnitus typically include loudness and pitch match, minimum masking level (MML), and residual inhibition (RI). We previously developed and documented a computer-automated tinnitus evaluation system (TES) capable of subject-guided loudness and pitch matching. The TES was further developed to conduct computer-aided, subject-guided testing for noise-band matching (NBM), MML, and RI. The purpose of the present study was to document the capability of the upgraded TES to obtain measures of NBM, MML, and RI, and to determine the test-retest reliability of the responses obtained. Three subject-guided, computer-automated testing protocols were developed to conduct NBM. For MML and RI testing, a 2-12 kHz band of noise was used. All testing was repeated during a second session. Subjects meeting study criteria were selected from those who had previously been tested for loudness and pitch matching in our laboratory. A total of 21 subjects completed testing, including seven females and 14 males. The upgraded TES was found to be fairly time efficient. Subjects were generally reliable, both within and between sessions, with respect to the type of stimulus they chose as the best match to their tinnitus. Matching to bandwidth was more variable between measurements, with greater consistency seen for subjects reporting tonal tinnitus or wide-band noisy tinnitus than intermediate types. Between-session repeated MMLs were within 10 dB of each other for all but three of the subjects. Subjects who experienced RI during Session 1 tended to be those who experienced it during Session 2. This study may represent the first time that NBM, MML, and RI audiometric testing results have been obtained entirely through a self-contained, computer-automated system designed specifically for use in the clinic. Future plans include refinements to achieve greater testing efficiency. American Academy of Audiology.
Gun bore flaw image matching based on improved SIFT descriptor
NASA Astrophysics Data System (ADS)
Zeng, Luan; Xiong, Wei; Zhai, You
2013-01-01
In order to increase the operation speed and matching ability of SIFT algorithm, the SIFT descriptor and matching strategy are improved. First, a method of constructing feature descriptor based on sector area is proposed. By computing the gradients histogram of location bins which are parted into 6 sector areas, a descriptor with 48 dimensions is constituted. It can reduce the dimension of feature vector and decrease the complexity of structuring descriptor. Second, it introduce a strategy that partitions the circular region into 6 identical sector areas starting from the dominate orientation. Consequently, the computational complexity is reduced due to cancellation of rotation operation for the area. The experimental results indicate that comparing with the OpenCV SIFT arithmetic, the average matching speed of the new method increase by about 55.86%. The matching veracity can be increased even under some variation of view point, illumination, rotation, scale and out of focus. The new method got satisfied results in gun bore flaw image matching. Keywords: Metrology, Flaw image matching, Gun bore, Feature descriptor
On Parallel Push-Relabel based Algorithms for Bipartite Maximum Matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langguth, Johannes; Azad, Md Ariful; Halappanavar, Mahantesh
2014-07-01
We study multithreaded push-relabel based algorithms for computing maximum cardinality matching in bipartite graphs. Matching is a fundamental combinatorial (graph) problem with applications in a wide variety of problems in science and engineering. We are motivated by its use in the context of sparse linear solvers for computing maximum transversal of a matrix. We implement and test our algorithms on several multi-socket multicore systems and compare their performance to state-of-the-art augmenting path-based serial and parallel algorithms using a testset comprised of a wide range of real-world instances. Building on several heuristics for enhancing performance, we demonstrate good scaling for themore » parallel push-relabel algorithm. We show that it is comparable to the best augmenting path-based algorithms for bipartite matching. To the best of our knowledge, this is the first extensive study of multithreaded push-relabel based algorithms. In addition to a direct impact on the applications using matching, the proposed algorithmic techniques can be extended to preflow-push based algorithms for computing maximum flow in graphs.« less
A Parallel Point Matching Algorithm for Landmark Based Image Registration Using Multicore Platform
Yang, Lin; Gong, Leiguang; Zhang, Hong; Nosher, John L.; Foran, David J.
2013-01-01
Point matching is crucial for many computer vision applications. Establishing the correspondence between a large number of data points is a computationally intensive process. Some point matching related applications, such as medical image registration, require real time or near real time performance if applied to critical clinical applications like image assisted surgery. In this paper, we report a new multicore platform based parallel algorithm for fast point matching in the context of landmark based medical image registration. We introduced a non-regular data partition algorithm which utilizes the K-means clustering algorithm to group the landmarks based on the number of available processing cores, which optimize the memory usage and data transfer. We have tested our method using the IBM Cell Broadband Engine (Cell/B.E.) platform. The results demonstrated a significant speed up over its sequential implementation. The proposed data partition and parallelization algorithm, though tested only on one multicore platform, is generic by its design. Therefore the parallel algorithm can be extended to other computing platforms, as well as other point matching related applications. PMID:24308014
NASA Astrophysics Data System (ADS)
Yllanes, David
2013-03-01
Spin glasses are a longstanding model for the sluggish dynamics that appears at the glass transition. They enjoy a privileged status in this context, as they provide the simplest model system both for theoretical and experimental studies of glassy dynamics. However, in spite of forty years of intensive investigation, spin glasses still pose a formidable challenge to theoretical, computational and experimental physics. The main difficulty lies in their incredibly slow dynamics. A recent breakthrough has been made possible by our custom-built computer, Janus, designed and built in a collaboration formed by five universities in Spain and Italy. By employing a purpose-driven architecture, capable of fully exploiting the parallelization possibilities intrinsic to these simulations, Janus outperforms conventional computers by several orders of magnitude. After a brief introduction to spin glasses, the talk will focus on the new physics unearthed by Janus. In particular, we recall our numerical study of the nonequilibrium dynamics of the Edwards-Anderson Ising Spin Glass, for a time that spans eleven orders of magnitude, thus approaching the experimentally relevant scale (i.e. seconds). We have also studied the equilibrium properties of the spin-glass phase, with an emphasis on the quantitative matching between non-equilibrium and equilibrium correlation functions, through a time-length dictionary. Last but not least, we have clarified the existence of a glass transition in the presence of a magnetic field for a finite-range spin glass (the so-called de Almeida-Thouless line). We will finally mention some of the currently ongoing work of the collaboration, such as the characterization of the non-equilibrium dynamics in a magnetic field and the existence of a statics-dynamics dictionary in these conditions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dr Marvin Adams
2002-03-01
OAK 270 - The DOE Matching Grant Program provided $50,000.00 to the Dept of N.E. at TAMU, matching a gift of $50,000.00 from TXU Electric. The $100,000.00 total was spent on scholarships, departmental labs, and computing network.
Does Matching Quality Matter in Mode Comparison Studies?
ERIC Educational Resources Information Center
Zeng, Ji; Yin, Ping; Shedden, Kerby A.
2015-01-01
This article provides a brief overview and comparison of three matching approaches in forming comparable groups for a study comparing test administration modes (i.e., computer-based tests [CBT] and paper-and-pencil tests [PPT]): (a) a propensity score matching approach proposed in this article, (b) the propensity score matching approach used by…
Multisensory Information Boosts Numerical Matching Abilities in Young Children
ERIC Educational Resources Information Center
Jordan, Kerry E.; Baker, Joseph
2011-01-01
This study presents the first evidence that preschool children perform more accurately in a numerical matching task when given multisensory rather than unisensory information about number. Three- to 5-year-old children learned to play a numerical matching game on a touchscreen computer, which asked them to match a sample numerosity with a…
Design of a fault tolerant airborne digital computer. Volume 1: Architecture
NASA Technical Reports Server (NTRS)
Wensley, J. H.; Levitt, K. N.; Green, M. W.; Goldberg, J.; Neumann, P. G.
1973-01-01
This volume is concerned with the architecture of a fault tolerant digital computer for an advanced commercial aircraft. All of the computations of the aircraft, including those presently carried out by analogue techniques, are to be carried out in this digital computer. Among the important qualities of the computer are the following: (1) The capacity is to be matched to the aircraft environment. (2) The reliability is to be selectively matched to the criticality and deadline requirements of each of the computations. (3) The system is to be readily expandable. contractible, and (4) The design is to appropriate to post 1975 technology. Three candidate architectures are discussed and assessed in terms of the above qualities. Of the three candidates, a newly conceived architecture, Software Implemented Fault Tolerance (SIFT), provides the best match to the above qualities. In addition SIFT is particularly simple and believable. The other candidates, Bus Checker System (BUCS), also newly conceived in this project, and the Hopkins multiprocessor are potentially more efficient than SIFT in the use of redundancy, but otherwise are not as attractive.
Some Problems and Solutions in Transferring Ecosystem Simulation Codes to Supercomputers
NASA Technical Reports Server (NTRS)
Skiles, J. W.; Schulbach, C. H.
1994-01-01
Many computer codes for the simulation of ecological systems have been developed in the last twenty-five years. This development took place initially on main-frame computers, then mini-computers, and more recently, on micro-computers and workstations. Recent recognition of ecosystem science as a High Performance Computing and Communications Program Grand Challenge area emphasizes supercomputers (both parallel and distributed systems) as the next set of tools for ecological simulation. Transferring ecosystem simulation codes to such systems is not a matter of simply compiling and executing existing code on the supercomputer since there are significant differences in the system architectures of sequential, scalar computers and parallel and/or vector supercomputers. To more appropriately match the application to the architecture (necessary to achieve reasonable performance), the parallelism (if it exists) of the original application must be exploited. We discuss our work in transferring a general grassland simulation model (developed on a VAX in the FORTRAN computer programming language) to a Cray Y-MP. We show the Cray shared-memory vector-architecture, and discuss our rationale for selecting the Cray. We describe porting the model to the Cray and executing and verifying a baseline version, and we discuss the changes we made to exploit the parallelism in the application and to improve code execution. As a result, the Cray executed the model 30 times faster than the VAX 11/785 and 10 times faster than a Sun 4 workstation. We achieved an additional speed-up of approximately 30 percent over the original Cray run by using the compiler's vectorizing capabilities and the machine's ability to put subroutines and functions "in-line" in the code. With the modifications, the code still runs at only about 5% of the Cray's peak speed because it makes ineffective use of the vector processing capabilities of the Cray. We conclude with a discussion and future plans.
The Use of Computer-Generated Fading Materials to Teach Visual-Visual Non-Identity Matching Tasks
ERIC Educational Resources Information Center
Murphy, Colleen; Figueroa, Maria; Martin, Garry L.; Yu, C. T.; Figueroa, Josue
2008-01-01
Many everyday matching tasks taught to persons with developmental disabilities are visual-visual non-identity matching (VVNM) tasks, such as matching the printed word DOG to a picture of a dog, or matching a sock to a shoe. Research has shown that, for participants who have failed a VVNM prototype task, it is very difficult to teach them various…
NASA Technical Reports Server (NTRS)
Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.
1989-01-01
Computer vision systems employ a sequence of vision algorithms in which the output of an algorithm is the input of the next algorithm in the sequence. Algorithms that constitute such systems exhibit vastly different computational characteristics, and therefore, require different data decomposition techniques and efficient load balancing techniques for parallel implementation. However, since the input data for a task is produced as the output data of the previous task, this information can be exploited to perform knowledge based data decomposition and load balancing. Presented here are algorithms for a motion estimation system. The motion estimation is based on the point correspondence between the involved images which are a sequence of stereo image pairs. Researchers propose algorithms to obtain point correspondences by matching feature points among stereo image pairs at any two consecutive time instants. Furthermore, the proposed algorithms employ non-iterative procedures, which results in saving considerable amounts of computation time. The system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from consecutive time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters.
Cleary, Anne M; Ryals, Anthony J; Wagner, Samantha R
2016-01-01
Research suggests that a feature-matching process underlies cue familiarity-detection when cued recall with graphemic cues fails. When a test cue (e.g., potchbork) overlaps in graphemic features with multiple unrecalled studied items (e.g., patchwork, pitchfork, pocketbook, pullcork), higher cue familiarity ratings are given during recall failure of all of the targets than when the cue overlaps in graphemic features with only one studied target and that target fails to be recalled (e.g., patchwork). The present study used semantic feature production norms (McRae et al., Behavior Research Methods, Instruments, & Computers, 37, 547-559, 2005) to examine whether the same holds true when the cues are semantic in nature (e.g., jaguar is used to cue cheetah). Indeed, test cues (e.g., cedar) that overlapped in semantic features (e.g., a_tree, has_bark, etc.) with four unretrieved studied items (e.g., birch, oak, pine, willow) received higher cue familiarity ratings during recall failure than test cues that overlapped in semantic features with only two (also unretrieved) studied items (e.g., birch, oak), which in turn received higher familiarity ratings during recall failure than cues that did not overlap in semantic features with any studied items. These findings suggest that the feature-matching theory of recognition during recall failure can accommodate recognition of semantic cues during recall failure, providing a potential mechanism for conceptually-based forms of cue recognition during target retrieval failure. They also provide converging evidence for the existence of the semantic features envisaged in feature-based models of semantic knowledge representation and for those more concretely specified by the production norms of McRae et al. (Behavior Research Methods, Instruments, & Computers, 37, 547-559, 2005).
22 CFR 1101.4 - Reports on new systems of records; computer matching programs.
Code of Federal Regulations, 2010 CFR
2010-04-01
... 22 Foreign Relations 2 2010-04-01 2010-04-01 true Reports on new systems of records; computer matching programs. 1101.4 Section 1101.4 Foreign Relations INTERNATIONAL BOUNDARY AND WATER COMMISSION, UNITED STATES AND MEXICO, UNITED STATES SECTION PRIVACY ACT OF 1974 § 1101.4 Reports on new systems of...
Developing a multimodal biometric authentication system using soft computing methods.
Malcangi, Mario
2015-01-01
Robust personal authentication is becoming ever more important in computer-based applications. Among a variety of methods, biometric offers several advantages, mainly in embedded system applications. Hard and soft multi-biometric, combined with hard and soft computing methods, can be applied to improve the personal authentication process and to generalize the applicability. This chapter describes the embedded implementation of a multi-biometric (voiceprint and fingerprint) multimodal identification system based on hard computing methods (DSP) for feature extraction and matching, an artificial neural network (ANN) for soft feature pattern matching, and a fuzzy logic engine (FLE) for data fusion and decision.
Algorithms for computing the geopotential using a simple density layer
NASA Technical Reports Server (NTRS)
Morrison, F.
1976-01-01
Several algorithms have been developed for computing the potential and attraction of a simple density layer. These are numerical cubature, Taylor series, and a mixed analytic and numerical integration using a singularity-matching technique. A computer program has been written to combine these techniques for computing the disturbing acceleration on an artificial earth satellite. A total of 1640 equal-area, constant surface density blocks on an oblate spheroid are used. The singularity-matching algorithm is used in the subsatellite region, Taylor series in the surrounding zone, and numerical cubature on the rest of the earth.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-06
..., Education, and Vocational Rehabilitation and Employment Records-VA'' (58VA21/22/28), published at 74 FR.... Inclusive Dates of the Matching Program The effective date of this matching program is October 2, 2012...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-22
..., Education, and Vocational Rehabilitation and Employment Records-VA (58 VA 21/22/28),'' first published at 74.... Inclusive Dates of the Matching Program The matching program will become effective no sooner than 40 days...
Cumming, Bruce G.
2016-01-01
In order to extract retinal disparity from a visual scene, the brain must match corresponding points in the left and right retinae. This computationally demanding task is known as the stereo correspondence problem. The initial stage of the solution to the correspondence problem is generally thought to consist of a correlation-based computation. However, recent work by Doi et al suggests that human observers can see depth in a class of stimuli where the mean binocular correlation is 0 (half-matched random dot stereograms). Half-matched random dot stereograms are made up of an equal number of correlated and anticorrelated dots, and the binocular energy model—a well-known model of V1 binocular complex cells—fails to signal disparity here. This has led to the proposition that a second, match-based computation must be extracting disparity in these stimuli. Here we show that a straightforward modification to the binocular energy model—adding a point output nonlinearity—is by itself sufficient to produce cells that are disparity-tuned to half-matched random dot stereograms. We then show that a simple decision model using this single mechanism can reproduce psychometric functions generated by human observers, including reduced performance to large disparities and rapidly updating dot patterns. The model makes predictions about how performance should change with dot size in half-matched stereograms and temporal alternation in correlation, which we test in human observers. We conclude that a single correlation-based computation, based directly on already-known properties of V1 neurons, can account for the literature on mixed correlation random dot stereograms. PMID:27196696
Robust stereo matching with trinary cross color census and triple image-based refinements
NASA Astrophysics Data System (ADS)
Chang, Ting-An; Lu, Xiao; Yang, Jar-Ferr
2017-12-01
For future 3D TV broadcasting systems and navigation applications, it is necessary to have accurate stereo matching which could precisely estimate depth map from two distanced cameras. In this paper, we first suggest a trinary cross color (TCC) census transform, which can help to achieve accurate disparity raw matching cost with low computational cost. The two-pass cost aggregation (TPCA) is formed to compute the aggregation cost, then the disparity map can be obtained by a range winner-take-all (RWTA) process and a white hole filling procedure. To further enhance the accuracy performance, a range left-right checking (RLRC) method is proposed to classify the results as correct, mismatched, or occluded pixels. Then, the image-based refinements for the mismatched and occluded pixels are proposed to refine the classified errors. Finally, the image-based cross voting and a median filter are employed to complete the fine depth estimation. Experimental results show that the proposed semi-global stereo matching system achieves considerably accurate disparity maps with reasonable computation cost.
Moving health promotion communities online: a review of the literature.
Sunderland, Naomi; Beekhuyzen, Jenine; Kendall, Elizabeth; Wolski, Malcom
There is a need to enhance the effectiveness and reach of complex health promotion initiatives by providing opportunities for diverse health promotion practitioners and others to interact in online settings. This paper reviews the existing literature on how to take health promotion communities and networks into online settings. A scoping review of relevant bodies of literature and empirical evidence was undertaken to provide an interpretive synthesis of existing knowledge on the topic. Sixteen studies were identified between 1986 and 2007. Relatively little research has been conducted on the process of taking existing offline communities and networks into online settings. However, more research has focused on offline (i.e. not mediated via computer networks); 'virtual' (purely online with no offline interpersonal contact); and 'multiplex' communities (i.e. those that interact across both online and offline settings). Results are summarised under three themes: characteristics of communities in online and offline settings; issues in moving offline communities online, and designing online communities to match community needs. Existing health promotion initiatives can benefit from online platforms that promote community building and knowledge sharing. Online e-health promotion settings and communities can successfully integrate with existing offline settings and communities to form 'multiplex' communities (i.e. communities that operate fluently across both online and offline settings).
Available pressure amplitude of linear compressor based on phasor triangle model
NASA Astrophysics Data System (ADS)
Duan, C. X.; Jiang, X.; Zhi, X. Q.; You, X. K.; Qiu, L. M.
2017-12-01
The linear compressor for cryocoolers possess the advantages of long-life operation, high efficiency, low vibration and compact structure. It is significant to study the match mechanisms between the compressor and the cold finger, which determines the working efficiency of the cryocooler. However, the output characteristics of linear compressor are complicated since it is affected by many interacting parameters. The existing matching methods are simplified and mainly focus on the compressor efficiency and output acoustic power, while neglecting the important output parameter of pressure amplitude. In this study, a phasor triangle model basing on analyzing the forces of the piston is proposed. It can be used to predict not only the output acoustic power, the efficiency, but also the pressure amplitude of the linear compressor. Calculated results agree well with the measurement results of the experiment. By this phasor triangle model, the theoretical maximum output pressure amplitude of the linear compressor can be calculated simply based on a known charging pressure and operating frequency. Compared with the mechanical and electrical model of the linear compressor, the new model can provide an intuitionistic understanding on the match mechanism with faster computational process. The model can also explain the experimental phenomenon of the proportional relationship between the output pressure amplitude and the piston displacement in experiments. By further model analysis, such phenomenon is confirmed as an expression of the unmatched design of the compressor. The phasor triangle model may provide an alternative method for the compressor design and matching with the cold finger.
Trajectory Segmentation Map-Matching Approach for Large-Scale, High-Resolution GPS Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Lei; Holden, Jacob R.; Gonder, Jeffrey D.
With the development of smartphones and portable GPS devices, large-scale, high-resolution GPS data can be collected. Map matching is a critical step in studying vehicle driving activity and recognizing network traffic conditions from the data. A new trajectory segmentation map-matching algorithm is proposed to deal accurately and efficiently with large-scale, high-resolution GPS trajectory data. The new algorithm separated the GPS trajectory into segments. It found the shortest path for each segment in a scientific manner and ultimately generated a best-matched path for the entire trajectory. The similarity of a trajectory segment and its matched path is described by a similaritymore » score system based on the longest common subsequence. The numerical experiment indicated that the proposed map-matching algorithm was very promising in relation to accuracy and computational efficiency. Large-scale data set applications verified that the proposed method is robust and capable of dealing with real-world, large-scale GPS data in a computationally efficient and accurate manner.« less
Wide baseline stereo matching based on double topological relationship consistency
NASA Astrophysics Data System (ADS)
Zou, Xiaohong; Liu, Bin; Song, Xiaoxue; Liu, Yang
2009-07-01
Stereo matching is one of the most important branches in computer vision. In this paper, an algorithm is proposed for wide-baseline stereo vision matching. Here, a novel scheme is presented called double topological relationship consistency (DCTR). The combination of double topological configuration includes the consistency of first topological relationship (CFTR) and the consistency of second topological relationship (CSTR). It not only sets up a more advanced model on matching, but discards mismatches by iteratively computing the fitness of the feature matches and overcomes many problems of traditional methods depending on the powerful invariance to changes in the scale, rotation or illumination across large view changes and even occlusions. Experimental examples are shown where the two cameras have been located in very different orientations. Also, epipolar geometry can be recovered using RANSAC by far the most widely method adopted possibly. By the method, we can obtain correspondences with high precision on wide baseline matching problems. Finally, the effectiveness and reliability of this method are demonstrated in wide-baseline experiments on the image pairs.
Trajectory Segmentation Map-Matching Approach for Large-Scale, High-Resolution GPS Data
Zhu, Lei; Holden, Jacob R.; Gonder, Jeffrey D.
2017-01-01
With the development of smartphones and portable GPS devices, large-scale, high-resolution GPS data can be collected. Map matching is a critical step in studying vehicle driving activity and recognizing network traffic conditions from the data. A new trajectory segmentation map-matching algorithm is proposed to deal accurately and efficiently with large-scale, high-resolution GPS trajectory data. The new algorithm separated the GPS trajectory into segments. It found the shortest path for each segment in a scientific manner and ultimately generated a best-matched path for the entire trajectory. The similarity of a trajectory segment and its matched path is described by a similaritymore » score system based on the longest common subsequence. The numerical experiment indicated that the proposed map-matching algorithm was very promising in relation to accuracy and computational efficiency. Large-scale data set applications verified that the proposed method is robust and capable of dealing with real-world, large-scale GPS data in a computationally efficient and accurate manner.« less
Lightness of an object under two illumination levels.
Zdravković, Suncica; Economou, Elias; Gilchrist, Alan
2006-01-01
Anchoring theory (Gilchrist et al, 1999 Psychological Review 106 795-834) predicts a wide range of lightness errors, including failures of constancy in multi-illumination scenes and a long list of well-known lightness illusions seen under homogeneous illumination. Lightness values are computed both locally and globally and then averaged together. Local values are computed within a given region of homogeneous illumination. Thus, for an object that extends through two different illumination levels, anchoring theory produces two values, one for the patch in brighter illumination and one for the patch in dimmer illumination. Observers can give matches for these patches separately, but they can also give a single match for the whole object. Anchoring theory in its current form is unable to predict these object matches. We report eight experiments in which we studied the relationship between patch matches and object matches. The results show that the object match represents a compromise between the match for the patch in the field of highest illumination and the patch in the largest field of illumination. These two principles are parallel to the rules found for anchoring lightness: highest luminance rule and area rule.
A comparison of semiglobal and local dense matching algorithms for surface reconstruction
NASA Astrophysics Data System (ADS)
Dall'Asta, E.; Roncella, R.
2014-06-01
Encouraged by the growing interest in automatic 3D image-based reconstruction, the development and improvement of robust stereo matching techniques is one of the most investigated research topic of the last years in photogrammetry and computer vision. The paper is focused on the comparison of some stereo matching algorithms (local and global) which are very popular both in photogrammetry and computer vision. In particular, the Semi-Global Matching (SGM), which realizes a pixel-wise matching and relies on the application of consistency constraints during the matching cost aggregation, will be discussed. The results of some tests performed on real and simulated stereo image datasets, evaluating in particular the accuracy of the obtained digital surface models, will be presented. Several algorithms and different implementation are considered in the comparison, using freeware software codes like MICMAC and OpenCV, commercial software (e.g. Agisoft PhotoScan) and proprietary codes implementing Least Square e Semi-Global Matching algorithms. The comparisons will also consider the completeness and the level of detail within fine structures, and the reliability and repeatability of the obtainable data.
NASA Technical Reports Server (NTRS)
Bradley, D. B.; Irwin, J. D.
1974-01-01
A computer simulation model for a multiprocessor computer is developed that is useful for studying the problem of matching multiprocessor's memory space, memory bandwidth and numbers and speeds of processors with aggregate job set characteristics. The model assumes an input work load of a set of recurrent jobs. The model includes a feedback scheduler/allocator which attempts to improve system performance through higher memory bandwidth utilization by matching individual job requirements for space and bandwidth with space availability and estimates of bandwidth availability at the times of memory allocation. The simulation model includes provisions for specifying precedence relations among the jobs in a job set, and provisions for specifying precedence execution of TMR (Triple Modular Redundant and SIMPLEX (non redundant) jobs.
Improved Feature Matching for Mobile Devices with IMU.
Masiero, Andrea; Vettore, Antonio
2016-08-05
Thanks to the recent diffusion of low-cost high-resolution digital cameras and to the development of mostly automated procedures for image-based 3D reconstruction, the popularity of photogrammetry for environment surveys is constantly increasing in the last years. Automatic feature matching is an important step in order to successfully complete the photogrammetric 3D reconstruction: this step is the fundamental basis for the subsequent estimation of the geometry of the scene. This paper reconsiders the feature matching problem when dealing with smart mobile devices (e.g., when using the standard camera embedded in a smartphone as imaging sensor). More specifically, this paper aims at exploiting the information on camera movements provided by the inertial navigation system (INS) in order to make the feature matching step more robust and, possibly, computationally more efficient. First, a revised version of the affine scale-invariant feature transform (ASIFT) is considered: this version reduces the computational complexity of the original ASIFT, while still ensuring an increase of correct feature matches with respect to the SIFT. Furthermore, a new two-step procedure for the estimation of the essential matrix E (and the camera pose) is proposed in order to increase its estimation robustness and computational efficiency.
An electronic stroll through the global village
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chew, J.
1992-09-01
This paper is a semi-random walk through Usenet News, a bulletin board system that exists on the vast Internet computer network. Interaction in such a medium is an interesting hybrid of speech and writing, of monologue and dialogue and sometimes an open shouting match in a crowded room. Those who are intrigued by these matters will be able to see a number of research areas exposed in this frankly anecdotal paper. In addition to being anecdotal, this paper is a work of participatory observation. In fact, I occasionally let it be known that I was observing the sociology and rhetoricmore » of the newsgroups. The natives appeared unimpressed.« less
An electronic stroll through the global village
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chew, J.
This paper is a semi-random walk through Usenet News, a bulletin board system that exists on the vast Internet computer network. Interaction in such a medium is an interesting hybrid of speech and writing, of monologue and dialogue and sometimes an open shouting match in a crowded room. Those who are intrigued by these matters will be able to see a number of research areas exposed in this frankly anecdotal paper. In addition to being anecdotal, this paper is a work of participatory observation. In fact, I occasionally let it be known that I was observing the sociology and rhetoricmore » of the newsgroups. The natives appeared unimpressed.« less
Quantum computation in the analysis of hyperspectral data
NASA Astrophysics Data System (ADS)
Gomez, Richard B.; Ghoshal, Debabrata; Jayanna, Anil
2004-08-01
Recent research on the topic of quantum computation provides us with some quantum algorithms with higher efficiency and speedup compared to their classical counterparts. In this paper, it is our intent to provide the results of our investigation of several applications of such quantum algorithms - especially the Grover's Search algorithm - in the analysis of Hyperspectral Data. We found many parallels with Grover's method in existing data processing work that make use of classical spectral matching algorithms. Our efforts also included the study of several methods dealing with hyperspectral image analysis work where classical computation methods involving large data sets could be replaced with quantum computation methods. The crux of the problem in computation involving a hyperspectral image data cube is to convert the large amount of data in high dimensional space to real information. Currently, using the classical model, different time consuming methods and steps are necessary to analyze these data including: Animation, Minimum Noise Fraction Transform, Pixel Purity Index algorithm, N-dimensional scatter plot, Identification of Endmember spectra - are such steps. If a quantum model of computation involving hyperspectral image data can be developed and formalized - it is highly likely that information retrieval from hyperspectral image data cubes would be a much easier process and the final information content would be much more meaningful and timely. In this case, dimensionality would not be a curse, but a blessing.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-04-13
... (OMB). The dates for the matching program will be effective as indicated in ``E. Inclusive Dates of the... Compensation, Pension, Education, and Vocational Rehabilitation and Employment Records--VA (58VA21/22/28... addresses, etc. E. Inclusive Dates of the Matching Program The effective date of the matching agreement and...
Initial constructs for patient-centered outcome measures to evaluate brain-computer interfaces.
Andresen, Elena M; Fried-Oken, Melanie; Peters, Betts; Patrick, Donald L
2016-10-01
The authors describe preliminary work toward the creation of patient-centered outcome (PCO) measures to evaluate brain-computer interface (BCI) as an assistive technology (AT) for individuals with severe speech and physical impairments (SSPI). In Phase 1, 591 items from 15 existing measures were mapped to the International Classification of Functioning, Disability and Health (ICF). In Phase 2, qualitative interviews were conducted with eight people with SSPI and seven caregivers. Resulting text data were coded in an iterative analysis. Most items (79%) were mapped to the ICF environmental domain; over half (53%) were mapped to more than one domain. The ICF framework was well suited for mapping items related to body functions and structures, but less so for items in other areas, including personal factors. Two constructs emerged from qualitative data: quality of life (QOL) and AT. Component domains and themes were identified for each. Preliminary constructs, domains and themes were generated for future PCO measures relevant to BCI. Existing instruments are sufficient for initial items but do not adequately match the values of people with SSPI and their caregivers. Field methods for interviewing people with SSPI were successful, and support the inclusion of these individuals in PCO research. Implications for Rehabilitation Adapted interview methods allow people with severe speech and physical impairments to participate in patient-centered outcomes research. Patient-centered outcome measures are needed to evaluate the clinical implementation of brain-computer interface as an assistive technology.
Caetano, Tibério S; McAuley, Julian J; Cheng, Li; Le, Quoc V; Smola, Alex J
2009-06-01
As a fundamental problem in pattern recognition, graph matching has applications in a variety of fields, from computer vision to computational biology. In graph matching, patterns are modeled as graphs and pattern recognition amounts to finding a correspondence between the nodes of different graphs. Many formulations of this problem can be cast in general as a quadratic assignment problem, where a linear term in the objective function encodes node compatibility and a quadratic term encodes edge compatibility. The main research focus in this theme is about designing efficient algorithms for approximately solving the quadratic assignment problem, since it is NP-hard. In this paper we turn our attention to a different question: how to estimate compatibility functions such that the solution of the resulting graph matching problem best matches the expected solution that a human would manually provide. We present a method for learning graph matching: the training examples are pairs of graphs and the 'labels' are matches between them. Our experimental results reveal that learning can substantially improve the performance of standard graph matching algorithms. In particular, we find that simple linear assignment with such a learning scheme outperforms Graduated Assignment with bistochastic normalisation, a state-of-the-art quadratic assignment relaxation algorithm.
Markkula, Gustav; Boer, Erwin; Romano, Richard; Merat, Natasha
2018-06-01
A conceptual and computational framework is proposed for modelling of human sensorimotor control and is exemplified for the sensorimotor task of steering a car. The framework emphasises control intermittency and extends on existing models by suggesting that the nervous system implements intermittent control using a combination of (1) motor primitives, (2) prediction of sensory outcomes of motor actions, and (3) evidence accumulation of prediction errors. It is shown that approximate but useful sensory predictions in the intermittent control context can be constructed without detailed forward models, as a superposition of simple prediction primitives, resembling neurobiologically observed corollary discharges. The proposed mathematical framework allows straightforward extension to intermittent behaviour from existing one-dimensional continuous models in the linear control and ecological psychology traditions. Empirical data from a driving simulator are used in model-fitting analyses to test some of the framework's main theoretical predictions: it is shown that human steering control, in routine lane-keeping and in a demanding near-limit task, is better described as a sequence of discrete stepwise control adjustments, than as continuous control. Results on the possible roles of sensory prediction in control adjustment amplitudes, and of evidence accumulation mechanisms in control onset timing, show trends that match the theoretical predictions; these warrant further investigation. The results for the accumulation-based model align with other recent literature, in a possibly converging case against the type of threshold mechanisms that are often assumed in existing models of intermittent control.
Document Image Parsing and Understanding using Neuromorphic Architecture
2015-03-01
processing speed at different layers. In the pattern matching layer, the computing power of multicore processors is explored to reduce the processing...developed to reduce the processing speed at different layers. In the pattern matching layer, the computing power of multicore processors is explored... cortex where the complex data is reduced to abstract representations. The abstract representation is compared to stored patterns in massively parallel
ERIC Educational Resources Information Center
Paney, Andrew S.; Kay, Ann C.
2015-01-01
The purpose of this study was to measure the effect of concurrent visual feedback on pitch-matching skill development in third-grade students. Participants played a computer game, "SingingCoach," which scored the accuracy of their singing of the song "America." They followed the contour of the melody on the screen as the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doak, J. E.; Prasad, Lakshman
2002-01-01
This paper discusses the use of Python in a computer vision (CV) project. We begin by providing background information on the specific approach to CV employed by the project. This includes a brief discussion of Constrained Delaunay Triangulation (CDT), the Chordal Axis Transform (CAT), shape feature extraction and syntactic characterization, and normalization of strings representing objects. (The terms 'object' and 'blob' are used interchangeably, both referring to an entity extracted from an image.) The rest of the paper focuses on the use of Python in three critical areas: (1) interactions with a MySQL database, (2) rapid prototyping of algorithms, andmore » (3) gluing together all components of the project including existing C and C++ modules. For (l), we provide a schema definition and discuss how the various tables interact to represent objects in the database as tree structures. (2) focuses on an algorithm to create a hierarchical representation of an object, given its string representation, and an algorithm to match unknown objects against objects in a database. And finally, (3) discusses the use of Boost Python to interact with the pre-existing C and C++ code that creates the CDTs and CATS, performs shape feature extraction and syntactic characterization, and normalizes object strings. The paper concludes with a vision of the future use of Python for the CV project.« less
Scheirer, Walter J; de Rezende Rocha, Anderson; Sapkota, Archana; Boult, Terrance E
2013-07-01
To date, almost all experimental evaluations of machine learning-based recognition algorithms in computer vision have taken the form of "closed set" recognition, whereby all testing classes are known at training time. A more realistic scenario for vision applications is "open set" recognition, where incomplete knowledge of the world is present at training time, and unknown classes can be submitted to an algorithm during testing. This paper explores the nature of open set recognition and formalizes its definition as a constrained minimization problem. The open set recognition problem is not well addressed by existing algorithms because it requires strong generalization. As a step toward a solution, we introduce a novel "1-vs-set machine," which sculpts a decision space from the marginal distances of a 1-class or binary SVM with a linear kernel. This methodology applies to several different applications in computer vision where open set recognition is a challenging problem, including object recognition and face verification. We consider both in this work, with large scale cross-dataset experiments performed over the Caltech 256 and ImageNet sets, as well as face matching experiments performed over the Labeled Faces in the Wild set. The experiments highlight the effectiveness of machines adapted for open set evaluation compared to existing 1-class and binary SVMs for the same tasks.
Scalable Nearest Neighbor Algorithms for High Dimensional Data.
Muja, Marius; Lowe, David G
2014-11-01
For many computer vision and machine learning problems, large training sets are key for good performance. However, the most computationally expensive part of many computer vision and machine learning algorithms consists of finding nearest neighbor matches to high dimensional vectors that represent the training data. We propose new algorithms for approximate nearest neighbor matching and evaluate and compare them with previous algorithms. For matching high dimensional features, we find two algorithms to be the most efficient: the randomized k-d forest and a new algorithm proposed in this paper, the priority search k-means tree. We also propose a new algorithm for matching binary features by searching multiple hierarchical clustering trees and show it outperforms methods typically used in the literature. We show that the optimal nearest neighbor algorithm and its parameters depend on the data set characteristics and describe an automated configuration procedure for finding the best algorithm to search a particular data set. In order to scale to very large data sets that would otherwise not fit in the memory of a single machine, we propose a distributed nearest neighbor matching framework that can be used with any of the algorithms described in the paper. All this research has been released as an open source library called fast library for approximate nearest neighbors (FLANN), which has been incorporated into OpenCV and is now one of the most popular libraries for nearest neighbor matching.
Carpool and buspool matching guide. Fourth edition
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pratsch, L.
1975-01-01
The operation and management of numerous successful carpool, buspool, and vanpool programs throughout the U.S. are discussed. The matching of the time and location requirements of the riders with vehicle availability and routing is described. The guide for data collection procedures and computer programs for carpool matching is presented. (LCL)
Dual boundary conditions in 3d SCFT's
NASA Astrophysics Data System (ADS)
Dimofte, Tudor; Gaiotto, Davide; Paquette, Natalie M.
2018-05-01
We propose matching pairs of half-BPS boundary conditions related by IR dualities of 3d N=2 gauge theories. From these matching pairs we construct duality interfaces. We test our proposals by anomaly matching and the computation of supersymmetric indices. Examples include basic abelian dualities, level-rank dualities, and Aharony dualities.
NASA Technical Reports Server (NTRS)
Choudhary, Alok Nidhi; Leung, Mun K.; Huang, Thomas S.; Patel, Janak H.
1989-01-01
Several techniques to perform static and dynamic load balancing techniques for vision systems are presented. These techniques are novel in the sense that they capture the computational requirements of a task by examining the data when it is produced. Furthermore, they can be applied to many vision systems because many algorithms in different systems are either the same, or have similar computational characteristics. These techniques are evaluated by applying them on a parallel implementation of the algorithms in a motion estimation system on a hypercube multiprocessor system. The motion estimation system consists of the following steps: (1) extraction of features; (2) stereo match of images in one time instant; (3) time match of images from different time instants; (4) stereo match to compute final unambiguous points; and (5) computation of motion parameters. It is shown that the performance gains when these data decomposition and load balancing techniques are used are significant and the overhead of using these techniques is minimal.
ERIC Educational Resources Information Center
Nielsen, Richard A.
2016-01-01
This article shows how statistical matching methods can be used to select "most similar" cases for qualitative analysis. I first offer a methodological justification for research designs based on selecting most similar cases. I then discuss the applicability of existing matching methods to the task of selecting most similar cases and…
NASA Technical Reports Server (NTRS)
Dudgeon, J. E.
1972-01-01
A computerized simulation of a planar phased array of circular waveguide elements is reported using mutual coupling and wide angle impedance matching in phased arrays. Special emphasis is given to circular polarization. The aforementioned computer program has as variable inputs: frequency, polarization, grid geometry, element size, dielectric waveguide fill, dielectric plugs in the waveguide for impedance matching, and dielectric sheets covering the array surface for the purpose of wide angle impedance matching. Parameter combinations are found which produce reflection peaks interior to grating lobes, while dielectric cover sheets are successfully employed to extend the usable scan range of a phased array. The most exciting results came from the application of computer aided optimization techniques to the design of this type of array.
Cacace, Anthony T; McFarland, Dennis J
2013-01-01
Tests of auditory perception, such as those used in the assessment of central auditory processing disorders ([C]APDs), represent a domain in audiological assessment where measurement of this theoretical construct is often confounded by nonauditory abilities due to methodological shortcomings. These confounds include the effects of cognitive variables such as memory and attention and suboptimal testing paradigms, including the use of verbal reproduction as a form of response selection. We argue that these factors need to be controlled more carefully and/or modified so that their impact on tests of auditory and visual perception is only minimal. To advocate for a stronger theoretical framework than currently exists and to suggest better methodological strategies to improve assessment of auditory processing disorders (APDs). Emphasis is placed on adaptive forced-choice psychophysical methods and the use of matched tasks in multiple sensory modalities to achieve these goals. Together, this approach has potential to improve the construct validity of the diagnosis, enhance and develop theory, and evolve into a preferred method of testing. Examination of methods commonly used in studies of APDs. Where possible, currently used methodology is compared to contemporary psychophysical methods that emphasize computer-controlled forced-choice paradigms. In many cases, the procedures used in studies of APD introduce confounding factors that could be minimized if computer-controlled forced-choice psychophysical methods were utilized. Ambiguities of interpretation, indeterminate diagnoses, and unwanted confounds can be avoided by minimizing memory and attentional demands on the input end and precluding the use of response-selection strategies that use complex motor processes on the output end. Advocated are the use of computer-controlled forced-choice psychophysical paradigms in combination with matched tasks in multiple sensory modalities to enhance the prospect of obtaining a valid diagnosis. American Academy of Audiology.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rogers, L.A.; Randolph, P.L.
1979-01-01
A paper presented by the Institute of Gas Technology (IGT) at the Third Geopressured-Geothermal Energy Conference hypothesized that the high ratio of produced gas to produced water from the No. 1 sand in the Edna Delcambre No. 1 well was due to free gas trapped in pores by imbibition over geological time. This hypothesis was examined in relation to preliminary test data which reported only average gas to water ratios over the roughly 2-day steps in flow rate. Subsequent public release of detailed test data revealed substantial departures from the previously reported computer simulation results. Also, data now in themore » public domain reveal the existence of a gas cap on the aquifier tested. This paper describes IGT's efforts to match the observed gas/water production with computer simulation. Two models for the occurrence and production of gas in excess of that dissolved in the brine have been used. One model considers the gas to be dispersed in pores by imbibition, and the other model considers the gas as a nearby free gas cap above the aquifier. The studies revealed that the dispersed gas model characteristically gave the wrong shape to plots of gas production on the gas/water ratio plots such that no reasonable match to the flow data could be achieved. The free gas cap model gave a characteristically better shape to the production plots and could provide an approximate fit to the data of the edge of the free gas cap is only about 400 feet from the well.Because the geological structure maps indicate the free gas cap to be several thousand feet away and the computer simulation results match the distance to the nearby Delcambre Nos. 4 and 4A wells, it appears that the source of the excess free gas in the test of the No. 1 sand may be from these nearby wells. The gas source is probably a separate gas zone and is brought into contact with the No. 1 sand via a conduit around the No. 4 well.« less
Sex Bias Exists in Human Surgical Clinical Research
Mansukhani, Neel A.; Yoon, Dustin Y.; Teter, Katherine A.; Stubbs, Vanessa C.; Helenowski, Irene B.; Woodruff, Teresa K.; Kibbe, Melina R.
2016-01-01
Importance Sex is a variable that is poorly controlled for in clinical research. Objective Determine if sex bias exists in human surgical clinical research, determine if data are reported and analyzed using sex as an independent variable, and identify specialties where the greatest and least sex biases exist. Design Review and data abstraction from published peer-reviewed manuscripts. Setting All original peer-reviewed manuscripts published in 2011 and 2012 in Annals of Surgery, American Journal of Surgery, JAMA Surgery, Journal of Surgical Research, and Surgery. Main Outcome Measures Study type, location, number and sex of subjects, sex matching, and inclusion of sex-based reporting, statistical analysis, and discussion of data. Results Of 2,347 articles reviewed, 1,668 included human subjects. After excluding 365 articles, 1,303 manuscripts remained: 17 (1%) included only males, 41 (3%) included only females, 1,020 (78%) included males and females, and 225 (17%) did not document the sex of the subjects. While females represent over 50% of the total number of subjects included, considerable variability existed with the number of male, female, and unspecified subjects included among the journals, between US domestic and international studies, and between single versus multi-center studies. For manuscripts included in the study, only 38% reported these data by sex, 33% analyzed these data by sex, and 23% included a discussion of sex-based results. Sex matching of the subjects included in the research was poor, with only 18% of the studies matching the inclusion of both sexes by 80%. Upon analysis of the different surgical specialties, a wide variation in sex-based inclusion, matching, and data reporting existed, with colorectal surgery having the best matching of males and females and cardiac surgery having the worst. Conclusion Our data show that sex bias exists in human surgical clinical research. Few studies included men and women equally, less than one-third performed data analysis by sex, and there was wide variation in inclusion and matching of the sexes among the specialties and the journals reviewed. Because clinical research serves as the foundation for evidence-based medicine, it is imperative that this disparity be addressed so that therapies benefit both sexes. PMID:27551816
Predicting New Indications for Approved Drugs Using a Proteo-Chemometric Method
Dakshanamurthy, Sivanesan; Issa, Naiem T; Assefnia, Shahin; Seshasayee, Ashwini; Peters, Oakland J; Madhavan, Subha; Uren, Aykut; Brown, Milton L; Byers, Stephen W
2012-01-01
The most effective way to move from target identification to the clinic is to identify already approved drugs with the potential for activating or inhibiting unintended targets (repurposing or repositioning). This is usually achieved by high throughput chemical screening, transcriptome matching or simple in silico ligand docking. We now describe a novel rapid computational proteo-chemometric method called “Train, Match, Fit, Streamline” (TMFS) to map new drug-target interaction space and predict new uses. The TMFS method combines shape, topology and chemical signatures, including docking score and functional contact points of the ligand, to predict potential drug-target interactions with remarkable accuracy. Using the TMFS method, we performed extensive molecular fit computations on 3,671 FDA approved drugs across 2,335 human protein crystal structures. The TMFS method predicts drug-target associations with 91% accuracy for the majority of drugs. Over 58% of the known best ligands for each target were correctly predicted as top ranked, followed by 66%, 76%, 84% and 91% for agents ranked in the top 10, 20, 30 and 40, respectively, out of all 3,671 drugs. Drugs ranked in the top 1–40, that have not been experimentally validated for a particular target now become candidates for repositioning. Furthermore, we used the TMFS method to discover that mebendazole, an anti-parasitic with recently discovered and unexpected anti-cancer properties, has the structural potential to inhibit VEGFR2. We confirmed experimentally that mebendazole inhibits VEGFR2 kinase activity as well as angiogenesis at doses comparable with its known effects on hookworm. TMFS also predicted, and was confirmed with surface plasmon resonance, that dimethyl celecoxib and the anti-inflammatory agent celecoxib can bind cadherin-11, an adhesion molecule important in rheumatoid arthritis and poor prognosis malignancies for which no targeted therapies exist. We anticipate that expanding our TMFS method to the >27,000 clinically active agents available worldwide across all targets will be most useful in the repositioning of existing drugs for new therapeutic targets. PMID:22780961
Creating a standardized watersheds database for the Lower Rio Grande/Río Bravo, Texas
Brown, J.R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Río Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets.Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds.A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
Creating a standardized watersheds database for the lower Rio Grande/Rio Bravo, Texas
Brown, Julie R.; Ulery, Randy L.; Parcher, Jean W.
2000-01-01
This report describes the creation of a large-scale watershed database for the lower Rio Grande/Rio Bravo Basin in Texas. The watershed database includes watersheds delineated to all 1:24,000-scale mapped stream confluences and other hydrologically significant points, selected watershed characteristics, and hydrologic derivative datasets. Computer technology allows generation of preliminary watershed boundaries in a fraction of the time needed for manual methods. This automated process reduces development time and results in quality improvements in watershed boundaries and characteristics. These data can then be compiled in a permanent database, eliminating the time-consuming step of data creation at the beginning of a project and providing a stable base dataset that can give users greater confidence when further subdividing watersheds. A standardized dataset of watershed characteristics is a valuable contribution to the understanding and management of natural resources. Vertical integration of the input datasets used to automatically generate watershed boundaries is crucial to the success of such an effort. The optimum situation would be to use the digital orthophoto quadrangles as the source of all the input datasets. While the hydrographic data from the digital line graphs can be revised to match the digital orthophoto quadrangles, hypsography data cannot be revised to match the digital orthophoto quadrangles. Revised hydrography from the digital orthophoto quadrangle should be used to create an updated digital elevation model that incorporates the stream channels as revised from the digital orthophoto quadrangle. Computer-generated, standardized watersheds that are vertically integrated with existing digital line graph hydrographic data will continue to be difficult to create until revisions can be made to existing source datasets. Until such time, manual editing will be necessary to make adjustments for man-made features and changes in the natural landscape that are not reflected in the digital elevation model data.
NASA Astrophysics Data System (ADS)
Qiu, Zhaoyang; Wang, Pei; Zhu, Jun; Tang, Bin
2016-12-01
Nyquist folding receiver (NYFR) is a novel ultra-wideband receiver architecture which can realize wideband receiving with a small amount of equipment. Linear frequency modulated/binary phase shift keying (LFM/BPSK) hybrid modulated signal is a novel kind of low probability interception signal with wide bandwidth. The NYFR is an effective architecture to intercept the LFM/BPSK signal and the LFM/BPSK signal intercepted by the NYFR will add the local oscillator modulation. A parameter estimation algorithm for the NYFR output signal is proposed. According to the NYFR prior information, the chirp singular value ratio spectrum is proposed to estimate the chirp rate. Then, based on the output self-characteristic, matching component function is designed to estimate Nyquist zone (NZ) index. Finally, matching code and subspace method are employed to estimate the phase change points and code length. Compared with the existing methods, the proposed algorithm has a better performance. It also has no need to construct a multi-channel structure, which means the computational complexity for the NZ index estimation is small. The simulation results demonstrate the efficacy of the proposed algorithm.
Visual navigation using edge curve matching for pinpoint planetary landing
NASA Astrophysics Data System (ADS)
Cui, Pingyuan; Gao, Xizhen; Zhu, Shengying; Shao, Wei
2018-05-01
Pinpoint landing is challenging for future Mars and asteroid exploration missions. Vision-based navigation scheme based on feature detection and matching is practical and can achieve the required precision. However, existing algorithms are computationally prohibitive and utilize poor-performance measurements, which pose great challenges for the application of visual navigation. This paper proposes an innovative visual navigation scheme using crater edge curves during descent and landing phase. In the algorithm, the edge curves of the craters tracked from two sequential images are utilized to determine the relative attitude and position of the lander through a normalized method. Then, considering error accumulation of relative navigation, a method is developed. That is to integrate the crater-based relative navigation method with crater-based absolute navigation method that identifies craters using a georeferenced database for continuous estimation of absolute states. In addition, expressions of the relative state estimate bias are derived. Novel necessary and sufficient observability criteria based on error analysis are provided to improve the navigation performance, which hold true for similar navigation systems. Simulation results demonstrate the effectiveness and high accuracy of the proposed navigation method.
Image Segmentation, Registration, Compression, and Matching
NASA Technical Reports Server (NTRS)
Yadegar, Jacob; Wei, Hai; Yadegar, Joseph; Ray, Nilanjan; Zabuawala, Sakina
2011-01-01
A novel computational framework was developed of a 2D affine invariant matching exploiting a parameter space. Named as affine invariant parameter space (AIPS), the technique can be applied to many image-processing and computer-vision problems, including image registration, template matching, and object tracking from image sequence. The AIPS is formed by the parameters in an affine combination of a set of feature points in the image plane. In cases where the entire image can be assumed to have undergone a single affine transformation, the new AIPS match metric and matching framework becomes very effective (compared with the state-of-the-art methods at the time of this reporting). No knowledge about scaling or any other transformation parameters need to be known a priori to apply the AIPS framework. An automated suite of software tools has been created to provide accurate image segmentation (for data cleaning) and high-quality 2D image and 3D surface registration (for fusing multi-resolution terrain, image, and map data). These tools are capable of supporting existing GIS toolkits already in the marketplace, and will also be usable in a stand-alone fashion. The toolkit applies novel algorithmic approaches for image segmentation, feature extraction, and registration of 2D imagery and 3D surface data, which supports first-pass, batched, fully automatic feature extraction (for segmentation), and registration. A hierarchical and adaptive approach is taken for achieving automatic feature extraction, segmentation, and registration. Surface registration is the process of aligning two (or more) data sets to a common coordinate system, during which the transformation between their different coordinate systems is determined. Also developed here are a novel, volumetric surface modeling and compression technique that provide both quality-guaranteed mesh surface approximations and compaction of the model sizes by efficiently coding the geometry and connectivity/topology components of the generated models. The highly efficient triangular mesh compression compacts the connectivity information at the rate of 1.5-4 bits per vertex (on average for triangle meshes), while reducing the 3D geometry by 40-50 percent. Finally, taking into consideration the characteristics of 3D terrain data, and using the innovative, regularized binary decomposition mesh modeling, a multistage, pattern-drive modeling, and compression technique has been developed to provide an effective framework for compressing digital elevation model (DEM) surfaces, high-resolution aerial imagery, and other types of NASA data.
NASA Astrophysics Data System (ADS)
van Rooij, Michael P. C.
Current turbomachinery design systems increasingly rely on multistage Computational Fluid Dynamics (CFD) as a means to assess performance of designs. However, design weaknesses attributed to improper stage matching are addressed using often ineffective strategies involving a costly iterative loop between blading modification, revision of design intent, and evaluation of aerodynamic performance. A design methodology is presented which greatly improves the process of achieving design-point aerodynamic matching. It is based on a three-dimensional viscous inverse design method which generates the blade camber surface based on prescribed pressure loading, thickness distribution and stacking line. This inverse design method has been extended to allow blading analysis and design in a multi-blade row environment. Blade row coupling was achieved through a mixing plane approximation. Parallel computing capability in the form of MPI has been implemented to reduce the computational time for multistage calculations. Improvements have been made to the flow solver to reach the level of accuracy required for multistage calculations. These include inclusion of heat flux, temperature-dependent treatment of viscosity, and improved calculation of stress components and artificial dissipation near solid walls. A validation study confirmed that the obtained accuracy is satisfactory at design point conditions. Improvements have also been made to the inverse method to increase robustness and design fidelity. These include the possibility to exclude spanwise sections of the blade near the endwalls from the design process, and a scheme that adjusts the specified loading area for changes resulting from the leading and trailing edge treatment. Furthermore, a pressure loading manager has been developed. Its function is to automatically adjust the pressure loading area distribution during the design calculation in order to achieve a specified design objective. Possible objectives are overall mass flow and compression ratio, and radial distribution of exit flow angle. To supplement the loading manager, mass flow inlet and exit boundary conditions have been implemented. Through appropriate combination of pressure or mass flow inflow/outflow boundary conditions and loading manager objectives, increased control over the design intent can be obtained. The three-dimensional multistage inverse design method with pressure loading manager was demonstrated to offer greatly enhanced blade row matching capabilities. Multistage design allows for simultaneous design of blade rows in a mutually interacting environment, which permits the redesigned blading to adapt to changing aerodynamic conditions resulting from the redesign. This ensures that the obtained blading geometry and performance implied by the prescribed pressure loading distribution are consistent with operation in the multi-blade row environment. The developed methodology offers high aerodynamic design quality and productivity, and constitutes a significant improvement over existing approaches used to address design-point aerodynamic matching.
Technical Note: spektr 3.0—A computational tool for x-ray spectrum modeling and analysis
Punnoose, J.; Xu, J.; Sisniega, A.; Zbijewski, W.; Siewerdsen, J. H.
2016-01-01
Purpose: A computational toolkit (spektr 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a matlab (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The spektr code generates x-ray spectra (photons/mm2/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins over beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, spektr, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the spektr function library, UI, and optimization tool are available. PMID:27487888
Technical Note: SPEKTR 3.0—A computational tool for x-ray spectrum modeling and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Punnoose, J.; Xu, J.; Sisniega, A.
2016-08-15
Purpose: A computational toolkit (SPEKTR 3.0) has been developed to calculate x-ray spectra based on the tungsten anode spectral model using interpolating cubic splines (TASMICS) algorithm, updating previous work based on the tungsten anode spectral model using interpolating polynomials (TASMIP) spectral model. The toolkit includes a MATLAB (The Mathworks, Natick, MA) function library and improved user interface (UI) along with an optimization algorithm to match calculated beam quality with measurements. Methods: The SPEKTR code generates x-ray spectra (photons/mm{sup 2}/mAs at 100 cm from the source) using TASMICS as default (with TASMIP as an option) in 1 keV energy bins overmore » beam energies 20–150 kV, extensible to 640 kV using the TASMICS spectra. An optimization tool was implemented to compute the added filtration (Al and W) that provides a best match between calculated and measured x-ray tube output (mGy/mAs or mR/mAs) for individual x-ray tubes that may differ from that assumed in TASMICS or TASMIP and to account for factors such as anode angle. Results: The median percent difference in photon counts for a TASMICS and TASMIP spectrum was 4.15% for tube potentials in the range 30–140 kV with the largest percentage difference arising in the low and high energy bins due to measurement errors in the empirically based TASMIP model and inaccurate polynomial fitting. The optimization tool reported a close agreement between measured and calculated spectra with a Pearson coefficient of 0.98. Conclusions: The computational toolkit, SPEKTR, has been updated to version 3.0, validated against measurements and existing models, and made available as open source code. Video tutorials for the SPEKTR function library, UI, and optimization tool are available.« less
Optimizing Approximate Weighted Matching on Nvidia Kepler K40
DOE Office of Scientific and Technical Information (OSTI.GOV)
Naim, Md; Manne, Fredrik; Halappanavar, Mahantesh
Matching is a fundamental graph problem with numerous applications in science and engineering. While algorithms for computing optimal matchings are difficult to parallelize, approximation algorithms on the other hand generally compute high quality solutions and are amenable to parallelization. In this paper, we present efficient implementations of the current best algorithm for half-approximate weighted matching, the Suitor algorithm, on Nvidia Kepler K-40 platform. We develop four variants of the algorithm that exploit hardware features to address key challenges for a GPU implementation. We also experiment with different combinations of work assigned to a warp. Using an exhaustive set ofmore » $269$ inputs, we demonstrate that the new implementation outperforms the previous best GPU algorithm by $10$ to $$100\\times$$ for over $100$ instances, and from $100$ to $$1000\\times$$ for $15$ instances. We also demonstrate up to $$20\\times$$ speedup relative to $2$ threads, and up to $$5\\times$$ relative to $16$ threads on Intel Xeon platform with $16$ cores for the same algorithm. The new algorithms and implementations provided in this paper will have a direct impact on several applications that repeatedly use matching as a key compute kernel. Further, algorithm designs and insights provided in this paper will benefit other researchers implementing graph algorithms on modern GPU architectures.« less
Numerical studies of the fluid and optical fields associated with complex cavity flows
NASA Technical Reports Server (NTRS)
Atwood, Christopher A.
1992-01-01
Numerical solutions for the flowfield about several cavity configurations have been computed using the Reynolds averaged Navier-Stokes equations. Comparisons between numerical and experimental results are made in two dimensions for free shear layers and a rectangular cavity, and in three dimensions for the transonic aero-window problem of the Stratospheric Observatory for Infrared Astronomy (SOFIA). Results show that dominant acoustic frequencies and magnitudes of the self excited resonant cavity flows compare well with the experiment. In addition, solution sensitivity to artificial dissipation and grid resolution levels are determined. Optical path distortion due to the flow field is modelled geometrically and is found to match the experiment. The fluid field was computed using a diagonalized scheme within an overset mesh framework. An existing code, OVERFLOW, was utilized with the additions of characteristic boundary condition and output routines required for reduction of the unsteady data. The newly developed code is directly applicable to a generalized three dimensional structured grid zone. Details are provided in a paper included in Appendix A.
Matching network for RF plasma source
Pickard, Daniel S.; Leung, Ka-Ngo
2007-11-20
A compact matching network couples an RF power supply to an RF antenna in a plasma generator. The simple and compact impedance matching network matches the plasma load to the impedance of a coaxial transmission line and the output impedance of an RF amplifier at radio frequencies. The matching network is formed of a resonantly tuned circuit formed of a variable capacitor and an inductor in a series resonance configuration, and a ferrite core transformer coupled to the resonantly tuned circuit. This matching network is compact enough to fit in existing compact focused ion beam systems.
Template match using local feature with view invariance
NASA Astrophysics Data System (ADS)
Lu, Cen; Zhou, Gang
2013-10-01
Matching the template image in the target image is the fundamental task in the field of computer vision. Aiming at the deficiency in the traditional image matching methods and inaccurate matching in scene image with rotation, illumination and view changing, a novel matching algorithm using local features are proposed in this paper. The local histograms of the edge pixels (LHoE) are extracted as the invariable feature to resist view and brightness changing. The merits of the LHoE is that the edge points have been little affected with view changing, and the LHoE can resist not only illumination variance but also the polution of noise. For the process of matching are excuded only on the edge points, the computation burden are highly reduced. Additionally, our approach is conceptually simple, easy to implement and do not need the training phase. The view changing can be considered as the combination of rotation, illumination and shear transformation. Experimental results on simulated and real data demonstrated that the proposed approach is superior to NCC(Normalized cross-correlation) and Histogram-based methods with view changing.
Component extraction on CT volumes of assembled products using geometric template matching
NASA Astrophysics Data System (ADS)
Muramatsu, Katsutoshi; Ohtake, Yutaka; Suzuki, Hiromasa; Nagai, Yukie
2017-03-01
As a method of non-destructive internal inspection, X-ray computed tomography (CT) is used not only in medical applications but also for product inspection. Some assembled products can be divided into separate components based on density, which is known to be approximately proportional to CT values. However, components whose densities are similar cannot be distinguished using the CT value driven approach. In this study, we proposed a new component extraction algorithm from the CT volume, using a set of voxels with an assigned CT value with the surface mesh as the template rather than the density. The method has two main stages: rough matching and fine matching. At the rough matching stage, the position of candidate targets is identified roughly from the CT volume, using the template of the target component. At the fine matching stage, these candidates are precisely matched with the templates, allowing the correct position of the components to be detected from the CT volume. The results of two computational experiments showed that the proposed algorithm is able to extract components with similar density within the assembled products on CT volumes.
NASA Astrophysics Data System (ADS)
Sharma, Kajal; Moon, Inkyu; Kim, Sung Gaun
2012-10-01
Estimating depth has long been a major issue in the field of computer vision and robotics. The Kinect sensor's active sensing strategy provides high-frame-rate depth maps and can recognize user gestures and human pose. This paper presents a technique to estimate the depth of features extracted from video frames, along with an improved feature-matching method. In this paper, we used the Kinect camera developed by Microsoft, which captured color and depth images for further processing. Feature detection and selection is an important task for robot navigation. Many feature-matching techniques have been proposed earlier, and this paper proposes an improved feature matching between successive video frames with the use of neural network methodology in order to reduce the computation time of feature matching. The features extracted are invariant to image scale and rotation, and different experiments were conducted to evaluate the performance of feature matching between successive video frames. The extracted features are assigned distance based on the Kinect technology that can be used by the robot in order to determine the path of navigation, along with obstacle detection applications.
NASA Astrophysics Data System (ADS)
Viana, Ilisio; Orteu, Jean-José; Cornille, Nicolas; Bugarin, Florian
2015-11-01
We focus on quality control of mechanical parts in aeronautical context using a single pan-tilt-zoom (PTZ) camera and a computer-aided design (CAD) model of the mechanical part. We use the CAD model to create a theoretical image of the element to be checked, which is further matched with the sensed image of the element to be inspected, using a graph theory-based approach. The matching is carried out in two stages. First, the two images are used to create two attributed graphs representing the primitives (ellipses and line segments) in the images. In the second stage, the graphs are matched using a similarity function built from the primitive parameters. The similarity scores of the matching are injected in the edges of a bipartite graph. A best-match-search procedure in the bipartite graph guarantees the uniqueness of the match solution. The method achieves promising performance in tests with synthetic data including missing elements, displaced elements, size changes, and combinations of these cases. The results open good prospects for using the method with realistic data.
Autonomous proximity operations using machine vision for trajectory control and pose estimation
NASA Technical Reports Server (NTRS)
Cleghorn, Timothy F.; Sternberg, Stanley R.
1991-01-01
A machine vision algorithm was developed which permits guidance control to be maintained during autonomous proximity operations. At present this algorithm exists as a simulation, running upon an 80386 based personal computer, using a ModelMATE CAD package to render the target vehicle. However, the algorithm is sufficiently simple, so that following off-line training on a known target vehicle, it should run in real time with existing vision hardware. The basis of the algorithm is a sequence of single camera images of the target vehicle, upon which radial transforms were performed. Selected points of the resulting radial signatures are fed through a decision tree, to determine whether the signature matches that of the known reference signatures for a particular view of the target. Based upon recognized scenes, the position of the maneuvering vehicle with respect to the target vehicles can be calculated, and adjustments made in the former's trajectory. In addition, the pose and spin rates of the target satellite can be estimated using this method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gilkey, Lindsay
This milestone presents a demonstration of the High-to-Low (Hi2Lo) process in the VVI focus area. Validation and additional calculations with the commercial computational fluid dynamics code, STAR-CCM+, were performed using a 5x5 fuel assembly with non-mixing geometry and spacer grids. This geometry was based on the benchmark experiment provided by Westinghouse. Results from the simulations were compared to existing experimental data and to the subchannel thermal-hydraulics code COBRA-TF (CTF). An uncertainty quantification (UQ) process was developed for the STAR-CCM+ model and results of the STAR UQ were communicated to CTF. Results from STAR-CCM+ simulations were used as experimental design pointsmore » in CTF to calibrate the mixing parameter β and compared to results obtained using experimental data points. This demonstrated that CTF’s β parameter can be calibrated to match existing experimental data more closely. The Hi2Lo process for the STAR-CCM+/CTF code coupling was documented in this milestone and closely linked L3:VVI.H2LP15.01 milestone report.« less
Nesvizhskii, Alexey I.
2010-01-01
This manuscript provides a comprehensive review of the peptide and protein identification process using tandem mass spectrometry (MS/MS) data generated in shotgun proteomic experiments. The commonly used methods for assigning peptide sequences to MS/MS spectra are critically discussed and compared, from basic strategies to advanced multi-stage approaches. A particular attention is paid to the problem of false-positive identifications. Existing statistical approaches for assessing the significance of peptide to spectrum matches are surveyed, ranging from single-spectrum approaches such as expectation values to global error rate estimation procedures such as false discovery rates and posterior probabilities. The importance of using auxiliary discriminant information (mass accuracy, peptide separation coordinates, digestion properties, and etc.) is discussed, and advanced computational approaches for joint modeling of multiple sources of information are presented. This review also includes a detailed analysis of the issues affecting the interpretation of data at the protein level, including the amplification of error rates when going from peptide to protein level, and the ambiguities in inferring the identifies of sample proteins in the presence of shared peptides. Commonly used methods for computing protein-level confidence scores are discussed in detail. The review concludes with a discussion of several outstanding computational issues. PMID:20816881
Gaia Data Release 1. Cross-match with external catalogues. Algorithm and results
NASA Astrophysics Data System (ADS)
Marrese, P. M.; Marinoni, S.; Fabrizio, M.; Giuffrida, G.
2017-11-01
Context. Although the Gaia catalogue on its own will be a very powerful tool, it is the combination of this highly accurate archive with other archives that will truly open up amazing possibilities for astronomical research. The advanced interoperation of archives is based on cross-matching, leaving the user with the feeling of working with one single data archive. The data retrieval should work not only across data archives, but also across wavelength domains. The first step for seamless data access is the computation of the cross-match between Gaia and external surveys. Aims: The matching of astronomical catalogues is a complex and challenging problem both scientifically and technologically (especially when matching large surveys like Gaia). We describe the cross-match algorithm used to pre-compute the match of Gaia Data Release 1 (DR1) with a selected list of large publicly available optical and IR surveys. Methods: The overall principles of the adopted cross-match algorithm are outlined. Details are given on the developed algorithm, including the methods used to account for position errors, proper motions, and environment; to define the neighbours; and to define the figure of merit used to select the most probable counterpart. Results: Statistics on the results are also given. The results of the cross-match are part of the official Gaia DR1 catalogue.
Pagan, Marino
2014-01-01
Finding sought objects requires the brain to combine visual and target signals to determine when a target is in view. To investigate how the brain implements these computations, we recorded neural responses in inferotemporal cortex (IT) and perirhinal cortex (PRH) as macaque monkeys performed a delayed-match-to-sample target search task. Our data suggest that visual and target signals were combined within or before IT in the ventral visual pathway and then passed onto PRH, where they were reformatted into a more explicit target match signal over ∼10–15 ms. Accounting for these dynamics in PRH did not require proposing dynamic computations within PRH itself but, rather, could be attributed to instantaneous PRH computations performed upon an input representation from IT that changed with time. We found that the dynamics of the IT representation arose from two commonly observed features: individual IT neurons whose response preferences were not simply rescaled with time and variable response latencies across the population. Our results demonstrate that these types of time-varying responses have important consequences for downstream computation and suggest that dynamic representations can arise within a feedforward framework as a consequence of instantaneous computations performed upon time-varying inputs. PMID:25122904
Carbohydrates for Soccer: A Focus on Skilled Actions and Half-Time Practices
Hills, Samuel P.; Russell, Mark
2017-01-01
Carbohydrate consumption is synonymous with soccer performance due to the established effects on endogenous energy store preservation, and physical capacity maintenance. For performance-enhancement purposes, exogenous energy consumption (in the form of drinks, bars, gels and snacks) is recommended on match-day; specifically, before and during match-play. Akin to the demands of soccer, limited opportunities exist to consume carbohydrates outside of scheduled breaks in competition, such as at half-time. The link between cognitive function and blood glucose availability suggests that carbohydrates may influence decision-making and technical proficiency (e.g., soccer skills). However, relatively few reviews have focused on technical, as opposed to physical, performance while also addressing the practicalities associated with carbohydrate consumption when limited in-play feeding opportunities exist. Transient physiological responses associated with reductions in activity prevalent in scheduled intra-match breaks (e.g., half-time) likely have important consequences for practitioners aiming to optimize match-day performance. Accordingly, this review evaluated novel developments in soccer literature regarding (1) the ergogenic properties of carbohydrates for skill performance; and (2) novel considerations concerning exogenous energy provision during half-time. Recommendations are made to modify half-time practices in an aim to enhance subsequent performance. Viable future research opportunities exist regarding a deeper insight into carbohydrate provision on match-day. PMID:29295583
Wu, Chueh-Hung; Chen, Li-Sheng; Yen, Ming-Fang; Chiu, Yueh-Hsia; Fann, Ching-Yuan; Chen, Hsiu-Hsi; Pan, Shin-Liang
2014-01-01
Previous studies on the association between tuberculosis and the risk of developing ischemic stroke have generated inconsistent results. We therefore performed a population-based, propensity score-matched longitudinal follow-up study to investigate whether contracting non-central nervous system (CNS) tuberculosis leads to an increased risk of ischemic stroke. We used a logistic regression model that includes age, sex, pre-existing comorbidities and socioeconomic status as covariates to compute the propensity score. A total of 5804 persons with at least three ambulatory visits in 2001 with the principal diagnosis of non-CNS tuberculosis were enrolled in the tuberculosis group. The non-tuberculosis group consisted of 5804, propensity score-matched subjects without tuberculosis. The three-year ischemic stroke-free survival rates for these 2 groups were estimated using the Kaplan-Meier method. The stratified Cox proportional hazards regression was used to estimate the effect of tuberculosis on the occurrence of ischemic stroke. During three-year follow-up, 176 subjects in the tuberculosis group (3.0%) and 207 in the non-tuberculosis group (3.6%) had ischemic stroke. The hazard ratio for developing ischemic stroke in the tuberculosis group was 0.92 compared to the non-tuberculosis group (95% confidence interval: 0.73-1.14, P = 0.4299). Non-CNS tuberculosis does not increase the risk of subsequent ischemic stroke.
NASA Astrophysics Data System (ADS)
Gohatre, Umakant Bhaskar; Patil, Venkat P.
2018-04-01
In computer vision application, the multiple object detection and tracking, in real-time operation is one of the important research field, that have gained a lot of attentions, in last few years for finding non stationary entities in the field of image sequence. The detection of object is advance towards following the moving object in video and then representation of object is step to track. The multiple object recognition proof is one of the testing assignment from detection multiple objects from video sequence. The picture enrollment has been for quite some time utilized as a reason for the location the detection of moving multiple objects. The technique of registration to discover correspondence between back to back casing sets in view of picture appearance under inflexible and relative change. The picture enrollment is not appropriate to deal with event occasion that can be result in potential missed objects. In this paper, for address such problems, designs propose novel approach. The divided video outlines utilizing area adjancy diagram of visual appearance and geometric properties. Then it performed between graph sequences by using multi graph matching, then getting matching region labeling by a proposed graph coloring algorithms which assign foreground label to respective region. The plan design is robust to unknown transformation with significant improvement in overall existing work which is related to moving multiple objects detection in real time parameters.
Convective Heat Transfer in the Reusable Solid Rocket Motor of the Space Transportation System
NASA Technical Reports Server (NTRS)
Ahmad, Rashid A.; Cash, Stephen F. (Technical Monitor)
2002-01-01
This simulation involved a two-dimensional axisymmetric model of a full motor initial grain of the Reusable Solid Rocket Motor (RSRM) of the Space Transportation System (STS). It was conducted with CFD (computational fluid dynamics) commercial code FLUENT. This analysis was performed to: a) maintain continuity with most related previous analyses, b) serve as a non-vectored baseline for any three-dimensional vectored nozzles, c) provide a relatively simple application and checkout for various CFD solution schemes, grid sensitivity studies, turbulence modeling and heat transfer, and d) calculate nozzle convective heat transfer coefficients. The accuracy of the present results and the selection of the numerical schemes and turbulence models were based on matching the rocket ballistic predictions of mass flow rate, head end pressure, vacuum thrust and specific impulse, and measured chamber pressure drop. Matching these ballistic predictions was found to be good. This study was limited to convective heat transfer and the results compared favorably with existing theory. On the other hand, qualitative comparison with backed-out data of the ratio of the convective heat transfer coefficient to the specific heat at constant pressure was made in a relative manner. This backed-out data was devised to match nozzle erosion that was a result of heat transfer (convective, radiative and conductive), chemical (transpirating), and mechanical (shear and particle impingement forces) effects combined.
Is an eclipse described in the Odyssey?
Baikouzis, Constantino; Magnasco, Marcelo O
2008-07-01
Plutarch and Heraclitus believed a certain passage in the 20th book of the Odyssey ("Theoclymenus's prophecy") to be a poetic description of a total solar eclipse. In the late 1920s, Schoch and Neugebauer computed that the solar eclipse of 16 April 1178 B.C.E. was total over the Ionian Islands and was the only suitable eclipse in more than a century to agree with classical estimates of the decade-earlier sack of Troy around 1192-1184 B.C.E. However, much skepticism remains about whether the verses refer to this, or any, eclipse. To contribute to the issue independently of the disputed eclipse reference, we analyze other astronomical references in the Epic, without assuming the existence of an eclipse, and search for dates matching the astronomical phenomena we believe they describe. We use three overt astronomical references in the epic: to Boötes and the Pleiades, Venus, and the New Moon; we supplement them with a conjectural identification of Hermes's trip to Ogygia as relating to the motion of planet Mercury. Performing an exhaustive search of all possible dates in the span 1250-1115 B.C., we looked to match these phenomena in the order and manner that the text describes. In that period, a single date closely matches our references: 16 April 1178 B.C.E. We speculate that these references, plus the disputed eclipse reference, may refer to that specific eclipse.
76 FR 48811 - Computer Matching and Privacy Protection Act of 1988
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-09
...)), concerning an individual's eligibility to receive a Segal AmeriCorps Education Award from the National... the applicable routine use. F. Inclusive Dates of the Matching Program This agreement will be in...
7 CFR 272.14 - Deceased matching system.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Master File, obtained through the State Verification and Exchange System (SVES) and enter into a computer... comparison of matched data at the time of application and no less frequently than once a year. (2) The...
7 CFR 272.14 - Deceased matching system.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Master File, obtained through the State Verification and Exchange System (SVES) and enter into a computer... comparison of matched data at the time of application and no less frequently than once a year. (2) The...
Modeling of anomalous electron mobility in Hall thrusters
DOE Office of Scientific and Technical Information (OSTI.GOV)
Koo, Justin W.; Boyd, Iain D.
Accurate modeling of the anomalous electron mobility is absolutely critical for successful simulation of Hall thrusters. In this work, existing computational models for the anomalous electron mobility are used to simulate the UM/AFRL P5 Hall thruster (a 5 kW laboratory model) in a two-dimensional axisymmetric hybrid particle-in-cell Monte Carlo collision code. Comparison to experimental results indicates that, while these computational models can be tuned to reproduce the correct thrust or discharge current, it is very difficult to match all integrated performance parameters (thrust, power, discharge current, etc.) simultaneously. Furthermore, multiple configurations of these computational models can produce reasonable integrated performancemore » parameters. A semiempirical electron mobility profile is constructed from a combination of internal experimental data and modeling assumptions. This semiempirical electron mobility profile is used in the code and results in more accurate simulation of both the integrated performance parameters and the mean potential profile of the thruster. Results indicate that the anomalous electron mobility, while absolutely necessary in the near-field region, provides a substantially smaller contribution to the total electron mobility in the high Hall current region near the thruster exit plane.« less
A computational approach to climate science education with CLIMLAB
NASA Astrophysics Data System (ADS)
Rose, B. E. J.
2017-12-01
CLIMLAB is a Python-based software toolkit for interactive, process-oriented climate modeling for use in education and research. It is motivated by the need for simpler tools and more reproducible workflows with which to "fill in the gaps" between blackboard-level theory and the results of comprehensive climate models. With CLIMLAB you can interactively mix and match physical model components, or combine simpler process models together into a more comprehensive model. I use CLIMLAB in the classroom to put models in the hands of students (undergraduate and graduate), and emphasize a hierarchical, process-oriented approach to understanding the key emergent properties of the climate system. CLIMLAB is equally a tool for climate research, where the same needs exist for more robust, process-based understanding and reproducible computational results. I will give an overview of CLIMLAB and an update on recent developments, including: a full-featured, well-documented, interactive implementation of a widely-used radiation model (RRTM) packaging with conda-forge for compiler-free (and hassle-free!) installation on Mac, Windows and Linux interfacing with xarray for i/o and graphics with gridded model data a rich and growing collection of examples and self-computing lecture notes in Jupyter notebook format
Sleep patterns and match performance in elite Australian basketball athletes.
Staunton, Craig; Gordon, Brett; Custovic, Edhem; Stanger, Jonathan; Kingsley, Michael
2017-08-01
To assess sleep patterns and associations between sleep and match performance in elite Australian female basketball players. Prospective cohort study. Seventeen elite female basketball players were monitored across two consecutive in-season competitions (30 weeks). Total sleep time and sleep efficiency were determined using triaxial accelerometers for Baseline, Pre-match, Match-day and Post-match timings. Match performance was determined using the basketball efficiency statistic (EFF). The effects of match schedule (Regular versus Double-Header; Home versus Away) and sleep on EFF were assessed. The Double-Header condition changed the pattern of sleep when compared with the Regular condition (F (3,48) =3.763, P=0.017), where total sleep time Post-match was 11% less for Double-Header (mean±SD; 7.2±1.4h) compared with Regular (8.0±1.3h; P=0.007). Total sleep time for Double-Header was greater Pre-match (8.2±1.7h) compared with Baseline (7.1±1.6h; P=0.022) and Match-day (7.3±1.5h; P=0.007). Small correlations existed between sleep metrics at Pre-match and EFF for pooled data (r=-0.39 to -0.22; P≥0.238). Relationships between total sleep time and EFF ranged from moderate negative to large positive correlations for individual players (r=-0.37 to 0.62) and reached significance for one player (r=0.60; P=0.025). Match schedule can affect the sleep patterns of elite female basketball players. A large degree of inter-individual variability existed in the relationship between sleep and match performance; nevertheless, sleep monitoring might assist in the optimisation of performance for some athletes. Copyright © 2017 Sports Medicine Australia. Published by Elsevier Ltd. All rights reserved.
Secure and Efficient k-NN Queries⋆
Asif, Hafiz; Vaidya, Jaideep; Shafiq, Basit; Adam, Nabil
2017-01-01
Given the morass of available data, ranking and best match queries are often used to find records of interest. As such, k-NN queries, which give the k closest matches to a query point, are of particular interest, and have many applications. We study this problem in the context of the financial sector, wherein an investment portfolio database is queried for matching portfolios. Given the sensitivity of the information involved, our key contribution is to develop a secure k-NN computation protocol that can enable the computation k-NN queries in a distributed multi-party environment while taking domain semantics into account. The experimental results show that the proposed protocols are extremely efficient. PMID:29218333
Open solutions to distributed control in ground tracking stations
NASA Technical Reports Server (NTRS)
Heuser, William Randy
1994-01-01
The advent of high speed local area networks has made it possible to interconnect small, powerful computers to function together as a single large computer. Today, distributed computer systems are the new paradigm for large scale computing systems. However, the communications provided by the local area network is only one part of the solution. The services and protocols used by the application programs to communicate across the network are as indispensable as the local area network. And the selection of services and protocols that do not match the system requirements will limit the capabilities, performance, and expansion of the system. Proprietary solutions are available but are usually limited to a select set of equipment. However, there are two solutions based on 'open' standards. The question that must be answered is 'which one is the best one for my job?' This paper examines a model for tracking stations and their requirements for interprocessor communications in the next century. The model and requirements are matched with the model and services provided by the five different software architectures and supporting protocol solutions. Several key services are examined in detail to determine which services and protocols most closely match the requirements for the tracking station environment. The study reveals that the protocols are tailored to the problem domains for which they were originally designed. Further, the study reveals that the process control model is the closest match to the tracking station model.
Constructed-Response Matching to Sample and Spelling Instruction.
ERIC Educational Resources Information Center
Dube, William V.; And Others
1991-01-01
This paper describes a computer-based spelling program grounded in programed instructional techniques and using constructed-response matching-to-sample procedures. Following use of the program, two mentally retarded men successfully spelled previously misspelled words. (JDD)
Shape-matching soft mechanical metamaterials.
Mirzaali, M J; Janbaz, S; Strano, M; Vergani, L; Zadpoor, A A
2018-01-17
Architectured materials with rationally designed geometries could be used to create mechanical metamaterials with unprecedented or rare properties and functionalities. Here, we introduce "shape-matching" metamaterials where the geometry of cellular structures comprising auxetic and conventional unit cells is designed so as to achieve a pre-defined shape upon deformation. We used computational models to forward-map the space of planar shapes to the space of geometrical designs. The validity of the underlying computational models was first demonstrated by comparing their predictions with experimental observations on specimens fabricated with indirect additive manufacturing. The forward-maps were then used to devise the geometry of cellular structures that approximate the arbitrary shapes described by random Fourier's series. Finally, we show that the presented metamaterials could match the contours of three real objects including a scapula model, a pumpkin, and a Delft Blue pottery piece. Shape-matching materials have potential applications in soft robotics and wearable (medical) devices.
Mathematics skills in good readers with hydrocephalus.
Barnes, Marcia A; Pengelly, Sarah; Dennis, Maureen; Wilkinson, Margaret; Rogers, Tracey; Faulkner, Heather
2002-01-01
Children with hydrocephalus have poor math skills. We investigated the nature of their arithmetic computation errors by comparing written subtraction errors in good readers with hydrocephalus, typically developing good readers of the same age, and younger children matched for math level to the children with hydrocephalus. Children with hydrocephalus made more procedural errors (although not more fact retrieval or visual-spatial errors) than age-matched controls; they made the same number of procedural errors as younger, math-level matched children. We also investigated a broad range of math abilities, and found that children with hydrocephalus performed more poorly than age-matched controls on tests of geometry and applied math skills such as estimation and problem solving. Computation deficits in children with hydrocephalus reflect delayed development of procedural knowledge. Problems in specific math domains such as geometry and applied math, were associated with deficits in constituent cognitive skills such as visual spatial competence, memory, and general knowledge.
Soft evolution of multi-jet final states
Gerwick, Erik; Schumann, Steffen; Höche, Stefan; ...
2015-02-16
We present a new framework for computing resummed and matched distributions in processes with many hard QCD jets. The intricate color structure of soft gluon emission at large angles renders resummed calculations highly non-trivial in this case. We automate all ingredients necessary for the color evolution of the soft function at next-to-leading-logarithmic accuracy, namely the selection of the color bases and the projections of color operators and Born amplitudes onto those bases. Explicit results for all QCD processes with up to 2 → 5 partons are given. We also devise a new tree-level matching scheme for resummed calculations which exploitsmore » a quasi-local subtraction based on the Catani-Seymour dipole formalism. We implement both resummation and matching in the Sherpa event generator. As a proof of concept, we compute the resummed and matched transverse-thrust distribution for hadronic collisions.« less
A robust fingerprint matching algorithm based on compatibility of star structures
NASA Astrophysics Data System (ADS)
Cao, Jia; Feng, Jufu
2009-10-01
In fingerprint verification or identification systems, most minutiae-based matching algorithms suffered from the problems of non-linear distortion and missing or faking minutiae. Local structures such as triangle or k-nearest structure are widely used to reduce the impact of non-linear distortion, but are suffered from missing and faking minutiae. In our proposed method, star structure is used to present local structure. A star structure contains various number of minutiae, thus, it is more robust with missing and faking minutiae. Our method consists of four steps: 1) Constructing star structures at minutia level; 2) Computing similarity score for each structure pair, and eliminating impostor matched pairs which have the low scores. As it is generally assumed that there is only linear distortion in local area, the similarity is defined by rotation and shifting. 3) Voting for remained matched pairs according to the compatibility between them, and eliminating impostor matched pairs which gain few votes. The concept of compatibility is first introduced by Yansong Feng [4], the original definition is only based on triangles. We define the compatibility for star structures to adjust to our proposed algorithm. 4) Computing the matching score, based on the number of matched structures and their voting scores. The score also reflects the fact that, it should get higher score if minutiae match in more intensive areas. Experiments evaluated on FVC 2004 show both effectiveness and efficiency of our methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Choonsik; Kim, Kwang Pyo; Long, Daniel
2011-03-15
Purpose: To develop a computed tomography (CT) organ dose estimation method designed to readily provide organ doses in a reference adult male and female for different scan ranges to investigate the degree to which existing commercial programs can reasonably match organ doses defined in these more anatomically realistic adult hybrid phantomsMethods: The x-ray fan beam in the SOMATOM Sensation 16 multidetector CT scanner was simulated within the Monte Carlo radiation transport code MCNPX2.6. The simulated CT scanner model was validated through comparison with experimentally measured lateral free-in-air dose profiles and computed tomography dose index (CTDI) values. The reference adult malemore » and female hybrid phantoms were coupled with the established CT scanner model following arm removal to simulate clinical head and other body region scans. A set of organ dose matrices were calculated for a series of consecutive axial scans ranging from the top of the head to the bottom of the phantoms with a beam thickness of 10 mm and the tube potentials of 80, 100, and 120 kVp. The organ doses for head, chest, and abdomen/pelvis examinations were calculated based on the organ dose matrices and compared to those obtained from two commercial programs, CT-EXPO and CTDOSIMETRY. Organ dose calculations were repeated for an adult stylized phantom by using the same simulation method used for the adult hybrid phantom. Results: Comparisons of both lateral free-in-air dose profiles and CTDI values through experimental measurement with the Monte Carlo simulations showed good agreement to within 9%. Organ doses for head, chest, and abdomen/pelvis scans reported in the commercial programs exceeded those from the Monte Carlo calculations in both the hybrid and stylized phantoms in this study, sometimes by orders of magnitude. Conclusions: The organ dose estimation method and dose matrices established in this study readily provides organ doses for a reference adult male and female for different CT scan ranges and technical parameters. Organ doses from existing commercial programs do not reasonably match organ doses calculated for the hybrid phantoms due to differences in phantom anatomy, as well as differences in organ dose scaling parameters. The organ dose matrices developed in this study will be extended to cover different technical parameters, CT scanner models, and various age groups.« less
A coarse to fine minutiae-based latent palmprint matching.
Liu, Eryun; Jain, Anil K; Tian, Jie
2013-10-01
With the availability of live-scan palmprint technology, high resolution palmprint recognition has started to receive significant attention in forensics and law enforcement. In forensic applications, latent palmprints provide critical evidence as it is estimated that about 30 percent of the latents recovered at crime scenes are those of palms. Most of the available high-resolution palmprint matching algorithms essentially follow the minutiae-based fingerprint matching strategy. Considering the large number of minutiae (about 1,000 minutiae in a full palmprint compared to about 100 minutiae in a rolled fingerprint) and large area of foreground region in full palmprints, novel strategies need to be developed for efficient and robust latent palmprint matching. In this paper, a coarse to fine matching strategy based on minutiae clustering and minutiae match propagation is designed specifically for palmprint matching. To deal with the large number of minutiae, a local feature-based minutiae clustering algorithm is designed to cluster minutiae into several groups such that minutiae belonging to the same group have similar local characteristics. The coarse matching is then performed within each cluster to establish initial minutiae correspondences between two palmprints. Starting with each initial correspondence, a minutiae match propagation algorithm searches for mated minutiae in the full palmprint. The proposed palmprint matching algorithm has been evaluated on a latent-to-full palmprint database consisting of 446 latents and 12,489 background full prints. The matching results show a rank-1 identification accuracy of 79.4 percent, which is significantly higher than the 60.8 percent identification accuracy of a state-of-the-art latent palmprint matching algorithm on the same latent database. The average computation time of our algorithm for a single latent-to-full match is about 141 ms for genuine match and 50 ms for impostor match, on a Windows XP desktop system with 2.2-GHz CPU and 1.00-GB RAM. The computation time of our algorithm is an order of magnitude faster than a previously published state-of-the-art-algorithm.
Euler Flow Computations on Non-Matching Unstructured Meshes
NASA Technical Reports Server (NTRS)
Gumaste, Udayan
1999-01-01
Advanced fluid solvers to predict aerodynamic performance-coupled treatment of multiple fields are described. The interaction between the fluid and structural components in the bladed regions of the engine is investigated with respect to known blade failures caused by either flutter or forced vibrations. Methods are developed to describe aeroelastic phenomena for internal flows in turbomachinery by accounting for the increased geometric complexity, mutual interaction between adjacent structural components and presence of thermal and geometric loading. The computer code developed solves the full three dimensional aeroelastic problem of-stage. The results obtained show that flow computations can be performed on non-matching finite-volume unstructured meshes with second order spatial accuracy.
Eurogrid: a new glideinWMS based portal for CDF data analysis
NASA Astrophysics Data System (ADS)
Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.
2012-12-01
The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.
Real-time non-rigid target tracking for ultrasound-guided clinical interventions
NASA Astrophysics Data System (ADS)
Zachiu, C.; Ries, M.; Ramaekers, P.; Guey, J.-L.; Moonen, C. T. W.; de Senneville, B. Denis
2017-10-01
Biological motion is a problem for non- or mini-invasive interventions when conducted in mobile/deformable organs due to the targeted pathology moving/deforming with the organ. This may lead to high miss rates and/or incomplete treatment of the pathology. Therefore, real-time tracking of the target anatomy during the intervention would be beneficial for such applications. Since the aforementioned interventions are often conducted under B-mode ultrasound (US) guidance, target tracking can be achieved via image registration, by comparing the acquired US images to a separate image established as positional reference. However, such US images are intrinsically altered by speckle noise, introducing incoherent gray-level intensity variations. This may prove problematic for existing intensity-based registration methods. In the current study we address US-based target tracking by employing the recently proposed EVolution registration algorithm. The method is, by construction, robust to transient gray-level intensities. Instead of directly matching image intensities, EVolution aligns similar contrast patterns in the images. Moreover, the displacement is computed by evaluating a matching criterion for image sub-regions rather than on a point-by-point basis, which typically provides more robust motion estimates. However, unlike similar previously published approaches, which assume rigid displacements in the image sub-regions, the EVolution algorithm integrates the matching criterion in a global functional, allowing the estimation of an elastic dense deformation. The approach was validated for soft tissue tracking under free-breathing conditions on the abdomen of seven healthy volunteers. Contact echography was performed on all volunteers, while three of the volunteers also underwent standoff echography. Each of the two modalities is predominantly specific to a particular type of non- or mini-invasive clinical intervention. The method demonstrated on average an accuracy of ˜1.5 mm and submillimeter precision. This, together with a computational performance of 20 images per second make the proposed method an attractive solution for real-time target tracking during US-guided clinical interventions.
Real-time non-rigid target tracking for ultrasound-guided clinical interventions.
Zachiu, C; Ries, M; Ramaekers, P; Guey, J-L; Moonen, C T W; de Senneville, B Denis
2017-10-04
Biological motion is a problem for non- or mini-invasive interventions when conducted in mobile/deformable organs due to the targeted pathology moving/deforming with the organ. This may lead to high miss rates and/or incomplete treatment of the pathology. Therefore, real-time tracking of the target anatomy during the intervention would be beneficial for such applications. Since the aforementioned interventions are often conducted under B-mode ultrasound (US) guidance, target tracking can be achieved via image registration, by comparing the acquired US images to a separate image established as positional reference. However, such US images are intrinsically altered by speckle noise, introducing incoherent gray-level intensity variations. This may prove problematic for existing intensity-based registration methods. In the current study we address US-based target tracking by employing the recently proposed EVolution registration algorithm. The method is, by construction, robust to transient gray-level intensities. Instead of directly matching image intensities, EVolution aligns similar contrast patterns in the images. Moreover, the displacement is computed by evaluating a matching criterion for image sub-regions rather than on a point-by-point basis, which typically provides more robust motion estimates. However, unlike similar previously published approaches, which assume rigid displacements in the image sub-regions, the EVolution algorithm integrates the matching criterion in a global functional, allowing the estimation of an elastic dense deformation. The approach was validated for soft tissue tracking under free-breathing conditions on the abdomen of seven healthy volunteers. Contact echography was performed on all volunteers, while three of the volunteers also underwent standoff echography. Each of the two modalities is predominantly specific to a particular type of non- or mini-invasive clinical intervention. The method demonstrated on average an accuracy of ∼1.5 mm and submillimeter precision. This, together with a computational performance of 20 images per second make the proposed method an attractive solution for real-time target tracking during US-guided clinical interventions.
Topics in the Detection of Gravitational Waves from Compact Binary Inspirals
NASA Astrophysics Data System (ADS)
Kapadia, Shasvath Jagat
Orbiting compact binaries - such as binary black holes, binary neutron stars and neutron star-black hole binaries - are among the most promising sources of gravitational waves observable by ground-based interferometric detectors. Despite numerous sophisticated engineering techniques, the gravitational wave signals will be buried deep within noise generated by various instrumental and environmental processes, and need to be extracted via a signal processing technique referred to as matched filtering. Matched filtering requires large banks of signal templates that are faithful representations of the true gravitational waveforms produced by astrophysical binaries. The accurate and efficient production of templates is thus crucial to the success of signal processing and data analysis. To that end, the dissertation presents a numerical technique that calibrates existing analytical (Post-Newtonian) waveforms, which are relatively inexpensive, to more accurate fiducial waveforms that are computationally expensive to generate. The resulting waveform family is significantly more accurate than the analytical waveforms, without incurring additional computational costs of production. Certain kinds of transient background noise artefacts, called "glitches'', can masquerade as gravitational wave signals for short durations and throw-off the matched-filter algorithm. Identifying glitches from true gravitational wave signals is a highly non-trivial exercise in data analysis which has been attempted with varying degrees of success. We present here a machine-learning based approach that exploits the various attributes of glitches and signals within detector data to provide a classification scheme that is a significant improvement over previous methods. The dissertation concludes by investigating the possibility of detecting a non-linear DC imprint, called the Christodoulou memory, produced in the arms of ground-based interferometers by the recently detected gravitational waves. The memory, which is even smaller in amplitude than the primary (detected) gravitational waves, will almost certainly not be seen in the current detection event. Nevertheless, future space-based detectors will likely be sensitive enough to observe the memory.
Tempest: GPU-CPU computing for high-throughput database spectral matching.
Milloy, Jeffrey A; Faherty, Brendan K; Gerber, Scott A
2012-07-06
Modern mass spectrometers are now capable of producing hundreds of thousands of tandem (MS/MS) spectra per experiment, making the translation of these fragmentation spectra into peptide matches a common bottleneck in proteomics research. When coupled with experimental designs that enrich for post-translational modifications such as phosphorylation and/or include isotopically labeled amino acids for quantification, additional burdens are placed on this computational infrastructure by shotgun sequencing. To address this issue, we have developed a new database searching program that utilizes the massively parallel compute capabilities of a graphical processing unit (GPU) to produce peptide spectral matches in a very high throughput fashion. Our program, named Tempest, combines efficient database digestion and MS/MS spectral indexing on a CPU with fast similarity scoring on a GPU. In our implementation, the entire similarity score, including the generation of full theoretical peptide candidate fragmentation spectra and its comparison to experimental spectra, is conducted on the GPU. Although Tempest uses the classical SEQUEST XCorr score as a primary metric for evaluating similarity for spectra collected at unit resolution, we have developed a new "Accelerated Score" for MS/MS spectra collected at high resolution that is based on a computationally inexpensive dot product but exhibits scoring accuracy similar to that of the classical XCorr. In our experience, Tempest provides compute-cluster level performance in an affordable desktop computer.
NASA Astrophysics Data System (ADS)
Tibi, R.; Young, C. J.; Gonzales, A.; Ballard, S.; Encarnacao, A. V.
2016-12-01
The matched filtering technique involving the cross-correlation of a waveform of interest with archived signals from a template library has proven to be a powerful tool for detecting events in regions with repeating seismicity. However, waveform correlation is computationally expensive, and therefore impractical for large template sets unless dedicated distributed computing hardware and software are used. In this study, we introduce an Approximate Nearest Neighbor (ANN) approach that enables the use of very large template libraries for waveform correlation without requiring a complex distributed computing system. Our method begins with a projection into a reduced dimensionality space based on correlation with a randomized subset of the full template archive. Searching for a specified number of nearest neighbors is accomplished by using randomized K-dimensional trees. We used the approach to search for matches to each of 2700 analyst-reviewed signal detections reported for May 2010 for the IMS station MKAR. The template library in this case consists of a dataset of more than 200,000 analyst-reviewed signal detections for the same station from 2002-2014 (excluding May 2010). Of these signal detections, 60% are teleseismic first P, and 15% regional phases (Pn, Pg, Sn, and Lg). The analyses performed on a standard desktop computer shows that the proposed approach performs the search of the large template libraries about 20 times faster than the standard full linear search, while achieving recall rates greater than 80%, with the recall rate increasing for higher correlation values. To decide whether to confirm a match, we use a hybrid method involving a cluster approach for queries with two or more matches, and correlation score for single matches. Of the signal detections that passed our confirmation process, 52% were teleseismic first P, and 30% were regional phases.
On Computing Breakpoint Distances for Genomes with Duplicate Genes.
Shao, Mingfu; Moret, Bernard M E
2017-06-01
A fundamental problem in comparative genomics is to compute the distance between two genomes in terms of its higher level organization (given by genes or syntenic blocks). For two genomes without duplicate genes, we can easily define (and almost always efficiently compute) a variety of distance measures, but the problem is NP-hard under most models when genomes contain duplicate genes. To tackle duplicate genes, three formulations (exemplar, maximum matching, and any matching) have been proposed, all of which aim to build a matching between homologous genes so as to minimize some distance measure. Of the many distance measures, the breakpoint distance (the number of nonconserved adjacencies) was the first one to be studied and remains of significant interest because of its simplicity and model-free property. The three breakpoint distance problems corresponding to the three formulations have been widely studied. Although we provided last year a solution for the exemplar problem that runs very fast on full genomes, computing optimal solutions for the other two problems has remained challenging. In this article, we describe very fast, exact algorithms for these two problems. Our algorithms rely on a compact integer-linear program that we further simplify by developing an algorithm to remove variables, based on new results on the structure of adjacencies and matchings. Through extensive experiments using both simulations and biological data sets, we show that our algorithms run very fast (in seconds) on mammalian genomes and scale well beyond. We also apply these algorithms (as well as the classic orthology tool MSOAR) to create orthology assignment, then compare their quality in terms of both accuracy and coverage. We find that our algorithm for the "any matching" formulation significantly outperforms other methods in terms of accuracy while achieving nearly maximum coverage.
Enhanced heterogeneous ice nucleation by special surface geometry
Bi, Yuanfei; Cao, Boxiao; Li, Tianshu
2017-01-01
The freezing of water typically proceeds through impurity-mediated heterogeneous nucleation. Although non-planar geometry generically exists on the surfaces of ice nucleation centres, its role in nucleation remains poorly understood. Here we show that an atomically sharp, concave wedge can further promote ice nucleation with special wedge geometries. Our molecular analysis shows that significant enhancements of ice nucleation can emerge both when the geometry of a wedge matches the ice lattice and when such lattice match does not exist. In particular, a 45° wedge is found to greatly enhance ice nucleation by facilitating the formation of special topological defects that consequently catalyse the growth of regular ice. Our study not only highlights the active role of defects in nucleation but also suggests that the traditional concept of lattice match between a nucleation centre and crystalline lattice should be extended to include a broader match with metastable, non-crystalline structural motifs. PMID:28513603
Enhanced heterogeneous ice nucleation by special surface geometry.
Bi, Yuanfei; Cao, Boxiao; Li, Tianshu
2017-05-17
The freezing of water typically proceeds through impurity-mediated heterogeneous nucleation. Although non-planar geometry generically exists on the surfaces of ice nucleation centres, its role in nucleation remains poorly understood. Here we show that an atomically sharp, concave wedge can further promote ice nucleation with special wedge geometries. Our molecular analysis shows that significant enhancements of ice nucleation can emerge both when the geometry of a wedge matches the ice lattice and when such lattice match does not exist. In particular, a 45° wedge is found to greatly enhance ice nucleation by facilitating the formation of special topological defects that consequently catalyse the growth of regular ice. Our study not only highlights the active role of defects in nucleation but also suggests that the traditional concept of lattice match between a nucleation centre and crystalline lattice should be extended to include a broader match with metastable, non-crystalline structural motifs.
Stephenson, Jennifer
2009-03-01
Communication symbols for students with severe intellectual disabilities often take the form of computer-generated line drawings. This study investigated the effects of the match between color and shape of line drawings and the objects they represented on drawing recognition and use. The match or non-match between color and shape of the objects and drawings did not have an effect on participants' ability to match drawings to objects, or to use drawings to make choices.
Wafer-Fused Orientation-Patterned GaAs
2008-02-13
frequencies utilizing existing industrial foundries. 15. SUBJECT TERMS Orientation-patterned Gallium Arsenide, hydride vapor phase epitaxy, quasi-phase... Gallium Arsenide, hydride vapor phase epitaxy, quasi-phase-matching, nonlinear frequency conversion 1. INTRODUCTION Quasi-phase-matching (QPM)1...and E. Lallier, “Second harmonic generation of CO2 laser using thick quasi-phase-matched GaAs layer grown by hydride vapour phase epitaxy
Federal Register 2010, 2011, 2012, 2013, 2014
2012-07-25
... pension payment data from its system of records (SOR) entitled the ``Compensation, Pension, Education, and... monthly. The actual match will take place approximately during the first week of every month. E. Inclusive...
NASA Astrophysics Data System (ADS)
Xia, Y.; Tian, J.; d'Angelo, P.; Reinartz, P.
2018-05-01
3D reconstruction of plants is hard to implement, as the complex leaf distribution highly increases the difficulty level in dense matching. Semi-Global Matching has been successfully applied to recover the depth information of a scene, but may perform variably when different matching cost algorithms are used. In this paper two matching cost computation algorithms, Census transform and an algorithm using a convolutional neural network, are tested for plant reconstruction based on Semi-Global Matching. High resolution close-range photogrammetric images from a handheld camera are used for the experiment. The disparity maps generated based on the two selected matching cost methods are comparable with acceptable quality, which shows the good performance of Census and the potential of neural networks to improve the dense matching.
A fuzzy structural matching scheme for space robotics vision
NASA Technical Reports Server (NTRS)
Naka, Masao; Yamamoto, Hiromichi; Homma, Khozo; Iwata, Yoshitaka
1994-01-01
In this paper, we propose a new fuzzy structural matching scheme for space stereo vision which is based on the fuzzy properties of regions of images and effectively reduces the computational burden in the following low level matching process. Three dimensional distance images of a space truss structural model are estimated using this scheme from stereo images sensed by Charge Coupled Device (CCD) TV cameras.
The Effects of Study Tasks in a Computer-Based Chemistry Learning Environment
NASA Astrophysics Data System (ADS)
Urhahne, Detlef; Nick, Sabine; Poepping, Anna Christin; Schulz, Sarah Jayne
2013-12-01
The present study examines the effects of different study tasks on the acquisition of knowledge about acids and bases in a computer-based learning environment. Three different task formats were selected to create three treatment conditions: learning with gap-fill and matching tasks, learning with multiple-choice tasks, and learning only from text and figures without any additional tasks. Participants were 196 ninth-grade students who learned with a self-developed multimedia program in a pretest-posttest control group design. Research results reveal that gap-fill and matching tasks were most effective in promoting knowledge acquisition, followed by multiple-choice tasks, and no tasks at all. The findings are in line with previous research on this topic. The effects can possibly be explained by the generation-recognition model, which predicts that gap-fill and matching tasks trigger more encompassing learning processes than multiple-choice tasks. It is concluded that instructional designers should incorporate more challenging study tasks for enhancing the effectiveness of computer-based learning environments.
NASA Astrophysics Data System (ADS)
Hummels, Cameron B.; Bryan, Greg L.; Smith, Britton D.; Turk, Matthew J.
2013-04-01
Cosmological hydrodynamical simulations of galaxy evolution are increasingly able to produce realistic galaxies, but the largest hurdle remaining is in constructing subgrid models that accurately describe the behaviour of stellar feedback. As an alternate way to test and calibrate such models, we propose to focus on the circumgalactic medium (CGM). To do so, we generate a suite of adaptive mesh refinement simulations for a Milky-Way-massed galaxy run to z = 0, systematically varying the feedback implementation. We then post-process the simulation data to compute the absorbing column density for a wide range of common atomic absorbers throughout the galactic halo, including H I, Mg II, Si II, Si III, Si IV, C IV, N V, O VI and O VII. The radial profiles of these atomic column densities are compared against several quasar absorption line studies to determine if one feedback prescription is favoured. We find that although our models match some of the observations (specifically those ions with lower ionization strengths), it is particularly difficult to match O VI observations. There is some indication that the models with increased feedback intensity are better matches. We demonstrate that sufficient metals exist in these haloes to reproduce the observed column density distribution in principle, but the simulated CGM lacks significant multiphase substructure and is generally too hot. Furthermore, we demonstrate the failings of inflow-only models (without energetic feedback) at populating the CGM with adequate metals to match observations even in the presence of multiphase structure. Additionally, we briefly investigate the evolution of the CGM from z = 3 to present. Overall, we find that quasar absorption line observations of the gas around galaxies provide a new and important constraint on feedback models.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zheng, Y; Yuan, J; Geis, P
2016-06-15
Purpose: To verify the similarity of the dosimetric characteristics between two Elekta linear accelerators (linacs) in order to treat patients interchangeably on these two machines without re-planning. Methods: To investigate the viability of matching the 6 MV flattened beam on an existing linac (Elekta Synergy with Agility head) with a recently installed new linca (Elekta Versa HD), percent depth doses (PDD), flatness and symmetry output factors were compared for both machines. To validate the beam matching among machines, we carried out two approaches to cross-check the dosimetrical equivalence: 1) the prior treatment plans were re-computed based on the newly builtmore » Versa HD treatment planning system (TPS) model without changing the beam control points; 2) The same plans were delivered on both machines and the radiation dose measurements on a MapCheck2 were compared with TPS calculations. Three VMAT plans (Head and neck, lung, and prostate) were used in the study. Results: The difference between the PDDs for 10×10 cm{sup 2} field at all depths was less than 0.8%. The difference of flatness and symmetry for 30×30 cm{sup 2} field was less than 0.8%, and the measured output factors varies by less than 1% for each field size ranging from 2×2 cm2 to 40×40 cm{sup 2}. For the same plans, the maximum difference of the two calculated dose distributions is 2% of prescription. For the QA measurements, the gamma index passing rates were above 99% for 3%/3mm criteria with 10% threshold for all three clinical plans. Conclusion: A beam modality matching between two Elekta linacs is demonstrated with a cross-checking approach.« less
ERIC Educational Resources Information Center
Willing, Kathlene R.; Girard, Suzanne
Intended for young children just becoming familiar with computers, this naming book introduces and reinforces new computer vocabulary and concepts. The 20 words are presented alphabetically, along with illustrations, providing room for different activities in which children can match and name the pictures and words. The 20 vocabulary items are…
A Fast Approach to Automatic Detection of Brain Lesions
Koley, Subhranil; Chakraborty, Chandan; Mainero, Caterina; Fischl, Bruce; Aganj, Iman
2017-01-01
Template matching is a popular approach to computer-aided detection of brain lesions from magnetic resonance (MR) images. The outcomes are often sufficient for localizing lesions and assisting clinicians in diagnosis. However, processing large MR volumes with three-dimensional (3D) templates is demanding in terms of computational resources, hence the importance of the reduction of computational complexity of template matching, particularly in situations in which time is crucial (e.g. emergent stroke). In view of this, we make use of 3D Gaussian templates with varying radii and propose a new method to compute the normalized cross-correlation coefficient as a similarity metric between the MR volume and the template to detect brain lesions. Contrary to the conventional fast Fourier transform (FFT) based approach, whose runtime grows as O(N logN) with the number of voxels, the proposed method computes the cross-correlation in O(N). We show through our experiments that the proposed method outperforms the FFT approach in terms of computational time, and retains comparable accuracy. PMID:29082383
Chromotomography for a rotating-prism instrument using backprojection, then filtering.
Deming, Ross W
2006-08-01
A simple closed-form solution is derived for reconstructing a 3D spatial-chromatic image cube from a set of chromatically dispersed 2D image frames. The algorithm is tailored for a particular instrument in which the dispersion element is a matching set of mechanically rotated direct vision prisms positioned between a lens and a focal plane array. By using a linear operator formalism to derive the Tikhonov-regularized pseudoinverse operator, it is found that the unique minimum-norm solution is obtained by applying the adjoint operator, followed by 1D filtering with respect to the chromatic variable. Thus the filtering and backprojection (adjoint) steps are applied in reverse order relative to an existing method. Computational efficiency is provided by use of the fast Fourier transform in the filtering step.
Discrimination of malignant lymphomas and leukemia using Radon transform based-higher order spectra
NASA Astrophysics Data System (ADS)
Luo, Yi; Celenk, Mehmet; Bejai, Prashanth
2006-03-01
A new algorithm that can be used to automatically recognize and classify malignant lymphomas and leukemia is proposed in this paper. The algorithm utilizes the morphological watersheds to obtain boundaries of cells from cell images and isolate them from the surrounding background. The areas of cells are extracted from cell images after background subtraction. The Radon transform and higher-order spectra (HOS) analysis are utilized as an image processing tool to generate class feature vectors of different type cells and to extract testing cells' feature vectors. The testing cells' feature vectors are then compared with the known class feature vectors for a possible match by computing the Euclidean distances. The cell in question is classified as belonging to one of the existing cell classes in the least Euclidean distance sense.
Mentorship and competencies for applied chronic disease epidemiology.
Lengerich, Eugene J; Siedlecki, Jennifer C; Brownson, Ross; Aldrich, Tim E; Hedberg, Katrina; Remington, Patrick; Siegel, Paul Z
2003-01-01
To understand the potential and establish a framework for mentoring as a method to develop professional competencies of state-level applied chronic disease epidemiologists, model mentorship programs were reviewed, specific competencies were identified, and competencies were then matched to essential public health services. Although few existing mentorship programs in public health were identified, common themes in other professional mentorship programs support the potential of mentoring as an effective means to develop capacity for applied chronic disease epidemiology. Proposed competencies for chronic disease epidemiologists in a mentorship program include planning, analysis, communication, basic public health, informatics and computer knowledge, and cultural diversity. Mentoring may constitute a viable strategy to build chronic disease epidemiology capacity, especially in public health agencies where resource and personnel system constraints limit opportunities to recruit and hire new staff.
Recognizing simple polyhedron from a perspective drawing
NASA Astrophysics Data System (ADS)
Zhang, Guimei; Chu, Jun; Miao, Jun
2009-10-01
Existed methods can't be used for recognizing simple polyhedron. In this paper, three problems are researched. First, a method for recognizing triangle and quadrilateral is introduced based on geometry and angle constraint. Then Attribute Relation Graph (ARG) is employed to describe simple polyhedron and line drawing. Last, a new method is presented to recognize simple polyhedron from a line drawing. The method filters the candidate database before matching line drawing and model, thus the recognition efficiency is improved greatly. We introduced the geometrical characteristics and topological characteristics to describe each node of ARG, so the algorithm can not only recognize polyhedrons with different shape but also distinguish between polyhedrons with the same shape but with different sizes and proportions. Computer simulations demonstrate the effectiveness of the method preliminarily.
Stability of anisotropic self-gravitating fluids
NASA Astrophysics Data System (ADS)
Ahmad, S.; Jami, A. Rehman; Mughal, M. Z.
2018-06-01
The aim of this paper is to study the stability as well as the existence of self-gravitating anisotropic fluids in Λ-dominated era. Taking a cylindrically symmetric and static spacetime, we computed the corresponding equations of motion in the background of anisotropic fluid distributions. The realistic formulation of energy momentum tensor as well as theoretical model of the scale factors are considered in order to describe some physical properties of the anisotropic fluids. To find the stability of the compact star, we have used Herrera’s technique which is based on finding the radial and the transverse components of the speed of sound. Moreover, the behaviors of other physical quantities are also discussed like anisotropy, matching conditions of interior metric and exterior metric and compactness of the compact structures are also discussed.
A distributed, dynamic, parallel computational model: the role of noise in velocity storage
Merfeld, Daniel M.
2012-01-01
Networks of neurons perform complex calculations using distributed, parallel computation, including dynamic “real-time” calculations required for motion control. The brain must combine sensory signals to estimate the motion of body parts using imperfect information from noisy neurons. Models and experiments suggest that the brain sometimes optimally minimizes the influence of noise, although it remains unclear when and precisely how neurons perform such optimal computations. To investigate, we created a model of velocity storage based on a relatively new technique–“particle filtering”–that is both distributed and parallel. It extends existing observer and Kalman filter models of vestibular processing by simulating the observer model many times in parallel with noise added. During simulation, the variance of the particles defining the estimator state is used to compute the particle filter gain. We applied our model to estimate one-dimensional angular velocity during yaw rotation, which yielded estimates for the velocity storage time constant, afferent noise, and perceptual noise that matched experimental data. We also found that the velocity storage time constant was Bayesian optimal by comparing the estimate of our particle filter with the estimate of the Kalman filter, which is optimal. The particle filter demonstrated a reduced velocity storage time constant when afferent noise increased, which mimics what is known about aminoglycoside ablation of semicircular canal hair cells. This model helps bridge the gap between parallel distributed neural computation and systems-level behavioral responses like the vestibuloocular response and perception. PMID:22514288
Methodology for extracting local constants from petroleum cracking flows
Chang, Shen-Lin; Lottes, Steven A.; Zhou, Chenn Q.
2000-01-01
A methodology provides for the extraction of local chemical kinetic model constants for use in a reacting flow computational fluid dynamics (CFD) computer code with chemical kinetic computations to optimize the operating conditions or design of the system, including retrofit design improvements to existing systems. The coupled CFD and kinetic computer code are used in combination with data obtained from a matrix of experimental tests to extract the kinetic constants. Local fluid dynamic effects are implicitly included in the extracted local kinetic constants for each particular application system to which the methodology is applied. The extracted local kinetic model constants work well over a fairly broad range of operating conditions for specific and complex reaction sets in specific and complex reactor systems. While disclosed in terms of use in a Fluid Catalytic Cracking (FCC) riser, the inventive methodology has application in virtually any reaction set to extract constants for any particular application and reaction set formulation. The methodology includes the step of: (1) selecting the test data sets for various conditions; (2) establishing the general trend of the parametric effect on the measured product yields; (3) calculating product yields for the selected test conditions using coupled computational fluid dynamics and chemical kinetics; (4) adjusting the local kinetic constants to match calculated product yields with experimental data; and (5) validating the determined set of local kinetic constants by comparing the calculated results with experimental data from additional test runs at different operating conditions.
The planum temporale as a computational hub.
Griffiths, Timothy D; Warren, Jason D
2002-07-01
It is increasingly recognized that the human planum temporale is not a dedicated language processor, but is in fact engaged in the analysis of many types of complex sound. We propose a model of the human planum temporale as a computational engine for the segregation and matching of spectrotemporal patterns. The model is based on segregating the components of the acoustic world and matching these components with learned spectrotemporal representations. Spectrotemporal information derived from such a 'computational hub' would be gated to higher-order cortical areas for further processing, leading to object recognition and the perception of auditory space. We review the evidence for the model and specific predictions that follow from it.
Efficient and Scalable Cross-Matching of (Very) Large Catalogs
NASA Astrophysics Data System (ADS)
Pineau, F.-X.; Boch, T.; Derriere, S.
2011-07-01
Whether it be for building multi-wavelength datasets from independent surveys, studying changes in objects luminosities, or detecting moving objects (stellar proper motions, asteroids), cross-catalog matching is a technique widely used in astronomy. The need for efficient, reliable and scalable cross-catalog matching is becoming even more pressing with forthcoming projects which will produce huge catalogs in which astronomers will dig for rare objects, perform statistical analysis and classification, or real-time transients detection. We have developed a formalism and the corresponding technical framework to address the challenge of fast cross-catalog matching. Our formalism supports more than simple nearest-neighbor search, and handles elliptical positional errors. Scalability is improved by partitioning the sky using the HEALPix scheme, and processing independently each sky cell. The use of multi-threaded two-dimensional kd-trees adapted to managing equatorial coordinates enables efficient neighbor search. The whole process can run on a single computer, but could also use clusters of machines to cross-match future very large surveys such as GAIA or LSST in reasonable times. We already achieve performances where the 2MASS (˜470M sources) and SDSS DR7 (˜350M sources) can be matched on a single machine in less than 10 minutes. We aim at providing astronomers with a catalog cross-matching service, available on-line and leveraging on the catalogs present in the VizieR database. This service will allow users both to access pre-computed cross-matches across some very large catalogs, and to run customized cross-matching operations. It will also support VO protocols for synchronous or asynchronous queries.
Fixed-interval matching-to-sample: intermatching time and intermatching error runs1
Nelson, Thomas D.
1978-01-01
Four pigeons were trained on a matching-to-sample task in which reinforcers followed either the first matching response (fixed interval) or the fifth matching response (tandem fixed-interval fixed-ratio) that occurred 80 seconds or longer after the last reinforcement. Relative frequency distributions of the matching-to-sample responses that concluded intermatching times and runs of mismatches (intermatching error runs) were computed for the final matching responses directly followed by grain access and also for the three matching responses immediately preceding the final match. Comparison of these two distributions showed that the fixed-interval schedule arranged for the preferential reinforcement of matches concluding relatively extended intermatching times and runs of mismatches. Differences in matching accuracy and rate during the fixed interval, compared to the tandem fixed-interval fixed-ratio, suggested that reinforcers following matches concluding various intermatching times and runs of mismatches influenced the rate and accuracy of the last few matches before grain access, but did not control rate and accuracy throughout the entire fixed-interval period. PMID:16812032
Computing Maximum Cardinality Matchings in Parallel on Bipartite Graphs via Tree-Grafting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Azad, Ariful; Buluc, Aydn; Pothen, Alex
It is difficult to obtain high performance when computing matchings on parallel processors because matching algorithms explicitly or implicitly search for paths in the graph, and when these paths become long, there is little concurrency. In spite of this limitation, we present a new algorithm and its shared-memory parallelization that achieves good performance and scalability in computing maximum cardinality matchings in bipartite graphs. This algorithm searches for augmenting paths via specialized breadth-first searches (BFS) from multiple source vertices, hence creating more parallelism than single source algorithms. Algorithms that employ multiple-source searches cannot discard a search tree once no augmenting pathmore » is discovered from the tree, unlike algorithms that rely on single-source searches. We describe a novel tree-grafting method that eliminates most of the redundant edge traversals resulting from this property of multiple-source searches. We also employ the recent direction-optimizing BFS algorithm as a subroutine to discover augmenting paths faster. Our algorithm compares favorably with the current best algorithms in terms of the number of edges traversed, the average augmenting path length, and the number of iterations. Here, we provide a proof of correctness for our algorithm. Our NUMA-aware implementation is scalable to 80 threads of an Intel multiprocessor and to 240 threads on an Intel Knights Corner coprocessor. On average, our parallel algorithm runs an order of magnitude faster than the fastest algorithms available. The performance improvement is more significant on graphs with small matching number.« less
Computing Maximum Cardinality Matchings in Parallel on Bipartite Graphs via Tree-Grafting
Azad, Ariful; Buluc, Aydn; Pothen, Alex
2016-03-24
It is difficult to obtain high performance when computing matchings on parallel processors because matching algorithms explicitly or implicitly search for paths in the graph, and when these paths become long, there is little concurrency. In spite of this limitation, we present a new algorithm and its shared-memory parallelization that achieves good performance and scalability in computing maximum cardinality matchings in bipartite graphs. This algorithm searches for augmenting paths via specialized breadth-first searches (BFS) from multiple source vertices, hence creating more parallelism than single source algorithms. Algorithms that employ multiple-source searches cannot discard a search tree once no augmenting pathmore » is discovered from the tree, unlike algorithms that rely on single-source searches. We describe a novel tree-grafting method that eliminates most of the redundant edge traversals resulting from this property of multiple-source searches. We also employ the recent direction-optimizing BFS algorithm as a subroutine to discover augmenting paths faster. Our algorithm compares favorably with the current best algorithms in terms of the number of edges traversed, the average augmenting path length, and the number of iterations. Here, we provide a proof of correctness for our algorithm. Our NUMA-aware implementation is scalable to 80 threads of an Intel multiprocessor and to 240 threads on an Intel Knights Corner coprocessor. On average, our parallel algorithm runs an order of magnitude faster than the fastest algorithms available. The performance improvement is more significant on graphs with small matching number.« less
Automated generation and ensemble-learned matching of X-ray absorption spectra
NASA Astrophysics Data System (ADS)
Zheng, Chen; Mathew, Kiran; Chen, Chi; Chen, Yiming; Tang, Hanmei; Dozier, Alan; Kas, Joshua J.; Vila, Fernando D.; Rehr, John J.; Piper, Louis F. J.; Persson, Kristin A.; Ong, Shyue Ping
2018-12-01
X-ray absorption spectroscopy (XAS) is a widely used materials characterization technique to determine oxidation states, coordination environment, and other local atomic structure information. Analysis of XAS relies on comparison of measured spectra to reliable reference spectra. However, existing databases of XAS spectra are highly limited both in terms of the number of reference spectra available as well as the breadth of chemistry coverage. In this work, we report the development of XASdb, a large database of computed reference XAS, and an Ensemble-Learned Spectra IdEntification (ELSIE) algorithm for the matching of spectra. XASdb currently hosts more than 800,000 K-edge X-ray absorption near-edge spectra (XANES) for over 40,000 materials from the open-science Materials Project database. We discuss a high-throughput automation framework for FEFF calculations, built on robust, rigorously benchmarked parameters. FEFF is a computer program uses a real-space Green's function approach to calculate X-ray absorption spectra. We will demonstrate that the ELSIE algorithm, which combines 33 weak "learners" comprising a set of preprocessing steps and a similarity metric, can achieve up to 84.2% accuracy in identifying the correct oxidation state and coordination environment of a test set of 19 K-edge XANES spectra encompassing a diverse range of chemistries and crystal structures. The XASdb with the ELSIE algorithm has been integrated into a web application in the Materials Project, providing an important new public resource for the analysis of XAS to all materials researchers. Finally, the ELSIE algorithm itself has been made available as part of veidt, an open source machine-learning library for materials science.
Chronic Hypoxia Accentuates Dysanaptic Lung Growth.
Llapur, Conrado J; Martínez, Myriam R; Grassino, Pedro T; Stok, Ana; Altieri, Héctor H; Bonilla, Federico; Caram, María M; Krowchuk, Natasha M; Kirby, Miranda; Coxson, Harvey O; Tepper, Robert S
2016-08-01
Adults born and raised at high altitudes have larger lung volumes and greater pulmonary diffusion capacity compared with adults at low altitude; however, it remains unclear whether the air and tissue volumes have comparable increases and whether there is a difference in airway size. To assess the effect of chronic hypoxia on lung growth using in vivo high-resolution computed tomography measurements. Healthy adults born and raised at moderate altitude (2,000 m above sea level; n = 19) and at low altitude (400 m above sea level; n = 23) underwent high-resolution computed tomography. Differences in total lung, air, and tissue volume, mean lung density, as well as airway lumen and wall areas in anatomically matched airways were compared between groups. No significant differences for age, sex, weight, or height were found between the two groups (P > 0.05). In a multivariate regression model, altitude was a significant contributor for total lung volume (P = 0.02), air volume (P = 0.03), and tissue volume (P = 0.03), whereby the volumes were greater for the moderate- versus the low-altitude group. However, altitude was not a significant contributor for mean lung density (P = 0.35) or lumen and wall areas in anatomically matched segmental, subsegmental, and subsubsegmental airways. Our findings suggest that the adult lung did not increase lung volume later in life by expansion of an existing number of alveoli, but rather from increased alveolarization early in life. In addition, chronic hypoxia accentuates dysanaptic lung growth by increasing the lung parenchyma but not the airways.
The probability of lava inundation at the proposed and existing Kulani prison sites
Kauahikaua, J.P.; Trusdell, F.A.; Heliker, C.C.
1998-01-01
The State of Hawai`i has proposed building a 2,300-bed medium-security prison about 10 km downslope from the existing Kulani medium-security correctional facility. The proposed and existing facilities lie on the northeast rift zone of Mauna Loa, which last erupted in 1984 in this same general area. We use the best available geologic mapping and dating with GIS software to estimate the average recurrence interval between lava flows that inundate these sites. Three different methods are used to adjust the number of flows exposed at the surface for those flows that are buried to allow a better representation of the recurrence interval. Probabilities are then computed, based on these recurrence intervals, assuming that the data match a Poisson distribution. The probability of lava inundation for the existing prison site is estimated to be 11- 12% in the next 50 years. The probability of lava inundation for the proposed sites B and C are 2- 3% and 1-2%, respectively, in the same period. The probabilities are based on estimated recurrence intervals for lava flows, which are approximately proportional to the area considered. The probability of having to evacuate the prison is certainly higher than the probability of lava entering the site. Maximum warning times between eruption and lava inundation of a site are estimated to be 24 hours for the existing prison site and 72 hours for proposed sites B and C. Evacuation plans should take these times into consideration.
A novel methodology for querying web images
NASA Astrophysics Data System (ADS)
Prabhakara, Rashmi; Lee, Ching Cheng
2005-01-01
Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.
A novel methodology for querying web images
NASA Astrophysics Data System (ADS)
Prabhakara, Rashmi; Lee, Ching Cheng
2004-12-01
Ever since the advent of Internet, there has been an immense growth in the amount of image data that is available on the World Wide Web. With such a magnitude of image availability, an efficient and effective image retrieval system is required to make use of this information. This research presents an effective image matching and indexing technique that improvises on existing integrated image retrieval methods. The proposed technique follows a two-phase approach, integrating query by topic and query by example specification methods. The first phase consists of topic-based image retrieval using an improved text information retrieval (IR) technique that makes use of the structured format of HTML documents. It consists of a focused crawler that not only provides for the user to enter the keyword for the topic-based search but also, the scope in which the user wants to find the images. The second phase uses the query by example specification to perform a low-level content-based image match for the retrieval of smaller and relatively closer results of the example image. Information related to the image feature is automatically extracted from the query image by the image processing system. A technique that is not computationally intensive based on color feature is used to perform content-based matching of images. The main goal is to develop a functional image search and indexing system and to demonstrate that better retrieval results can be achieved with this proposed hybrid search technique.
Is an eclipse described in the Odyssey?
Baikouzis, Constantino; Magnasco, Marcelo O.
2008-01-01
Plutarch and Heraclitus believed a certain passage in the 20th book of the Odyssey (“Theoclymenus's prophecy”) to be a poetic description of a total solar eclipse. In the late 1920s, Schoch and Neugebauer computed that the solar eclipse of 16 April 1178 B.C.E. was total over the Ionian Islands and was the only suitable eclipse in more than a century to agree with classical estimates of the decade-earlier sack of Troy around 1192–1184 B.C.E. However, much skepticism remains about whether the verses refer to this, or any, eclipse. To contribute to the issue independently of the disputed eclipse reference, we analyze other astronomical references in the Epic, without assuming the existence of an eclipse, and search for dates matching the astronomical phenomena we believe they describe. We use three overt astronomical references in the epic: to Boötes and the Pleiades, Venus, and the New Moon; we supplement them with a conjectural identification of Hermes's trip to Ogygia as relating to the motion of planet Mercury. Performing an exhaustive search of all possible dates in the span 1250–1115 B.C., we looked to match these phenomena in the order and manner that the text describes. In that period, a single date closely matches our references: 16 April 1178 B.C.E. We speculate that these references, plus the disputed eclipse reference, may refer to that specific eclipse. PMID:18577587
Using Computer Technology To Aid the Disabled Reader.
ERIC Educational Resources Information Center
Balajthy, Ernest
When matched for achievement level and educational objectives, computer technology can be particularly effective with at-risk students. Computer-assisted instructional software is the most widely available type of software. An exciting development pertinent to literacy education is the development of the "electronic book" (also called…
Impact of topographic mask models on scanner matching solutions
NASA Astrophysics Data System (ADS)
Tyminski, Jacek K.; Pomplun, Jan; Renwick, Stephen P.
2014-03-01
Of keen interest to the IC industry are advanced computational lithography applications such as Optical Proximity Correction of IC layouts (OPC), scanner matching by optical proximity effect matching (OPEM), and Source Optimization (SO) and Source-Mask Optimization (SMO) used as advanced reticle enhancement techniques. The success of these tasks is strongly dependent on the integrity of the lithographic simulators used in computational lithography (CL) optimizers. Lithographic mask models used by these simulators are key drivers impacting the accuracy of the image predications, and as a consequence, determine the validity of these CL solutions. Much of the CL work involves Kirchhoff mask models, a.k.a. thin masks approximation, simplifying the treatment of the mask near-field images. On the other hand, imaging models for hyper-NA scanner require that the interactions of the illumination fields with the mask topography be rigorously accounted for, by numerically solving Maxwell's Equations. The simulators used to predict the image formation in the hyper-NA scanners must rigorously treat the masks topography and its interaction with the scanner illuminators. Such imaging models come at a high computational cost and pose challenging accuracy vs. compute time tradeoffs. Additional complication comes from the fact that the performance metrics used in computational lithography tasks show highly non-linear response to the optimization parameters. Finally, the number of patterns used for tasks such as OPC, OPEM, SO, or SMO range from tens to hundreds. These requirements determine the complexity and the workload of the lithography optimization tasks. The tools to build rigorous imaging optimizers based on first-principles governing imaging in scanners are available, but the quantifiable benefits they might provide are not very well understood. To quantify the performance of OPE matching solutions, we have compared the results of various imaging optimization trials obtained with Kirchhoff mask models to those obtained with rigorous models involving solutions of Maxwell's Equations. In both sets of trials, we used sets of large numbers of patterns, with specifications representative of CL tasks commonly encountered in hyper-NA imaging. In this report we present OPEM solutions based on various mask models and discuss the models' impact on hyper- NA scanner matching accuracy. We draw conclusions on the accuracy of results obtained with thin mask models vs. the topographic OPEM solutions. We present various examples representative of the scanner image matching for patterns representative of the current generation of IC designs.
1981-09-30
to perform a variety of local arithmetic operations. Our initial task will be to use it for computing 5X5 convolutions common to many low level...report presents the results of applying our relaxation based scene matching systein I1] to a new domain - automatic matching of pairs of images. The task...objects (corners of buildings) within the large image. But we did demonstrate the ability of our system to automatically segment, describe, and match
Evaluation of Deep Learning Based Stereo Matching Methods: from Ground to Aerial Images
NASA Astrophysics Data System (ADS)
Liu, J.; Ji, S.; Zhang, C.; Qin, Z.
2018-05-01
Dense stereo matching has been extensively studied in photogrammetry and computer vision. In this paper we evaluate the application of deep learning based stereo methods, which were raised from 2016 and rapidly spread, on aerial stereos other than ground images that are commonly used in computer vision community. Two popular methods are evaluated. One learns matching cost with a convolutional neural network (known as MC-CNN); the other produces a disparity map in an end-to-end manner by utilizing both geometry and context (known as GC-net). First, we evaluate the performance of the deep learning based methods for aerial stereo images by a direct model reuse. The models pre-trained on KITTI 2012, KITTI 2015 and Driving datasets separately, are directly applied to three aerial datasets. We also give the results of direct training on target aerial datasets. Second, the deep learning based methods are compared to the classic stereo matching method, Semi-Global Matching(SGM), and a photogrammetric software, SURE, on the same aerial datasets. Third, transfer learning strategy is introduced to aerial image matching based on the assumption of a few target samples available for model fine tuning. It experimentally proved that the conventional methods and the deep learning based methods performed similarly, and the latter had greater potential to be explored.
Efficient Approximation Algorithms for Weighted $b$-Matching
DOE Office of Scientific and Technical Information (OSTI.GOV)
Khan, Arif; Pothen, Alex; Mostofa Ali Patwary, Md.
2016-01-01
We describe a half-approximation algorithm, b-Suitor, for computing a b-Matching of maximum weight in a graph with weights on the edges. b-Matching is a generalization of the well-known Matching problem in graphs, where the objective is to choose a subset of M edges in the graph such that at most a specified number b(v) of edges in M are incident on each vertex v. Subject to this restriction we maximize the sum of the weights of the edges in M. We prove that the b-Suitor algorithm computes the same b-Matching as the one obtained by the greedy algorithm for themore » problem. We implement the algorithm on serial and shared-memory parallel processors, and compare its performance against a collection of approximation algorithms that have been proposed for the Matching problem. Our results show that the b-Suitor algorithm outperforms the Greedy and Locally Dominant edge algorithms by one to two orders of magnitude on a serial processor. The b-Suitor algorithm has a high degree of concurrency, and it scales well up to 240 threads on a shared memory multiprocessor. The b-Suitor algorithm outperforms the Locally Dominant edge algorithm by a factor of fourteen on 16 cores of an Intel Xeon multiprocessor.« less
An efficient photogrammetric stereo matching method for high-resolution images
NASA Astrophysics Data System (ADS)
Li, Yingsong; Zheng, Shunyi; Wang, Xiaonan; Ma, Hao
2016-12-01
Stereo matching of high-resolution images is a great challenge in photogrammetry. The main difficulty is the enormous processing workload that involves substantial computing time and memory consumption. In recent years, the semi-global matching (SGM) method has been a promising approach for solving stereo problems in different data sets. However, the time complexity and memory demand of SGM are proportional to the scale of the images involved, which leads to very high consumption when dealing with large images. To solve it, this paper presents an efficient hierarchical matching strategy based on the SGM algorithm using single instruction multiple data instructions and structured parallelism in the central processing unit. The proposed method can significantly reduce the computational time and memory required for large scale stereo matching. The three-dimensional (3D) surface is reconstructed by triangulating and fusing redundant reconstruction information from multi-view matching results. Finally, three high-resolution aerial date sets are used to evaluate our improvement. Furthermore, precise airborne laser scanner data of one data set is used to measure the accuracy of our reconstruction. Experimental results demonstrate that our method remarkably outperforms in terms of time and memory savings while maintaining the density and precision of the 3D cloud points derived.
The Prediction of Drug-Disease Correlation Based on Gene Expression Data.
Cui, Hui; Zhang, Menghuan; Yang, Qingmin; Li, Xiangyi; Liebman, Michael; Yu, Ying; Xie, Lu
2018-01-01
The explosive growth of high-throughput experimental methods and resulting data yields both opportunity and challenge for selecting the correct drug to treat both a specific patient and their individual disease. Ideally, it would be useful and efficient if computational approaches could be applied to help achieve optimal drug-patient-disease matching but current efforts have met with limited success. Current approaches have primarily utilized the measureable effect of a specific drug on target tissue or cell lines to identify the potential biological effect of such treatment. While these efforts have met with some level of success, there exists much opportunity for improvement. This specifically follows the observation that, for many diseases in light of actual patient response, there is increasing need for treatment with combinations of drugs rather than single drug therapies. Only a few previous studies have yielded computational approaches for predicting the synergy of drug combinations by analyzing high-throughput molecular datasets. However, these computational approaches focused on the characteristics of the drug itself, without fully accounting for disease factors. Here, we propose an algorithm to specifically predict synergistic effects of drug combinations on various diseases, by integrating the data characteristics of disease-related gene expression profiles with drug-treated gene expression profiles. We have demonstrated utility through its application to transcriptome data, including microarray and RNASeq data, and the drug-disease prediction results were validated using existing publications and drug databases. It is also applicable to other quantitative profiling data such as proteomics data. We also provide an interactive web interface to allow our Prediction of Drug-Disease method to be readily applied to user data. While our studies represent a preliminary exploration of this critical problem, we believe that the algorithm can provide the basis for further refinement towards addressing a large clinical need.
An adaptive clustering algorithm for image matching based on corner feature
NASA Astrophysics Data System (ADS)
Wang, Zhe; Dong, Min; Mu, Xiaomin; Wang, Song
2018-04-01
The traditional image matching algorithm always can not balance the real-time and accuracy better, to solve the problem, an adaptive clustering algorithm for image matching based on corner feature is proposed in this paper. The method is based on the similarity of the matching pairs of vector pairs, and the adaptive clustering is performed on the matching point pairs. Harris corner detection is carried out first, the feature points of the reference image and the perceived image are extracted, and the feature points of the two images are first matched by Normalized Cross Correlation (NCC) function. Then, using the improved algorithm proposed in this paper, the matching results are clustered to reduce the ineffective operation and improve the matching speed and robustness. Finally, the Random Sample Consensus (RANSAC) algorithm is used to match the matching points after clustering. The experimental results show that the proposed algorithm can effectively eliminate the most wrong matching points while the correct matching points are retained, and improve the accuracy of RANSAC matching, reduce the computation load of whole matching process at the same time.
Najafi-Yazdi, A.; Mongeau, L.
2012-01-01
The Lattice Boltzmann Method (LBM) is a well established computational tool for fluid flow simulations. This method has been recently utilized for low Mach number computational aeroacoustics. Robust and nonreflective boundary conditions, similar to those used in Navier-Stokes solvers, are needed for LBM-based aeroacoustics simulations. The goal of the present study was to develop an absorbing boundary condition based on the perfectly matched layer (PML) concept for LBM. The derivation of formulations for both two and three dimensional problems are presented. The macroscopic behavior of the new formulation is discussed. The new formulation was tested using benchmark acoustic problems. The perfectly matched layer concept appears to be very well suited for LBM, and yielded very low acoustic reflection factor. PMID:23526050
Rigid shape matching by segmentation averaging.
Wang, Hongzhi; Oliensis, John
2010-04-01
We use segmentations to match images by shape. The new matching technique does not require point-to-point edge correspondence and is robust to small shape variations and spatial shifts. To address the unreliability of segmentations computed bottom-up, we give a closed form approximation to an average over all segmentations. Our method has many extensions, yielding new algorithms for tracking, object detection, segmentation, and edge-preserving smoothing. For segmentation, instead of a maximum a posteriori approach, we compute the "central" segmentation minimizing the average distance to all segmentations of an image. For smoothing, instead of smoothing images based on local structures, we smooth based on the global optimal image structures. Our methods for segmentation, smoothing, and object detection perform competitively, and we also show promising results in shape-based tracking.
Research on rolling element bearing fault diagnosis based on genetic algorithm matching pursuit
NASA Astrophysics Data System (ADS)
Rong, R. W.; Ming, T. F.
2017-12-01
In order to solve the problem of slow computation speed, matching pursuit algorithm is applied to rolling bearing fault diagnosis, and the improvement are conducted from two aspects that are the construction of dictionary and the way to search for atoms. To be specific, Gabor function which can reflect time-frequency localization characteristic well is used to construct the dictionary, and the genetic algorithm to improve the searching speed. A time-frequency analysis method based on genetic algorithm matching pursuit (GAMP) algorithm is proposed. The way to set property parameters for the improvement of the decomposition results is studied. Simulation and experimental results illustrate that the weak fault feature of rolling bearing can be extracted effectively by this proposed method, at the same time, the computation speed increases obviously.
Three-dimensional object surface identification
NASA Astrophysics Data System (ADS)
Celenk, Mehmet
1995-03-01
This paper describes a computationally efficient matching method for inspecting 3D objects using their serial cross sections. Object regions of interest in cross-sectional binary images of successive slices are aligned with those of the models. Cross-sectional differences between the object and the models are measured in the direction of the gradient of the cross section boundary. This is repeated in all the cross-sectional images. The model with minimum average cross-sectional difference is selected as the best match to the given object (i.e., no defect). The method is tested using various computer generated surfaces and matching results are presented. It is also demonstrated using Symult S-2010 16-node system that the method is suitable for parallel implementation in massage passing processors with the maximum attainable speedup (close to 16 for S-2010).
Bovino, S; Grassi, T; Gianturco, F A
2015-12-17
A detailed analysis of an ionic reaction that plays a crucial role in the carbon chemistry of the interstellar medium (ISM) is carried out by computing ab initio reactive cross sections with a quantum method and by further obtaining the corresponding CH(+) destruction rates over a range of temperatures that shows good overall agreement with existing experiments. The differences found between all existing calculations and the very-low-T experiments are discussed and explored via a simple numerical model that links these cross section reductions to collinear approaches where nonadiabatic crossing is expected to dominate. The new rates are further linked to a complex chemical network that models the evolution of the CH(+) abundance in the photodissociation region (PDR) and molecular cloud (MC) environments of the ISM. The abundances of CH(+) are given by numerical solutions of a large set of coupled, first-order kinetics equations that employs our new chemical package krome. The analysis that we carry out reveals that the important region for CH(+) destruction is that above 100 K, hence showing that, at least for this reaction, the differences with the existing laboratory low-T experiments are of essentially no importance within the astrochemical environments discussed here because, at those temperatures, other chemical processes involving the title molecule are taking over. A detailed analysis of the chemical network involving CH(+) also shows that a slight decrease in the initial oxygen abundance might lead to higher CH(+) abundances because the main chemical carbon ion destruction channel is reduced in efficiency. This might provide an alternative chemical route to understand the reason why general astrochemical models fail when the observed CH(+) abundances are matched with the outcomes of their calculations.
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Pototzky, Anthony S.; Perry, Boyd, III
1991-01-01
Two matched filter theory based schemes are described and illustrated for obtaining maximized and time correlated gust loads for a nonlinear aircraft. The first scheme is computationally fast because it uses a simple 1-D search procedure to obtain its answers. The second scheme is computationally slow because it uses a more complex multi-dimensional search procedure to obtain its answers, but it consistently provides slightly higher maximum loads than the first scheme. Both schemes are illustrated with numerical examples involving a nonlinear control system.
NASA Technical Reports Server (NTRS)
Scott, Robert C.; Perry, Boyd, III; Pototzky, Anthony S.
1991-01-01
This paper describes and illustrates two matched-filter-theory based schemes for obtaining maximized and time-correlated gust-loads for a nonlinear airplane. The first scheme is computationally fast because it uses a simple one-dimensional search procedure to obtain its answers. The second scheme is computationally slow because it uses a more complex multidimensional search procedure to obtain its answers, but it consistently provides slightly higher maximum loads than the first scheme. Both schemes are illustrated with numerical examples involving a nonlinear control system.
Concentrating on beauty: sexual selection and sociospatial memory.
Becker, D Vaughn; Kenrick, Douglas T; Guerin, Stephen; Maner, Jon K
2005-12-01
In three experiments, location memory for faces was examined using a computer version of the matching game Concentration. Findings suggested that physical attractiveness led to more efficient matching for female faces but not for male faces. Study 3 revealed this interaction despite allowing participants to initially see, attend to, and match the attractive male faces in the first few turns. Analysis of matching errors suggested that, compared to other targets, attractive women were less confusable with one another. Results are discussed in terms of the different functions that attractiveness serves for men and women.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelletier, C; Jung, J; Lee, C
2015-06-15
Purpose: To quantify the dosimetric uncertainty due to organ position errors when using height and weight as phantom selection criteria in the UF/NCI Hybrid Phantom Library for the purpose of out-of-field organ dose reconstruction. Methods: Four diagnostic patient CT images were used to create 7-field IMRT plans. For each patient, dose to the liver, right lung, and left lung were calculated using the XVMC Monte Carlo code. These doses were taken to be the ground truth. For each patient, the phantom with the most closely matching height and weight was selected from the body size dependent phantom library. The patientmore » plans were then transferred to the computational phantoms and organ doses were recalculated. Each plan was also run on 4 additional phantoms with reference heights and or weights. Maximum and mean doses for the three organs were computed, and the DVHs were extracted and compared. One sample t-tests were performed to compare the accuracy of the height and weight matched phantoms against the additional phantoms in regards to both maximum and mean dose. Results: For one of the patients, the height and weight matched phantom yielded the most accurate results across all three organs for both maximum and mean doses. For two additional patients, the matched phantom yielded the best match for one organ only. In 13 of the 24 cases, the matched phantom yielded better results than the average of the other four phantoms, though the results were only statistically significant at the .05 level for three cases. Conclusion: Using height and weight matched phantoms does yield better results in regards to out-of-field dosimetry than using average phantoms. Height and weight appear to be moderately good selection criteria, though this selection criteria failed to yield any better results for one patient.« less
Development of a laser-guided embedded-computer-controlled air-assisted precision sprayer
USDA-ARS?s Scientific Manuscript database
An embedded computer-controlled, laser-guided, air-assisted, variable-rate precision sprayer was developed to automatically adjust spray outputs on both sides of the sprayer to match presence, size, shape, and foliage density of tree crops. The sprayer was the integration of an embedded computer, a ...
78 FR 79564 - Discontinuance of Annual Financial Assessments-Delay in Implementation
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-30
... that due to delays in modifying computer software, VA is postponing implementation of this change. FOR... computer matching of income reported to the Internal Revenue Service (IRS) and Social Security... implemented by December 31, 2013. Due to delays in revising and updating supporting computer software, VA is...
ERIC Educational Resources Information Center
Tan, Xuan; Xiang, Bihua; Dorans, Neil J.; Qu, Yanxuan
2010-01-01
The nature of the matching criterion (usually the total score) in the study of differential item functioning (DIF) has been shown to impact the accuracy of different DIF detection procedures. One of the topics related to the nature of the matching criterion is whether the studied item should be included. Although many studies exist that suggest…
Optimal case-control matching in practice.
Cologne, J B; Shibata, Y
1995-05-01
We illustrate modern matching techniques and discuss practical issues in defining the closeness of matching for retrospective case-control designs (in which the pool of subjects already exists when the study commences). We empirically compare matching on a balancing score, analogous to the propensity score for treated/control matching, with matching on a weighted distance measure. Although both methods in principle produce balance between cases and controls in the marginal distributions of the matching covariates, the weighted distance measure provides better balance in practice because the balancing score can be poorly estimated. We emphasize the use of optimal matching based on efficient network algorithms. An illustration is based on the design of a case-control study of hepatitis B virus infection as a possible confounder and/or effect modifier of radiation-related primary liver cancer in atomic bomb survivors.
DOE Office of Scientific and Technical Information (OSTI.GOV)
McDaniel, Dwayne; Dulikravich, George; Cizmas, Paul
2017-11-27
This report summarizes the objectives, tasks and accomplishments made during the three year duration of this research project. The report presents the results obtained by applying advanced computational techniques to develop reduced-order models (ROMs) in the case of reacting multiphase flows based on high fidelity numerical simulation of gas-solids flow structures in risers and vertical columns obtained by the Multiphase Flow with Interphase eXchanges (MFIX) software. The research includes a numerical investigation of reacting and non-reacting gas-solids flow systems and computational analysis that will involve model development to accelerate the scale-up process for the design of fluidization systems by providingmore » accurate solutions that match the full-scale models. The computational work contributes to the development of a methodology for obtaining ROMs that is applicable to the system of gas-solid flows. Finally, the validity of the developed ROMs is evaluated by comparing the results against those obtained using the MFIX code. Additionally, the robustness of existing POD-based ROMs for multiphase flows is improved by avoiding non-physical solutions of the gas void fraction and ensuring that the reduced kinetics models used for reactive flows in fluidized beds are thermodynamically consistent.« less
TU-AB-BRA-02: An Efficient Atlas-Based Synthetic CT Generation Method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Han, X
2016-06-15
Purpose: A major obstacle for MR-only radiotherapy is the need to generate an accurate synthetic CT (sCT) from MR image(s) of a patient for the purposes of dose calculation and DRR generation. We propose here an accurate and efficient atlas-based sCT generation method, which has a computation speed largely independent of the number of atlases used. Methods: Atlas-based sCT generation requires a set of atlases with co-registered CT and MR images. Unlike existing methods that align each atlas to the new patient independently, we first create an average atlas and pre-align every atlas to the average atlas space. When amore » new patient arrives, we compute only one deformable image registration to align the patient MR image to the average atlas, which indirectly aligns the patient to all pre-aligned atlases. A patch-based non-local weighted fusion is performed in the average atlas space to generate the sCT for the patient, which is then warped back to the original patient space. We further adapt a PatchMatch algorithm that can quickly find top matches between patches of the patient image and all atlas images, which makes the patch fusion step also independent of the number of atlases used. Results: Nineteen brain tumour patients with both CT and T1-weighted MR images are used as testing data and a leave-one-out validation is performed. Each sCT generated is compared against the original CT image of the same patient on a voxel-by-voxel basis. The proposed method produces a mean absolute error (MAE) of 98.6±26.9 HU overall. The accuracy is comparable with a conventional implementation scheme, but the computation time is reduced from over an hour to four minutes. Conclusion: An average atlas space patch fusion approach can produce highly accurate sCT estimations very efficiently. Further validation on dose computation accuracy and using a larger patient cohort is warranted. The author is a full time employee of Elekta, Inc.« less