Science.gov

Sample records for high-level template matching

  1. Fast Legendre moment computation for template matching

    NASA Astrophysics Data System (ADS)

    Li, Bing C.

    2017-05-01

    Normalized cross correlation (NCC) based template matching is insensitive to intensity changes and it has many applications in image processing, object detection, video tracking and pattern recognition. However, normalized cross correlation implementation is computationally expensive since it involves both correlation computation and normalization implementation. In this paper, we propose Legendre moment approach for fast normalized cross correlation implementation and show that the computational cost of this proposed approach is independent of template mask sizes which is significantly faster than traditional mask size dependent approaches, especially for large mask templates. Legendre polynomials have been widely used in solving Laplace equation in electrodynamics in spherical coordinate systems, and solving Schrodinger equation in quantum mechanics. In this paper, we extend Legendre polynomials from physics to computer vision and pattern recognition fields, and demonstrate that Legendre polynomials can help to reduce the computational cost of NCC based template matching significantly.

  2. Template Matching on Parallel Architectures,

    DTIC Science & Technology

    1985-07-01

    memory. The processors run asynchronously. Thus according to Hynn’s categories the Butterfl . is a MIMD machine. The processors of the Butterfly are...Generalized Butterfly Architecture This section describes timings for pattern matching on the generalized Butterfl .. Ihe implementations on the Butterfly...these algorithms. Thus the best implementation of the techniques on the generalized Butterfl % are the same as the implementation on the real Butterfly

  3. Assessing particle kinematics via template matching algorithms.

    PubMed

    Weber, M; Fink, M; Fortov, V; Lipaev, A; Molotkov, V; Morfill, G; Petrov, O; Pustylnik, M; Thoma, M; Thomas, H; Usachev, A; Raeth, C

    2016-04-18

    Template matching algorithms represent a viable tool to locate particles in optical images. A crucial factor of the performance of these methods is the choice of the similarity measure. Recently, it was shown in [Gao and Helgeson, Opt. Express 22 (2014)] that the correlation coefficient (CC) leads to good results. Here, we introduce the mutual information (MI) as a nonlinear similarity measure and compare the performance of the MI and the CC for different noise scenarios. It turns out that the mutual information leads to superior results in the case of signal dependent noise. We propose a novel approach to estimate the velocity of particles which is applicable in imaging scenarios where the particles appear elongated due to their movement. By designing a bank of anisotropic templates supposed to fit the elongation of the particles we are able to reliably estimate their velocity and direction of motion out of a single image.

  4. Scale estimation of objects using template matching

    NASA Astrophysics Data System (ADS)

    Gaxiola, Leopoldo N.; Diaz-Ramirez, Victor H.; Tapia, Juan J.

    2016-09-01

    Scale estimation of objects is a challenging problem in image processing. This work presents a novel method to detect and estimate the scaling factor of a target in an observed scene corrupted with additive noise and clutter. Given a set of available views of the target the proposed method is able to detect the target and estimate its scaling factor using a template matched filters and a scale pyramidal representation. The performance of the proposed method is evaluated in synthetic and real-life scenes in different pattern recognition applications. The obtained results are characterized in terms of objective metrics.

  5. Automatic target detection using binary template matching

    NASA Astrophysics Data System (ADS)

    Jun, Dong-San; Sun, Sun-Gu; Park, HyunWook

    2005-03-01

    This paper presents a new automatic target detection (ATD) algorithm to detect targets such as battle tanks and armored personal carriers in ground-to-ground scenarios. Whereas most ATD algorithms were developed for forward-looking infrared (FLIR) images, we have developed an ATD algorithm for charge-coupled device (CCD) images, which have superior quality to FLIR images in daylight. The proposed algorithm uses fast binary template matching with an adaptive binarization, which is robust to various light conditions in CCD images and saves computation time. Experimental results show that the proposed method has good detection performance.

  6. A Finger Vein Identification Method Based on Template Matching

    NASA Astrophysics Data System (ADS)

    Zou, Hui; Zhang, Bing; Tao, Zhigang; Wang, Xiaoping

    2016-01-01

    New methods for extracting vein features from finger vein image and generating templates for matching are proposed. In the algorithm for generating templates, we proposed a parameter-templates quality factor (TQF) - to measure the quality of generated templates. So that we can use fewer finger vein samples to generate templates that meet the quality requirement of identification. The recognition accuracy of using proposed methods of finger vein feature extraction and template generation strategy for identification is 97.14%.

  7. Charge density matching in templated molybdates

    SciTech Connect

    Casalongue, Hernan Sanchez; Choyke, Sarah J.; Narducci Sarjeant, Amy; Schrier, Joshua; Norquist, Alexander J.

    2009-06-15

    The role of charge density matching in the formation of templated molybdates under mild hydrothermal conditions was investigated through the use of a series of structurally related amines: piperazine, 1,4-dimethylpiperazine, 2,5-dimethylpiperazine and 2,6-dimethylpiperazine. A series of reactions was conducted in which the relative mole fractions of each component were fixed at 2.5 MoO{sub 3}:1 amine:330 H{sub 2}O:2 H{sub 2}SO{sub 4} in order to isolate the effects of the amine, the only variation between reactions was the structure of the amine. Four distinct polyoxomolybdates anions were observed, ranging from zero-dimensional beta-[Mo{sub 8}O{sub 26}]{sup 4-} molecular anions to [Mo{sub 3}O{sub 10}]{sub n}{sup 2n-} and [Mo{sub 8}O{sub 26}]{sub n}{sup 4n-} chains and [Mo{sub 5}O{sub 16}]{sub n}{sup 2n-} layers. The primary influence over the structure of the molybdate anion is charge density matching with the protonated amine, which was quantified through surface area approximations based upon both calculated molecular surfaces and polyhedral representations of each anion. Secondary influences include amine symmetry and hydrogen-bonding preferences. The synthesis and characterization of two new compounds are reported. Crystal data: [C{sub 6}H{sub 16}N{sub 2}][Mo{sub 3}O{sub 10}].H{sub 2}O (1), triclinic, P-1 (no. 2), a=8.0973(7) A, b=8.8819(9) A, c=11.5969(11) A, alpha=71.362(9){sup o}, beta=82.586(8){sup o}, gamma=74.213(8){sup o}, Z=2, R/R{sub w}=0.0262/0.0564, and [C{sub 6}H{sub 16}N{sub 2}]{sub 2}[Mo{sub 8}O{sub 26}] (2), monoclinic, P2{sub 1}/n (no. 14), a=7.9987(11) A, b=12.5324(19) A, c=16.003(3) A, beta=97.393(14){sup o}, Z=2, R/R{sub w}=0.0189/0.0454. - Graphical abstract: A geometric decomposition method for surface area determination is presented in the context of charge density matching in new organically templated polyoxomolybdates.

  8. Robust structural identification via polyhedral template matching

    NASA Astrophysics Data System (ADS)

    Mahler Larsen, Peter; Schmidt, Søren; Schiøtz, Jakob

    2016-06-01

    Successful scientific applications of large-scale molecular dynamics often rely on automated methods for identifying the local crystalline structure of condensed phases. Many existing methods for structural identification, such as common neighbour analysis, rely on interatomic distances (or thresholds thereof) to classify atomic structure. As a consequence they are sensitive to strain and thermal displacements, and preprocessing such as quenching or temporal averaging of the atomic positions is necessary to provide reliable identifications. We propose a new method, polyhedral template matching (PTM), which classifies structures according to the topology of the local atomic environment, without any ambiguity in the classification, and with greater reliability than e.g. common neighbour analysis in the presence of thermal fluctuations. We demonstrate that the method can reliably be used to identify structures even in simulations near the melting point, and that it can identify the most common ordered alloy structures as well. In addition, the method makes it easy to identify the local lattice orientation in polycrystalline samples, and to calculate the local strain tensor. An implementation is made available under a Free and Open Source Software license.

  9. Infrared point target detection with improved template matching

    NASA Astrophysics Data System (ADS)

    Liu, Ruiming; Lu, Yanhong; Gong, Chenglong; Liu, Yang

    2012-07-01

    Detecting point targets in infrared images is a difficult task. Template matching is simple and easy to implement for completing this task. However, it has some shortcomings. We propose an improved template matching method for detecting targets. Different from the classic template matching, the projection coefficients obtained from principal component analysis are used as templates and the nonlinear correlation is proposed to measure the similarity, the matching degree. The correlation in original space can not capture the higher-order statistical property of images. So its detection performance is not satisfying. We introduce the nonlinear correlation, which computes the correlation coefficients in a higher-dimensional feature space or even in an infinite-dimensional feature space, to capture the higher-order statistics. The detection performance is improved greatly. Results of experiments show that the improved method is competent to detect infrared point targets.

  10. Robust template matching using run-length encoding

    NASA Astrophysics Data System (ADS)

    Lee, Hunsue; Suh, Sungho; Cho, Hansang

    2015-09-01

    In this paper we propose a novel template matching algorithm for visual inspection of bare printed circuit board (PCB).1 In the conventional template matching for PCB inspection, the matching score and its relevant offsets are acquired by calculating the maximum value among the convolutions of template image and camera image. While the method is fast, the robustness and accuracy of matching are not guaranteed due to the gap between a design and an implementation resulting from defects and process variations. To resolve this problem, we suggest a new method which uses run-length encoding (RLE). For the template image to be matched, we accumulate data of foreground and background, and RLE data for each row and column in the template image. Using the data, we can find the x and y offsets which minimize the optimization function. The efficiency and robustness of the proposed algorithm are verified through a series of experiments. By comparing the proposed algorithm with the conventional approach, we could realize that the proposed algorithm is not only fast but also more robust and reliable in matching results.

  11. Correlation-coefficient-based fast template matching through partial elimination.

    PubMed

    Mahmood, Arif; Khan, Sohaib

    2012-04-01

    Partial computation elimination techniques are often used for fast template matching. At a particular search location, computations are prematurely terminated as soon as it is found that this location cannot compete with an already known best match location. Due to the nonmonotonic growth pattern of the correlation-based similarity measures, partial computation elimination techniques have been traditionally considered inapplicable to speed up these measures. In this paper, we show that partial elimination techniques may be applied to a correlation coefficient by using a monotonic formulation, and we propose basic-mode and extended-mode partial correlation elimination algorithms for fast template matching. The basic-mode algorithm is more efficient on small template sizes, whereas the extended mode is faster on medium and larger templates. We also propose a strategy to decide which algorithm to use for a given data set. To achieve a high speedup, elimination algorithms require an initial guess of the peak correlation value. We propose two initialization schemes including a coarse-to-fine scheme for larger templates and a two-stage technique for small- and medium-sized templates. Our proposed algorithms are exact, i.e., having exhaustive equivalent accuracy, and are compared with the existing fast techniques using real image data sets on a wide variety of template sizes. While the actual speedups are data dependent, in most cases, our proposed algorithms have been found to be significantly faster than the other algorithms.

  12. A new template matching method based on contour information

    NASA Astrophysics Data System (ADS)

    Cai, Huiying; Zhu, Feng; Wu, Qingxiao; Li, Sicong

    2014-11-01

    Template matching is a significant approach in machine vision due to its effectiveness and robustness. However, most of the template matching methods are so time consuming that they can't be used to many real time applications. The closed contour matching method is a popular kind of template matching methods. This paper presents a new closed contour template matching method which is suitable for two dimensional objects. Coarse-to-fine searching strategy is used to improve the matching efficiency and a partial computation elimination scheme is proposed to further speed up the searching process. The method consists of offline model construction and online matching. In the process of model construction, triples and distance image are obtained from the template image. A certain number of triples which are composed by three points are created from the contour information that is extracted from the template image. The rule to select the three points is that the template contour is divided equally into three parts by these points. The distance image is obtained here by distance transform. Each point on the distance image represents the nearest distance between current point and the points on the template contour. During the process of matching, triples of the searching image are created with the same rule as the triples of the model. Through the similarity that is invariant to rotation, translation and scaling between triangles, the triples corresponding to the triples of the model are found. Then we can obtain the initial RST (rotation, translation and scaling) parameters mapping the searching contour to the template contour. In order to speed up the searching process, the points on the searching contour are sampled to reduce the number of the triples. To verify the RST parameters, the searching contour is projected into the distance image, and the mean distance can be computed rapidly by simple operations of addition and multiplication. In the fine searching process

  13. Template Matching Approach to Signal Prediction

    NASA Technical Reports Server (NTRS)

    Mackey, Ryan; Kulikov, Igor

    2010-01-01

    A new approach to signal prediction and prognostic assessment of spacecraft health resolves an inherent difficulty in fusing sensor data with simulated data. This technique builds upon previous work that demonstrated the importance of physics-based transient models to accurate prediction of signal dynamics and system performance. While models can greatly improve predictive accuracy, they are difficult to apply in general because of variations in model type, accuracy, or intended purpose. However, virtually any flight project will have at least some modeling capability at its disposal, whether a full-blown simulation, partial physics models, dynamic look-up tables, a brassboard analogue system, or simple hand-driven calculation by a team of experts. Many models can be used to develop a predict, or an estimate of the next day s or next cycle s behavior, which is typically used for planning purposes. The fidelity of a predict varies from one project to another, depending on the complexity of the simulation (i.e. linearized or full differential equations) and the level of detail in anticipated system operation, but typically any predict cannot be adapted to changing conditions or adjusted spacecraft command execution. Applying a predict blindly, without adapting the predict to current conditions, produces mixed results at best, primarily due to mismatches between assumed execution of spacecraft activities and actual times of execution. This results in the predict becoming useless during periods of complicated behavior, exactly when the predict would be most valuable. Each spacecraft operation tends to show up as a transient in the data, and if the transients are misaligned, using the predict can actually harm forecasting performance. To address this problem, the approach here expresses the predict in terms of a baseline function superposed with one or more transient functions. These transients serve as signal templates, which can be relocated in time and space against

  14. Template Matching for Auditing Hospital Cost and Quality

    PubMed Central

    Silber, Jeffrey H; Rosenbaum, Paul R; Ross, Richard N; Ludwig, Justin M; Wang, Wei; Niknam, Bijan A; Mukherjee, Nabanita; Saynisch, Philip A; Even-Shoshan, Orit; Kelz, Rachel R; Fleisher, Lee A

    2014-01-01

    Objective Develop an improved method for auditing hospital cost and quality. Data Sources/Setting Medicare claims in general, gynecologic and urologic surgery, and orthopedics from Illinois, Texas, and New York between 2004 and 2006. Study Design A template of 300 representative patients was constructed and then used to match 300 patients at hospitals that had a minimum of 500 patients over a 3-year study period. Data Collection/Extraction Methods From each of 217 hospitals we chose 300 patients most resembling the template using multivariate matching. Principal Findings The matching algorithm found close matches on procedures and patient characteristics, far more balanced than measured covariates would be in a randomized clinical trial. These matched samples displayed little to no differences across hospitals in common patient characteristics yet found large and statistically significant hospital variation in mortality, complications, failure-to-rescue, readmissions, length of stay, ICU days, cost, and surgical procedure length. Similar patients at different hospitals had substantially different outcomes. Conclusion The template-matched sample can produce fair, directly standardized audits that evaluate hospitals on patients with similar characteristics, thereby making benchmarking more believable. Through examining matched samples of individual patients, administrators can better detect poor performance at their hospitals and better understand why these problems are occurring. PMID:24588413

  15. Template matching for auditing hospital cost and quality.

    PubMed

    Silber, Jeffrey H; Rosenbaum, Paul R; Ross, Richard N; Ludwig, Justin M; Wang, Wei; Niknam, Bijan A; Mukherjee, Nabanita; Saynisch, Philip A; Even-Shoshan, Orit; Kelz, Rachel R; Fleisher, Lee A

    2014-10-01

    Develop an improved method for auditing hospital cost and quality. Medicare claims in general, gynecologic and urologic surgery, and orthopedics from Illinois, Texas, and New York between 2004 and 2006. A template of 300 representative patients was constructed and then used to match 300 patients at hospitals that had a minimum of 500 patients over a 3-year study period. From each of 217 hospitals we chose 300 patients most resembling the template using multivariate matching. The matching algorithm found close matches on procedures and patient characteristics, far more balanced than measured covariates would be in a randomized clinical trial. These matched samples displayed little to no differences across hospitals in common patient characteristics yet found large and statistically significant hospital variation in mortality, complications, failure-to-rescue, readmissions, length of stay, ICU days, cost, and surgical procedure length. Similar patients at different hospitals had substantially different outcomes. The template-matched sample can produce fair, directly standardized audits that evaluate hospitals on patients with similar characteristics, thereby making benchmarking more believable. Through examining matched samples of individual patients, administrators can better detect poor performance at their hospitals and better understand why these problems are occurring. © Health Research and Educational Trust.

  16. Chinese license plate character segmentation using multiscale template matching

    NASA Astrophysics Data System (ADS)

    Tian, Jiangmin; Wang, Guoyou; Liu, Jianguo; Xia, Yuanchun

    2016-09-01

    Character segmentation (CS) plays an important role in automatic license plate recognition and has been studied for decades. A method using multiscale template matching is proposed to settle the problem of CS for Chinese license plates. It is carried out on a binary image integrated from maximally stable extremal region detection and Otsu thresholding. Afterward, a uniform harrow-shaped template with variable length is designed, by virtue of which a three-dimensional matching space is constructed for searching of candidate segmentations. These segmentations are detected at matches with local minimum responses. Finally, the vertical boundaries of each single character are located for subsequent recognition. Experiments on a data set including 2349 license plate images of different quality levels show that the proposed method can achieve a higher accuracy at comparable time cost and is robust to images in poor conditions.

  17. Monocular correspondence detection for symmetrical objects by template matching

    NASA Astrophysics Data System (ADS)

    Vilmar, G.; Besslich, Philipp W., Jr.

    1990-09-01

    We describe a possibility to reconstruct 3-D information from a single view of an 3-D bilateral symmetric object. The symmetry assumption allows us to obtain a " second view" from a different viewpoint by a simple reflection of the monocular image. Therefore we have to solve the correspondence problem in a special case where known feature-based or area-based binocular approaches fail. In principle our approach is based on a frequency domain template matching of the features on the epipolar lines. During a training period our system " learns" the assignment of correspondence models to image features. The object shape is interpolated when no template matches to the image features. This fact is an important advantage of this methodology because no " real world" image holds the symmetry assumption perfectly. To simplify the training process we used single views on human faces (e. g. passport photos) but our system is trainable on any other kind of objects.

  18. Relationship between field tests and match running performance in high-level young brazilian soccer players.

    PubMed

    Aquino, Rodrigo; Palucci Vieira, Luiz H; de Paula Oliveira, Lucas; Cruz Goncalves, Luiz G; Pereira Santiago, Paulo R

    2017-02-14

    The main aim of this study was to analyse the relationship between field tests and match running performance using computational tracking technology in high-level young Brazilian soccer players. Twenty-five young male Brazilian soccer players participated in this study (U-15, n = 13; U-17, n = 12). In the same week the players were submitted to field tests and actual matches. The field tests were: Maximum Speed (10m-30m), Zig-Zag, Running-based Anaerobic Sprint Test, and Yo-Yo Intermittent Recovery Test level 1. Additionally, participants performed actual soccer match-play. Match running performance was collected using a fixed video-camera. Subsequently, computerized tracking video-analysis (30 Hz) was utilized to identify each physical performance indicator. Pearson's correlation and linear regression were used for statistical analysis. The results showed that the majority of field tests were not related to match running performance. The Zig-Zag Test, Running-based Anaerobic Sprint Test, and Yo-Yo Intermittent Recovery Test level 1 seem to be the most specific tests (r = 0.41-0.47), however the explanatory powers of these field tests in relation to match running performance were low (R2 = 17-22%). Assessment of match running performance should be included in the evaluation periods of young soccer players, together with the most specific tests reported.

  19. Template match using local feature with view invariance

    NASA Astrophysics Data System (ADS)

    Lu, Cen; Zhou, Gang

    2013-10-01

    Matching the template image in the target image is the fundamental task in the field of computer vision. Aiming at the deficiency in the traditional image matching methods and inaccurate matching in scene image with rotation, illumination and view changing, a novel matching algorithm using local features are proposed in this paper. The local histograms of the edge pixels (LHoE) are extracted as the invariable feature to resist view and brightness changing. The merits of the LHoE is that the edge points have been little affected with view changing, and the LHoE can resist not only illumination variance but also the polution of noise. For the process of matching are excuded only on the edge points, the computation burden are highly reduced. Additionally, our approach is conceptually simple, easy to implement and do not need the training phase. The view changing can be considered as the combination of rotation, illumination and shear transformation. Experimental results on simulated and real data demonstrated that the proposed approach is superior to NCC(Normalized cross-correlation) and Histogram-based methods with view changing.

  20. Tectonic tremor locations using template matching of low frequency earthquakes

    NASA Astrophysics Data System (ADS)

    Skoumal, R.; Colella, H. V.; Holtkamp, S. G.; Brudzinski, M. R.; Schlanser, K. M.; Shelly, D. R.; Cabral-Cano, E.; Arciniega-Ceballos, A.

    2012-12-01

    Tectonic (non-volcanic) tremor is difficult to locate due to its emergent nature, but critical to assess what impact it has on the plate interface slip budget. Recent studies have found that tectonic tremor is primarily composed of a swarm of low frequency earthquakes, such that identifying individual low frequency earthquakes can provide opportunities to improve source characterizations. This study seeks to refine the tremor source locations by stacking families of similar low frequency earthquakes to enhance the signal to noise ratio and clarify P- and S-wave arrivals, and to better characterize the time history of specific "famlies" of tremor events. Short, well-defined tremor bursts identified from previous source location analysis are used to define template waveforms that are cross-correlated over several years of recording. Since multi-station template matching algorithms are particularly sensitive to source location, accurate time histories of event families can be produced. These time histories provide an important additional constraint on episodic tremor and slip events (and an independent test of both procedures) since they do not depend on station amplitudes as more traditional techniques do, which may impart a detection bias. Stacking similar events clarifies arrival times that are then used to refine the source locations. This approach is being applied to the Oaxaca region of Mexico and southern Cascadia, where lower network density has limited detailed tremor source location analysis.

  1. Automatic Identification of the Templates in Matched Filtering

    SciTech Connect

    Awwal, A S

    2004-09-29

    In laser beam position determination, various shapes of markers may be used to identify different beams. When matched filtering is used for identifying the markers, one is faced with the challenge of determining the appropriate filter to use in the presence of distortions and marker size variability. If the incorrect filter is used, it will result in significant position uncertainty. Thus in the very first step of position detection one has to come up with an automated process to select the right template to use. The automated template identification method proposed here is based on a two-step approach. In the first step an approximate type of the object is determined. Then the filter is chosen based on the best size of the specific type. After the appropriate filter is chosen, the correlation peak position is used to identify the beam position. Real world examples of the application of this technique from the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory are presented.

  2. Automatic identification of the templates in matched filtering

    NASA Astrophysics Data System (ADS)

    Awwal, Abdul A. S.

    2004-11-01

    In laser beam position determination, various shapes of markers may be used to identify different beams. When matched filtering is used for identifying the markers, one is faced with the challenge of determining the appropriate filter to use in the presence of distortions and marker size variability. If the incorrect filter is used, it will result in significant position uncertainty. Thus in the very first step of position detection one has to come up with an automated process to select the right template to use. The automated template identification method proposed here is based on a two-step approach. In the first step an approximate type of the object is determined. Then the filter is chosen based on the best size of the specific type. After the appropriate filter is chosen, the correlation peak position is used to identify the beam position. Real world examples of the application of this technique from the National Ignition Facility (NIF) at Lawrence Livermore National Laboratory are presented.

  3. Template matching method for the analysis of interstellar cloud structure

    NASA Astrophysics Data System (ADS)

    Juvela, M.

    2016-09-01

    Context. The structure of interstellar medium can be characterised at large scales in terms of its global statistics (e.g. power spectra) and at small scales by the properties of individual cores. Interest has been increasing in structures at intermediate scales, resulting in a number of methods being developed for the analysis of filamentary structures. Aims: We describe the application of the generic template-matching (TM) method to the analysis of maps. Our aim is to show that it provides a fast and still relatively robust way to identify elongated structures or other image features. Methods: We present the implementation of a TM algorithm for map analysis. The results are compared against rolling Hough transform (RHT), one of the methods previously used to identify filamentary structures. We illustrate the method by applying it to Herschel surface brightness data. Results: The performance of the TM method is found to be comparable to that of RHT but TM appears to be more robust regarding the input parameters, for example, those related to the selected spatial scales. Small modifications of TM enable one to target structures at different size and intensity levels. In addition to elongated features, we demonstrate the possibility of using TM to also identify other types of structures. Conclusions: The TM method is a viable tool for data quality control, exploratory data analysis, and even quantitative analysis of structures in image data.

  4. Finger Character Recognition Using 3-Dimensional Template Matching

    NASA Astrophysics Data System (ADS)

    Higashiyama, Kazuhiro; Ono, Satoshi; Wang, Yu; Nakayama, Shigeru

    This paper proposes a method for Japanese finger character recognition, using a 3-dimensional (3D) scanner. A hand is a complex dexterous manipulator, evolved to be more complex than any other animals. The hand, being capable of making many different complex shapes, it is ideal for communicating using gestures. The recognition of a whole language, such as the Japanese finger characters, requires the differentiation of subtle similar positioning of each digit. To know the exact 3D position of the hand's digits and overall shape, data gloves had been developed, but these are inconvenient to use. 2D image recognition systems struggle with recreating the 3D information. To capture the 3D information, the proposed method uses a 3D scanner, and then makes matches with 3D templates representing each unique character. Experimental results show that the proposed method recognizes a greater number of characters than existing 2D-based systems with recognition accuracy, on average of 93% for 9 testees, and a peak of over 98% for 4 of them.

  5. The effect of high level tennis matches on urine steroid profiles in professional tennis players.

    PubMed

    Muñoz, D; Toribio, F; Timón, R; Olcina, G; Maynar, J I; Maynar, M

    2010-12-01

    Modern day, tennis matches are characterized by shorter and more intense efforts with players enduring great physical and psychological stress. The aim of this study was to evaluate acute changes in the urinary steroid profile of elite tennis players following professional tournament matches. Eight professional male tennis players participated in this study. Urine samples were collected before and after tennis matches corresponding to the quarter finals of the Spanish Tennis Masters. After the match, there was a significant fall (P<0.05) in testosterone, androsterone, etiocholanolone, and dehydroepiandrosterone (DHEA). Cortisone increased whereas tetrahydrocortisone (THE) decreased. The anabolic/catabolic hormone ratio also decreased, although only the fall in total suprarenal androgen (TSA)/total corticosteroid (TC) and DHEA/(THE+THF) ratios had a significant decrease (P<0.05). These results indicate that a professional tennis match modifies the urine steroid profiles of players, increasing corticosteroid and decreasing androgen excretion in urine, suggesting an important adrenal activation.

  6. A TSR Visual Servoing System Based on a Novel Dynamic Template Matching Method †

    PubMed Central

    Cai, Jia; Huang, Panfeng; Zhang, Bin; Wang, Dongke

    2015-01-01

    The so-called Tethered Space Robot (TSR) is a novel active space debris removal system. To solve its problem of non-cooperative target recognition during short-distance rendezvous events, this paper presents a framework for a real-time visual servoing system using non-calibrated monocular-CMOS (Complementary Metal Oxide Semiconductor). When a small template is used for matching with a large scene, it always leads to mismatches, so a novel template matching algorithm to solve the problem is presented. Firstly, the novel matching algorithm uses a hollow annulus structure according to a FAST (Features from Accelerated Segment) algorithm and makes the method be rotation-invariant. Furthermore, the accumulative deviation can be decreased by the hollow structure. The matching function is composed of grey and gradient differences between template and object image, which help it reduce the effects of illumination and noises. Then, a dynamic template update strategy is designed to avoid tracking failures brought about by wrong matching or occlusion. Finally, the system synthesizes the least square integrated predictor, realizing tracking online in complex circumstances. The results of ground experiments show that the proposed algorithm can decrease the need for sophisticated computation and improves matching accuracy. PMID:26703609

  7. A novel artificial bee colony algorithm based on internal-feedback strategy for image template matching.

    PubMed

    Li, Bai; Gong, Li-Gang; Li, Ya

    2014-01-01

    Image template matching refers to the technique of locating a given reference image over a source image such that they are the most similar. It is a fundamental mission in the field of visual target recognition. In general, there are two critical aspects of a template matching scheme. One is similarity measurement and the other is best-match location search. In this work, we choose the well-known normalized cross correlation model as a similarity criterion. The searching procedure for the best-match location is carried out through an internal-feedback artificial bee colony (IF-ABC) algorithm. IF-ABC algorithm is highlighted by its effort to fight against premature convergence. This purpose is achieved through discarding the conventional roulette selection procedure in the ABC algorithm so as to provide each employed bee an equal chance to be followed by the onlooker bees in the local search phase. Besides that, we also suggest efficiently utilizing the internal convergence states as feedback guidance for searching intensity in the subsequent cycles of iteration. We have investigated four ideal template matching cases as well as four actual cases using different searching algorithms. Our simulation results show that the IF-ABC algorithm is more effective and robust for this template matching mission than the conventional ABC and two state-of-the-art modified ABC algorithms do.

  8. A Novel Artificial Bee Colony Algorithm Based on Internal-Feedback Strategy for Image Template Matching

    PubMed Central

    Gong, Li-Gang

    2014-01-01

    Image template matching refers to the technique of locating a given reference image over a source image such that they are the most similar. It is a fundamental mission in the field of visual target recognition. In general, there are two critical aspects of a template matching scheme. One is similarity measurement and the other is best-match location search. In this work, we choose the well-known normalized cross correlation model as a similarity criterion. The searching procedure for the best-match location is carried out through an internal-feedback artificial bee colony (IF-ABC) algorithm. IF-ABC algorithm is highlighted by its effort to fight against premature convergence. This purpose is achieved through discarding the conventional roulette selection procedure in the ABC algorithm so as to provide each employed bee an equal chance to be followed by the onlooker bees in the local search phase. Besides that, we also suggest efficiently utilizing the internal convergence states as feedback guidance for searching intensity in the subsequent cycles of iteration. We have investigated four ideal template matching cases as well as four actual cases using different searching algorithms. Our simulation results show that the IF-ABC algorithm is more effective and robust for this template matching mission than the conventional ABC and two state-of-the-art modified ABC algorithms do. PMID:24892107

  9. A TSR Visual Servoing System Based on a Novel Dynamic Template Matching Method.

    PubMed

    Cai, Jia; Huang, Panfeng; Zhang, Bin; Wang, Dongke

    2015-12-21

    The so-called Tethered Space Robot (TSR) is a novel active space debris removal system. To solve its problem of non-cooperative target recognition during short-distance rendezvous events, this paper presents a framework for a real-time visual servoing system using non-calibrated monocular-CMOS (Complementary Metal Oxide Semiconductor). When a small template is used for matching with a large scene, it always leads to mismatches, so a novel template matching algorithm to solve the problem is presented. Firstly, the novel matching algorithm uses a hollow annulus structure according to a FAST (Features from Accelerated Segment) algorithm and makes the method be rotation-invariant. Furthermore, the accumulative deviation can be decreased by the hollow structure. The matching function is composed of grey and gradient differences between template and object image, which help it reduce the effects of illumination and noises. Then, a dynamic template update strategy is designed to avoid tracking failures brought about by wrong matching or occlusion. Finally, the system synthesizes the least square integrated predictor, realizing tracking online in complex circumstances. The results of ground experiments show that the proposed algorithm can decrease the need for sophisticated computation and improves matching accuracy.

  10. Fast template matching based on grey prediction for real-time object tracking

    NASA Astrophysics Data System (ADS)

    Lv, Mingming; Hou, Yuanlong; Liu, Rongzhong; Hou, Runmin

    2017-02-01

    Template matching is a basic algorithm for image processing, and real-time is a crucial requirement of object tracking. For real-time tracking, a fast template matching algorithm based on grey prediction is presented, where computation cost can be reduced dramatically by minimizing search range. First, location of the tracked object in the current image is estimated by Grey Model (GM). GM(1,1), which is the basic model of grey prediction, can use some known information to foretell the location. Second, the precise position of the object in the frame is computed by template matching. Herein, Sequential Similarity Detection Algorithm (SSDA) with a self-adaptive threshold is employed to obtain the matching position in the neighborhood of the predicted location. The role of threshold in SSDA is important, as a proper threshold can make template matching fast and accurate. Moreover, a practical weighted strategy is utilized to handle scale and rotation changes of the object, as well as illumination changes. The experimental results show the superior performance of the proposed algorithm over the conventional full-search method, especially in terms of executive time.

  11. Optimizing Multi-Station Template Matching to Identify and Characterize Induced Seismicity in Ohio

    NASA Astrophysics Data System (ADS)

    Brudzinski, M. R.; Skoumal, R.; Currie, B. S.

    2014-12-01

    As oil and gas well completions utilizing multi-stage hydraulic fracturing have become more commonplace, the potential for seismicity induced by the deep disposal of frac-related flowback waters and the hydraulic fracturing process itself has become increasingly important. While it is rare for these processes to induce felt seismicity, the recent increase in the number of deep injection wells and volumes injected have been suspected to have contributed to a substantial increase of events = M 3 in the continental U.S. over the past decade. Earthquake template matching using multi-station waveform cross-correlation is an adept tool for investigating potentially induced sequences due to its proficiency at identifying similar/repeating seismic events. We have sought to refine this approach by investigating a variety of seismic sequences and determining the optimal parameters (station combinations, template lengths and offsets, filter frequencies, data access method, etc.) for identifying induced seismicity. When applied to a sequence near a wastewater injection well in Youngstown, Ohio, our optimized template matching routine yielded 566 events while other template matching studies found ~100-200 events. We also identified 77 events on 4-12 March 2014 that are temporally and spatially correlated with active hydraulic fracturing in Poland Township, Ohio. We find similar improvement in characterizing sequences in Washington and Harrison Counties, which appear to be related to wastewater injection and hydraulic fracturing, respectively. In the Youngstown and Poland Township cases, focal mechanisms and double difference relocation using the cross-correlation matrix finds left-lateral faults striking roughly east-west near the top of the basement. We have also used template matching to determine isolated earthquakes near several other wastewater injection wells are unlikely to be induced based on a lack of similar/repeating sequences. Optimized template matching utilizes

  12. Fast full-search equivalent template matching by enhanced bounded correlation.

    PubMed

    Mattoccia, S; Tombari, F; Di Stefano, L

    2008-04-01

    We propose a novel algorithm, referred to as enhanced bounded correlation (EBC), that significantly reduces the number of computations required to carry out template matching based on normalized cross correlation (NCC) and yields exactly the same result as the full search algorithm. The algorithm relies on the concept of bounding the matching function: finding an efficiently computable upper bound of the NCC rapidly prunes those candidates that cannot provide a better NCC score with respect to the current best match. In this framework, we apply a succession of increasingly tighter upper bounding functions based on Cauchy-Schwarz inequality. Moreover, by including an online parameter prediction step into EBC, we obtain a parameter free algorithm that, in most cases, affords computational advantages very similar to those attainable by optimal offline parameter tuning. Experimental results show that the proposed algorithm can significantly accelerate a full-search equivalent template matching process and outperforms state-of-the-art methods.

  13. Performance of peaky template matching under additive white Gaussian noise and uniform quantization

    NASA Astrophysics Data System (ADS)

    Horvath, Matthew S.; Rigling, Brian D.

    2015-05-01

    Peaky template matching (PTM) is a special case of a general algorithm known as multinomial pattern matching originally developed for automatic target recognition of synthetic aperture radar data. The algorithm is a model- based approach that first quantizes pixel values into Nq = 2 discrete values yielding generative Beta-Bernoulli models as class-conditional templates. Here, we consider the case of classification of target chips in AWGN and develop approximations to image-to-template classification performance as a function of the noise power. We focus specifically on the case of a uniform quantization" scheme, where a fixed number of the largest pixels are quantized high as opposed to using a fixed threshold. This quantization method reduces sensitivity to the scaling of pixel intensities and quantization in general reduces sensitivity to various nuisance parameters difficult to account for a priori. Our performance expressions are verified using forward-looking infrared imagery from the Army Research Laboratory Comanche dataset.

  14. Hash-Based Line-by-Line Template Matching for Lossless Screen Image Coding.

    PubMed

    Xiulian Peng; Jizheng Xu

    2016-12-01

    Template matching (TM) was proposed in the literature a decade ago to efficiently remove non-local redundancies within an image without transmitting any overhead of displacement vectors. However, the large computational complexity introduced at both the encoder and the decoder, especially for a large search range, limits its widespread use. This paper proposes a hash-based line-by-line template matching (hLTM) for lossless screen image coding, where the non-local redundancy commonly exists in text and graphics parts. By hash-based search, it can largely reduce the search complexity of template matching without an accuracy degradation. Besides, the line-by-line template matching increases prediction accuracy by using a fine granularity. Experimental results show that the hLTM can significantly reduce both the encoding and decoding complexities by 68 and 23 times, respectively, compared with the traditional TM with a search radius of 128. Moreover, when compared with High Efficiency Video Coding screen content coding test model SCM-1.0, it can largely improve coding efficiency by up to 12.68% bits saving on screen contents with rich texts/graphics.

  15. Sparse representation based on local time-frequency template matching for bearing transient fault feature extraction

    NASA Astrophysics Data System (ADS)

    He, Qingbo; Ding, Xiaoxi

    2016-05-01

    The transients caused by the localized fault are important measurement information for bearing fault diagnosis. Thus it is crucial to extract the transients from the bearing vibration or acoustic signals that are always corrupted by a large amount of background noise. In this paper, an iterative transient feature extraction approach is proposed based on time-frequency (TF) domain sparse representation. The approach is realized by presenting a new method, called local TF template matching. In this method, the TF atoms are constructed based on the TF distribution (TFD) of the Morlet wavelet bases and local TF templates are formulated from the TF atoms for the matching process. The instantaneous frequency (IF) ridge calculated from the TFD of an analyzed signal provides the frequency parameter values for the TF atoms as well as an effective template matching path on the TF plane. In each iteration, local TF templates are employed to do correlation with the TFD of the analyzed signal along the IF ridge tube for identifying the optimum parameters of transient wavelet model. With this iterative procedure, transients can be extracted in the TF domain from measured signals one by one. The final signal can be synthesized by combining the extracted TF atoms and the phase of the raw signal. The local TF template matching builds an effective TF matching-based sparse representation approach with the merit of satisfying the native pulse waveform structure of transients. The effectiveness of the proposed method is verified by practical defective bearing signals. Comparison results also show that the proposed method is superior to traditional methods in transient feature extraction.

  16. Component extraction on CT volumes of assembled products using geometric template matching

    NASA Astrophysics Data System (ADS)

    Muramatsu, Katsutoshi; Ohtake, Yutaka; Suzuki, Hiromasa; Nagai, Yukie

    2017-03-01

    As a method of non-destructive internal inspection, X-ray computed tomography (CT) is used not only in medical applications but also for product inspection. Some assembled products can be divided into separate components based on density, which is known to be approximately proportional to CT values. However, components whose densities are similar cannot be distinguished using the CT value driven approach. In this study, we proposed a new component extraction algorithm from the CT volume, using a set of voxels with an assigned CT value with the surface mesh as the template rather than the density. The method has two main stages: rough matching and fine matching. At the rough matching stage, the position of candidate targets is identified roughly from the CT volume, using the template of the target component. At the fine matching stage, these candidates are precisely matched with the templates, allowing the correct position of the components to be detected from the CT volume. The results of two computational experiments showed that the proposed algorithm is able to extract components with similar density within the assembled products on CT volumes.

  17. Vehicle extraction from high-resolution satellite image using template matching

    NASA Astrophysics Data System (ADS)

    Natt, Dehchaiwong; Cao, Xiaoguang

    2015-12-01

    The process of vehicle examination by using satellite images is complicated and cumbersome process. At the present, the high definition satellite images are being used, however, the images of the vehicles can be seen as just a small point which is difficult to separate it out from the background that the image details are not sufficient to identify small objects. In this research, the techniques for the process of vehicle examination by using satellite images were applied by using image data from Pléiades which is the satellite image with high resolution of 0.40 m. The objective of this research is to study and develop the device for data extracting from satellite images, and the received data would be organized and created as Geospatial information by the concept of the picture matching with a pattern matching or Template Matching developed with Matlab program and Sum of Absolute Difference method collaborated with Neural Network technique in order to help evaluating pattern matching between template images of cars and cars' images which were used to examine from satellite images. The result obtained from the comparison with template data shows that data extraction accuracy is greater than 90%, and the extracted data can be imported into Geospatial information database. Moreover, the data can be displayed in Geospatial information Software, and it also can be searched by quantity condition and satellite image position.

  18. Stable and Transportable Seismic Yield Estimation from Full Envelope Template Matching

    NASA Astrophysics Data System (ADS)

    Yoo, S. H.; Mayeda, K. M.

    2015-12-01

    We have developed a transportable, empirically based, multi-frequency local template approach to estimate surface explosion yield from seismic stations that range between ~5 to 50-km distance with yield estimates within 50% of the design yield at 95% confidence using as few as 4 stations for events ranging between ~50 kg to 50 tons. Unlike the regional coda methodology developed by Mayeda et al. (2003) and the more recent synthetic template approach of Pasyanos et al. (2012), near-local coda is short in duration and not well modeled by the simple Aki-style (e.g., Aki 1969) exponential single-scattering formulation in a full-space. With only short durations and high-frequency waveforms at our disposal, our empirical full envelope template matching approach provides nearly an exact match to these complicated waveforms in the near-field. Moreover, as these templates capture the nuance of both the initial P-wave, S-wave and scattered counterparts, our method is very robust and provides roughly a factor of 3 times less yield measurement scatter than peak P-wave and integrated first P-wave estimates (e.g., Ford et al., 2014) and also utilizes a much larger amplitude signal. Time permitting, we will also consider near-regional recordings of surface explosions, Sayarim in Israel.

  19. How many templates for GW chirp detection? The minimal-match issue revisited

    NASA Astrophysics Data System (ADS)

    Croce, R. P.; Demma, Th; Longo, M.; Marano, S.; Matta, V.; Pierro, V.; Pinto, I. M.

    2004-11-01

    Refined estimates for the template density (or minimal match) yielding the best trade-off between a detector's performance (in terms of false dismissals) and computational burden are obtained in the context of the maximum likelihood of detection of gravitational wave chirps from coalescing binaries with unknown parameters, using a recently derived accurate representation of the no-signal cumulative distribution of the detection statistic.

  20. Distinguishing induced seismicity from natural seismicity in Ohio: Demonstrating the utility of waveform template matching

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2015-09-01

    This study investigated the utility of multistation waveform cross correlation to help discern induced seismicity. Template matching was applied to all Ohio earthquakes cataloged since the arrival of nearby EarthScope TA stations in late 2010. Earthquakes that were within 5 km of fluid injection activities in regions that lacked previously documented seismicity were found to be swarmy. Moreover, the larger number of events produced by template matching for these swarmy sequences made it easier to establish more detailed temporal and spatial relationships between the seismicity and fluid injection activities, which is typically required for an earthquake to be considered induced. Study results detected three previously documented induced sequences (Youngstown, Poland Township, and Harrison County) and provided evidence that suggests two additional cases of induced seismicity (Belmont/Guernsey County and Washington County). Evidence for these cases suggested that unusual swarm-like behaviors in regions that lack previously documented seismicity can be used to help distinguish induced seismicity, complementing the traditional identification of an anthropogenic source spatially and temporally correlated with the seismicity. In support of this finding, we identified 17 additional cataloged earthquakes in regions of previously documented seismicity and away from disposal wells or hydraulic fracturing that returned very few template matches. The lack of swarminess helps to indicate that these events are most likely naturally occurring.

  1. Matched filtering of gravitational waves from inspiraling compact binaries: Computational cost and template placement

    NASA Astrophysics Data System (ADS)

    Owen, Benjamin J.; Sathyaprakash, B. S.

    1999-07-01

    We estimate the number of templates, computational power, and storage required for a one-step matched filtering search for gravitational waves from inspiraling compact binaries. Our estimates for the one-step search strategy should serve as benchmarks for the evaluation of more sophisticated strategies such as hierarchical searches. We use a discrete family of two-parameter wave form templates based on the second post-Newtonian approximation for binaries composed of nonspinning compact bodies in circular orbits. We present estimates for all of the large- and mid-scale interferometers now under construction: LIGO (three configurations), VIRGO, GEO600, and TAMA. To search for binaries with components more massive than mmin=0.2Msolar while losing no more than 10% of events due to coarseness of template spacing, the initial LIGO interferometers will require about 1.0×1011 flops (floating point operations per second) for data analysis to keep up with data acquisition. This is several times higher than estimated in previous work by Owen, in part because of the improved family of templates and in part because we use more realistic (higher) sampling rates. Enhanced LIGO, GEO600, and TAMA will require computational power similar to initial LIGO. Advanced LIGO will require 7.8×1011 flops, and VIRGO will require 4.8×1012 flops to take full advantage of its broad target noise spectrum. If the templates are stored rather than generated as needed, storage requirements range from 1.5×1011 real numbers for TAMA to 6.2×1014 for VIRGO. The computational power required scales roughly as m-8/3min and the storage as m-13/3min. Since these scalings are perturbed by the curvature of the parameter space at second post-Newtonian order, we also provide estimates for a search with mmin=1Msolar. Finally, we sketch and discuss an algorithm for placing the templates in the parameter space.

  2. [Using neural networks based template matching method to obtain redshifts of normal galaxies].

    PubMed

    Xu, Xin; Luo, A-li; Wu, Fu-chao; Zhao, Yong-heng

    2005-06-01

    Galaxies can be divided into two classes: normal galaxy (NG) and active galaxy (AG). In order to determine NG redshifts, an automatic effective method is proposed in this paper, which consists of the following three main steps: (1) From the template of normal galaxy, the two sets of samples are simulated, one with the redshift of 0.0-0.3, the other of 0.3-0.5, then the PCA is used to extract the main components, and train samples are projected to the main component subspace to obtain characteristic spectra. (2) The characteristic spectra are used to train a Probabilistic Neural Network to obtain a Bayes classifier. (3) An unknown real NG spectrum is first inputted to this Bayes classifier to determine the possible range of redshift, then the template matching is invoked to locate the redshift value within the estimated range. Compared with the traditional template matching technique with an unconstrained range, our proposed method not only halves the computational load, but also increases the estimation accuracy. As a result, the proposed method is particularly useful for automatic spectrum processing produced from a large-scale sky survey project.

  3. Optimizing multi-station earthquake template matching through re-examination of the Youngstown, Ohio, sequence

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.; Levy, Jonathan

    2014-11-01

    A series of earthquakes in 2011 near Youngstown, OH, has been a focal point for discussions of seismicity induced by a nearby wastewater disposal well. Utilizing an efficient waveform template matching procedure, the optimal correlation template to study the Youngstown sequence was identified by varying parameters such as the stations utilized, frequency passband, and seismogram length. A catalog composed of 566 events was identified between January 2011 and February 2014. Double-difference relocation refines seismicity to a ˜800 m linear streak from the Northstar 1 injection well to the WSW along the same strike as the fault plane of the largest event. Calculated Gutenberg-Richter b-values are consistent with trends observed in other regions with seismicity induced by fluid injection.

  4. Dendritic cell recognition using template matching based on one-dimensional (1D) Fourier descriptors (FD)

    NASA Astrophysics Data System (ADS)

    Muhd Suberi, Anis Azwani; Wan Zakaria, Wan Nurshazwani; Tomari, Razali; Lau, Mei Xia

    2016-07-01

    Identification of Dendritic Cell (DC) particularly in the cancer microenvironment is a unique disclosure since fighting tumor from the harnessing immune system has been a novel treatment under investigation. Nowadays, the staining procedure in sorting DC can affect their viability. In this paper, a computer aided system is proposed for automatic classification of DC in peripheral blood mononuclear cell (PBMC) images. Initially, the images undergo a few steps in preprocessing to remove uneven illumination and artifacts around the cells. In segmentation, morphological operators and Canny edge are implemented to isolate the cell shapes and extract the contours. Following that, information from the contours are extracted based on Fourier descriptors, derived from one dimensional (1D) shape signatures. Eventually, cells are classified as DC by comparing template matching (TM) of established template and target images. The results show that the proposed scheme is reliable and effective to recognize DC.

  5. Perceptual judgements and chronic imaging of altered odour maps indicate comprehensive stimulus template matching in olfaction

    PubMed Central

    Bracey, Edward F.; Pichler, Bruno; Schaefer, Andreas T.; Wallace, Damian J.; Margrie, Troy W.

    2013-01-01

    Lesion experiments suggest that odour input to the olfactory bulb contains significant redundant signal such that rodents can discern odours using minimal stimulus-related information. Here we investigate the dependence of odour-quality perception on the integrity of glomerular activity by comparing odour-evoked activity maps before and after epithelial lesions. Lesions prevent mice from recognizing previously experienced odours and differentially delay discrimination learning of unrecognized and novel odour pairs. Poor recognition results not from mice experiencing an altered concentration of an odour but from perception of apparent novel qualities. Consistent with this, relative intensity of glomerular activity following lesions is altered compared with maps recorded in shams and by varying odour concentration. Together, these data show that odour recognition relies on comprehensively matching input patterns to a previously generated stimulus template. When encountering novel odours, access to all glomerular activity ensures rapid generation of new templates to perform accurate perceptual judgements. PMID:23820818

  6. Object Detection Based on Template Matching through Use of Best-So-Far ABC

    PubMed Central

    2014-01-01

    Best-so-far ABC is a modified version of the artificial bee colony (ABC) algorithm used for optimization tasks. This algorithm is one of the swarm intelligence (SI) algorithms proposed in recent literature, in which the results demonstrated that the best-so-far ABC can produce higher quality solutions with faster convergence than either the ordinary ABC or the current state-of-the-art ABC-based algorithm. In this work, we aim to apply the best-so-far ABC-based approach for object detection based on template matching by using the difference between the RGB level histograms corresponding to the target object and the template object as the objective function. Results confirm that the proposed method was successful in both detecting objects and optimizing the time used to reach the solution. PMID:24812556

  7. Template-matching based detection of hyperbolas in ground-penetrating radargrams for buried utilities

    NASA Astrophysics Data System (ADS)

    Sagnard, Florence; Tarel, Jean-Philippe

    2016-08-01

    Ground-penetrating radar (GPR) is a mature geophysical technique that is used to map utility pipelines buried within 1.5 m of the ground surface in the urban landscape. In this work, the template-matching algorithm has been originally applied to the detection and localization of pipe signatures in two perpendicular antenna polarizations. The processing of a GPR radargram is based on four main steps. The first step consists in defining a template, usually from finite-difference time-domain simulations, made of the nearby area of the hyperbola apex associated with the mean size object to be detected in the soil, whose mean permittivity has been previously experimentally estimated. In the second step, the raw radargram is pre-processed to correct variations due to antenna coupling, then the template matching algorithm is used to detect and localize individual hyperbola signatures in an environment containing unwanted reflections, noise and overlapping signatures. The distance between the shifted template and a local zone in the radargram, based on the L1 norm, allows us to obtain a map of distances. A user-defined threshold allows us to select a reduced number of zones having a high similarity measure. In the third step, minimum or maximum discrete amplitudes belonging to a selected hyperbola curve are semi-automatically extracted in each zone. In the fourth step, the discrete hyperbola data (i, j) are fitted by a parametric hyperbola model using a non-linear least squares criterion. The algorithm was implemented and evaluated on numerical radargrams, and afterwards on experimental radargrams.

  8. Hybrid segmentation of mass in mammograms using template matching and dynamic programming.

    PubMed

    Song, Enmin; Xu, Shengzhou; Xu, Xiangyang; Zeng, Jianye; Lan, Yihua; Zhang, Shenyi; Hung, Chih-Cheng

    2010-11-01

    Accurate image segmentation for breast lesions is a critical step in computer-aided diagnosis systems. The objective of this study was to develop a robust method for the automatic segmentation of breast masses on mammograms to extract feasible features for computer-aided diagnosis systems. The data set used in this study consisted of 483 regions of interest extracted from 328 patients. A hybrid method for segmenting breast masses was proposed on the basis of the template-matching and dynamic programming techniques. First, a template-matching technique was used to locate and obtain the rough region of masses. Then, on the basis of this rough region, a local cost function for dynamic programming was defined. Finally, the optimal contour was derived by applying dynamic programming as an optimization technique. The performance of this proposed segmentation method was evaluated using area-based and boundary distance-based similarity measures based on radiologists' manually marked annotations. A comparison with three different segmentation algorithms on the data set was provided. The mean overlap percentage for our proposed hybrid method was 0.727 ± 0.127, whereas those for Timp and Karssemeijer's dynamic programming method, Song et al's plane-fitting and dynamic programming method, and the normalized cut segmentation method were 0.657 ± 0.216, 0.636 ± 0.190, and 0.562 ± 0.199, respectively. All P values for the measure distribution of our proposed method and the other three algorithms were <.001. A hybrid method based on the template-matching and dynamic programming techniques was proposed to segment breast masses on mammograms. Evaluation results indicate that the proposed segmentation method can improve the accuracy of mass segmentation compared to three other algorithms. The proposed segmentation method shows better performance and has great potential in improving the accuracy of computer-aided diagnosis systems in interpreting mammograms. Copyright © 2010 AUR

  9. SU-E-J-108: Template Matching Based On Multiple Templates Can Improve the Tumor Tracking Performance When There Is Large Tumor Deformation

    SciTech Connect

    Shi, X; Lin, J; Diwanji, T; Mooney, K; D'Souza, W; Mistry, N

    2014-06-01

    Purpose: Recently, template matching has been shown to be able to track tumor motion on cine-MRI images. However, artifacts such as deformation, rotation, and/or out-of-plane movement could seriously degrade the performance of this technique. In this work, we demonstrate the utility of multiple templates derived from different phases of tumor motion in reducing the negative effects of artifacts and improving the accuracy of template matching methods. Methods: Data from 2 patients with large tumors and significant tumor deformation were analyzed from a group of 12 patients from an earlier study. Cine-MRI (200 frames) imaging was performed while the patients were instructed to breathe normally. Ground truth tumor position was established on each frame manually by a radiation oncologist. Tumor positions were also automatically determined using template matching with either single or multiple (5) templates. The tracking errors, defined as the absolute differences in tumor positions determined by the manual and automated methods, when using either single or multiple templates were compared in both the AP and SI directions, respectively. Results: Using multiple templates reduced the tracking error of template matching. In the SI direction where the tumor movement and deformation were significant, the mean tracking error decreased from 1.94 mm to 0.91 mm (Patient 1) and from 6.61 mm to 2.06 mm (Patient 2). In the AP direction where the tumor movement was small, the reduction of the mean tracking error was significant in Patient 1 (from 3.36 mm to 1.04 mm), but not in Patient 2 ( from 3.86 mm to 3.80 mm). Conclusion: This study shows the effectiveness of using multiple templates in improving the performance of template matching when artifacts like large tumor deformation or out-of-plane motion exists. Accurate tumor tracking capabilities can be integrated with MRI guided radiation therapy systems. This work was supported in part by grants from NIH/NCI CA 124766 and Varian

  10. Lattice and strain analysis of atomic resolution Z-contrast images based on template matching.

    PubMed

    Zuo, Jian-Min; Shah, Amish B; Kim, Honggyu; Meng, Yifei; Gao, Wenpei; Rouviére, Jean-Luc

    2014-01-01

    A real space approach is developed based on template matching for quantitative lattice analysis using atomic resolution Z-contrast images. The method, called TeMA, uses the template of an atomic column, or a group of atomic columns, to transform the image into a lattice of correlation peaks. This is helped by using a local intensity adjusted correlation and by the design of templates. Lattice analysis is performed on the correlation peaks. A reference lattice is used to correct for scan noise and scan distortions in the recorded images. Using these methods, we demonstrate that a precision of few picometers is achievable in lattice measurement using aberration corrected Z-contrast images. For application, we apply the methods to strain analysis of a molecular beam epitaxy (MBE) grown LaMnO₃ and SrMnO₃ superlattice. The results show alternating epitaxial strain inside the superlattice and its variations across interfaces at the spatial resolution of a single perovskite unit cell. Our methods are general, model free and provide high spatial resolution for lattice analysis.

  11. Appearance and incomplete label matching for diffeomorphic template based hippocampus segmentation.

    PubMed

    Pluta, John; Avants, Brian B; Glynn, Simon; Awate, Suyash; Gee, James C; Detre, John A

    2009-06-01

    We present a robust, high-throughput, semiautomated template-based protocol for segmenting the hippocampus in temporal lobe epilepsy. The semiautomated component of this approach, which minimizes user effort while maximizing the benefit of human input to the algorithm, relies on "incomplete labeling." Incomplete labeling requires the user to quickly and approximately segment a few key regions of the hippocampus through a user-interface. Subsequently, this partial labeling of the hippocampus is combined with image similarity terms to guide volumetric diffeomorphic normalization between an individual brain and an unbiased disease-specific template, with fully labeled hippocampi. We solve this many-to-few and few-to-many matching problem, and gain robustness to inter and intrarater variability and small errors in user labeling, by embedding the template-based normalization within a probabilistic framework that examines both label geometry and appearance data at each label. We evaluate the reliability of this framework with respect to manual labeling and show that it increases minimum performance levels relative to fully automated approaches and provides high inter-rater reliability. Thus, this approach does not require expert neuroanatomical training and is viable for high-throughput studies of both the normal and the highly atrophic hippocampus. 2009 Wiley-Liss, Inc.

  12. Continuous detection of cerebral vasodilatation and vasoconstriction using intracranial pulse morphological template matching.

    PubMed

    Asgari, Shadnaz; Gonzalez, Nestor; Subudhi, Andrew W; Hamilton, Robert; Vespa, Paul; Bergsneider, Marvin; Roach, Robert C; Hu, Xiao

    2012-01-01

    Although accurate and continuous assessment of cerebral vasculature status is highly desirable for managing cerebral vascular diseases, no such method exists for current clinical practice. The present work introduces a novel method for real-time detection of cerebral vasodilatation and vasoconstriction using pulse morphological template matching. Templates consisting of morphological metrics of cerebral blood flow velocity (CBFV) pulse, measured at middle cerebral artery using Transcranial Doppler, are obtained by applying a morphological clustering and analysis of intracranial pulse algorithm to the data collected during induced vasodilatation and vasoconstriction in a controlled setting. These templates were then employed to define a vasodilatation index (VDI) and a vasoconstriction index (VCI) for any inquiry data segment as the percentage of the metrics demonstrating a trend consistent with those obtained from the training dataset. The validation of the proposed method on a dataset of CBFV signals of 27 healthy subjects, collected with a similar protocol as that of training dataset, during hyperventilation (and CO₂ rebreathing tests) shows a sensitivity of 92% (and 82%) for detection of vasodilatation (and vasoconstriction) and the specificity of 90% (and 92%), respectively. Moreover, the proposed method of detection of vasodilatation (vasoconstriction) is capable of rejecting all the cases associated with vasoconstriction (vasodilatation) and outperforms other two conventional techniques by at least 7% for vasodilatation and 19% for vasoconstriction.

  13. Continuous Detection of Cerebral Vasodilatation and Vasoconstriction Using Intracranial Pulse Morphological Template Matching

    PubMed Central

    Asgari, Shadnaz; Gonzalez, Nestor; Subudhi, Andrew W.; Hamilton, Robert; Vespa, Paul; Bergsneider, Marvin; Roach, Robert C.; Hu, Xiao

    2012-01-01

    Although accurate and continuous assessment of cerebral vasculature status is highly desirable for managing cerebral vascular diseases, no such method exists for current clinical practice. The present work introduces a novel method for real-time detection of cerebral vasodilatation and vasoconstriction using pulse morphological template matching. Templates consisting of morphological metrics of cerebral blood flow velocity (CBFV) pulse, measured at middle cerebral artery using Transcranial Doppler, are obtained by applying a morphological clustering and analysis of intracranial pulse algorithm to the data collected during induced vasodilatation and vasoconstriction in a controlled setting. These templates were then employed to define a vasodilatation index (VDI) and a vasoconstriction index (VCI) for any inquiry data segment as the percentage of the metrics demonstrating a trend consistent with those obtained from the training dataset. The validation of the proposed method on a dataset of CBFV signals of 27 healthy subjects, collected with a similar protocol as that of training dataset, during hyperventilation (and CO2 rebreathing tests) shows a sensitivity of 92% (and 82%) for detection of vasodilatation (and vasoconstriction) and the specificity of 90% (and 92%), respectively. Moreover, the proposed method of detection of vasodilatation (vasoconstriction) is capable of rejecting all the cases associated with vasoconstriction (vasodilatation) and outperforms other two conventional techniques by at least 7% for vasodilatation and 19% for vasoconstriction. PMID:23226385

  14. Contour detection of atherosclerotic plaques in IVUS images using ellipse template matching and particle swarm optimization.

    PubMed

    Zhang, Qi; Wang, Yuanyuan; Ma, Jianying; Shi, Jun

    2011-01-01

    It is valuable for diagnosis of atherosclerosis to detect lumen and media-adventitia contours in intravascular ultrasound (IVUS) images of atherosclerotic plaques. In this paper, a method for contour detection of plaques is proposed utilizing the prior knowledge of elliptic geometry of plaques. Contours are initialized as ellipses by using ellipse template matching, where a matching function is maximized by particle swarm optimization. Then the contours are refined by boundary vector field snakes. The method was evaluated via 88 in vivo images from 21 patients. It outperformed a state-of-the-art method by 3.8 pixels and 4.8% in terms of the mean distance error and relative mean distance error, respectively.

  15. Semi-automatic template matching based extraction of hyperbolic signatures in ground-penetrating radar images

    NASA Astrophysics Data System (ADS)

    Sagnard, Florence; Tarel, Jean-Philippe

    2015-04-01

    In civil engineering applications, ground-penetrating radar (GPR) is one of the main non destructive technique based on the refraction and reflection of electromagnetic waves to probe the underground and particularly detect damages (cracks, delaminations, texture changes…) and buried objects (utilities, rebars…). An UWB ground-coupled radar operating in the frequency band [0.46;4] GHz and made of bowtie slot antennas has been used because, comparing to a air-launched radar, it increases energy transfer of electromagnetic radiation in the sub-surface and penetration depth. This paper proposes an original adaptation of the generic template matching algorithm to GPR images to recognize, localize and characterize with parameters a specific pattern associated with a hyperbola signature in the two main polarizations. The processing of a radargram (Bscan) is based on four main steps. The first step consists in pre-processing and scaling. The second step uses template matching to isolate and localize individual hyperbola signatures in an environment containing unwanted reflections, noise and overlapping signatures. The algorithm supposes to generate and collect a set of reference hyperbola templates made of a small reflection pattern in the vicinity of the apex in order to further analyze multiple time signals of embedded targets in an image. The standard Euclidian distance between the template shifted and a local zone in the radargram allows to obtain a map of distances. A user-defined threshold allows to select a reduced number of zones having a high similarity measure. In a third step, each zone is analyzed to detect minimum or maximum discrete amplitudes belonging to the first arrival times of a hyperbola signature. In the fourth step, the extracted discrete data (i,j) are fitted by a parametric hyperbola modeling based on the straight ray path hypothesis and using a constraint least square criterion associated with parameter ranges, that are the position, the

  16. Classification of electron sub-tomograms with neural networks and its application to template-matching.

    PubMed

    Yu, Zhou; Frangakis, Achilleas S

    2011-06-01

    Classification of electron sub-tomograms is a challenging task, due the missing-wedge and the low signal-to-noise ratio of the data. Classification algorithms tend to classify data according to their orientation to the missing-wedge, rather than to the underlying signal. Here we use a neural network approach, called the Kernel Density Estimator Self-Organizing Map (KerDenSOM3D), which we have implemented in three-dimensions (3D), also having compensated for the missing-wedge, and we comprehensively compare it to other classification methods. For this purpose, we use various simulated macromolecules, as well as tomographically reconstructed in vitro GroEL and GroEL/GroES molecules. We show that the performance of this classification method is superior to previously used algorithms. Furthermore, we show how this algorithm can be used to provide an initial cross-validation of template-matching approaches. For the example of sub-tomogram classification extracted from cellular tomograms of Mycoplasma pneumonia and Spiroplasma melliferum cells, we show the bias of template-matching, and by using differing search and classification areas, we demonstrate how the bias can be significantly reduced.

  17. Classification of normal and arrhythmic ECG using wavelet transform based template-matching technique.

    PubMed

    Hassan, Wajahat; Saleem, Saqib; Habib, Aamir

    2017-06-01

    To propose a wavelet-based template matching technique to extract features for automatic classification of electrocardiogram signals of normal and arrhythmic individuals. The study was conducted from December 2014 to December 2015 at the Department of Electrical Engineering, Institute of Space Technology, Islamabad, Pakistan. Electrocardiogram signals analysed in this study were taken from the freely available database www.physionet.org. The data for normal subjects was taken from the Massachusetts Institute of Technology-Beth Israel Hospital's normal sinus rhythm database and data for diseased subjects was taken from the arrhythmia database. Of the 30 subjects, there were 15(50%) normal and 15(50%) diseased subjects. The group-averaged phase difference indices of arrhythmic subjects were significantly larger than that of normal individuals (p<0.05) within the frequency range of 0.9-1.1 Hz. Moreover, the scatter plot between the phase difference index and magnitude of wavelet cross-spectrum for frequency range of 0.9-1.1 Hz demonstrated a satisfactory delineation between normal and arrhythmic individuals. Wavelet decomposition-based template matching technique achieved satisfactory delineation of normal and arrhythmic electrocardiogram dynamics.

  18. Comparison of template-matching and singular-spectrum-analysis methods for imaging implanted brachytherapy seeds.

    PubMed

    Alam, S Kaisar; Mamou, Jonathan; Feleppa, Ernest J; Kalisz, Andrew; Ramachandran, Sarayu

    2011-11-01

    Brachytherapy using small implanted radioactive seeds is becoming an increasingly popular method for treating prostate cancer, in which a radiation oncologist implants seeds in the prostate transperineally under ultrasound guidance. Dosimetry software determines the optimal placement of seeds for achieving the prescribed dose based on ultrasonic determination of the gland boundaries. However, because of prostate movement and distortion during the implantation procedure, some seeds may not be placed in the desired locations; this causes the delivered dose to differ from the prescribed dose. Current ultrasonic imaging methods generally cannot depict the implanted seeds accurately. We are investigating new ultrasonic imaging methods that show promise for enhancing the visibility of seeds and thereby enabling real-time detection and correction of seed-placement errors during the implantation procedure. Real-time correction of seed-placement errors will improve the therapeutic radiation dose delivered to target tissues. In this work, we compare the potential performance of a template-matching method and a previously published method based on singular spectrum analysis for imaging seeds. In particular, we evaluated how changes in seed angle and position relative to the ultrasound beam affect seed detection. The conclusion of the present study is that singular spectrum analysis has better sensitivity but template matching is more resistant to false positives; both perform well enough to make seed detection clinically feasible over a relevant range of angles and positions. Combining the information provided by the two methods may further reduce ambiguities in determining where seeds are located.

  19. Application of Template Matching for Improving Classification of Urban Railroad Point Clouds

    PubMed Central

    Arastounia, Mostafa; Oude Elberink, Sander

    2016-01-01

    This study develops an integrated data-driven and model-driven approach (template matching) that clusters the urban railroad point clouds into three classes of rail track, contact cable, and catenary cable. The employed dataset covers 630 m of the Dutch urban railroad corridors in which there are four rail tracks, two contact cables, and two catenary cables. The dataset includes only geometrical information (three dimensional (3D) coordinates of the points) with no intensity data and no RGB data. The obtained results indicate that all objects of interest are successfully classified at the object level with no false positives and no false negatives. The results also show that an average 97.3% precision and an average 97.7% accuracy at the point cloud level are achieved. The high precision and high accuracy of the rail track classification (both greater than 96%) at the point cloud level stems from the great impact of the employed template matching method on excluding the false positives. The cables also achieve quite high average precision (96.8%) and accuracy (98.4%) due to their high sampling and isolated position in the railroad corridor. PMID:27973452

  20. Application of Template Matching for Improving Classification of Urban Railroad Point Clouds.

    PubMed

    Arastounia, Mostafa; Oude Elberink, Sander

    2016-12-12

    This study develops an integrated data-driven and model-driven approach (template matching) that clusters the urban railroad point clouds into three classes of rail track, contact cable, and catenary cable. The employed dataset covers 630 m of the Dutch urban railroad corridors in which there are four rail tracks, two contact cables, and two catenary cables. The dataset includes only geometrical information (three dimensional (3D) coordinates of the points) with no intensity data and no RGB data. The obtained results indicate that all objects of interest are successfully classified at the object level with no false positives and no false negatives. The results also show that an average 97.3% precision and an average 97.7% accuracy at the point cloud level are achieved. The high precision and high accuracy of the rail track classification (both greater than 96%) at the point cloud level stems from the great impact of the employed template matching method on excluding the false positives. The cables also achieve quite high average precision (96.8%) and accuracy (98.4%) due to their high sampling and isolated position in the railroad corridor.

  1. Automatic identification of resting state networks: an extended version of multiple template-matching

    NASA Astrophysics Data System (ADS)

    Guaje, Javier; Molina, Juan; Rudas, Jorge; Demertzi, Athena; Heine, Lizette; Tshibanda, Luaba; Soddu, Andrea; Laureys, Steven; Gómez, Francisco

    2015-12-01

    Functional magnetic resonance imaging in resting state (fMRI-RS) constitutes an informative protocol to investigate several pathological and pharmacological conditions. A common approach to study this data source is through the analysis of changes in the so called resting state networks (RSNs). These networks correspond to well-defined functional entities that have been associated to different low and high brain order functions. RSNs may be characterized by using Independent Component Analysis (ICA). ICA provides a decomposition of the fMRI-RS signal into sources of brain activity, but it lacks of information about the nature of the signal, i.e., if the source is artifactual or not. Recently, a multiple template-matching (MTM) approach was proposed to automatically recognize RSNs in a set of Independent Components (ICs). This method provides valuable information to assess subjects at individual level. Nevertheless, it lacks of a mechanism to quantify how much certainty there is about the existence/absence of each network. This information may be important for the assessment of patients with severely damaged brains, in which RSNs may be greatly affected as a result of the pathological condition. In this work we propose a set of changes to the original MTM that improves the RSNs recognition task and also extends the functionality of the method. The key points of this improvement is a standardization strategy and a modification of method's constraints that adds flexibility to the approach. Additionally, we also introduce an analysis to the trustworthiness measurement of each RSN obtained by using template-matching approach. This analysis consists of a thresholding strategy applied over the computed Goodness-of-Fit (GOF) between the set of templates and the ICs. The proposed method was validated on 2 two independent studies (Baltimore, 23 healthy subjects and Liege, 27 healthy subjects) with different configurations of MTM. Results suggest that the method will provide

  2. An improved finger-vein recognition algorithm based on template matching

    NASA Astrophysics Data System (ADS)

    Liu, Yueyue; Di, Si; Jin, Jian; Huang, Daoping

    2016-10-01

    Finger-vein recognition has became the most popular biometric identify methods. The investigation on the recognition algorithms always is the key point in this field. So far, there are many applicable algorithms have been developed. However, there are still some problems in practice, such as the variance of the finger position which may lead to the image distortion and shifting; during the identification process, some matching parameters determined according to experience may also reduce the adaptability of algorithm. Focus on above mentioned problems, this paper proposes an improved finger-vein recognition algorithm based on template matching. In order to enhance the robustness of the algorithm for the image distortion, the least squares error method is adopted to correct the oblique finger. During the feature extraction, local adaptive threshold method is adopted. As regard as the matching scores, we optimized the translation preferences as well as matching distance between the input images and register images on the basis of Naoto Miura algorithm. Experimental results indicate that the proposed method can improve the robustness effectively under the finger shifting and rotation conditions.

  3. TMaCS: a hybrid template matching and classification system for partially-automated particle selection.

    PubMed

    Zhao, Jianhua; Brubaker, Marcus A; Rubinstein, John L

    2013-03-01

    Selection of particle images from electron micrographs presents a bottleneck in determining the structures of macromolecular assemblies by single particle electron cryomicroscopy (cryo-EM). The problem is particularly important when an experimentalist wants to improve the resolution of a 3D map by increasing by tens or hundreds of thousands of images the size of the dataset used for calculating the map. Although several existing methods for automatic particle image selection work well for large protein complexes that produce high-contrast images, it is well known in the cryo-EM community that small complexes that give low-contrast images are often refractory to existing automated particle image selection schemes. Here we develop a method for partially-automated particle image selection when an initial 3D map of the protein under investigation is already available. Candidate particle images are selected from micrographs by template matching with template images derived from projections of the existing 3D map. The candidate particle images are then used to train a support vector machine, which classifies the candidates as particle images or non-particle images. In a final step in the analysis, the selected particle images are subjected to projection matching against the initial 3D map, with the correlation coefficient between the particle image and the best matching map projection used to assess the reliability of the particle image. We show that this approach is able to rapidly select particle images from micrographs of a rotary ATPase, a type of membrane protein complex involved in many aspects of biology.

  4. Real-time tracking by double templates matching based on timed motion history image with HSV feature.

    PubMed

    Li, Zhiyong; Li, Pengfei; Yu, Xiaoping; Hashem, Mervat

    2014-01-01

    It is a challenge to represent the target appearance model for moving object tracking under complex environment. This study presents a novel method with appearance model described by double templates based on timed motion history image with HSV color histogram feature (tMHI-HSV). The main components include offline template and online template initialization, tMHI-HSV-based candidate patches feature histograms calculation, double templates matching (DTM) for object location, and templates updating. Firstly, we initialize the target object region and calculate its HSV color histogram feature as offline template and online template. Secondly, the tMHI-HSV is used to segment the motion region and calculate these candidate object patches' color histograms to represent their appearance models. Finally, we utilize the DTM method to trace the target and update the offline template and online template real-timely. The experimental results show that the proposed method can efficiently handle the scale variation and pose change of the rigid and nonrigid objects, even in illumination change and occlusion visual environment.

  5. Characterization of Cerebral Vascular Response to EEG Bursts Using ICP Pulse Waveform Template Matching.

    PubMed

    Connolly, Mark; Vespa, Paul; Hu, Xiao

    2016-01-01

    Neurovascular coupling is the relationship between the activity of the brain and the subsequent change in blood flow to the active region. The most common methods of detecting neurovascular coupling are cumbersome and noncontinuous. However, the integration of intracranial pressure (ICP) and electroencephalography (EEG) may serve as an indirect measure of neurovascular coupling.This study used data collected from burst-suppressed patients who received both ICP and depth EEG monitoring. An adaptive thresholding algorithm was used to detect the start and end of each EEG burst. The morphological clustering and analysis of ICP and pulse morphological template-matching algorithms were then applied to derive several metrics describing the shape of the ICP pulse waveform and track how it changed following an EEG burst. These changes were compared using a template obtained from patients undergoing CO2-induced vasodilation.All segments exhibited a significant period of vasodilation within 1-2 s after burst, and 4 of 5 had a significant period of vasoconstriction within 4-11 s of the EEG burst, suggesting that there might be a characteristic response of vasodilation and subsequent vasoconstriction after a spontaneous EEG burst. Furthermore, these findings demonstrate the potential of integrated EEG and ICP as an indirect measure of neurovascular coupling.

  6. Two dimensional template matching method for buried object discrimination in GPR data

    NASA Astrophysics Data System (ADS)

    Sezgin, Mehmet

    2009-05-01

    In this study discrimination of two different metallic object classes were studied, utilizing Ground Penetrating Radar (GPR). Feature sets of both classes have almost the same information for both Metal Detector (MD) and GPR data. There were no evident features those are easily discriminate classes. Background removal has been applied to original B-Scan data and then a normalization process was performed. Image thresholding was applied to segment B-Scan GPR images. So, main hyperbolic shape of buried object reflection was extracted and then a morphological process was performed optionally. Templates of each class representatives have been obtained and they were searched whether they match with true class or not. Two data sets were examined experimentally. Actually they were obtained in different time and burial for the same objects. Considerably high discrimination performance was obtained which was not possible by using individual Metal Detector data.

  7. Accurate three-dimensional pose recognition from monocular images using template matched filtering

    NASA Astrophysics Data System (ADS)

    Picos, Kenia; Diaz-Ramirez, Victor H.; Kober, Vitaly; Montemayor, Antonio S.; Pantrigo, Juan J.

    2016-06-01

    An accurate algorithm for three-dimensional (3-D) pose recognition of a rigid object is presented. The algorithm is based on adaptive template matched filtering and local search optimization. When a scene image is captured, a bank of correlation filters is constructed to find the best correspondence between the current view of the target in the scene and a target image synthesized by means of computer graphics. The synthetic image is created using a known 3-D model of the target and an iterative procedure based on local search. Computer simulation results obtained with the proposed algorithm in synthetic and real-life scenes are presented and discussed in terms of accuracy of pose recognition in the presence of noise, cluttered background, and occlusion. Experimental results show that our proposal presents high accuracy for 3-D pose estimation using monocular images.

  8. Pose detection of a 3D object using template matched filtering

    NASA Astrophysics Data System (ADS)

    Picos, Kenia; Díaz-Ramírez, Víctor H.

    2016-09-01

    The problem of 3D pose recognition of a rigid object is difficult to solve because the pose in a 3D space can vary with multiple degrees of freedom. In this work, we propose an accurate method for 3D pose estimation based on template matched filtering. The proposed method utilizes a bank of space-variant filters which take into account different pose states of the target and local statistical properties of the input scene. The state parameters of location coordinates, orientation angles, and scaling parameters of the target are estimated with high accuracy in the input scene. Experimental tests are performed for real and synthetic scenes. The proposed system yields good performance for 3D pose recognition in terms of detection efficiency, location and orientation errors.

  9. [Development of automatic navigation measuring system using template-matching software in image guided neurosurgery].

    PubMed

    Watanabe, Yohei; Hayashi, Yuichiro; Fujii, Masazumi; Kimura, Miyuki; Sugiura, Akihiro; Tsuzaka, Masatoshi; Wakabayashi, Toshihiko

    2010-02-20

    An image-guided neurosurgery and neuronavigation system based on magnetic resonance imaging has been used as an indispensable tool for resection of brain tumors. Therefore, accuracy of the neuronavigation system, provided by periodic quality assurance (QA), is essential for image-guided neurosurgery. Two types of accuracy index, fiducial registration error (FRE) and target registration error (TRE), have been used to evaluate navigation accuracy. FRE shows navigation accuracy on points that have been registered. On the other hand, TRE shows navigation accuracy on points such as tumor, skin, and fiducial markers. This study shows that TRE is more reliable than FRE. However, calculation of TRE is a time-consuming, subjective task. Software for QA was developed to compute TRE. This software calculates TRE automatically by an image processing technique, such as automatic template matching. TRE was calculated by the software and compared with the results obtained by manual calculation. Using the software made it possible to achieve a reliable QA system.

  10. The Elementary Operations of Human Vision Are Not Reducible to Template-Matching

    PubMed Central

    Neri, Peter

    2015-01-01

    It is generally acknowledged that biological vision presents nonlinear characteristics, yet linear filtering accounts of visual processing are ubiquitous. The template-matching operation implemented by the linear-nonlinear cascade (linear filter followed by static nonlinearity) is the most widely adopted computational tool in systems neuroscience. This simple model achieves remarkable explanatory power while retaining analytical tractability, potentially extending its reach to a wide range of systems and levels in sensory processing. The extent of its applicability to human behaviour, however, remains unclear. Because sensory stimuli possess multiple attributes (e.g. position, orientation, size), the issue of applicability may be asked by considering each attribute one at a time in relation to a family of linear-nonlinear models, or by considering all attributes collectively in relation to a specified implementation of the linear-nonlinear cascade. We demonstrate that human visual processing can operate under conditions that are indistinguishable from linear-nonlinear transduction with respect to substantially different stimulus attributes of a uniquely specified target signal with associated behavioural task. However, no specific implementation of a linear-nonlinear cascade is able to account for the entire collection of results across attributes; a satisfactory account at this level requires the introduction of a small gain-control circuit, resulting in a model that no longer belongs to the linear-nonlinear family. Our results inform and constrain efforts at obtaining and interpreting comprehensive characterizations of the human sensory process by demonstrating its inescapably nonlinear nature, even under conditions that have been painstakingly fine-tuned to facilitate template-matching behaviour and to produce results that, at some level of inspection, do conform to linear filtering predictions. They also suggest that compliance with linear transduction may be

  11. Accuracy of Three-Dimensional Printed Templates for Guided Implant Placement Based on Matching a Surface Scan with CBCT.

    PubMed

    Kernen, Florian; Benic, Goran I; Payer, Michael; Schär, Alex; Müller-Gerbl, Magdalena; Filippi, Andreas; Kühl, Sebastian

    2016-08-01

    Reference elements are necessary to transfer a virtual planning into reality for guided implant placement. New systems allow matching optical scans with three-dimensional radiographic images. To test whether digitally designed three-dimensional printed templates (D-temp) fabricated by matching surface scans and cone beam computed tomography (CBCT) images differ from the templates fabricated in-lab (L-temp) by using a physical transfer device for the positioning of the guiding sleeves. L-temp were fabricated for eight human lower cadaver-jaws applying a digital planning software program (smop, Swissmeda AG, Zürich, Switzerland) using a Lego® (Lego Group, KIRKBI A/S, Billund, Denmark) brick as reference element and the respective transfer device (X1-table). Additionally, digital templates (D-temp) using the identical planning data sets and software were virtually designed and three-dimensional printed, after matching a surface scan with CBCT data. The accuracy of both templates for each planning was evaluated determining the estimated coronal, apical, and angular deviation if templates were used for implant placement. Mean coronal deviations for L-temp were 0.31 mm (mesial/distal), 0.32 mm (lingual/buccal), and 0.16 mm and 0.23 mm for D-temp, respectively. The mean apical deviations for L-temp were 0.50 mm (mesial/distal), 0.50 mm (lingual/buccal). and 0.25 mm and 0.34 mm for the D-temp, respectively. Differences between both devices were statistically significant (p < .05). A higher accuracy of implant placement can be achieved by using three-dimensional printed templates produced by matching a surface scan and CBCT as compared with templates which use physical elements transferring the virtual planning into reality. © 2015 Wiley Periodicals, Inc.

  12. Evaluation of template matching for tumor motion management with cine-MR images in lung cancer patients

    PubMed Central

    Shi, Xiutao; Diwanji, Tejan; Mooney, Karen E.; Lin, Jolinta; Feigenberg, Steven; D’Souza, Warren D.; Mistry, Nilesh N.

    2014-01-01

    Purpose: Accurate determination of tumor position is crucial for successful application of motion compensated radiotherapy in lung cancer patients. This study tested the performance of an automated template matching algorithm in tracking the tumor position on cine-MR images by examining the tracking error and further comparing the tracking error to the interoperator variability of three human reviewers. Methods: Cine-MR images of 12 lung cancer patients were analyzed. Tumor positions were determined both automatically with template matching and manually by a radiation oncologist and two additional reviewers trained by the radiation oncologist. Performance of the automated template matching was compared against the ground truth established by the radiation oncologist. Additionally, the tracking error of template matching, defined as the difference in the tumor positions determined with template matching and the ground truth, was investigated and compared to the interoperator variability for all patients in the anterior-posterior (AP) and superior-inferior (SI) directions, respectively. Results: The median tracking error for ten out of the 12 patients studied in both the AP and SI directions was less than 1 pixel (= 1.95 mm). Furthermore, the median tracking error for seven patients in the AP direction and nine patients in the SI direction was less than half a pixel (= 0.975 mm). The median tracking error was positively correlated with the tumor motion magnitude in both the AP (R = 0.55, p = 0.06) and SI (R = 0.67, p = 0.02) directions. Also, a strong correlation was observed between tracking error and interoperator variability (y = 0.26 + 1.25x, R = 0.84, p < 0.001) with the latter larger. Conclusions: Results from this study indicate that the performance of template matching is comparable with or better than that of manual tumor localization. This study serves as preliminary investigations towards developing online motion tracking techniques for hybrid MRI

  13. SpikeGUI: Software for Rapid Interictal Discharge Annotation via Template Matching and Online Machine Learning

    PubMed Central

    Jin, Jing; Dauwels, Justin; Cash, Sydney; Westover, M. Brandon

    2015-01-01

    Detection of interictal discharges is a key element of interpreting EEGs during the diagnosis and management of epilepsy. Because interpretation of clinical EEG data is time-intensive and reliant on experts who are in short supply, there is a great need for automated spike detectors. However, attempts to develop general-purpose spike detectors have so far been severely limited by a lack of expert-annotated data. Huge databases of interictal discharges are therefore in great demand for the development of general-purpose detectors. Detailed manual annotation of interictal discharges is time consuming, which severely limits the willingness of experts to participate. To address such problems, a graphical user interface “SpikeGUI” was developed in our work for the purposes of EEG viewing and rapid interictal discharge annotation. “SpikeGUI” substantially speeds up the task of annotating interictal discharges using a custom-built algorithm based on a combination of template matching and online machine learning techniques. While the algorithm is currently tailored to annotation of interictal epileptiform discharges, it can easily be generalized to other waveforms and signal types. PMID:25570976

  14. CAD System for Pulmonary Nodule Detection Using Gabor Filtering and Template Matching

    NASA Astrophysics Data System (ADS)

    Bastawrous, Hany Ayad; Nitta, Norihisa; Tsudagawa, Masaru

    This paper aims at developing a Computer Aided Diagnosis (CAD) system used for the detection of pulmonary nodules in chest Computed Tomography (CT) images. These lung nodules include both solid nodules and Ground Glass Opacity (GGO) nodules. In our scheme, we apply Gabor filter on the CT image in order to enhance the detection process. After this we perform some morphological operations including threshold process and labeling to extract all the objects inside the lung area. Then, some feature analysis is used to examine these objects to decide which of them are likely to be potential cancer candidates. Following the feature examination, a template matching between the potential cancer candidates and some Gaussian reference models is performed to determine the similarity between them. The algorithm was applied on 715 slices containing 25 GGO nodules and 82 solid nodules and achieved detection sensitivity of 92% for GGO nodules and 95% for solid nodules with False Positive (FP) rate of 0.75 FP/slice for GGO nodules and 2.32 FP/slice for solid nodules. Finally, we used an Artificial Neural Network (ANN) to reduce the number of FP findings. After using ANN, we were able to reduce the FP rate to 0.25 FP/slice for GGO nodules and 1.62 FP/slice for solid nodules but at the expense of detection sensitivity, which became 84 % for GGO nodules and 91% for solid nodules.

  15. Rapid Annotation of Interictal Epileptiform Discharges via Template Matching under Dynamic Time Warping

    PubMed Central

    Dauwels, J.; Rakthanmanon, T.; Keogh, E.; Cash, S.S.; Westover, M.B.

    2017-01-01

    Background EEG interpretation relies on experts who are in short supply. There is a great need for automated pattern recognition systems to assist with interpretation. However, attempts to develop such systems have been limited by insufficient expert-annotated data. To address these issues, we developed a system named NeuroBrowser for EEG review and rapid waveform annotation. New Methods At the core of NeuroBrowser lies on ultrafast template matching under Dynamic Time Warping, which substantially accelerates the task of annotation. Results Our results demonstrate that NeuroBrowser can reduce the time required for annotation of interictal epileptiform discharges by EEG experts by 20–90%, with an average of approximately 70%. Comparison with Existing Method(s) In comparison with conventional manual EEG annotation, NeuroBrowser is able to save EEG experts approximately 70% on average of the time spent in annotating interictal epileptiform discharges. We have already extracted 19,000+ interictal epileptiform discharges from 100 patient EEG recordings. To our knowledge this represents the largest annotated database of interictal epileptiform discharges in existence. Conclusion NeuroBrowser is an integrated system for rapid waveform annotation. While the algorithm is currently tailored to annotation of interictal epileptiform discharges in scalp EEG recordings, the concepts can be easily generalized to other waveforms and signal types. PMID:26944098

  16. Involuntary attentional capture by task-irrelevant objects that match the search template for category detection in natural scenes.

    PubMed

    Reeder, Reshanne R; van Zoest, Wieske; Peelen, Marius V

    2015-05-01

    Theories of visual search postulate that the selection of targets amongst distractors involves matching visual input to a top-down attentional template. Previous work has provided evidence that feature-based attentional templates affect visual processing globally across the visual field. In the present study, we asked whether more naturalistic, category-level attentional templates also modulate visual processing in a spatially global and obligatory way. Subjects were cued to detect people or cars in a diverse set of photographs of real-world scenes. On a subset of trials, silhouettes of people and cars appeared in search-irrelevant locations that subjects were instructed to ignore, and subjects were required to respond to the location of a subsequent dot probe. In three experiments, results showed a consistency effect on dot-probe trials: dot probes were detected faster when they appeared in the location of the cued category compared with the non-cued category, indicating attentional capture by template-matching stimuli. Experiments 1 and 2 showed that this capture was involuntary: consistency effects persisted under conditions in which attending to silhouettes of the cued category was detrimental to performance. Experiment 3 tested whether these effects could be attributed to non-attentional effects related to the processing of the category cues. Results showed a consistency effect when subjects searched for category exemplars but not when they searched for objects semantically related to the cued category. Together, these results indicate that attentional templates for familiar object categories affect visual processing across the visual field, leading to involuntary attentional capture by template-matching stimuli.

  17. An improved earthquake catalogue in the Marmara Sea region, Turkey, using massive template matching

    NASA Astrophysics Data System (ADS)

    Matrullo, Emanuela; Lengliné, Olivier; Schmittbuhl, Jean; Karabulut, Hayrullah; Bouchon, Michel

    2016-04-01

    After the 1999 Izmit earthquake, the Main Marmara Fault (MMF) represents a 150 km unruptured segment of the North Anatolian Fault located below the Marmara Sea. One of the principal issue for seismic hazard assessment in the region is to know if the MMF is totally or partially locked and where the nucleation of the major forthcoming event is going to take place. The area is actually one of the best-instrumented fault systems in Europe. Since year 2007, various seismic networks both broadband, short period and OBS stations were deployed in order to monitor continuously the seismicity along the MMF and the related fault systems. A recent analysis of the seismicity recorded during the 2007-2012 period has provided new insights on the recent evolution of this important regional seismic gap. This analysis was based on events detected with STA/LTA procedure and manually picked P and S wave arrivals times (Schmittbuhl et al., 2015). In order to extend the level of details and to fully take advantage of the dense seismic network we improved the seismic catalog using an automatic earthquake detection technique based on a template matching approach. This approach uses known earthquake seismic signals in order to detect newer events similar to the tested one from waveform cross-correlation. To set-up the methodology and verify the accuracy and the robustness of the results, we initially focused in the eastern part of the Marmara Sea (Cinarcik basin) and compared new detection with those manually identified. Through the massive analysis of cross-correlation based on the template scanning of the continuous recordings, we construct a refined catalog of earthquakes for the Marmara Sea in 2007-2014 period. Our improved earthquake catalog will provide an effective tool to improve the catalog completeness, to monitor and study the fine details of the time-space distribution of events, to characterize the repeating earthquake source processes and to understand the mechanical state of

  18. Automatic vertebral bodies detection of x-ray images using invariant multiscale template matching

    NASA Astrophysics Data System (ADS)

    Sharifi Sarabi, Mona; Villaroman, Diane; Beckett, Joel; Attiah, Mark; Marcus, Logan; Ahn, Christine; Babayan, Diana; Gaonkar, Bilwaj; Macyszyn, Luke; Raghavendra, Cauligi

    2017-03-01

    Lower back pain and pathologies related to it are one of the most common results for a referral to a neurosurgical clinic in the developed and the developing world. Quantitative evaluation of these pathologies is a challenge. Image based measurements of angles/vertebral heights and disks could provide a potential quantitative biomarker for tracking and measuring these pathologies. Detection of vertebral bodies is a key element and is the focus of the current work. From the variety of medical imaging techniques, MRI and CT scans have been typically used for developing image segmentation methods. However, CT scans are known to give a large dose of x-rays, increasing cancer risk [8]. MRI can be substituted for CTs when the risk is high [8] but are difficult to obtain in smaller facilities due to cost and lack of expertise in the field [2]. X-rays provide another option with its ability to control the x-ray dosage, especially for young people, and its accessibility for smaller facilities. Hence, the ability to create quantitative biomarkers from x-ray data is especially valuable. Here, we develop a multiscale template matching, inspired by [9], to detect centers of vertebral bodies from x-ray data. The immediate application of such detection lies in developing quantitative biomarkers and in querying similar images in a database. Previously, shape similarity classification methods have been used to address this problem, but these are challenging to use in the presence of variation due to gross pathology and even subtle effects [1].

  19. Vertebra identification using template matching modelmp and K-means clustering.

    PubMed

    Larhmam, Mohamed Amine; Benjelloun, Mohammed; Mahmoudi, Saïd

    2014-03-01

    Accurate vertebra detection and segmentation are essential steps for automating the diagnosis of spinal disorders. This study is dedicated to vertebra alignment measurement, the first step in a computer-aided diagnosis tool for cervical spine trauma. Automated vertebral segment alignment determination is a challenging task due to low contrast imaging and noise. A software tool for segmenting vertebrae and detecting subluxations has clinical significance. A robust method was developed and tested for cervical vertebra identification and segmentation that extracts parameters used for vertebra alignment measurement. Our contribution involves a novel combination of a template matching method and an unsupervised clustering algorithm. In this method, we build a geometric vertebra mean model. To achieve vertebra detection, manual selection of the region of interest is performed initially on the input image. Subsequent preprocessing is done to enhance image contrast and detect edges. Candidate vertebra localization is then carried out by using a modified generalized Hough transform (GHT). Next, an adapted cost function is used to compute local voted centers and filter boundary data. Thereafter, a K-means clustering algorithm is applied to obtain clusters distribution corresponding to the targeted vertebrae. These clusters are combined with the vote parameters to detect vertebra centers. Rigid segmentation is then carried out by using GHT parameters. Finally, cervical spine curves are extracted to measure vertebra alignment. The proposed approach was successfully applied to a set of 66 high-resolution X-ray images. Robust detection was achieved in 97.5 % of the 330 tested cervical vertebrae. An automated vertebral identification method was developed and demonstrated to be robust to noise and occlusion. This work presents a first step toward an automated computer-aided diagnosis system for cervical spine trauma detection.

  20. Real-time automatic target location by template matching on blurred images

    NASA Astrophysics Data System (ADS)

    Kumar, Ritesh; Downs, Justin, III; Peterson, Michael L.

    1996-11-01

    High precision manufacturing is dependent on the accurate positioning of the part prior to each processing step. The part is typically held in a transfer fixture which may be used either simply as a part holder or as an alignment reference. A typical application is explored in this work, grinding of slots in a thick ceramic wafer mounted on a part fixture. A 1 micron slot is located with a position tolerance of 10 microns in the nickel-zinc-ferrite wafer. Part location is determined by means of a 50.8 by 1 micron target deposited onto the ceramic during an earlier process. In the current process, the target is magnified by a 250X vision system and the machine operator manually aligns the part using a set of cross hairs overlaid on the image. A high precision x-y stage is used to adjust the pat location. However, the use of a fixed focal distance camera combined with part mounting variation results in an unfocused image for the operator. Manual alignment of the unfocused image leads to target location error during centering of the target in the cross hairs. The misalignment, in turn, results in process variation. The location of the slots varies based on alignment and reduces the process capability. To reduce process variation, an automated target location algorithm has been applied. The algorithm uses template matching to detect the target. This simple algorithm has been shown to locate the target within the required tolerance in spite of image blur. Using low cost processing, the system is able to determine the target location in real-time. For real-time control, the algorithm must determine the x-y coordinate of the target in 10 seconds or less. This effort shows the potential for a simple location algorithm to be implemented in a manner which can significantly decrease process variation in a precision fabrication process.

  1. Real-time automatic target location by template matching on blurred images

    NASA Astrophysics Data System (ADS)

    Kumar, Ritesh; Downs, Justin, III; Peterson, Michael L.

    1996-11-01

    High precision manufacturing is dependent on the accurate positioning of the part prior to each processing step. The part is typically held in a transfer fixture which may be used either simply as a part holder or as an alignment reference. A typical application is explored in this work, grinding of slots in a thick ceramic wafer mounted on a part fixture. A 1 micron slot is located with a position tolerance of 10 microns in the nickel-zinc-ferrite wafer. Part location is determined by means of a 50.8 by 1 micron target deposited onto the ceramic during an earlier process. In the current process, the target is magnified by a 250X vision system and the machine operator manually aligns the part using a set of cross hairs overlaid on the image. A high precision x-y stage is used to adjust the part location. However, the use of a fixed focal distance camera combined with part mounting variation results in an unfocused image for the operator. Manual alignment of the unfocused image leads to target location error during centering of the target in the cross hairs. The misalignment, in turn, results in process variation. The location of the slots varies based on alignment and reduces the process capability. To reduce process variation, an automated target location algorithm has been applied. The algorithm uses template matching to detect the target. This simple algorithm has been shown to locate the target within the required tolerance in spite of image blur. Using low cost processing, the system is able to determine the target location in real-time. For real-time control, the algorithm must determine the x-y coordinate of the target in 10 seconds or less. This effort shows the potential for a simple location algorithm to be implemented in a manner which can significantly decrease process variation in a precision fabrication process.

  2. Gene Time E{chi}pression Warper: a tool for alignment, template matching and visualization of gene expression time series.

    PubMed

    Criel, Jo; Tsiporkova, Elena

    2006-01-15

    An application tool for alignment, template matching and visualization of gene expression time series is presented. The core algorithm is based on dynamic time warping techniques used in the speech recognition field. These techniques allow for non-linear (elastic) alignment of temporal sequences of feature vectors and consequently enable detection of similar shapes with different phases. The Java program, examples and a tutorial are available at http://www.psb.ugent.be/cbd/papers/gentxwarper/

  3. Efficient optimal decomposition of a sequence into disjoint regions, each matched to some template in an inventory.

    PubMed

    Sankoff, D

    1992-10-01

    Given an amino acid sequence, we discuss how to find efficiently an optimal set of disjoint regions (substrings, domains, modules, etc.), each of which can be matched to some element of a predefined inventory containing, for example, consensus sequences, protosequences, or protein family profiles. A two-stage approach to sequence decomposition, consisting of the detection of all acceptable matches followed by the construction of an optimal subset of compatible matches, leads to computational difficulties. When the problem is reformulated in terms of network comparisons, it can be solved in time quadratic in the length of the sequence and linear with the number of templates in the inventory, by a single pass of a dynamic programming algorithm. This method has the advantage that the criterion for acceptable matches can be relaxed without materially affecting computing time. Except under special conditions it is more efficient than previous segmentation methods based on dynamic programming.

  4. Precise geometric correction for NOAA and GMS images considering elevation effects using GCP template matching and affine transform

    NASA Astrophysics Data System (ADS)

    Takagi, Mikio

    2004-02-01

    This paper describes a precise geometric correction method considering elevation effects for NOAA/AVHRR of GMS images, which is mandatory for long-term global environmental monitoring studies. First, using the so-called systematic geometric correction, the correspondences of sub-sampled image pixels to their map coordinates are calculated. And, the correspondences of sub-sampled map locations, which are the corner points of blocks, to image pixels are calculated to speed up the inverse transform to find for a pixel on the map coordinates to the corresponding pixel in the image coordinates using the bilinear interpolation of the four corner points of a block. For precise geometric correction, the residual errors of the systematic correction are measured using many GCP templates. GCP templates in the map coordinates are provide using DCW. Templates in the image coordinates are generated using the bilinear Interpolation. Also, the templates of high elevation areas are modified to include the elevation effects, using the height from GTOPO30 and satellite sensor geometry. Then, the residual errors are acquired by template matching and affine transform coefficients are calculated to remove the residual errors. And if the difference between the average error and each GCP is more than one pixel, these GCP"s are removed and new affine transform coefficients are recalculated iteratively until all errors reach within one pixel. Then, mapping of each pixel is done using the correspondence of four corner block points and image coordinates modified by affine transform, but for high elevation areas blocks are divided into pixels according to their elevation. The accuracy of within one pixel; i.e. 0.01 degree for NOAA/AVHRR and GMS/VIS and 0.04 degrees for GMS/IR is obtained for NOAA images received at Tokyo and the stitched ones received at Tokyo and Bangkok and also GMS full disk images.

  5. High Levels of Soluble Lectin-Like Oxidized Low-Density Lipoprotein Receptor-1 in Acute Stroke: An Age- and Sex-Matched Cross-Sectional Study

    PubMed Central

    Sawamura, Tatsuya; Watanabe, Makoto; Kokubo, Yoshihiro; Fujita, Yoshiko; Kakino, Akemi; Nakai, Michikazu; Toyoda, Kazunori; Miyamoto, Yoshihiro; Minematsu, Kazuo

    2016-01-01

    Aim: Lectin-like oxidized low-density lipoprotein receptor-1 (LOX-1) is known to be a key molecule in the pathogenesis of atherosclerosis. Although high levels of serum soluble LOX-1 (sLOX-1) were demonstrated in patients with acute coronary syndrome, there are no reports about acute stroke patients. The aim of the present study was to evaluate the levels of sLOX-1 in acute stroke patients according to different stroke subtypes. Methods: We enrolled a total of 377 patients with a stroke (men/women: 251/126; age: 40–79 years), 250 with ischemic stroke and 127 with intracerebral hemorrhage (ICH). Patients were admitted to our hospital within 3 days after the onset of stroke. As controls, we randomly selected age- and sex-matched subjects without a past history of cardiovascular disease according to stroke subtype from the community-based cohort of the Suita study. Serum LOX-1 levels were compared between stroke patients and healthy controls according to stroke subtype. Results: Median values of serum sLOX-1 in stroke patients were significantly higher than those in controls (526 vs. 486 ng/L in ischemic stroke and 720 vs. 513 ng/L in ICH, respectively). Among subtypes of ischemic stroke, median sLOX-1 levels in atherothrombotic brain infarction (641 ng/L) only were significantly higher than those in controls (496 ng/L). Ischemic stroke [odds ratio (OR), 3.80; 95% confidence interval (CI), 1.86–7.74] and ICH (OR, 5.97; 95% CI, 2.13–16.77) were independently associated with high levels of sLOX-1 by multivariate logistic regression analysis. Conclusions: Higher levels of sLOX-1 were observed in patients with acute stoke than in controls. High levels of sLOX-1 can be useful as biomarker for acute stroke. PMID:27025681

  6. High Levels of Soluble Lectin-Like Oxidized Low-Density Lipoprotein Receptor-1 in Acute Stroke: An Age- and Sex-Matched Cross-Sectional Study.

    PubMed

    Yokota, Chiaki; Sawamura, Tatsuya; Watanabe, Makoto; Kokubo, Yoshihiro; Fujita, Yoshiko; Kakino, Akemi; Nakai, Michikazu; Toyoda, Kazunori; Miyamoto, Yoshihiro; Minematsu, Kazuo

    2016-10-01

    Lectin-like oxidized low-density lipoprotein receptor-1 (LOX-1) is known to be a key molecule in the pathogenesis of atherosclerosis. Although high levels of serum soluble LOX-1 (sLOX-1) were demonstrated in patients with acute coronary syndrome, there are no reports about acute stroke patients. The aim of the present study was to evaluate the levels of sLOX-1 in acute stroke patients according to different stroke subtypes. We enrolled a total of 377 patients with a stroke (men/women: 251/126; age: 40-79 years), 250 with ischemic stroke and 127 with intracerebral hemorrhage (ICH). Patients were admitted to our hospital within 3 days after the onset of stroke. As controls, we randomly selected age- and sex-matched subjects without a past history of cardiovascular disease according to stroke subtype from the community-based cohort of the Suita study. Serum LOX-1 levels were compared between stroke patients and healthy controls according to stroke subtype. Median values of serum sLOX-1 in stroke patients were significantly higher than those in controls (526 vs. 486 ng/L in ischemic stroke and 720 vs. 513 ng/L in ICH, respectively). Among subtypes of ischemic stroke, median sLOX-1 levels in atherothrombotic brain infarction (641 ng/L) only were significantly higher than those in controls (496 ng/L). Ischemic stroke [odds ratio (OR), 3.80; 95% confidence interval (CI), 1.86-7.74] and ICH (OR, 5.97; 95% CI, 2.13-16.77) were independently associated with high levels of sLOX-1 by multivariate logistic regression analysis. Higher levels of sLOX-1 were observed in patients with acute stoke than in controls. High levels of sLOX-1 can be useful as biomarker for acute stroke.

  7. A model-based 3D template matching technique for pose acquisition of an uncooperative space object.

    PubMed

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2015-03-16

    This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented.

  8. A Model-Based 3D Template Matching Technique for Pose Acquisition of an Uncooperative Space Object

    PubMed Central

    Opromolla, Roberto; Fasano, Giancarmine; Rufino, Giancarlo; Grassi, Michele

    2015-01-01

    This paper presents a customized three-dimensional template matching technique for autonomous pose determination of uncooperative targets. This topic is relevant to advanced space applications, like active debris removal and on-orbit servicing. The proposed technique is model-based and produces estimates of the target pose without any prior pose information, by processing three-dimensional point clouds provided by a LIDAR. These estimates are then used to initialize a pose tracking algorithm. Peculiar features of the proposed approach are the use of a reduced number of templates and the idea of building the database of templates on-line, thus significantly reducing the amount of on-board stored data with respect to traditional techniques. An algorithm variant is also introduced aimed at further accelerating the pose acquisition time and reducing the computational cost. Technique performance is investigated within a realistic numerical simulation environment comprising a target model, LIDAR operation and various target-chaser relative dynamics scenarios, relevant to close-proximity flight operations. Specifically, the capability of the proposed techniques to provide a pose solution suitable to initialize the tracking algorithm is demonstrated, as well as their robustness against highly variable pose conditions determined by the relative dynamics. Finally, a criterion for autonomous failure detection of the presented techniques is presented. PMID:25785309

  9. Multistation template matching to characterize frequency-magnitude distributions of induced seismicity in the Central and Eastern US

    NASA Astrophysics Data System (ADS)

    Brudzinski, M. R.; Skoumal, R.; Currie, B.

    2015-12-01

    We analyze the frequency-magnitude distribution (FMD) of recent seismic sequences thought to be induced by wastewater injection and hydraulic fracturing in the Central and Eastern U.S. to investigate their physical origin and improve hazard estimates. Multistation template matching is utilized to increase the number of events analyzed by lowering the magnitude of detection. In cases where local deployments are available, we demonstrate that the FMD obtained through template matching using regional data are comparable to those obtained from traditional detection using the local deployment. Since deployments usually occur after seismicity has already been identified, catalogs constructed with regional data offer the advantage of providing a more complete history of the seismicity. We find two primary groups of FMDs for induced sequences: those that generally follow the Gutenberg-Richter power-law and those that generally do not. All of the induced sequences are typically characterized by swarm-like behavior, but the non-power-law FMDs are also characterized by a clustering of events at low magnitudes and particularly low aftershock productivity for a continental interior. Each of the observations in the non-power law FMD cases is predicted by numerical simulations of a seismogenic zone governed by a viscoelastic damage rheology with low effective viscosity in the fault zone. Such a reduction in effective viscosity is expected if fluid injection increases fluid pressures in the fault zone to the point that the fault zone begins to dilate.

  10. Study on a digital pulse processing algorithm based on template-matching for high-throughput spectroscopy

    NASA Astrophysics Data System (ADS)

    Wen, Xianfei; Yang, Haori

    2015-06-01

    A major challenge in utilizing spectroscopy techniques for nuclear safeguards is to perform high-resolution measurements at an ultra-high throughput rate. Traditionally, piled-up pulses are rejected to ensure good energy resolution. To improve throughput rate, high-pass filters are normally implemented to shorten pulses. However, this reduces signal-to-noise ratio and causes degradation in energy resolution. In this work, a pulse pile-up recovery algorithm based on template-matching was proved to be an effective approach to achieve high-throughput gamma ray spectroscopy. First, a discussion of the algorithm was given in detail. Second, the algorithm was then successfully utilized to process simulated piled-up pulses from a scintillator detector. Third, the algorithm was implemented to analyze high rate data from a NaI detector, a silicon drift detector and a HPGe detector. The promising results demonstrated the capability of this algorithm to achieve high-throughput rate without significant sacrifice in energy resolution. The performance of the template-matching algorithm was also compared with traditional shaping methods.

  11. Computer-aided detection of pulmonary nodules using dynamic self-adaptive template matching and a FLDA classifier.

    PubMed

    Gong, Jing; Liu, Ji-Yu; Wang, Li-Jia; Zheng, Bin; Nie, Sheng-Dong

    2016-12-01

    Improving the performance of computer-aided detection (CAD) system for pulmonary nodules is still an important issue for its future clinical applications. This study aims to develop a new CAD scheme for pulmonary nodule detection based on dynamic self-adaptive template matching and Fisher linear discriminant analysis (FLDA) classifier. We first segment and repair lung volume by using OTSU algorithm and three-dimensional (3D) region growing. Next, the suspicious regions of interest (ROIs) are extracted and filtered by applying 3D dot filtering and thresholding method. Then, pulmonary nodule candidates are roughly detected with 3D dynamic self-adaptive template matching. Finally, we optimally select 11 image features and apply FLDA classifier to reduce false positive detections. The performance of the new method is validated by comparing with other methods through experiments using two groups of public datasets from Lung Image Database Consortium (LIDC) and ANODE09. By a 10-fold cross-validation experiment, the new CAD scheme finally has achieved a sensitivity of 90.24% and a false-positive (FP) of 4.54 FP/scan on average for the former dataset, and a sensitivity of 84.1% with 5.59 FP/scan for the latter. By comparing with other previously reported CAD schemes tested on the same datasets, the study proves that this new scheme can yield higher and more robust results in detecting pulmonary nodules. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  12. In induced reconstructions of Si(1 1 1) as superlattice matched epitaxial templates for InN growth

    SciTech Connect

    Kuyyalil, Jithesh; Tangi, Malleswararao; Shivaprasad, S.M.

    2013-02-15

    Graphical abstract: Display Omitted Highlights: ► A novel growth method to form InN at low growth temperatures. ► Use of Si reconstruction as a growth template for group III nitrides. ► Band gap variation of InN – Moss–Burstein shift – non-parabolic conduction band for InN. ► Super lattice matching epitaxy of metal induced reconstructions with III–V unit cell. -- Abstract: Indium induced surface reconstructions of Si(1 1 1)-7 × 7 are used as templates to grow high quality InN. We grow InN on Si(1 1 1)-7 × 7, Si(1 1 1)-4 × 1-In and Si(1 1 1)-1 × 1-In reconstructed surfaces and study the quality of the films formed using complementary characterization tools. InN grown on Si(1 1 1)-1 × 1-In reconstruction shows superior film quality with lowest band-edge emission having a narrow full width at half maximum, intense and narrow 0 0 0 2 X-ray diffraction, low surface roughness and carrier concentration an order lower than other samples. We attribute the high quality of the film formed at 300 °C to the integral matching of InN and super lattice dimensions, we also study the reasons for the band gap variation of InN in the literature. Present study demonstrates the proposed Superlattice Matched Epitaxy can be a general approach to grow good quality InN at much lower growth temperature on compatible In induced reconstructions of the Si surface.

  13. Mutual information-based template matching scheme for detection of breast masses: from mammography to digital breast tomosynthesis.

    PubMed

    Mazurowski, Maciej A; Lo, Joseph Y; Harrawood, Brian P; Tourassi, Georgia D

    2011-10-01

    Development of a computational decision aid for a new medical imaging modality typically is a long and complicated process. It consists of collecting data in the form of images and annotations, development of image processing and pattern recognition algorithms for analysis of the new images and finally testing of the resulting system. Since new imaging modalities are developed more rapidly than ever before, any effort for decreasing the time and cost of this development process could result in maximizing the benefit of the new imaging modality to patients by making the computer aids quickly available to radiologists that interpret the images. In this paper, we make a step in this direction and investigate the possibility of translating the knowledge about the detection problem from one imaging modality to another. Specifically, we present a computer-aided detection (CAD) system for mammographic masses that uses a mutual information-based template matching scheme with intelligently selected templates. We presented principles of template matching with mutual information for mammography before. In this paper, we present an implementation of those principles in a complete computer-aided detection system. The proposed system, through an automatic optimization process, chooses the most useful templates (mammographic regions of interest) using a large database of previously collected and annotated mammograms. Through this process, the knowledge about the task of detecting masses in mammograms is incorporated in the system. Then, we evaluate whether our system developed for screen-film mammograms can be successfully applied not only to other mammograms but also to digital breast tomosynthesis (DBT) reconstructed slices without adding any DBT cases for training. Our rationale is that since mutual information is known to be a robust inter-modality image similarity measure, it has high potential of transferring knowledge between modalities in the context of the mass detection

  14. New, low magnitude earthquake detections in Ireland and neighbouring offshore basins by waveform template matching

    NASA Astrophysics Data System (ADS)

    Arroucau, Pierre; Grannell, James; Lebedev, Sergei; Bean, Chris J.; Möllhoff, Martin; Blake, Tom; Horan, Clare

    2017-04-01

    Earthquake monitoring in intraplate continental interiors requires the detection of low magnitude events in regions that are sometimes poorly instrumented due to low estimated hazard and risk. According to existing catalogues, the seismic activity of Ireland is characterized by low magnitude, infrequent earthquakes. This is expected as Ireland is located several hundred kilometers away from the closest plate boundaries. However, the lack of seismic activity is still surprising in comparison with that of Great Britain, its closest neighbour. Since Ireland's historical seismic station coverage was significantly sparser than that of Great Britain, a possible instrumental bias has been invoked, but recent results obtained from the analysis of waveforms recorded at dense temporary arrays and new permanent stations tend to confirm the relative quiet seismogenic behaviour of Ireland's crust. However, classical detection methods are known to fail if site conditions are too noisy, hence very low magnitude events can still be missed. Such events are of primary importance for seismotectonic studies, so in this work we investigate the possibility of producing new detections by cross-correlating the available continuous waveform data with waveform templates from catalogue earthquakes. Preliminary results show that more than 200 new events can be identified over the past 5 years, which is particularly significant considering the 120 events present in the catalogue for the period 1980-2016. Despite the limitation of the technique to events whose location and source characteristics are close to previously known ones, these results demonstrate that waveform template cross-correlation can successfully be used to lower detection thresholds in a seismically quiet region such as Ireland.

  15. Automated Detection of 50-kHz Ultrasonic Vocalizations Using Template Matching in XBAT

    PubMed Central

    Barker, David J.; Herrera, Christopher; West, Mark O.

    2014-01-01

    Background Ultrasonic vocalizations (USVs) have been utilized to infer animals' affective states in multiple research paradigms including animal models of drug abuse, depression, fear or anxiety disorders, Parkinson's disease, and in studying neural substrates of reward processing. Currently, the analysis of USV data is performed manually, and thus time consuming. New Method The goal of the present study was to develop a method for automated USV recognition using a ‘template detection’ procedure for vocalizations in the 50-kHz range (35-80 kHz). The detector is designed to run within XBAT, a MATLAB graphical user interface and extensible bioacoustics tool developed at Cornell University. Results Results show that this method is capable of detecting >90% of emitted USVs and that time spent collecting data by experimenters is greatly reduced. Comparison with Existing Methods Currently, no viable and publicly available methods exist for the automated detection of USVs. The present method, in combination with the XBAT environment is ideal for the USV community as it allows others to 1) detect USVs within a user-friendly environment, 2) make improvements to the detector and disseminate and 3) develop new tools for analysis within the MATLAB environment. Conclusions The present detector provides an open-source, accurate method for the detection of 50-kHz USVs. Ongoing research will extend the current method for use in the 22-kHz frequency range of ultrasonic vocalizations. Moreover, collaborative efforts among USV researchers might enhance the capabilities of the current detector via changes to the templates and the development of new programs for analysis. PMID:25128724

  16. Detection and Counting of Orchard Trees from Vhr Images Using a Geometrical-Optical Model and Marked Template Matching

    NASA Astrophysics Data System (ADS)

    Maillard, Philippe; Gomes, Marília F.

    2016-06-01

    This article presents an original algorithm created to detect and count trees in orchards using very high resolution images. The algorithm is based on an adaptation of the "template matching" image processing approach, in which the template is based on a "geometricaloptical" model created from a series of parameters, such as illumination angles, maximum and ambient radiance, and tree size specifications. The algorithm is tested on four images from different regions of the world and different crop types. These images all have < 1 meter spatial resolution and were downloaded from the GoogleEarth application. Results show that the algorithm is very efficient at detecting and counting trees as long as their spectral and spatial characteristics are relatively constant. For walnut, mango and orange trees, the overall accuracy was clearly above 90%. However, the overall success rate for apple trees fell under 75%. It appears that the openness of the apple tree crown is most probably responsible for this poorer result. The algorithm is fully explained with a step-by-step description. At this stage, the algorithm still requires quite a bit of user interaction. The automatic determination of most of the required parameters is under development.

  17. Relevance-Based Template Matching for Tracking Targets in FLIR Imagery

    PubMed Central

    Paravati, Gianluca; Esposito, Stefano

    2014-01-01

    One of the main challenges in automatic target tracking applications is represented by the need to maintain a low computational footprint, especially when dealing with real-time scenarios and the limited resources of embedded environments. In this context, significant results can be obtained by using forward-looking infrared sensors capable of providing distinctive features for targets of interest. In fact, due to their nature, forward-looking infrared (FLIR) images lend themselves to being used with extremely small footprint techniques based on the extraction of target intensity profiles. This work proposes a method for increasing the computational efficiency of template-based target tracking algorithms. In particular, the speed of the algorithm is improved by using a dynamic threshold that narrows the number of computations, thus reducing both execution time and resources usage. The proposed approach has been tested on several datasets, and it has been compared to several target tracking techniques. Gathered results, both in terms of theoretical analysis and experimental data, showed that the proposed approach is able to achieve the same robustness of reference algorithms by reducing the number of operations needed and the processing time. PMID:25093344

  18. Pulmonary nodule registration in serial CT scans based on rib anatomy and nodule template matching

    PubMed Central

    Shi, Jiazheng; Sahiner, Berkman; Chan, Heang-Ping; Hadjiiski, Lubomir; Zhou, Chuan; Cascade, Philip N.; Bogot, Naama; Kazerooni, Ella A.; Wu, Yi-Ta; Wei, Jun

    2009-01-01

    An automated method is being developed in order to identify corresponding nodules in serial thoracic CT scans for interval change analysis. The method uses the rib centerlines as the reference for initial nodule registration. A spatially adaptive rib segmentation method first locates the regions where the ribs join the spine, which define the starting locations for rib tracking. Each rib is tracked and locally segmented by expectation-maximization. The ribs are automatically labeled, and the centerlines are estimated using skeletonization. For a given nodule in the source scan, the closest three ribs are identified. A three-dimensional (3D) rigid affine transformation guided by simplex optimization aligns the centerlines of each of the three rib pairs in the source and target CT volumes. Automatically defined control points along the centerlines of the three ribs in the source scan and the registered ribs in the target scan are used to guide an initial registration using a second 3D rigid affine transformation. A search volume of interest (VOI) is then located in the target scan. Nodule candidate locations within the search VOI are identified as regions with high Hessian responses. The initial registration is refined by searching for the maximum cross-correlation between the nodule template from the source scan and the candidate locations. The method was evaluated on 48 CT scans from 20 patients. Experienced radiologists identified 101 pairs of corresponding nodules. Three metrics were used for performance evaluation. The first metric was the Euclidean distance between the nodule centers identified by the radiologist and the computer registration, the second metric was a volume overlap measure between the nodule VOIs identified by the radiologist and the computer registration, and the third metric was the hit rate, which measures the fraction of nodules whose centroid computed by the computer registration in the target scan falls within the VOI identified by the

  19. Mix and match: templating chiral Schiff base ligands to suit the needs of the metal ion.

    PubMed

    Constable, Edwin C; Zhang, Guoqi; Housecroft, Catherine E; Zampese, Jennifer A

    2010-06-14

    One-pot reactions of 2,2'-bipyridine-6-carbaldehyde, (1S,2S)-(-)-1,2-diphenyl-1,2-diaminoethane and FeCl(2).4H(2)O or Zn(OAc)(2).2H(2)O (2 : 1 : 1) at room temperature in MeOH lead to [Fe{(S,S)-5}(2)][PF(6)]Cl or [Zn{(S,S)-5}(2)][PF(6)](2) in which (S,S)-5 contains an imidazolidine ring, produced by intramolecular cyclization. This has been confirmed with the single-crystal structure of 2{P-[Fe{(S,S)-5}(2)][PF(6)]Cl}.H(2)O. The diastereoselectivity observed in the solid state has been confirmed by NMR spectroscopy for solutions of [Fe{(S,S)-5}(2)][PF(6)]Cl and [Zn{(S,S)-5}(2)][PF(6)](2). At room temperature, a minor product competes with the formation of [Fe{(S,S)-5}(2)][PF(6)]Cl, and the preference for these complexes is switched by carrying out the reaction in MeOH at reflux. In this case the major product is M-[Fe(2){(S,S)-4}(2)][PF(6)](4) in which (S,S)-4 is the hexadentate Schiff base ligand formed by condensation of two equivalents of 2,2'-bipyridine-6-carbaldehyde with (1S,2S)-(-)-1,2-diphenyl-1,2-diaminoethane; the single-crystal structure of 4{M-[Fe(2){(S,S)-4}(2)][PF(6)](4)}.8Me(2)CO.5MeCN.3H(2)O confirms the assembly of a double helicate. When pyridine-6-carbaldehyde replaces 2,2'-bipyridine-6-carbaldehyde in the iron(II)-templated reaction with (1S,2S)-(-)-1,2-diphenyl-1,2-diaminoethane, the product is [Fe{(S,S)-7}(2)][PF(6)](2) (3 : 2 mixture of diastereoisomers in solution) in which (S,S)-7 is an asymmetrical Schiff base, formed by reaction of only one of the amine groups in (1S,2S)-(-)-1,2-diphenyl-1,2-diaminoethane. The solid state structure of P-[Fe{(S,S)-7}(2)][PF(6)](2).MeCN is presented.

  20. Prediction of Signal Peptide Cleavage Sites with Subsite-Coupled and Template Matching Fusion Algorithm.

    PubMed

    Zhang, Shao-Wu; Zhang, Ting-He; Zhang, Jun-Nan; Huang, Yufei

    2014-03-01

    Fast and effective prediction of signal peptides (SP) and their cleavage sites is of great importance in computational biology. The approaches developed to predict signal peptide can be roughly divided into machine learning based, and sliding windows based. In order to further increase the prediction accuracy and coverage of organism for SP cleavage sites, we propose a novel method for predicting SP cleavage sites called Signal-CTF that utilizes machine learning and sliding windows, and is designed for N-termial secretory proteins in a large variety of organisms including human, animal, plant, virus, bacteria, fungi and archaea. Signal-CTF consists of three distinct elements: (1) a subsite-coupled and regularization function with a scaled window of fixed width that selects a set of candidates of possible secretion-cleavable segment for a query secretory protein; (2) a sum fusion system that integrates the outcomes from aligning the cleavage site template sequence with each of the aforementioned candidates in a scaled window of fixed width to determine the best candidate cleavage sites for the query secretory protein; (3) a voting system that identifies the ultimate signal peptide cleavage site among all possible results derived from using scaled windows of different width. When compared with Signal-3L and SignalP 4.0 predictors, the prediction accuracy of Signal-CTF is 4-12 %, 10-25 % higher than that of Signal-3L for human, animal and eukaryote, and SignalP 4.0 for eukaryota, Gram-positive bacteria and Gram-negative bacteria, respectively. Comparing with PRED-SIGNAL and SignalP 4.0 predictors on the 32 archaea secretory proteins of used in Bagos's paper, the prediction accuracy of Signal-CTF is 12.5 %, 25 % higher than that of PRED-SIGNAL and SignalP 4.0, respectively. The predicting results of several long signal peptides show that the Signal-CTF can better predict cleavage sites for long signal peptides than SignalP, Phobius, Philius, SPOCTOPUS, Signal

  1. Matched Template Signal Processing for Continuous Wave Laser Tracking of Space Debris

    NASA Astrophysics Data System (ADS)

    Raj, S.; Ward, R.; Roberts, L.; Fleddermann, R.; Francis, S.; McClellend, D.; Shaddock, D.; Smith, C.

    2016-09-01

    The build up of space junk in Earth's orbit space is a growing concern as it shares the same orbit as many currently active satellites. As the number of objects increase in these orbits, the likelihood of collisions between satellites and debris will increase [1]. The eventual goal is to be able to maneuver space debris to avoid such collisions. We at SERC aim to accomplish this by using ground based laser facilities that are already being used to track space debris orbit. One potential method to maneuver space debris is using continuous wave lasers and applying photon pressure on the debris and attempt to change the orbit. However most current laser ranging facilities operates using pulsed lasers where a pulse of light is sent out and the time taken for the pulse to return back to the telescope is measured after being reflected by the target. If space debris maneuvering is carried out with a continuous wave laser then two laser sources need to be used for ranging and maneuvering. The aim of this research is to develop a laser ranging system that is compatible with the continuous wave laser; using the same laser source to simultaneously track and maneuver space debris. We aim to accomplish this by modulating the outgoing laser light with pseudo random noise (PRN) codes, time tagging the outgoing light, and utilising a matched filter at the receiver end to extract the various orbital information of the debris.

  2. Selecting electrode configurations for image-guided cochlear implant programming using template matching

    NASA Astrophysics Data System (ADS)

    Zhang, Dongqing; Zhao, Yiyuan; Noble, Jack H.; Dawant, Benoit M.

    2017-03-01

    Cochlear implants (CIs) are used to treat patients with severe-to-profound hearing loss. In surgery, an electrode array is implanted in the cochlea. After implantation, the CI processor is programmed by an audiologist. One factor that negatively impacts outcomes and can be addressed by programming is cross-electrode neural stimulation overlap (NSO). In the recent past, we have proposed a system to assist the audiologist in programming the CI that we call Image-Guided CI Programming (IGCIP). IGCIP permits using CT images to detect NSO and recommend which subset of electrodes should be active to avoid NSO. In an ongoing clinical study, we have shown that IGCIP leads to significant improvement in hearing outcomes. Most of the IGCIP steps are robustly automated but electrode configuration selection still sometimes requires expert intervention. With expertise, Distance-Vs-Frequency (DVF) curves, which are a way to visualize the spatial relationship learned from CT between the electrodes and the nerves they stimulate, can be used to select the electrode configuration. In this work, we propose an automated technique for electrode configuration selection. It relies on matching new patients' DVF curves to a library of DVF curves for which electrode configurations are known. We compare this approach to one we have previously proposed. We show that, generally, our new method produces results that are as good as those obtained with our previous one while being generic and requiring fewer parameters.

  3. Direct template matching reveals a host subcellular membrane gyroid cubic structure that is associated with SARS virus.

    PubMed

    Almsherqi, Zakaria A; McLachlan, Craig S; Mossop, Peter; Knoops, Kèvin; Deng, Yuru

    2005-01-01

    Viral infection can result in alterations to the host subcellular membrane. This is often reported when using transmission electron microscopy (TEM), resulting in a description of tubuloreticular membrane subcellular ultrastructure rather than a definition based on 3-D morphology. 2-D TEM micrographs depicting subcellular membrane changes are associated with subcellular SARS virion particles [Goldsmith CS, Tatti KM, Ksiazek TG et al. Ultra-structural characterization of SARS coronavirus. Emerg Infect Dis 2004; 10: 320-326]. In the present study, we have defined the 2-D membrane pattern and shape associated with the SARS virus infection. This is by using a direct template matching method to determine what the 3-D structure of the SARS virus associated host membrane change would be. The TEM image for our purposes is defined on 2-D information, such as the membrane having undergone proliferation and from pattern recognition suggesting that the membrane-described pattern is possibly a gyroid type of membrane. Features of the membrane were used to compute and match the gyroid structure with an existing 2-D TEM micrograph, where it was revealed that the membrane structure was indeed a gyroid-based cubic membrane. The 2-D gyroid computer-simulated image that was used to match the electron micrograph of interest was derived from a mathematically well-defined 3-D structure, and it is from this 3-D derivative that allows us to make inferences about the 3-D structure of this membrane. In conclusion, we demonstrate that a 3-D structure can be defined from a 2-D membrane patterned image and that a SARS viral associated membrane change has been identified as cubic membrane morphology. Possible mechanisms for this cubic membrane change are discussed with respect to viral severity, persistence and free radical production.

  4. Spinal Cord Segmentation by One Dimensional Normalized Template Matching: A Novel, Quantitative Technique to Analyze Advanced Magnetic Resonance Imaging Data.

    PubMed

    Cadotte, Adam; Cadotte, David W; Livne, Micha; Cohen-Adad, Julien; Fleet, David; Mikulis, David; Fehlings, Michael G

    2015-01-01

    Spinal cord segmentation is a developing area of research intended to aid the processing and interpretation of advanced magnetic resonance imaging (MRI). For example, high resolution three-dimensional volumes can be segmented to provide a measurement of spinal cord atrophy. Spinal cord segmentation is difficult due to the variety of MRI contrasts and the variation in human anatomy. In this study we propose a new method of spinal cord segmentation based on one-dimensional template matching and provide several metrics that can be used to compare with other segmentation methods. A set of ground-truth data from 10 subjects was manually-segmented by two different raters. These ground truth data formed the basis of the segmentation algorithm. A user was required to manually initialize the spinal cord center-line on new images, taking less than one minute. Template matching was used to segment the new cord and a refined center line was calculated based on multiple centroids within the segmentation. Arc distances down the spinal cord and cross-sectional areas were calculated. Inter-rater validation was performed by comparing two manual raters (n = 10). Semi-automatic validation was performed by comparing the two manual raters to the semi-automatic method (n = 10). Comparing the semi-automatic method to one of the raters yielded a Dice coefficient of 0.91 +/- 0.02 for ten subjects, a mean distance between spinal cord center lines of 0.32 +/- 0.08 mm, and a Hausdorff distance of 1.82 +/- 0.33 mm. The absolute variation in cross-sectional area was comparable for the semi-automatic method versus manual segmentation when compared to inter-rater manual segmentation. The results demonstrate that this novel segmentation method performs as well as a manual rater for most segmentation metrics. It offers a new approach to study spinal cord disease and to quantitatively track changes within the spinal cord in an individual case and across cohorts of subjects.

  5. Syntheses, structures, characterizations and charge-density matching of novel amino-templated uranyl selenates

    SciTech Connect

    Ling Jie; Sigmon, Ginger E.; Burns, Peter C.

    2009-02-15

    Five hybrid organic-inorganic uranyl selenates have been synthesized, characterized and their structures have been determined. The structure of (C{sub 2}H{sub 8}N){sub 2}[(UO{sub 2}){sub 2}(SeO{sub 4}){sub 3}(H{sub 2}O)] (EthylAUSe) is monoclinic, P2{sub 1}, a=8.290(1), b=12.349(2), c=11.038(2) A, {beta}=104.439(4){sup o}, V=1094.3(3) A{sup 3}, Z=2, R{sub 1}=0.0425. The structure of (C{sub 7}H{sub 10}N){sub 2}[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)]H{sub 2}O (BenzylAUSe) is orthorhombic, Pna2{sub 1}, a=24.221(2), b=11.917(1), c=7.4528(7) A, V=2151.1(3) A{sup 3}, Z=4, R{sub 1}=0.0307. The structure of (C{sub 2}H{sub 10}N{sub 2})[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)](H{sub 2}O){sub 2} (EDAUSe) is monoclinic, P2{sub 1}/c, a=11.677(2), b=7.908(1), c=15.698(2) A, {beta}=98.813(3){sup o}, V=1432.4(3) A{sup 3}, Z=4, R{sub 1}=0.0371. The structure of (C{sub 6}H{sub 22}N{sub 4})[(UO{sub 2})(SeO{sub 4}){sub 2}(H{sub 2}O)](H{sub 2}O) (TETAUSe) is monoclinic, P2{sub 1}/n, a=13.002(2), b=7.962(1), c=14.754(2) A, {beta}=114.077(2){sup o}, V=1394.5(3) A{sup 3}, Z=4, R{sub 1}=0.0323. The structure of (C{sub 6}H{sub 21}N{sub 4})[(UO{sub 2})(SeO{sub 4}){sub 2}(HSeO{sub 4})] (TAEAUSe) is monoclinic, P2{sub 1}/m, a=9.2218(6), b=12.2768(9), c=9.4464(7) A, {beta}=116.1650(10){sup o}, V=959.88(12) A{sup 3}, Z=2, R{sub 1}=0.0322. The inorganic structural units in these compounds are composed of uranyl pentagonal bipyramids and selenate tetrahedra. In each case, tetrahedra link bipyramids through vertex-sharing, resulting in chain or sheet topologies. The charge-density matching principle is discussed relative to the orientations of the organic molecules between the inorganic structural units. - Graphical abstract: The structures of five new inorganic-organic hybrid uranyl selenates present new structural topologies based upon chains and sheets of uranyl pentagonal bipyramids and selenate tetrahedra.

  6. Waveform Template Matching and Analysis of Hydroacoustic Events from the April-May 2015 Eruption of Axial Volcano

    NASA Astrophysics Data System (ADS)

    Mann, M. E.; Bohnenstiehl, D. R.; Weis, J.

    2016-12-01

    The submarine emplacement of new lava flows during the 2015 eruption of Axial Volcano generated a series of impulsive acoustic signals that were captured by seismic and hydrophone sensors deployed as part of the Ocean Observatories Initiative cabled array network. A catalog of >37,000 explosions was created using a four-channel waveform matching routine using 800 template arrivals. Most of the explosions are sourced from a set of lava mounds erupted along the volcano's northern rift; however, a subset of 400 explosions are located within the caldera and track the flow of lava from a vent near its eastern rim. The earliest explosion occurs at 08:00 UTC on April 24, approximately four hours after the seismicity rate began to increase and two hours after bottom pressure recorders indicate the caldera floor began to subside. Between April 24 and 28 event rates are sustained at 1000/day. The rate then decreases gradually with explosive activity ending on 21 May, coincident with the initial re-inflation of the caldera. The windowed coefficient of variation of the inter-event time is approximately 1 throughout the eruption, consistent with a random process. The size-frequency distribution shows a bimodal pattern, with the loudest explosions, having received levels up to 157 dB re 1 micro-Pa, being produced during the first few hours of the eruption.

  7. Registration of FA and T1-weighted MRI data of healthy human brain based on template matching and normalized cross-correlation.

    PubMed

    Malinsky, Milos; Peter, Roman; Hodneland, Erlend; Lundervold, Astri J; Lundervold, Arvid; Jan, Jiri

    2013-08-01

    In this work, we propose a new approach for three-dimensional registration of MR fractional anisotropy images with T1-weighted anatomy images of human brain. From the clinical point of view, this accurate coregistration allows precise detection of nerve fibers that is essential in neuroscience. A template matching algorithm combined with normalized cross-correlation was used for this registration task. To show the suitability of the proposed method, it was compared with the normalized mutual information-based B-spline registration provided by the Elastix software library, considered a reference method. We also propose a general framework for the evaluation of robustness and reliability of both registration methods. Both registration methods were tested by four evaluation criteria on a dataset consisting of 74 healthy subjects. The template matching algorithm has shown more reliable results than the reference method in registration of the MR fractional anisotropy and T1 anatomy image data. Significant differences were observed in the regions splenium of corpus callosum and genu of corpus callosum, considered very important areas of brain connectivity. We demonstrate that, in this registration task, the currently used mutual information-based parametric registration can be replaced by more accurate local template matching utilizing the normalized cross-correlation similarity measure.

  8. A Combined One-Class SVM and Template-Matching Approach for User-Aided Human Fall Detection by Means of Floor Acoustic Features

    PubMed Central

    Ferretti, Daniele; Piazza, Francesco

    2017-01-01

    The primary cause of injury-related death for the elders is represented by falls. The scientific community devoted them particular attention, since injuries can be limited by an early detection of the event. The solution proposed in this paper is based on a combined One-Class SVM (OCSVM) and template-matching classifier that discriminate human falls from nonfalls in a semisupervised framework. Acoustic signals are captured by means of a Floor Acoustic Sensor; then Mel-Frequency Cepstral Coefficients and Gaussian Mean Supervectors (GMSs) are extracted for the fall/nonfall discrimination. Here we propose a single-sensor two-stage user-aided approach: in the first stage, the OCSVM detects abnormal acoustic events. In the second, the template-matching classifier produces the final decision exploiting a set of template GMSs related to the events marked as false positives by the user. The performance of the algorithm has been evaluated on a corpus containing human falls and nonfall sounds. Compared to the OCSVM only approach, the proposed algorithm improves the performance by 10.14% in clean conditions and 4.84% in noisy conditions. Compared to Popescu and Mahnot (2009) the performance improvement is 19.96% in clean conditions and 8.08% in noisy conditions. PMID:28638405

  9. Megathrust Earthquake Swarms Contemporaneous to Slow Slip and Non-Volcanic Tremor in Southern Mexico, Detected and Analyzed through a Template Matching Approach

    NASA Astrophysics Data System (ADS)

    Holtkamp, S.; Brudzinski, M. R.; Cabral-Cano, E.; Arciniega-Ceballos, A.

    2012-12-01

    An outstanding question in geophysics is the degree to which the newly discovered types of slow fault slip are related to their destructive cousin - the earthquake. Here, we utilize a local network along the Oaxacan segment of the Middle American subduction zone to investigate the potential relationship between slow slip, non-volcanic tremor (NVT), and earthquakes along the subduction megathrust. We have developed a multi-station "template matching" waveform cross correlation technique which is able to detect and locate events several orders of magnitude smaller than would be possible using more traditional techniques. Also, our template matching procedure is capable of consistently locate events which occur during periods of increased background activity (e.g., during productive NVT, loud cultural noise, or after larger earthquakes) because the multi-station detector is finely tuned to events with similar hypocentral location and focal mechanism. The local network in the Oaxaca region allows us to focus on documented megathrust earthquake swarms, which we focus on because slow slip is hypothesized to be the cause for earthquake swarms in some tectonic environments. We identify a productive earthquake swarm in July 2006 (~600 similar earthquakes detected), which occurred during a week-long episode of productive tremor and slow slip. Families of events in this sequence were also active during larger and longer slow slip events, which provides a potential link between slow slip in the transition zone and earthquakes at the downdip end of the seismogenic portion of the megathrust. Because template matching techniques only detect similar signals, detected waveforms can be stacked together to produce higher signal to noise ratios or cross correlated against each other to produce precise relative phase arrival times. We are using the refined signals to look for evidence of expansion or propagation of hypocenters during these earthquake swarms, which could be used as a

  10. An automatic electroencephalography blinking artefact detection and removal method based on template matching and ensemble empirical mode decomposition.

    PubMed

    Bizopoulos, Paschalis A; Al-Ani, Tarik; Tsalikakis, Dimitrios G; Tzallas, Alexandros T; Koutsouris, Dimitrios D; Fotiadis, Dimitrios I

    2013-01-01

    Electrooculographic (EOG) artefacts are one of the most common causes of Electroencephalogram (EEG) distortion. In this paper, we propose a method for EOG Blinking Artefacts (BAs) detection and removal from EEG. Normalized Correlation Coefficient (NCC), based on a predetermined BA template library was used for detecting the BA. Ensemble Empirical Mode Decomposition (EEMD) was applied to the contaminated region and a statistical algorithm determined which Intrinsic Mode Functions (IMFs) correspond to the BA. The proposed method was applied in simulated EEG signals, which were contaminated with artificially created EOG BAs, increasing the Signal-to-Error Ratio (SER) of the EEG Contaminated Region (CR) by 35 dB on average.

  11. Source mechanism of small long-period events at Mount St. Helens in July 2005 using template matching, phase-weighted stacking, and full-waveform inversion

    USGS Publications Warehouse

    Matoza, Robin S.; Chouet, Bernard A.; Dawson, Phillip B.; Shearer, Peter M.; Haney, Matthew M.; Waite, Gregory P.; Moran, Seth C.; Mikesell, T. Dylan

    2015-01-01

    Long-period (LP, 0.5-5 Hz) seismicity, observed at volcanoes worldwide, is a recognized signature of unrest and eruption. Cyclic LP “drumbeating” was the characteristic seismicity accompanying the sustained dome-building phase of the 2004–2008 eruption of Mount St. Helens (MSH), WA. However, together with the LP drumbeating was a near-continuous, randomly occurring series of tiny LP seismic events (LP “subevents”), which may hold important additional information on the mechanism of seismogenesis at restless volcanoes. We employ template matching, phase-weighted stacking, and full-waveform inversion to image the source mechanism of one multiplet of these LP subevents at MSH in July 2005. The signal-to-noise ratios of the individual events are too low to produce reliable waveform-inversion results, but the events are repetitive and can be stacked. We apply network-based template matching to 8 days of continuous velocity waveform data from 29 June to 7 July 2005 using a master event to detect 822 network triggers. We stack waveforms for 359 high-quality triggers at each station and component, using a combination of linear and phase-weighted stacking to produce clean stacks for use in waveform inversion. The derived source mechanism pointsto the volumetric oscillation (~10 m3) of a subhorizontal crack located at shallow depth (~30 m) in an area to the south of Crater Glacier in the southern portion of the breached MSH crater. A possible excitation mechanism is the sudden condensation of metastable steam from a shallow pressurized hydrothermal system as it encounters cool meteoric water in the outer parts of the edifice, perhaps supplied from snow melt.

  12. Source mechanism of small long-period events at Mount St. Helens in July 2005 using template matching, phase-weighted stacking, and full-waveform inversion

    NASA Astrophysics Data System (ADS)

    Matoza, Robin S.; Chouet, Bernard A.; Dawson, Phillip B.; Shearer, Peter M.; Haney, Matthew M.; Waite, Gregory P.; Moran, Seth C.; Mikesell, T. Dylan

    2015-09-01

    Long-period (LP, 0.5-5 Hz) seismicity, observed at volcanoes worldwide, is a recognized signature of unrest and eruption. Cyclic LP "drumbeating" was the characteristic seismicity accompanying the sustained dome-building phase of the 2004-2008 eruption of Mount St. Helens (MSH), WA. However, together with the LP drumbeating was a near-continuous, randomly occurring series of tiny LP seismic events (LP "subevents"), which may hold important additional information on the mechanism of seismogenesis at restless volcanoes. We employ template matching, phase-weighted stacking, and full-waveform inversion to image the source mechanism of one multiplet of these LP subevents at MSH in July 2005. The signal-to-noise ratios of the individual events are too low to produce reliable waveform inversion results, but the events are repetitive and can be stacked. We apply network-based template matching to 8 days of continuous velocity waveform data from 29 June to 7 July 2005 using a master event to detect 822 network triggers. We stack waveforms for 359 high-quality triggers at each station and component, using a combination of linear and phase-weighted stacking to produce clean stacks for use in waveform inversion. The derived source mechanism points to the volumetric oscillation (˜10 m3) of a subhorizontal crack located at shallow depth (˜30 m) in an area to the south of Crater Glacier in the southern portion of the breached MSH crater. A possible excitation mechanism is the sudden condensation of metastable steam from a shallow pressurized hydrothermal system as it encounters cool meteoric water in the outer parts of the edifice, perhaps supplied from snow melt.

  13. Computerized detection of breast lesions in multi-centre and multi-instrument DCE-MR data using 3D principal component maps and template matching

    NASA Astrophysics Data System (ADS)

    Ertas, Gokhan; Doran, Simon; Leach, Martin O.

    2011-12-01

    In this study, we introduce a novel, robust and accurate computerized algorithm based on volumetric principal component maps and template matching that facilitates lesion detection on dynamic contrast-enhanced MR. The study dataset comprises 24 204 contrast-enhanced breast MR images corresponding to 4034 axial slices from 47 women in the UK multi-centre study of MRI screening for breast cancer and categorized as high risk. The scans analysed here were performed on six different models of scanner from three commercial vendors, sited in 13 clinics around the UK. 1952 slices from this dataset, containing 15 benign and 13 malignant lesions, were used for training. The remaining 2082 slices, with 14 benign and 12 malignant lesions, were used for test purposes. To prevent false positives being detected from other tissues and regions of the body, breast volumes are segmented from pre-contrast images using a fast semi-automated algorithm. Principal component analysis is applied to the centred intensity vectors formed from the dynamic contrast-enhanced T1-weighted images of the segmented breasts, followed by automatic thresholding to eliminate fatty tissues and slowly enhancing normal parenchyma and a convolution and filtering process to minimize artefacts from moderately enhanced normal parenchyma and blood vessels. Finally, suspicious lesions are identified through a volumetric sixfold neighbourhood connectivity search and calculation of two morphological features: volume and volumetric eccentricity, to exclude highly enhanced blood vessels, nipples and normal parenchyma and to localize lesions. This provides satisfactory lesion localization. For a detection sensitivity of 100%, the overall false-positive detection rate of the system is 1.02/lesion, 1.17/case and 0.08/slice, comparing favourably with previous studies. This approach may facilitate detection of lesions in multi-centre and multi-instrument dynamic contrast-enhanced breast MR data.

  14. High level nuclear waste

    SciTech Connect

    Crandall, J L

    1980-01-01

    The DOE Division of Waste Products through a lead office at Savannah River is developing a program to immobilize all US high-level nuclear waste for terminal disposal. DOE high-level wastes include those at the Hanford Plant, the Idaho Chemical Processing Plant, and the Savannah River Plant. Commercial high-level wastes, for which DOE is also developing immobilization technology, include those at the Nuclear Fuel Services Plant and any future commercial fuels reprocessing plants. The first immobilization plant is to be the Defense Waste Processing Facility at Savannah River, scheduled for 1983 project submission to Congress and 1989 operation. Waste forms are still being selected for this plant. Borosilicate glass is currently the reference form, but alternate candidates include concretes, calcines, other glasses, ceramics, and matrix forms.

  15. Correlator bank detection of gravitational wave chirps—False-alarm probability, template density, and thresholds: Behind and beyond the minimal-match issue

    NASA Astrophysics Data System (ADS)

    Croce, R. P.; Demma, Th.; Longo, M.; Marano, S.; Matta, V.; Pierro, V.; Pinto, I. M.

    2004-12-01

    The general problem of computing the false-alarm probability vs the detection-threshold relationship for a bank of correlators is addressed, in the context of maximum-likelihood detection of gravitational waves in additive stationary Gaussian noise. Specific reference is made to chirps from coalescing binary systems. Accurate (lower-bound) approximants for the cumulative distribution of the whole-bank supremum are deduced from a class of Bonferroni-type inequalities. The asymptotic properties of the cumulative distribution are obtained, in the limit where the number of correlators goes to infinity. The validity of numerical simulations made on small-size banks is extended to banks of any size, via a Gaussian-correlation inequality. The result is used to readdress the problem of relating the template density to the fraction of potentially observable sources which could be dismissed as an effect of template space discreteness.

  16. Low-resolution gamma-ray spectrometry for an information barrier based on a multi-criteria template-matching approach

    NASA Astrophysics Data System (ADS)

    Göttsche, Malte; Schirm, Janet; Glaser, Alexander

    2016-12-01

    Gamma-ray spectrometry has been successfully employed to identify unique items containing special nuclear materials. Template information barriers have been developed in the past to confirm items as warheads by comparing their gamma signature to the signature of true warheads. Their development has, however, not been fully transparent, and they may not be sensitive to some relevant evasion scenarios. We develop a fully open template information barrier concept, based on low-resolution measurements, which, by design, reduces the extent of revealed sensitive information. The concept is based on three signatures of an item to be compared to a recorded template. The similarity of the spectrum is assessed by a modification of the Kolmogorov-Smirnov test to confirm the isotopic composition. The total gamma count rate must agree with the template as a measure of the projected surface of the object. In order to detect the diversion of fissile material from the interior of an item, a polyethylene mask is placed in front of the detector. Neutrons from spontaneous and induced fission events in the item produce 2.223 MeV gamma rays from neutron capture by hydrogen-1 in the mask. This peak is detected and its intensity scales with the item's fissile mass. The analysis based on MCNP Monte Carlo simulations of various plutonium configurations suggests that this concept can distinguish a valid item from a variety of invalid ones. The concept intentionally avoids any assumptions about specific spectral features, such as looking for specific gamma peaks of specific isotopes, thereby facilitating a fully unclassified discussion. By making all aspects public and allowing interested participants to contribute to the development and benchmarking, we enable a more open and inclusive discourse on this matter.

  17. A fast template periodogram

    NASA Astrophysics Data System (ADS)

    Hoffman, John; VanderPlas, Jake; Hartman, Joel; Bakos, Gáspár

    2017-09-01

    This proceedings contribution presents a novel, non-linear extension to the Lomb-Scargle periodogram that allows periodograms to be generated for arbitrary signal shapes. Such periodograms are already known as "template periodograms" or "periodic matched filters," but current implementations are computationally inefficient. The "fast template periodogram" presented here improves existing techniques by a factor of ˜a few for small test cases (O(10) observations), and over three orders of magnitude for lightcurves containing O(104) observations. The fast template periodogram scales asymptotically as O(HNf log HNf + H4Nf), where H denotes the number of harmonics required to adequately approximate the template and Nf is the number of trial frequencies. Existing implementations scale as O(NobsNf), where Nobs is the number of observations in the lightcurve. An open source Python implementation is available on GitHub.

  18. Nodule detection in a lung region that's segmented with using genetic cellular neural networks and 3D template matching with fuzzy rule based thresholding.

    PubMed

    Ozekes, Serhat; Osman, Onur; Ucan, Osman N

    2008-01-01

    The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer-aided detection of lung nodules.

  19. Nodule Detection in a Lung Region that's Segmented with Using Genetic Cellular Neural Networks and 3D Template Matching with Fuzzy Rule Based Thresholding

    PubMed Central

    Osman, Onur; Ucan, Osman N.

    2008-01-01

    Objective The purpose of this study was to develop a new method for automated lung nodule detection in serial section CT images with using the characteristics of the 3D appearance of the nodules that distinguish themselves from the vessels. Materials and Methods Lung nodules were detected in four steps. First, to reduce the number of region of interests (ROIs) and the computation time, the lung regions of the CTs were segmented using Genetic Cellular Neural Networks (G-CNN). Then, for each lung region, ROIs were specified with using the 8 directional search; +1 or -1 values were assigned to each voxel. The 3D ROI image was obtained by combining all the 2-Dimensional (2D) ROI images. A 3D template was created to find the nodule-like structures on the 3D ROI image. Convolution of the 3D ROI image with the proposed template strengthens the shapes that are similar to those of the template and it weakens the other ones. Finally, fuzzy rule based thresholding was applied and the ROI's were found. To test the system's efficiency, we used 16 cases with a total of 425 slices, which were taken from the Lung Image Database Consortium (LIDC) dataset. Results The computer aided diagnosis (CAD) system achieved 100% sensitivity with 13.375 FPs per case when the nodule thickness was greater than or equal to 5.625 mm. Conclusion Our results indicate that the detection performance of our algorithm is satisfactory, and this may well improve the performance of computer-aided detection of lung nodules. PMID:18253070

  20. Optimizing High Level Waste Disposal

    SciTech Connect

    Dirk Gombert

    2005-09-01

    If society is ever to reap the potential benefits of nuclear energy, technologists must close the fuel-cycle completely. A closed cycle equates to a continued supply of fuel and safe reactors, but also reliable and comprehensive closure of waste issues. High level waste (HLW) disposal in borosilicate glass (BSG) is based on 1970s era evaluations. This host matrix is very adaptable to sequestering a wide variety of radionuclides found in raffinates from spent fuel reprocessing. However, it is now known that the current system is far from optimal for disposal of the diverse HLW streams, and proven alternatives are available to reduce costs by billions of dollars. The basis for HLW disposal should be reassessed to consider extensive waste form and process technology research and development efforts, which have been conducted by the United States Department of Energy (USDOE), international agencies and the private sector. Matching the waste form to the waste chemistry and using currently available technology could increase the waste content in waste forms to 50% or more and double processing rates. Optimization of the HLW disposal system would accelerate HLW disposition and increase repository capacity. This does not necessarily require developing new waste forms, the emphasis should be on qualifying existing matrices to demonstrate protection equal to or better than the baseline glass performance. Also, this proposed effort does not necessarily require developing new technology concepts. The emphasis is on demonstrating existing technology that is clearly better (reliability, productivity, cost) than current technology, and justifying its use in future facilities or retrofitted facilities. Higher waste processing and disposal efficiency can be realized by performing the engineering analyses and trade-studies necessary to select the most efficient methods for processing the full spectrum of wastes across the nuclear complex. This paper will describe technologies being

  1. Genotyping and interpretation of STR-DNA: Low-template, mixtures and database matches-Twenty years of research and development.

    PubMed

    Gill, Peter; Haned, Hinda; Bleka, Oyvind; Hansson, Oskar; Dørum, Guro; Egeland, Thore

    2015-09-01

    The introduction of Short Tandem Repeat (STR) DNA was a revolution within a revolution that transformed forensic DNA profiling into a tool that could be used, for the first time, to create National DNA databases. This transformation would not have been possible without the concurrent development of fluorescent automated sequencers, combined with the ability to multiplex several loci together. Use of the polymerase chain reaction (PCR) increased the sensitivity of the method to enable the analysis of a handful of cells. The first multiplexes were simple: 'the quad', introduced by the defunct UK Forensic Science Service (FSS) in 1994, rapidly followed by a more discriminating 'six-plex' (Second Generation Multiplex) in 1995 that was used to create the world's first national DNA database. The success of the database rapidly outgrew the functionality of the original system - by the year 2000 a new multiplex of ten-loci was introduced to reduce the chance of adventitious matches. The technology was adopted world-wide, albeit with different loci. The political requirement to introduce pan-European databases encouraged standardisation - the development of European Standard Set (ESS) of markers comprising twelve-loci is the latest iteration. Although development has been impressive, the methods used to interpret evidence have lagged behind. For example, the theory to interpret complex DNA profiles (low-level mixtures), had been developed fifteen years ago, but only in the past year or so, are the concepts starting to be widely adopted. A plethora of different models (some commercial and others non-commercial) have appeared. This has led to a confusing 'debate' about the 'best' to use. The different models available are described along with their advantages and disadvantages. A section discusses the development of national DNA databases, along with details of an associated controversy to estimate the strength of evidence of matches. Current methodology is limited to

  2. Biometric template transformation: a security analysis

    NASA Astrophysics Data System (ADS)

    Nagar, Abhishek; Nandakumar, Karthik; Jain, Anil K.

    2010-01-01

    One of the critical steps in designing a secure biometric system is protecting the templates of the users that are stored either in a central database or on smart cards. If a biometric template is compromised, it leads to serious security and privacy threats because unlike passwords, it is not possible for a legitimate user to revoke his biometric identifiers and switch to another set of uncompromised identifiers. One methodology for biometric template protection is the template transformation approach, where the template, consisting of the features extracted from the biometric trait, is transformed using parameters derived from a user specific password or key. Only the transformed template is stored and matching is performed directly in the transformed domain. In this paper, we formally investigate the security strength of template transformation techniques and define six metrics that facilitate a holistic security evaluation. Furthermore, we analyze the security of two wellknown template transformation techniques, namely, Biohashing and cancelable fingerprint templates based on the proposed metrics. Our analysis indicates that both these schemes are vulnerable to intrusion and linkage attacks because it is relatively easy to obtain either a close approximation of the original template (Biohashing) or a pre-image of the transformed template (cancelable fingerprints). We argue that the security strength of template transformation techniques must consider also consider the computational complexity of obtaining a complete pre-image of the transformed template in addition to the complexity of recovering the original biometric template.

  3. Biometric template revocation

    NASA Astrophysics Data System (ADS)

    Arndt, Craig M.

    2004-08-01

    Biometric are a powerful technology for identifying humans both locally and at a distance. In order to perform identification or verification biometric systems capture an image of some biometric of a user or subject. The image is then converted mathematical to representation of the person call a template. Since we know that every human in the world is different each human will have different biometric images (different fingerprints, or faces, etc.). This is what makes biometrics useful for identification. However unlike a credit card number or a password to can be given to a person and later revoked if it is compromised and biometric is with the person for life. The problem then is to develop biometric templates witch can be easily revoked and reissued which are also unique to the user and can be easily used for identification and verification. In this paper we develop and present a method to generate a set of templates which are fully unique to the individual and also revocable. By using bases set compression algorithms in an n-dimensional orthogonal space we can represent a give biometric image in an infinite number of equally valued and unique ways. The verification and biometric matching system would be presented with a given template and revocation code. The code will then representing where in the sequence of n-dimensional vectors to start the recognition.

  4. High-Level Radioactive Waste.

    ERIC Educational Resources Information Center

    Hayden, Howard C.

    1995-01-01

    Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…

  5. High-Level Radioactive Waste.

    ERIC Educational Resources Information Center

    Hayden, Howard C.

    1995-01-01

    Presents a method to calculate the amount of high-level radioactive waste by taking into consideration the following factors: the fission process that yields the waste, identification of the waste, the energy required to run a 1-GWe plant for one year, and the uranium mass required to produce that energy. Briefly discusses waste disposal and…

  6. The ALICE high level trigger

    NASA Astrophysics Data System (ADS)

    Alt, T.; Grastveit, G.; Helstrup, H.; Lindenstruth, V.; Loizides, C.; Röhrich, D.; Skaali, B.; Steinbeck, T.; Stock, R.; Tilsner, H.; Ullaland, K.; Vestbø, A.; Vik, T.; Wiebalck, A.; the ALICE Collaboration

    2004-08-01

    The ALICE experiment at LHC will implement a high-level trigger system for online event selection and/or data compression. The largest computing challenge is posed by the TPC detector, which requires real-time pattern recognition. The system entails a very large processing farm that is designed for an anticipated input data stream of 25 GB s-1. In this paper, we present the architecture of the system and the current state of the tracking methods and data compression applications.

  7. High-Level Connectionist Models

    DTIC Science & Technology

    1993-10-01

    Artficial Intelligence Research Computer and Information Science Department The Ohio State Universiy Columbus, Ohio 43210 pja@ci.ohio-state.edu saunders...Peter J. Angeline, Gregory M. Saunders and Jordan B. Pollack Laboratory for Artficial Intelligence Research Computer and 1i4ormadon Science Deparment...AD-A273 638 OHIOi High-Level Connectionist Models 5LPJE UNIVERSITY Jordan B. Pollack Laboratory for Artificial Intelligence Research Department of

  8. RPython high-level synthesis

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radoslaw; Linczuk, Maciej

    2016-09-01

    The development of FPGA technology and the increasing complexity of applications in recent decades have forced compilers to move to higher abstraction levels. Compilers interprets an algorithmic description of a desired behavior written in High-Level Languages (HLLs) and translate it to Hardware Description Languages (HDLs). This paper presents a RPython based High-Level synthesis (HLS) compiler. The compiler get the configuration parameters and map RPython program to VHDL. Then, VHDL code can be used to program FPGA chips. In comparison of other technologies usage, FPGAs have the potential to achieve far greater performance than software as a result of omitting the fetch-decode-execute operations of General Purpose Processors (GPUs), and introduce more parallel computation. This can be exploited by utilizing many resources at the same time. Creating parallel algorithms computed with FPGAs in pure HDL is difficult and time consuming. Implementation time can be greatly reduced with High-Level Synthesis compiler. This article describes design methodologies and tools, implementation and first results of created VHDL backend for RPython compiler.

  9. Templated biomimetic multifunctional coatings

    NASA Astrophysics Data System (ADS)

    Sun, Chih-Hung; Gonzalez, Adriel; Linn, Nicholas C.; Jiang, Peng; Jiang, Bin

    2008-02-01

    We report a bioinspired templating technique for fabricating multifunctional optical coatings that mimic both unique functionalities of antireflective moth eyes and superhydrophobic cicada wings. Subwavelength-structured fluoropolymer nipple arrays are created by a soft-lithography-like process. The utilization of fluoropolymers simultaneously enhances the antireflective performance and the hydrophobicity of the replicated films. The specular reflectivity matches the optical simulation using a thin-film multilayer model. The dependence of the size and the crystalline ordering of the replicated nipples on the resulting antireflective properties have also been investigated by experiment and modeling. These biomimetic materials may find important technological application in self-cleaning antireflection coatings.

  10. The CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Trocino, Daniele

    2014-06-01

    The CMS experiment has been designed with a two-level trigger system: the Level-1 Trigger, implemented in custom-designed electronics, and the High-Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. A software trigger system requires a tradeoff between the complexity of the algorithms running with the available computing power, the sustainable output rate, and the selection efficiency. We present the performance of the main triggers used during the 2012 data taking, ranging from simple single-object selections to more complex algorithms combining different objects, and applying analysis-level reconstruction and selection. We discuss the optimisation of the trigger and the specific techniques to cope with the increasing LHC pile-up, reducing its impact on the physics performance.

  11. Learning templates for artistic portrait lighting analysis.

    PubMed

    Chen, Xiaowu; Jin, Xin; Wu, Hongyu; Zhao, Qinping

    2015-02-01

    Lighting is a key factor in creating impressive artistic portraits. In this paper, we propose to analyze portrait lighting by learning templates of lighting styles. Inspired by the experience of artists, we first define several novel features that describe the local contrasts in various face regions. The most informative features are then selected with a stepwise feature pursuit algorithm to derive the templates of various lighting styles. After that, the matching scores that measure the similarity between a testing portrait and those templates are calculated for lighting style classification. Furthermore, we train a regression model by the subjective scores and the feature responses of a template to predict the score of a portrait lighting quality. Based on the templates, a novel face illumination descriptor is defined to measure the difference between two portrait lightings. Experimental results show that the learned templates can well describe the lighting styles, whereas the proposed approach can assess the lighting quality of artistic portraits as human being does.

  12. Template overlap method for massive jets

    NASA Astrophysics Data System (ADS)

    Almeida, Leandro G.; Lee, Seung J.; Perez, Gilad; Sterman, George; Sung, Ilmo

    2010-09-01

    We introduce a new class of infrared safe jet observables, which we refer to as template overlaps, designed to filter targeted highly-boosted particle decays from QCD jets and other background. Template overlaps are functional measures that quantify how well the energy flow of a physical jet matches the flow of a boosted partonic decay. Any region of the partonic phase space for the boosted decays defines a template. We will refer to the maximum functional overlap found this way as the template overlap. To illustrate the method, we test lowest-order templates designed to distinguish highly-boosted top and Higgs decays from backgrounds produced by event generators. For the functional overlap, we find good results with a simple construction based on a Gaussian in energy differences within angular regions surrounding the template partons. Although different event generators give different averages for our template overlaps, we find in each case excellent rejection power, especially when combined with cuts based on jet shapes. The template overlaps are capable of systematic improvement by including higher-order corrections in the template phase space.

  13. High-Level Event Recognition in Unconstrained Videos

    DTIC Science & Technology

    2013-01-01

    the motion of the camera. The second is the acoustic channel, which may contain music , environmen- tal sounds and/or conversations. Both channels...and simple template matching to detect a few pre-defined keywords such as “touchdown”. Minami et al. [91] utilized music and speech detection to assist...special cases where a video’s audio channel is dubbed by an entirely different audio con- tent, e.g., a motorbike stunt video dubbed with a music track

  14. Template changes with perceptual learning are driven by feature informativeness

    PubMed Central

    Kurki, Ilmari; Eckstein, Miguel P.

    2014-01-01

    Perceptual learning changes the way the human visual system processes stimulus information. Previous studies have shown that the human brain's weightings of visual information (the perceptual template) become better matched to the optimal weightings. However, the dynamics of the template changes are not well understood. We used the classification image method to investigate whether visual field or stimulus properties govern the dynamics of the changes in the perceptual template. A line orientation discrimination task where highly informative parts were placed in the peripheral visual field was used to test three hypotheses: (1) The template changes are determined by the visual field structure, initially covering stimulus parts closer to the fovea and expanding toward the periphery with learning; (2) the template changes are object centered, starting from the center and expanding toward edges; and (3) the template changes are determined by stimulus information, starting from the most informative parts and expanding to less informative parts. Results show that, initially, the perceptual template contained only the more peripheral, highly informative parts. Learning expanded the template to include less informative parts, resulting in an increase in sampling efficiency. A second experiment interleaved parts with high and low signal-to-noise ratios and showed that template reweighting through learning was restricted to stimulus elements that are spatially contiguous to parts with initial high template weights. The results suggest that the informativeness of features determines how the perceptual template changes with learning. Further, the template expansion is constrained by spatial proximity. PMID:25194018

  15. Multibiometric Systems: Fusion Strategies and Template Security

    DTIC Science & Technology

    2008-01-01

    gratitude to Linda Moore, Debbie Kruch, Cathy Davison, Starr Portice, Norma Teague, Kim Thompson, Cathy Sparks, Sue Watson and Adam Pitcher for their...Weighted sum rule Red, Green , Blue channels for face [109] Match score Sum and min rules [166] Feature; match score Feature selection and concatenation...FVC2002-DB2 database, mosaiced template leads to a GAR of 94% and 3The core point was detected using the commercial Neurotechnologija Verifinger software

  16. Programmable imprint lithography template

    DOEpatents

    Cardinale, Gregory F.; Talin, Albert A.

    2006-10-31

    A template for imprint lithography (IL) that reduces significantly template production costs by allowing the same template to be re-used for several technology generations. The template is composed of an array of spaced-apart moveable and individually addressable rods or plungers. Thus, the template can be configured to provide a desired pattern by programming the array of plungers such that certain of the plungers are in an "up" or actuated configuration. This arrangement of "up" and "down" plungers forms a pattern composed of protruding and recessed features which can then be impressed onto a polymer film coated substrate by applying a pressure to the template impressing the programmed configuration into the polymer film. The pattern impressed into the polymer film will be reproduced on the substrate by subsequent processing.

  17. Sonographic stroke templates.

    PubMed

    Govaert, Paul

    2009-10-01

    This chapter provides arterial and venous stroke templates, designed with neonatal brain ultrasound as the viewpoint and adult stroke templates as the basis. Images change with maturation of the stages of infarction: swelling, necrosis, organisation and tissue loss. Adult templates permit recognition of well-delineated stroke types observed in the newborn brain. All circle of Willis arteries can be involved, as can their perforator branches. Middle cerebral artery (MCA) truncal stroke (anterior or posterior) is an important entity, with different prognosis than complete MCA stroke. Knowledge of these templates also aids in the definition of combinations of infarction (e.g. internal carotid artery stroke or pial plus perforator stroke) and of interarterial watershed injury. Venous templates, even if still under development around the time of birth, permit us to understand brain injury associated with sinus or deep vein thrombosis, especially several types of intracranial haemorrhage. Hindbrain stroke templates are scarcely applied to perinatal lesions.

  18. FUZZY SUPERNOVA TEMPLATES. I. CLASSIFICATION

    SciTech Connect

    Rodney, Steven A.; Tonry, John L. E-mail: jt@ifa.hawaii.ed

    2009-12-20

    Modern supernova (SN) surveys are now uncovering stellar explosions at rates that far surpass what the world's spectroscopic resources can handle. In order to make full use of these SN data sets, it is necessary to use analysis methods that depend only on the survey photometry. This paper presents two methods for utilizing a set of SN light-curve templates to classify SN objects. In the first case, we present an updated version of the Bayesian Adaptive Template Matching program (BATM). To address some shortcomings of that strictly Bayesian approach, we introduce a method for Supernova Ontology with Fuzzy Templates (SOFT), which utilizes fuzzy set theory for the definition and combination of SN light-curve models. For well-sampled light curves with a modest signal-to-noise ratio (S/N >10), the SOFT method can correctly separate thermonuclear (Type Ia) SNe from core collapse SNe with >=98% accuracy. In addition, the SOFT method has the potential to classify SNe into sub-types, providing photometric identification of very rare or peculiar explosions. The accuracy and precision of the SOFT method are verified using Monte Carlo simulations as well as real SN light curves from the Sloan Digital Sky Survey and the SuperNova Legacy Survey. In a subsequent paper, the SOFT method is extended to address the problem of parameter estimation, providing estimates of redshift, distance, and host galaxy extinction without any spectroscopy.

  19. Converting Basic D3 Charts into Reusable Style Templates.

    PubMed

    Harper, Jonathan; Agrawala, Maneesh

    2017-02-07

    We present a technique for converting a basic D3 chart into a reusable style template. Then, given a new data source we can apply the style template to generate a chart that depicts the new data, but in the style of the template. To construct the style template we first deconstruct the input D3 chart to recover its underlying structure: the data, the marks and the mappings that describe how the marks encode the data. We then rank the perceptual effectiveness of the deconstructed mappings. To apply the resulting style template to a new data source we first obtain importance ranks for each new data field. We then adjust the template mappings to depict the source data by matching the most important data fields to the most perceptually effective mappings. We show how the style templates can be applied to source data in the form of either a data table or another D3 chart. While our implementation focuses on generating templates for basic chart types (e.g. variants of bar charts, line charts, dot plots, scatterplots, etc.), these are the most commonly used chart types today. Users can easily find such basic D3 charts on the Web, turn them into templates, and immediately see how their own data would look in the visual style (e.g. colors, shapes, fonts, etc.) of the templates. We demonstrate the effectiveness of our approach by applying a diverse set of style templates to a variety of source datasets.

  20. A Software Architecture for High Level Applications

    SciTech Connect

    Shen,G.

    2009-05-04

    A modular software platform for high level applications is under development at the National Synchrotron Light Source II project. This platform is based on client-server architecture, and the components of high level applications on this platform will be modular and distributed, and therefore reusable. An online model server is indispensable for model based control. Different accelerator facilities have different requirements for the online simulation. To supply various accelerator simulators, a set of narrow and general application programming interfaces is developed based on Tracy-3 and Elegant. This paper describes the system architecture for the modular high level applications, the design of narrow and general application programming interface for an online model server, and the prototype of online model server.

  1. CLIPS template system for program understanding

    NASA Technical Reports Server (NTRS)

    Finkbine, Ronald B.

    1994-01-01

    Program understanding is a subfield of software reengineering and attempts to recognize the run-time behavior of source code. To this point, success in this area has been limited to very small code segments. An expert system, HLAR (High-Level Algorithm Recognizer), has been written in CLIPS and recognizes three sorting algorithms, selection sort, quicksort, and heapsort. This paper describes the HLAR system in general and, in depth, the CLIPS templates used for program representation and understanding.

  2. High-Level Application Framework for LCLS

    SciTech Connect

    Chu, P; Chevtsov, S.; Fairley, D.; Larrieu, C.; Rock, J.; Rogind, D.; White, G.; Zalazny, M.; /SLAC

    2008-04-22

    A framework for high level accelerator application software is being developed for the Linac Coherent Light Source (LCLS). The framework is based on plug-in technology developed by an open source project, Eclipse. Many existing functionalities provided by Eclipse are available to high-level applications written within this framework. The framework also contains static data storage configuration and dynamic data connectivity. Because the framework is Eclipse-based, it is highly compatible with any other Eclipse plug-ins. The entire infrastructure of the software framework will be presented. Planned applications and plug-ins based on the framework are also presented.

  3. Templates, Numbers & Watercolors.

    ERIC Educational Resources Information Center

    Clemesha, David J.

    1990-01-01

    Describes how a second-grade class used large templates to draw and paint five-digit numbers. The lesson integrated artistic knowledge and vocabulary with their mathematics lesson in place value. Students learned how draftspeople use templates, and they studied number paintings by Charles Demuth and Jasper Johns. (KM)

  4. Templates, Numbers & Watercolors.

    ERIC Educational Resources Information Center

    Clemesha, David J.

    1990-01-01

    Describes how a second-grade class used large templates to draw and paint five-digit numbers. The lesson integrated artistic knowledge and vocabulary with their mathematics lesson in place value. Students learned how draftspeople use templates, and they studied number paintings by Charles Demuth and Jasper Johns. (KM)

  5. High-level radioactive wastes. Supplement 1

    SciTech Connect

    McLaren, L.H.

    1984-09-01

    This bibliography contains information on high-level radioactive wastes included in the Department of Energy's Energy Data Base from August 1982 through December 1983. These citations are to research reports, journal articles, books, patents, theses, and conference papers from worldwide sources. Five indexes, each preceded by a brief description, are provided: Corporate Author, Personal Author, Subject, Contract Number, and Report Number. 1452 citations.

  6. PAIRWISE BLENDING OF HIGH LEVEL WASTE (HLW)

    SciTech Connect

    CERTA, P.J.

    2006-02-22

    The primary objective of this study is to demonstrate a mission scenario that uses pairwise and incidental blending of high level waste (HLW) to reduce the total mass of HLW glass. Secondary objectives include understanding how recent refinements to the tank waste inventory and solubility assumptions affect the mass of HLW glass and how logistical constraints may affect the efficacy of HLW blending.

  7. Do we understand high-level vision?

    PubMed

    Cox, David Daniel

    2014-04-01

    'High-level' vision lacks a single, agreed upon definition, but it might usefully be defined as those stages of visual processing that transition from analyzing local image structure to analyzing structure of the external world that produced those images. Much work in the last several decades has focused on object recognition as a framing problem for the study of high-level visual cortex, and much progress has been made in this direction. This approach presumes that the operational goal of the visual system is to read-out the identity of an object (or objects) in a scene, in spite of variation in the position, size, lighting and the presence of other nearby objects. However, while object recognition as a operational framing of high-level is intuitive appealing, it is by no means the only task that visual cortex might do, and the study of object recognition is beset by challenges in building stimulus sets that adequately sample the infinite space of possible stimuli. Here I review the successes and limitations of this work, and ask whether we should reframe our approaches to understanding high-level vision. Copyright © 2014. Published by Elsevier Ltd.

  8. High-Level Binocular Rivalry Effects

    PubMed Central

    Wolf, Michal; Hochstein, Shaul

    2011-01-01

    Binocular rivalry (BR) occurs when the brain cannot fuse percepts from the two eyes because they are different. We review results relating to an ongoing controversy regarding the cortical site of the BR mechanism. Some BR qualities suggest it is low-level: (1) BR, as its name implies, is usually between eyes and only low-levels have access to utrocular information. (2) All input to one eye is suppressed: blurring doesn’t stimulate accommodation; pupilary constrictions are reduced; probe detection is reduced. (3) Rivalry is affected by low-level attributes, contrast, spatial frequency, brightness, motion. (4) There is limited priming due to suppressed words or pictures. On the other hand, recent studies favor a high-level mechanism: (1) Rivalry occurs between patterns, not eyes, as in patchwork rivalry or a swapping paradigm. (2) Attention affects alternations. (3) Context affects dominance. There is conflicting evidence from physiological studies (single cell and fMRI) regarding cortical level(s) of conscious perception. We discuss the possibility of multiple BR sites and theoretical considerations that rule out this solution. We present new data regarding the locus of the BR switch by manipulating stimulus semantic content or high-level characteristics. Since these variations are represented at higher cortical levels, their affecting rivalry supports high-level BR intervention. In Experiment I, we measure rivalry when one eye views words and the other non-words and find significantly longer dominance durations for non-words. In Experiment II, we find longer dominance times for line drawings of simple, structurally impossible figures than for similar, possible objects. In Experiment III, we test the influence of idiomatic context on rivalry between words. Results show that generally words within their idiomatic context have longer mean dominance durations. We conclude that BR has high-level cortical influences, and may be controlled by a high-level mechanism

  9. High-level binocular rivalry effects.

    PubMed

    Wolf, Michal; Hochstein, Shaul

    2011-01-01

    Binocular rivalry (BR) occurs when the brain cannot fuse percepts from the two eyes because they are different. We review results relating to an ongoing controversy regarding the cortical site of the BR mechanism. Some BR qualities suggest it is low-level: (1) BR, as its name implies, is usually between eyes and only low-levels have access to utrocular information. (2) All input to one eye is suppressed: blurring doesn't stimulate accommodation; pupilary constrictions are reduced; probe detection is reduced. (3) Rivalry is affected by low-level attributes, contrast, spatial frequency, brightness, motion. (4) There is limited priming due to suppressed words or pictures. On the other hand, recent studies favor a high-level mechanism: (1) Rivalry occurs between patterns, not eyes, as in patchwork rivalry or a swapping paradigm. (2) Attention affects alternations. (3) Context affects dominance. There is conflicting evidence from physiological studies (single cell and fMRI) regarding cortical level(s) of conscious perception. We discuss the possibility of multiple BR sites and theoretical considerations that rule out this solution. We present new data regarding the locus of the BR switch by manipulating stimulus semantic content or high-level characteristics. Since these variations are represented at higher cortical levels, their affecting rivalry supports high-level BR intervention. In Experiment I, we measure rivalry when one eye views words and the other non-words and find significantly longer dominance durations for non-words. In Experiment II, we find longer dominance times for line drawings of simple, structurally impossible figures than for similar, possible objects. In Experiment III, we test the influence of idiomatic context on rivalry between words. Results show that generally words within their idiomatic context have longer mean dominance durations. We conclude that BR has high-level cortical influences, and may be controlled by a high-level mechanism.

  10. High-Level Waste Melter Study Report

    SciTech Connect

    Perez, Joseph M.; Bickford, Dennis F.; Day, Delbert E.; Kim, Dong-Sang; Lambert, Steven L.; Marra, Sharon L.; Peeler, David K.; Strachan, Denis M.; Triplett, Mark B.; Vienna, John D.; Wittman, Richard S.

    2001-07-13

    At the Hanford Site in Richland, Washington, the path to site cleanup involves vitrification of the majority of the wastes that currently reside in large underground tanks. A Joule-heated glass melter is the equipment of choice for vitrifying the high-level fraction of these wastes. Even though this technology has general national and international acceptance, opportunities may exist to improve or change the technology to reduce the enormous cost of accomplishing the mission of site cleanup. Consequently, the U.S. Department of Energy requested the staff of the Tanks Focus Area to review immobilization technologies, waste forms, and modifications to requirements for solidification of the high-level waste fraction at Hanford to determine what aspects could affect cost reductions with reasonable long-term risk. The results of this study are summarized in this report.

  11. High-Level Waste Melter Study Report

    SciTech Connect

    Perez Jr, Joseph M; Bickford, Dennis F; Day, Delbert E; Kim, Dong-Sang; Lambert, Steven L; Marra, Sharon L; Peeler, David K; Strachan, Denis M; Triplett, Mark B; Vienna, John D; Wittman, Richard S

    2001-07-13

    At the Hanford Site in Richland, Washington, the path to site cleanup involves vitrification of the majority of the wastes that currently reside in large underground tanks. A Joule-heated glass melter is the equipment of choice for vitrifying the high-level fraction of these wastes. Even though this technology has general national and international acceptance, opportunities may exist to improve or change the technology to reduce the enormous cost of accomplishing the mission of site cleanup. Consequently, the U.S. Department of Energy requested the staff of the Tanks Focus Area to review immobilization technologies, waste forms, and modifications to requirements for solidification of the high-level waste fraction at Hanford to determine what aspects could affect cost reductions with reasonable long-term risk. The results of this study are summarized in this report.

  12. Virus templated metallic nanoparticles.

    PubMed

    Aljabali, Alaa A A; Barclay, J Elaine; Lomonossoff, George P; Evans, David J

    2010-12-01

    Plant viruses are considered as nanobuilding blocks that can be used as synthons or templates for novel materials. Cowpea mosaic virus (CPMV) particles have been shown to template the fabrication of metallic nanoparticles by an electroless deposition metallization process. Palladium ions were electrostatically bound to the virus capsid and, when reduced, acted as nucleation sites for the subsequent metal deposition from solution. The method, although simple, produced highly monodisperse metallic nanoparticles with a diameter of ca. ≤35 nm. CPMV-templated particles were prepared with cobalt, nickel, iron, platinum, cobalt-platinum and nickel-iron.

  13. E3 Charter Template

    EPA Pesticide Factsheets

    This is a charter template which includes decisions made during the project planning phase, as well as local project goals, a communication strategy, an outreach strategy, distribution of responsibilities and a schedule.

  14. Enhanced ICBM Diffusion Tensor Template of the Human Brain

    PubMed Central

    Zhang, Shengwei; Peng, Huiling; Dawe, Robert J.; Arfanakis, Konstantinos

    2010-01-01

    Development of a diffusion tensor (DT) template that is representative of the micro-architecture of the human brain is crucial for comparisons of neuronal structural integrity and brain connectivity across populations, as well as for the generation of a detailed white matter atlas. Furthermore, a DT template in ICBM space may simplify consolidation of information from DT, anatomical and functional MRI studies. The previously developed “IIT DT brain template” was produced in ICBM-152 space, based on a large number of subjects from a limited age-range, using data with minimal image artifacts, and non-linear registration. That template was characterized by higher image sharpness, provided the ability to distinguish smaller white matter fiber structures, and contained fewer image artifacts, than several previously published DT templates. However, low-dimensional registration was used in the development of that template, which led to a mismatch of DT information across subjects, eventually manifested as loss of local diffusion information and errors in the final tensors. Also, low-dimensional registration led to a mismatch of the anatomy in the IIT and ICBM-152 templates. In this work, a significantly improved DT brain template in ICBM-152 space was developed, using high-dimensional non-linear registration and the raw data collected for the purposes of the IIT template. The accuracy of inter-subject DT matching was significantly increased compared to that achieved for the development of the IIT template. Consequently, the new template contained DT information that was more representative of single-subject human brain data, and was characterized by higher image sharpness than the IIT template. Furthermore, a bootstrap approach demonstrated that the variance of tensor characteristics was lower in the new template. Additionally, compared to the IIT template, brain anatomy in the new template more accurately matched ICBM-152 space. Finally, spatial normalization of a

  15. Virus templated metallic nanoparticles

    NASA Astrophysics Data System (ADS)

    Aljabali, Alaa A. A.; Barclay, J. Elaine; Lomonossoff, George P.; Evans, David J.

    2010-12-01

    Plant viruses are considered as nanobuilding blocks that can be used as synthons or templates for novel materials. Cowpea mosaic virus (CPMV) particles have been shown to template the fabrication of metallic nanoparticles by an electroless deposition metallization process. Palladium ions were electrostatically bound to the virus capsid and, when reduced, acted as nucleation sites for the subsequent metal deposition from solution. The method, although simple, produced highly monodisperse metallic nanoparticles with a diameter of ca. <=35 nm. CPMV-templated particles were prepared with cobalt, nickel, iron, platinum, cobalt-platinum and nickel-iron.Plant viruses are considered as nanobuilding blocks that can be used as synthons or templates for novel materials. Cowpea mosaic virus (CPMV) particles have been shown to template the fabrication of metallic nanoparticles by an electroless deposition metallization process. Palladium ions were electrostatically bound to the virus capsid and, when reduced, acted as nucleation sites for the subsequent metal deposition from solution. The method, although simple, produced highly monodisperse metallic nanoparticles with a diameter of ca. <=35 nm. CPMV-templated particles were prepared with cobalt, nickel, iron, platinum, cobalt-platinum and nickel-iron. Electronic supplementary information (ESI) available: Additional experimental detail, agarose gel electrophoresis results, energy dispersive X-ray spectra, ζ-potential measurements, dynamic light scattering data, nanoparticle tracking analysis and an atomic force microscopy image of Ni-CPMV. See DOI: 10.1039/c0nr00525h

  16. EAP high-level product architecture

    NASA Astrophysics Data System (ADS)

    Gudlaugsson, T. V.; Mortensen, N. H.; Sarban, R.

    2013-04-01

    EAP technology has the potential to be used in a wide range of applications. This poses the challenge to the EAP component manufacturers to develop components for a wide variety of products. Danfoss Polypower A/S is developing an EAP technology platform, which can form the basis for a variety of EAP technology products while keeping complexity under control. High level product architecture has been developed for the mechanical part of EAP transducers, as the foundation for platform development. A generic description of an EAP transducer forms the core of the high level product architecture. This description breaks down the EAP transducer into organs that perform the functions that may be present in an EAP transducer. A physical instance of an EAP transducer contains a combination of the organs needed to fulfill the task of actuator, sensor, and generation. Alternative principles for each organ allow the function of the EAP transducers to be changed, by basing the EAP transducers on a different combination of organ alternatives. A model providing an overview of the high level product architecture has been developed to support daily development and cooperation across development teams. The platform approach has resulted in the first version of an EAP technology platform, on which multiple EAP products can be based. The contents of the platform have been the result of multi-disciplinary development work at Danfoss PolyPower, as well as collaboration with potential customers and research institutions. Initial results from applying the platform on demonstrator design for potential applications are promising. The scope of the article does not include technical details.

  17. High-level waste qualification: Managing uncertainty

    SciTech Connect

    Pulsipher, B.A.

    1993-09-01

    A vitrification facility is being developed by the U.S. Department of Energy (DOE) at the West Valley Demonstration Plant (WVDP) near Buffalo, New York, where approximately 300 canisters of high-level nuclear waste glass will be produced. To assure that the produced waste form is acceptable, uncertainty must be managed. Statistical issues arise due to sampling, waste variations, processing uncertainties, and analytical variations. This paper presents elements of a strategy to characterize and manage the uncertainties associated with demonstrating that an acceptable waste form product is achieved. Specific examples are provided within the context of statistical work performed by Pacific Northwest Laboratory (PNL).

  18. Python based high-level synthesis compiler

    NASA Astrophysics Data System (ADS)

    Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard

    2014-11-01

    This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.

  19. The effects of high level infrasound

    SciTech Connect

    Johnson, D.L.

    1980-02-01

    This paper will attempt to survey the current knowledge on the effects of relative high levels of infrasound on humans. While this conference is concerned mainly about hearing, some discussion of other physiological effects is appropriate. Such discussion also serves to highlight a basic question, 'Is hearing the main concern of infrasound and low frequency exposure, or is there a more sensitive mechanism'. It would be comforting to know that the focal point of this conference is indeed the most important concern. Therefore, besides hearing loss and auditory threshold of infrasonic and low frequency exposure, four other effects will be provided. These are performance, respiration, annoyance, and vibration.

  20. Service Oriented Architecture for High Level Applications

    SciTech Connect

    Chu, Chungming; Chevtsov, Sergei; Wu, Juhao; Shen, Guobao; /Brookhaven

    2012-06-28

    Standalone high level applications often suffer from poor performance and reliability due to lengthy initialization, heavy computation and rapid graphical update. Service-oriented architecture (SOA) is trying to separate the initialization and computation from applications and to distribute such work to various service providers. Heavy computation such as beam tracking will be done periodically on a dedicated server and data will be available to client applications at all time. Industrial standard service architecture can help to improve the performance, reliability and maintainability of the service. Robustness will also be improved by reducing the complexity of individual client applications.

  1. High Level Waste Disposal System Optimization

    SciTech Connect

    Dirk Gombert; M. Connolly; J. Roach; W. Holtzscheiter

    2005-02-01

    The high level waste (HLW) disposal system consists of the Yucca Mountain Facility (YMF) and waste product (e.g. glass) generation facilities. Responsibility for management is shared between the U. S. Department of Energy (DOE) Offices of Civilian Radioactive Waste Management (DOE-RW) and Environmental Management (DOE-EM). The DOE-RW license application and the Waste Acceptance System Requirements Document (WASRD), as well as the DOE-EM Waste Acceptance Product Specification for Vitrified High Level Waste Forms (WAPS) govern the overall performance of the system. This basis for HLW disposal should be reassessed to consider waste form and process technology research and development (R&D), which have been conducted by DOE-EM, international agencies (i.e. ANSTO, CEA), and the private sector; as well as the technical bases for including additional waste forms in the final license application. This will yield a more optimized HLW disposal system to accelerate HLW disposition, more efficient utilization of the YMF, and overall system cost reduction.

  2. Commissioning of the CMS High Level Trigger

    SciTech Connect

    Agostino, Lorenzo; et al.

    2009-08-01

    The CMS experiment will collect data from the proton-proton collisions delivered by the Large Hadron Collider (LHC) at a centre-of-mass energy up to 14 TeV. The CMS trigger system is designed to cope with unprecedented luminosities and LHC bunch-crossing rates up to 40 MHz. The unique CMS trigger architecture only employs two trigger levels. The Level-1 trigger is implemented using custom electronics, while the High Level Trigger (HLT) is based on software algorithms running on a large cluster of commercial processors, the Event Filter Farm. We present the major functionalities of the CMS High Level Trigger system as of the starting of LHC beams operations in September 2008. The validation of the HLT system in the online environment with Monte Carlo simulated data and its commissioning during cosmic rays data taking campaigns are discussed in detail. We conclude with the description of the HLT operations with the first circulating LHC beams before the incident occurred the 19th September 2008.

  3. Templated blue phases.

    PubMed

    Ravnik, Miha; Fukuda, Jun-ichi

    2015-11-21

    Cholesteric blue phases of a chiral liquid crystal are interesting examples of self-organised three-dimensional nanostructures formed by soft matter. Recently it was demonstrated that a polymer matrix introduced by photopolymerization inside a bulk blue phase not only stabilises the host blue phase significantly, but also serves as a template for blue phase ordering. We show with numerical modelling that the transfer of the orientational order of the blue phase to the surfaces of the polymer matrix, together with the resulting surface anchoring, can account for the templating behaviour of the polymer matrix inducing the blue phase ordering of an achiral nematic liquid crystal. Furthermore, tailoring the anchoring conditions of the polymer matrix surfaces can bring about orientational ordering different from those of bulk blue phases, including an intertwined complex of the polymer matrix and topological line defects of orientational order. Optical Kerr response of templated blue phases is explored, finding large Kerr constants in the range of K = 2-10 × 10(-9) m V(-2) and notable dependence on the surface anchoring strength. More generally, the presented numerical approach is aimed to clarify the role and actions of templating polymer matrices in complex chiral nematic fluids, and further to help design novel template-based materials from chiral liquid crystals.

  4. Umbra's High Level Architecture (HLA) Interface

    SciTech Connect

    GOTTLIEB, ERIC JOSEPH; MCDONALD, MICHAEL J.; OPPEL III, FRED J.

    2002-04-01

    This report describes Umbra's High Level Architecture HLA library. This library serves as an interface to the Defense Simulation and Modeling Office's (DMSO) Run Time Infrastructure Next Generation Version 1.3 (RTI NG1.3) software library and enables Umbra-based models to be federated into HLA environments. The Umbra library was built to enable the modeling of robots for military and security system concept evaluation. A first application provides component technologies that ideally fit the US Army JPSD's Joint Virtual Battlespace (JVB) simulation framework for Objective Force concept analysis. In addition to describing the Umbra HLA library, the report describes general issues of integrating Umbra with RTI code and outlines ways of building models to support particular HLA simulation frameworks like the JVB.

  5. The High Level Data Reduction Library

    NASA Astrophysics Data System (ADS)

    Ballester, P.; Gabasch, A.; Jung, Y.; Modigliani, A.; Taylor, J.; Coccato, L.; Freudling, W.; Neeser, M.; Marchetti, E.

    2015-09-01

    The European Southern Observatory (ESO) provides pipelines to reduce data for most of the instruments at its Very Large telescope (VLT). These pipelines are written as part of the development of VLT instruments, and are used both in the ESO's operational environment and by science users who receive VLT data. All the pipelines are highly specific geared toward instruments. However, experience showed that the independently developed pipelines include significant overlap, duplication and slight variations of similar algorithms. In order to reduce the cost of development, verification and maintenance of ESO pipelines, and at the same time improve the scientific quality of pipelines data products, ESO decided to develop a limited set of versatile high-level scientific functions that are to be used in all future pipelines. The routines are provided by the High-level Data Reduction Library (HDRL). To reach this goal, we first compare several candidate algorithms and verify them during a prototype phase using data sets from several instruments. Once the best algorithm and error model have been chosen, we start a design and implementation phase. The coding of HDRL is done in plain C and using the Common Pipeline Library (CPL) functionality. HDRL adopts consistent function naming conventions and a well defined API to minimise future maintenance costs, implements error propagation, uses pixel quality information, employs OpenMP to take advantage of multi-core processors, and is verified with extensive unit and regression tests. This poster describes the status of the project and the lesson learned during the development of reusable code implementing algorithms of high scientific quality.

  6. The sidebar template and extraction of invariant feature of calligraphy and painting seal

    NASA Astrophysics Data System (ADS)

    Hu, Zheng-kun; Bao, Hong; Lou, Hai-tao

    2009-07-01

    The paper propose a novel seal extract method through template matching based on the characteristics of the external contour of the seal image in Chinese Painting and Calligraphy. By analyzing the characteristics of the seal edge, we obtain the priori knowledge of the seal edge, and set up the outline template of the seals, then design a template matching method by computing the distance difference between the outline template and the seal image edge which can extract seal image from Chinese Painting and Calligraphy effectively. This method is proved to have higher extraction rate by experiment results than the traditional image extract methods.

  7. Prototypes and personal templates: collective wisdom and individual differences.

    PubMed

    Horowitz, Leonard M; Turan, Bulent

    2008-10-01

    This article concerns individual differences in the associative meaning of psychological concepts. Associative meaning may be assessed with prototype methodology, which yields a list of features of the concept ordered according to their rated importance. Our theory concerns individual differences in a concept's associative meaning: A personal template reveals a person's idiosyncratic associative meaning. It is possible to assess the degree to which a personal template matches the corresponding prototype. The theory distinguishes among three types of concepts. One type, for example, specifies a particular behavior to be predicted, for example, a person who is likely to commit suicide, and features of the prototype would include predictors of suicidal behavior. According to the theory, the most prototypical features are (under specifiable conditions) valid predictors, and people with a strong template-to-prototype match possess more valid knowledge about the concept than do people with a weak template-to-prototype match. Other types of concepts cannot be validated (e.g., those describing subjective experiences). In that case, a strong template-to-prototype match does not reflect a person's degree of valid knowledge. The authors provide three applications of the theory.

  8. Engineering neural systems for high-level problem solving.

    PubMed

    Sylvester, Jared; Reggia, James

    2016-07-01

    There is a long-standing, sometimes contentious debate in AI concerning the relative merits of a symbolic, top-down approach vs. a neural, bottom-up approach to engineering intelligent machine behaviors. While neurocomputational methods excel at lower-level cognitive tasks (incremental learning for pattern classification, low-level sensorimotor control, fault tolerance and processing of noisy data, etc.), they are largely non-competitive with top-down symbolic methods for tasks involving high-level cognitive problem solving (goal-directed reasoning, metacognition, planning, etc.). Here we take a step towards addressing this limitation by developing a purely neural framework named galis. Our goal in this work is to integrate top-down (non-symbolic) control of a neural network system with more traditional bottom-up neural computations. galis is based on attractor networks that can be "programmed" with temporal sequences of hand-crafted instructions that control problem solving by gating the activity retention of, communication between, and learning done by other neural networks. We demonstrate the effectiveness of this approach by showing that it can be applied successfully to solve sequential card matching problems, using both human performance and a top-down symbolic algorithm as experimental controls. Solving this kind of problem makes use of top-down attention control and the binding together of visual features in ways that are easy for symbolic AI systems but not for neural networks to achieve. Our model can not only be instructed on how to solve card matching problems successfully, but its performance also qualitatively (and sometimes quantitatively) matches the performance of both human subjects that we had perform the same task and the top-down symbolic algorithm that we used as an experimental control. We conclude that the core principles underlying the galis framework provide a promising approach to engineering purely neurocomputational systems for problem

  9. Proton Affinity Calculations with High Level Methods.

    PubMed

    Kolboe, Stein

    2014-08-12

    Proton affinities, stretching from small reference compounds, up to the methylbenzenes and naphthalene and anthracene, have been calculated with high accuracy computational methods, viz. W1BD, G4, G3B3, CBS-QB3, and M06-2X. Computed and the currently accepted reference proton affinities are generally in excellent accord, but there are deviations. The literature value for propene appears to be 6-7 kJ/mol too high. Reported proton affinities for the methylbenzenes seem 4-5 kJ/mol too high. G4 and G3 computations generally give results in good accord with the high level W1BD. Proton affinity values computed with the CBS-QB3 scheme are too low, and the error increases with increasing molecule size, reaching nearly 10 kJ/mol for the xylenes. The functional M06-2X fails markedly for some of the small reference compounds, in particular, for CO and ketene, but calculates methylbenzene proton affinities with high accuracy.

  10. Decontamination of high-level waste canisters

    SciTech Connect

    Nesbitt, J.F.; Slate, S.C.; Fetrow, L.K.

    1980-12-01

    This report presents evaluations of several methods for the in-process decontamination of metallic canisters containing any one of a number of solidified high-level waste (HLW) forms. The use of steam-water, steam, abrasive blasting, electropolishing, liquid honing, vibratory finishing and soaking have been tested or evaluated as potential techniques to decontaminate the outer surfaces of HLW canisters. Either these techniques have been tested or available literature has been examined to assess their applicability to the decontamination of HLW canisters. Electropolishing has been found to be the most thorough method to remove radionuclides and other foreign material that may be deposited on or in the outer surface of a canister during any of the HLW processes. Steam or steam-water spraying techniques may be adequate for some applications but fail to remove all contaminated forms that could be present in some of the HLW processes. Liquid honing and abrasive blasting remove contamination and foreign material very quickly and effectively from small areas and components although these blasting techniques tend to disperse the material removed from the cleaned surfaces. Vibratory finishing is very capable of removing the bulk of contamination and foreign matter from a variety of materials. However, special vibratory finishing equipment would have to be designed and adapted for a remote process. Soaking techniques take long periods of time and may not remove all of the smearable contamination. If soaking involves pickling baths that use corrosive agents, these agents may cause erosion of grain boundaries that results in rough surfaces.

  11. The ALICE electromagnetic calorimeter high level triggers

    NASA Astrophysics Data System (ADS)

    Ronchetti, F.; Blanco, F.; Figueredo, M.; Knospe, A. G.; Xaplanteris, L.

    2012-12-01

    The ALICE (A Large Ion Collider Experiment) detector yields a huge sample of data from different sub-detectors. On-line data processing is applied to select and reduce the volume of the stored data. ALICE applies a multi-level hardware trigger scheme where fast detectors are used to feed a three-level (L0, L1, and L2) deep chain. The High-Level Trigger (HLT) is a fourth filtering stage sitting logically between the L2 trigger and the data acquisition event building. The EMCal detector comprises a large area electromagnetic calorimeter that extends the momentum measurement of photons and neutral mesons up to pT = 250 GeV/c, which improves the ALICE capability to perform jet reconstruction with measurement of the neutral energy component of jets. An online reconstruction and trigger chain has been developed within the HLT framework to sharpen the EMCal hardware triggers, by combining the central barrel tracking information with the shower reconstruction (clusters) in the calorimeter. In the present report the status and the functionality of the software components developed for the EMCal HLT online reconstruction and trigger chain will be discussed, as well as preliminary results from their commissioning performed during the 2011 LHC running period.

  12. HIGH LEVEL RF FOR THE SNS RING.

    SciTech Connect

    ZALTSMAN,A.; BLASKIEWICZ,M.; BRENNAN,J.; BRODOWSKI,J.; METH,M.; SPITZ,R.; SEVERINO,F.

    2002-06-03

    A high level RF system (HLRF) consisting of power amplifiers (PA's) and ferrite loaded cavities is being designed and built by Brookhaven National Laboratory (BNL) for the Spallation Neutron Source (SNS) project. It is a fixed frequency, two harmonic system whose main function is to maintain a gap for the kicker rise time. Three cavities running at the fundamental harmonic (h=l) will provide 40 kV and one cavity at the second harmonic (h=2) will provide 20 kV. Each cavity has two gaps with a design voltage of 10 kV per gap and will be driven by a power amplifier (PA) directly adjacent to it. The PA uses a 600kW tetrode to provide the necessary drive current. The anode of the tetrode is magnetically coupled to the downstream cell of the cavity. Drive to the PA will be provided by a wide band, solid state amplifier located remotely. A dynamic tuning scheme will be implemented to help compensate for the effect of beam loading.

  13. Performance of the CMS High Level Trigger

    NASA Astrophysics Data System (ADS)

    Perrotta, Andrea

    2015-12-01

    The CMS experiment has been designed with a 2-level trigger system. The first level is implemented using custom-designed electronics. The second level is the so-called High Level Trigger (HLT), a streamlined version of the CMS offline reconstruction software running on a computer farm. For Run II of the Large Hadron Collider, the increases in center-of-mass energy and luminosity will raise the event rate to a level challenging for the HLT algorithms. The increase in the number of interactions per bunch crossing, on average 25 in 2012, and expected to be around 40 in Run II, will be an additional complication. We present here the expected performance of the main triggers that will be used during the 2015 data taking campaign, paying particular attention to the new approaches that have been developed to cope with the challenges of the new run. This includes improvements in HLT electron and photon reconstruction as well as better performing muon triggers. We will also present the performance of the improved tracking and vertexing algorithms, discussing their impact on the b-tagging performance as well as on the jet and missing energy reconstruction.

  14. DEFENSE HIGH LEVEL WASTE GLASS DEGRADATION

    SciTech Connect

    W. Ebert

    2001-09-20

    The purpose of this Analysis/Model Report (AMR) is to document the analyses that were done to develop models for radionuclide release from high-level waste (HLW) glass dissolution that can be integrated into performance assessment (PA) calculations conducted to support site recommendation and license application for the Yucca Mountain site. This report was developed in accordance with the ''Technical Work Plan for Waste Form Degradation Process Model Report for SR'' (CRWMS M&O 2000a). It specifically addresses the item, ''Defense High Level Waste Glass Degradation'', of the product technical work plan. The AP-3.15Q Attachment 1 screening criteria determines the importance for its intended use of the HLW glass model derived herein to be in the category ''Other Factors for the Postclosure Safety Case-Waste Form Performance'', and thus indicates that this factor does not contribute significantly to the postclosure safety strategy. Because the release of radionuclides from the glass will depend on the prior dissolution of the glass, the dissolution rate of the glass imposes an upper bound on the radionuclide release rate. The approach taken to provide a bound for the radionuclide release is to develop models that can be used to calculate the dissolution rate of waste glass when contacted by water in the disposal site. The release rate of a particular radionuclide can then be calculated by multiplying the glass dissolution rate by the mass fraction of that radionuclide in the glass and by the surface area of glass contacted by water. The scope includes consideration of the three modes by which water may contact waste glass in the disposal system: contact by humid air, dripping water, and immersion. The models for glass dissolution under these contact modes are all based on the rate expression for aqueous dissolution of borosilicate glasses. The mechanism and rate expression for aqueous dissolution are adequately understood; the analyses in this AMR were conducted to

  15. The high-level trigger of ALICE

    NASA Astrophysics Data System (ADS)

    Tilsner, H.; Alt, T.; Aurbakken, K.; Grastveit, G.; Helstrup, H.; Lindenstruth, V.; Loizides, C.; Nystrand, J.; Roehrich, D.; Skaali, B.; Steinbeck, T.; Ullaland, K.; Vestbo, A.; Vik, T.

    One of the main tracking detectors of the forthcoming ALICE Experiment at the LHC is a cylindrical Time Projection Chamber (TPC) with an expected data volume of about 75 MByte per event. This data volume, in combination with the presumed maximum bandwidth of 1.2 GByte/s to the mass storage system, would limit the maximum event rate to 20 Hz. In order to achieve higher event rates, online data processing has to be applied. This implies either the detection and read-out of only those events which contain interesting physical signatures or an efficient compression of the data by modeling techniques. In order to cope with the anticipated data rate, massive parallel computing power is required. It will be provided in form of a clustered farm of SMP-nodes, based on off-the-shelf PCs, which are connected with a high bandwidth low overhead network. This High-Level Trigger (HLT) will be able to process a data rate of 25 GByte/s online. The front-end electronics of the individual sub-detectors is connected to the HLT via an optical link and a custom PCI card which is mounted in the clustered PCs. The PCI card is equipped with an FPGA necessary for the implementation of the PCI-bus protocol. Therefore, this FPGA can also be used to assist the host processor with first-level processing. The first-level processing done on the FPGA includes conventional cluster-finding for low multiplicity events and local track finding based on the Hough Transformation of the raw data for high multiplicity events. PACS: 07.05.-t Computers in experimental physics - 07.05.Hd Data acquisition: hardware and software - 29.85.+c Computer data analysis

  16. Environmental Learning Centers: A Template.

    ERIC Educational Resources Information Center

    Vozick, Eric

    1999-01-01

    Provides a working model, or template, for community-based environmental learning centers (ELCs). The template presents a philosophy as well as a plan for staff and administration operations, educational programming, and financial support. The template also addresses "green" construction and maintenance of buildings and grounds and…

  17. Multinomial pattern matching revisited

    NASA Astrophysics Data System (ADS)

    Horvath, Matthew S.; Rigling, Brian D.

    2015-05-01

    Multinomial pattern matching (MPM) is an automatic target recognition algorithm developed for specifically radar data at Sandia National Laboratories. The algorithm is in a family of algorithms that first quantizes pixel value into Nq bins based on pixel amplitude before training and classification. This quantization step reduces the sensitivity of algorithm performance to absolute intensity variation in the data, typical of radar data where signatures exhibit high variation for even small changes in aspect angle. Our previous work has focused on performance analysis of peaky template matching, a special case of MPM where binary quantization is used (Nq = 2). Unfortunately references on these algorithms are generally difficult to locate and here we revisit the MPM algorithm and illustrate the underlying statistical model and decision rules for two algorithm interpretations: the 1-of-K vector form and the scalar. MPM can also be used as a detector and specific attention is given to algorithm tuning where "peak pixels" are chosen based on their underlying empirical probabilities according to a reward minimization strategy aimed at reducing false alarms in the detection scenario and false positives in a classification capacity. The algorithms are demonstrated using Monte Carlo simulations on the AFRL civilian vehicle dataset for variety of choices of Nq.

  18. Protein function prediction using local 3D templates.

    PubMed

    Laskowski, Roman A; Watson, James D; Thornton, Janet M

    2005-08-19

    The prediction of a protein's function from its 3D structure is becoming more and more important as the worldwide structural genomics initiatives gather pace and continue to solve 3D structures, many of which are of proteins of unknown function. Here, we present a methodology for predicting function from structure that shows great promise. It is based on 3D templates that are defined as specific 3D conformations of small numbers of residues. We use four types of template, covering enzyme active sites, ligand-binding residues, DNA-binding residues and reverse templates. The latter are templates generated from the target structure itself and scanned against a representative subset of all known protein structures. Together, the templates provide a fairly thorough coverage of the known structures and ensure that if there is a match to a known structure it is unlikely to be missed. A new scoring scheme provides a highly sensitive means of discriminating between true positive and false positive template matches. In all, the methodology provides a powerful new tool for function prediction to complement those already in use.

  19. Fan fault diagnosis based on symmetrized dot pattern analysis and image matching

    NASA Astrophysics Data System (ADS)

    Xu, Xiaogang; Liu, Haixiao; Zhu, Hao; Wang, Songling

    2016-07-01

    To detect the mechanical failure of fans, a new diagnostic method based on the symmetrized dot pattern (SDP) analysis and image matching is proposed. Vibration signals of 13 kinds of running states are acquired on a centrifugal fan test bed and reconstructed by the SDP technique. The SDP pattern templates of each running state are established. An image matching method is performed to diagnose the fault. In order to improve the diagnostic accuracy, the single template, multiple templates and clustering fault templates are used to perform the image matching.

  20. Fast image matching algorithm based on projection characteristics

    NASA Astrophysics Data System (ADS)

    Zhou, Lijuan; Yue, Xiaobo; Zhou, Lijun

    2011-06-01

    Based on analyzing the traditional template matching algorithm, this paper identified the key factors restricting the speed of matching and put forward a brand new fast matching algorithm based on projection. Projecting the grayscale image, this algorithm converts the two-dimensional information of the image into one-dimensional one, and then matches and identifies through one-dimensional correlation, meanwhile, because of normalization has been done, when the image brightness or signal amplitude increasing in proportion, it could also perform correct matching. Experimental results show that the projection characteristics based image registration method proposed in this article could greatly improve the matching speed, which ensuring the matching accuracy as well.

  1. Human action recognition using motion energy template

    NASA Astrophysics Data System (ADS)

    Shao, Yanhua; Guo, Yongcai; Gao, Chao

    2015-06-01

    Human action recognition is an active and interesting research topic in computer vision and pattern recognition field that is widely used in the real world. We proposed an approach for human activity analysis based on motion energy template (MET), a new high-level representation of video. The main idea for the MET model is that human actions could be expressed as the composition of motion energy acquired in a three-dimensional (3-D) space-time volume by using a filter bank. The motion energies were directly computed from raw video sequences, thus some problems, such as object location and segmentation, etc., are definitely avoided. Another important competitive merit of this MET method is its insensitivity to gender, hair, and clothing. We extract MET features by using the Bhattacharyya coefficient to measure the motion energy similarity between the action template video and the tested video, and then the 3-D max-pooling. Using these features as input to the support vector machine, extensive experiments on two benchmark datasets, Weizmann and KTH, were carried out. Compared with other state-of-the-art approaches, such as variation energy image, dynamic templates and local motion pattern descriptors, the experimental results demonstrate that our MET model is competitive and promising.

  2. Rapid Catalytic Template Searching as an Enzyme Function Prediction Procedure

    PubMed Central

    Nilmeier, Jerome P.; Kirshner, Daniel A.; Wong, Sergio E.; Lightstone, Felice C.

    2013-01-01

    We present an enzyme protein function identification algorithm, Catalytic Site Identification (CatSId), based on identification of catalytic residues. The method is optimized for highly accurate template identification across a diverse template library and is also very efficient in regards to time and scalability of comparisons. The algorithm matches three-dimensional residue arrangements in a query protein to a library of manually annotated, catalytic residues – The Catalytic Site Atlas (CSA). Two main processes are involved. The first process is a rapid protein-to-template matching algorithm that scales quadratically with target protein size and linearly with template size. The second process incorporates a number of physical descriptors, including binding site predictions, in a logistic scoring procedure to re-score matches found in Process 1. This approach shows very good performance overall, with a Receiver-Operator-Characteristic Area Under Curve (AUC) of 0.971 for the training set evaluated. The procedure is able to process cofactors, ions, nonstandard residues, and point substitutions for residues and ions in a robust and integrated fashion. Sites with only two critical (catalytic) residues are challenging cases, resulting in AUCs of 0.9411 and 0.5413 for the training and test sets, respectively. The remaining sites show excellent performance with AUCs greater than 0.90 for both the training and test data on templates of size greater than two critical (catalytic) residues. The procedure has considerable promise for larger scale searches. PMID:23675414

  3. Seasonal changes in stress indicators in high level football.

    PubMed

    Faude, O; Kellmann, M; Ammann, T; Schnittker, R; Meyer, T

    2011-04-01

    This study aimed at describing changes in stress and performance indicators throughout a competitive season in high level football. 15 players (19.5±3.0 years, 181±5 cm, 75.7±9.0 kg) competing under professional circumstances were tested at baseline and 3 times during the season 2008/09 (in-season 1, 2, 3). Testing consisted of the Recovery-Stress Questionnaire for Athletes (Total Stress and Recovery score), vertical jump tests (counter movement and drop jump (DJ)), and a maximal ramp-like running test. Average match exposure was higher during a 3-weeks period prior to in-season 3 compared to in-season 1 and 2 (1.5 vs. 1 h/week, p=0.05). Total Stress score was elevated at in-season 1 and 2 compared to baseline (p<0.01) with a further increase at in-season 3 (p<0.03; generalized eta squared (η(2)(g))=0.37). Total Recovery score was decreased at in-season 1 and 3 compared to baseline (p<0.05; η(2)(g)=0.21). Maximal running velocity (V(max)) and jumping heights were not significantly affected (η(2)(g)≤0.04). Changes in DJ height and V (max) between baseline and in-season 3 were correlated with the corresponding changes in Total Stress score (r=-0.55 and r=-0.61, p<0.03). Usual match exposure during a professional football season does not induce relevant changes in performance indicators. Accumulated stress and a lack of recovery towards the end of a season might be indicated by psychometric deteriorations. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Activity profile of high-level Australian lacrosse players.

    PubMed

    Polley, Chris S; Cormack, Stuart J; Gabbett, Tim J; Polglaze, Ted

    2015-01-01

    Despite lacrosse being one of the fastest growing team sports in the world, there is a paucity of information detailing the activity profile of high-level players. Microtechnology systems (global positioning systems and accelerometers) provide the opportunity to obtain detailed information on the activity profile in lacrosse. Therefore, this study aimed to analyze the activity profile of lacrosse match-play using microtechnology. Activity profile variables assessed relative to minutes of playing time included relative distance (meter per minute), distance spent standing (0-0.1 m·min), walking (0.2-1.7 m·min), jogging (1.8-3.2 m·min), running (3.3-5.6 m·min), sprinting (≥5.7 m·min), number of high, moderate, low accelerations and decelerations, and player load (PL per minute), calculated as the square root of the sum of the squared instantaneous rate of change in acceleration in 3 vectors (medio-lateral, anterior-posterior, and vertical). Activity was recorded from 14 lacrosse players over 4 matches during a national tournament. Players were separated into positions of attack, midfield, or defense. Differences (effect size [ES] ± 90% confidence interval) between positions and periods of play were considered likely positive when there was ≥75% likelihood of the difference exceeding an ES threshold of 0.2. Midfielders had likely covered higher (mean ± SD) meters per minute (100 ± 11) compared with attackers (87 ± 14; ES = 0.89 ± 1.04) and defenders (79 ± 14; ES = 1.54 ± 0.94) and more moderate and high accelerations and decelerations. Almost all variables across positions were reduced in quarter 4 compared with quarter 1. Coaches should accommodate for positional differences when preparing lacrosse players for competition.

  5. Dental rehabilitation of amelogenesis imperfecta using thermoformed templates.

    PubMed

    Sockalingam, Snmp

    2011-01-01

    Amelogenesis imperfecta represents a group of dental developmental conditions that are genomic in origin. Hypoplastic AI, hypomineralised AI or both in combination were the most common types seen clinically. This paper describes oral rehabilitation of a 9-year-old Malay girl with inherited hypoplastic AI using transparent thermoforming templates. The defective surface areas were reconstructed to their original dimensions on stone cast models of the upper and lower arches using composite, and transparent thermoform templates were fabricated on the models. The templates were used as crown formers to reconstruct the defective teeth clinically using esthetically matching composite. The usage of the templates allowed direct light curing of the composite, accurate reproducibility of the anatomic contours of the defective teeth, reduced chair-side time and easy contouring and placement of homogenous thickness of composite in otherwise inaccessible sites of the affected teeth.

  6. Metrics associated with NIH funding: a high-level view

    PubMed Central

    Jordan, Paul

    2011-01-01

    Objective To introduce the availability of grant-to-article linkage data associated with National Institutes of Health (NIH) grants and to perform a high-level analysis of the publication outputs and impacts associated with those grants. Design Articles were linked to the grants they acknowledge using the grant acknowledgment strings in PubMed using a parsing and matching process as embodied in the NIH Scientific Publication Information Retrieval & Evaluation System system. Additional data from PubMed and citation counts from Scopus were added to the linkage data. The data comprise 2 572 576 records from 1980 to 2009. Results The data show that synergies between NIH institutes are increasing over time; 29% of current articles acknowledge grants from multiple institutes. The median time lag to publication for a new grant is 3 years. Each grant contributes to approximately 1.7 articles per year, averaged over all grant types. Articles acknowledging US Public Health Service (PHS, which includes NIH) funding are cited twice as much as US-authored articles acknowledging no funding source. Articles acknowledging both PHS funding and a non-US government funding source receive on average 40% more citations that those acknowledging PHS funding sources alone. Conclusion The US PHS is effective at funding research with a higher-than-average impact. The data are amenable to further and much more detailed analysis. PMID:21527408

  7. Discovery of high-level behavior from observation of human performance in a strategic game.

    PubMed

    Stensrud, Brian S; Gonzalez, Avelino J

    2008-06-01

    This paper explores the issues faced in creating a system that can learn tactical human behavior merely by observing a human perform the behavior in a simulation. More specifically, this paper describes a technique based on fuzzy ARTMAP (FAM) neural networks to discover the criteria that cause a transition between contexts during a strategic game simulation. The approach depends on existing context templates that can identify the high-level action of the human, given a description of the situation along with his action. The learning task then becomes the identification and representation of the context sequence executed by the human. In this paper, we present the FAM/Template-based Interpretation Learning Engine (FAMTILE). This system seeks to achieve this learning task by constructing rules that govern the context transitions made by the human. To evaluate FAMTILE, six test scenarios were developed to achieve three distinct evaluation goals: 1) to assess the learning capabilities of FAM; 2) to evaluate the ability of FAMTILE to correctly predict human and context selections, given an observation; and 3) more fundamentally, to create a model of the human's behavior that can perform the high-level task at a comparable level of proficiency.

  8. Cubic nitride templates

    DOEpatents

    Burrell, Anthony K; McCleskey, Thomas Mark; Jia, Quanxi; Mueller, Alexander H; Luo, Hongmei

    2013-04-30

    A polymer-assisted deposition process for deposition of epitaxial cubic metal nitride films and the like is presented. The process includes solutions of one or more metal precursor and soluble polymers having binding properties for the one or more metal precursor. After a coating operation, the resultant coating is heated at high temperatures under a suitable atmosphere to yield metal nitride films and the like. Such films can be used as templates for the development of high quality cubic GaN based electronic devices.

  9. Templating Water Stains for Nanolithography

    PubMed Central

    Liao, Wei-Ssu; Chen, Xin; Chen, Jixin; Cremer, Paul S.

    2008-01-01

    Herein, a nanoscale patterning technique is demonstrated for creating twin features in polymers and metals. The process works by combining evaporative staining with a templating process. Well-ordered hexagonally arrayed double rings were fabricated using hydrophobic spherical templates. The diameter of the rings, the width of individual rings, and the spacing between concentric and adjacent rings could be tuned by varying the solution conditions. Arrays could be made without the outer ring by employing hydrophilic templates. PMID:17637019

  10. Matched Filter Detection of Microseismicity at Ngatamariki and Rotokawa Geothermal Fields, Central North Island, New Zealand

    NASA Astrophysics Data System (ADS)

    Hopp, C. J.; Savage, M. K.; Townend, J.; Sherburn, S.

    2016-12-01

    Monitoring patterns in local microseismicity gives clues to the existence and location of subsurface structures. In the context of a geothermal reservoir, subsurface structures often indicate areas of high permeability and are vitally important in understanding fluid flow within the geothermal resource. Detecting and locating microseismic events within an area of power generation, however, is often challenging due to high levels of noise associated with nearby power plant infrastructure. In this situation, matched filter detection improves drastically upon standard earthquake detection techniques, specifically when events are likely induced by fluid injection and are therefore near-repeating. Using an earthquake catalog of 637 events which occurred between 1 January and 18 November 2015 as our initial dataset, we implemented a matched filtering routine for the Mighty River Power (MRP) geothermal fields at Rotokawa and Ngatamariki, central North Island, New Zealand. We detected nearly 21,000 additional events across both geothermal fields, a roughly 30-fold increase from the original catalog. On average, each of the 637 template events detected 45 additional events throughout the study period, with a maximum number of additional detections for a single template of 359. Cumulative detection rates for all template events, in general, do not mimic large scale changes in injection rates within the fields, however we do see indications of an increase in detection rate associated with power plant shutdown at Ngatamariki. Locations of detected events follow established patterns of historic seismicity at both Ngatamariki and Rotokawa. One large cluster of events persists in the southeastern portion of Rotokawa and is likely bounded to the northwest by a known fault dividing the injection and production sections of the field. Two distinct clusters of microseismicity occur in the North and South of Ngatamariki, the latter appearing to coincide with a structure dividing the

  11. Impact of Surgical Template on the Accuracy of Implant Placement.

    PubMed

    Xu, Liang-Wei; You, Jia; Zhang, Jian-Xing; Liu, Yun-Feng; Peng, Wei

    2016-12-01

    To achieve functional and esthetic results, implants must be placed accurately; however, little information is available regarding the effect of surgical templates on the accuracy of implant placement. Thus, the aim of this study was to measure the deviation between actual and planned implant positions, and determine the deviation caused by the surgical template. Jaws from 16 patients were scanned using cone beam computed tomography (CBCT). For our study, 53 implants were planned in a virtual 3D environment, of which 35 were inserted in the mandible and 18 in the maxilla. A stereolithographic (SLA) surgical template was created. A CBCT scan of the surgical template fitted on a plaster model was performed, and the images obtained were matched to virtual implant plan images that contained the planned implant position. The actual implant position was acquired from the registration position of the surgical template. Deviation between actual and planned implant positions was analyzed. Mean central deviation at the hex and apex was 0.456 mm and 0.515 mm, respectively. Mean value of horizontal deviation at the hex was 0.193 mm, horizontal deviation at the apex was 0.277 mm, vertical deviation at the hex was 0.388 mm, vertical deviation at the apex was 0.390 mm, and angular deviation was 0.621°. Our study results revealed a significant deviation between actual and planned implant positions caused by the surgical template. © 2015 by the American College of Prosthodontists.

  12. A HIGH-LEVEL PYTHON INTERFACE TO THE FERMILAB ACNET CONTROL SYSTEM

    SciTech Connect

    Piot, P.; Halavanau, A.

    2016-10-19

    This paper discusses the implementation of a python- based high-level interface to the Fermilab acnet control system. The interface has been successfully employed during the commissioning of the Fermilab Accelerator Science & Technology (FAST) facility. Specifically, we present examples of applications at FAST which include the interfacing of the elegant program to assist lattice matching, an automated emittance measurement via the quadrupole-scan method and tranverse transport matrix measurement of a superconducting RF cavity.

  13. DNA-templated nanofabrication.

    PubMed

    Becerril, Héctor A; Woolley, Adam T

    2009-02-01

    Nanofabrication, or the organizational control over matter at the nanometre scale, is an intriguing scientific challenge requiring multidisciplinary tools for its solution. DNA is a biomolecule that can be combined with other nanometre-scale entities through chemical self-assembly to form a broad variety of nanomaterials. In this tutorial review we present the principles that allow DNA to interact with other chemical species, and describe the challenges and potential applications of DNA as a template for making both biological and inorganic features with nanometre resolution. As such, this report should be of interest to chemists, surface and materials scientists, biologists, and nanotechnologists, as well as others who seek to use DNA in nanofabrication.

  14. Templated quasicrystalline molecular layers

    NASA Astrophysics Data System (ADS)

    Smerdon, Joe; Young, Kirsty; Lowe, Michael; Hars, Sanger; Yadav, Thakur; Hesp, David; Dhanak, Vinod; Tsai, An-Pang; Sharma, Hem Raj; McGrath, Ronan

    2014-03-01

    Quasicrystals are materials with long range ordering but no periodicity. We report scanning tunneling microscopy (STM) observations of quasicrystalline molecular layers on five-fold quasicrystal surfaces. The molecules adopt positions and orientations on the surface consistent with the quasicrystalline ordering of the substrate. Carbon-60 adsorbs atop sufficiently-separated Fe atoms on icosahedral Al-Cu-Fe to form a unique quasicrystalline lattice whereas further C60 molecules decorate remaining surface Fe atoms in a quasi-degenerate fashion. Pentacene (Pn) adsorbs at tenfold-symmetric points around surface-bisected rhombic triacontahedral clusters in icosahedral Ag-In-Yb. These systems constitute the first demonstrations of quasicrystalline molecular ordering on a template. EPSRC EP/D05253X/1, EP/D071828/1, UK BIS.

  15. A Template for Insect Cryopreservation

    USDA-ARS?s Scientific Manuscript database

    This article is intended to update the reader on the progress made on insect embryo cryopreservation in the past 20 years and gives information for developing a protocol for cryopreserving insects by using a 2001 study as a template. The study used for the template is the cryopreservation of the Old...

  16. Prediction of enzyme function based on 3D templates of evolutionarily important amino acids

    PubMed Central

    Kristensen, David M; Ward, R Matthew; Lisewski, Andreas Martin; Erdin, Serkan; Chen, Brian Y; Fofanov, Viacheslav Y; Kimmel, Marek; Kavraki, Lydia E; Lichtarge, Olivier

    2008-01-01

    Background Structural genomics projects such as the Protein Structure Initiative (PSI) yield many new structures, but often these have no known molecular functions. One approach to recover this information is to use 3D templates – structure-function motifs that consist of a few functionally critical amino acids and may suggest functional similarity when geometrically matched to other structures. Since experimentally determined functional sites are not common enough to define 3D templates on a large scale, this work tests a computational strategy to select relevant residues for 3D templates. Results Based on evolutionary information and heuristics, an Evolutionary Trace Annotation (ETA) pipeline built templates for 98 enzymes, half taken from the PSI, and sought matches in a non-redundant structure database. On average each template matched 2.7 distinct proteins, of which 2.0 share the first three Enzyme Commission digits as the template's enzyme of origin. In many cases (61%) a single most likely function could be predicted as the annotation with the most matches, and in these cases such a plurality vote identified the correct function with 87% accuracy. ETA was also found to be complementary to sequence homology-based annotations. When matches are required to both geometrically match the 3D template and to be sequence homologs found by BLAST or PSI-BLAST, the annotation accuracy is greater than either method alone, especially in the region of lower sequence identity where homology-based annotations are least reliable. Conclusion These data suggest that knowledge of evolutionarily important residues improves functional annotation among distant enzyme homologs. Since, unlike other 3D template approaches, the ETA method bypasses the need for experimental knowledge of the catalytic mechanism, it should prove a useful, large scale, and general adjunct to combine with other methods to decipher protein function in the structural proteome. PMID:18190718

  17. A Framework for Translating a High Level Security Policy into Low Level Security Mechanisms

    NASA Astrophysics Data System (ADS)

    Hassan, Ahmed A.; Bahgat, Waleed M.

    2010-01-01

    Security policies have different components; firewall, active directory, and IDS are some examples of these components. Enforcement of network security policies to low level security mechanisms faces some essential difficulties. Consistency, verification, and maintenance are the major ones of these difficulties. One approach to overcome these difficulties is to automate the process of translation of high level security policy into low level security mechanisms. This paper introduces a framework of an automation process that translates a high level security policy into low level security mechanisms. The framework is described in terms of three phases; in the first phase all network assets are categorized according to their roles in the network security and relations between them are identified to constitute the network security model. This proposed model is based on organization based access control (OrBAC). However, the proposed model extend the OrBAC model to include not only access control policy but also some other administrative security policies like auditing policy. Besides, the proposed model enables matching of each rule of the high level security policy with the corresponding ones of the low level security policy. Through the second phase of the proposed framework, the high level security policy is mapped into the network security model. The second phase could be considered as a translation of the high level security policy into an intermediate model level. Finally, the intermediate model level is translated automatically into low level security mechanism. The paper illustrates the applicability of proposed approach through an application example.

  18. Brain templates and atlases.

    PubMed

    Evans, Alan C; Janke, Andrew L; Collins, D Louis; Baillet, Sylvain

    2012-08-15

    The core concept within the field of brain mapping is the use of a standardized, or "stereotaxic", 3D coordinate frame for data analysis and reporting of findings from neuroimaging experiments. This simple construct allows brain researchers to combine data from many subjects such that group-averaged signals, be they structural or functional, can be detected above the background noise that would swamp subtle signals from any single subject. Where the signal is robust enough to be detected in individuals, it allows for the exploration of inter-individual variance in the location of that signal. From a larger perspective, it provides a powerful medium for comparison and/or combination of brain mapping findings from different imaging modalities and laboratories around the world. Finally, it provides a framework for the creation of large-scale neuroimaging databases or "atlases" that capture the population mean and variance in anatomical or physiological metrics as a function of age or disease. However, while the above benefits are not in question at first order, there are a number of conceptual and practical challenges that introduce second-order incompatibilities among experimental data. Stereotaxic mapping requires two basic components: (i) the specification of the 3D stereotaxic coordinate space, and (ii) a mapping function that transforms a 3D brain image from "native" space, i.e. the coordinate frame of the scanner at data acquisition, to that stereotaxic space. The first component is usually expressed by the choice of a representative 3D MR image that serves as target "template" or atlas. The native image is re-sampled from native to stereotaxic space under the mapping function that may have few or many degrees of freedom, depending upon the experimental design. The optimal choice of atlas template and mapping function depend upon considerations of age, gender, hemispheric asymmetry, anatomical correspondence, spatial normalization methodology and disease

  19. Templated nanocarbons for energy storage.

    PubMed

    Nishihara, Hirotomo; Kyotani, Takashi

    2012-08-28

    The template carbonization method is a powerful tool for producing carbon materials with precisely controlled structures at the nanometer level. The resulting templated nanocarbons exhibit extraordinarily unique (often ordered) structures that could never be produced by any of the conventional methods for carbon production. This review summarizes recent publications about templated nanocarbons and their composites used for energy storage applications, including hydrogen storage, electrochemical capacitors, and lithium-ion batteries. The main objective of this review is to clarify the true significance of the templated nanocarbons for each application. For this purpose, the performance characteristics of almost all templated nanocarbons reported thus far are listed and compared with those of conventional materials, so that the advantages/disadvantages of the templated nanocarbons are elucidated. From the practical point of view, the high production cost and poor mass-producibility of the templated nanocarbons make them rather difficult to utilize; however, the study of their unique, specific, and ordered structures enables a deeper insight into energy storage mechanisms and the guidelines for developing energy storage materials. Thus, another important purpose of this work is to establish such general guidelines and to propose future strategies for the production of carbon materials with improved performance for energy storage applications. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. Cloning nanocrystal morphology with soft templates

    NASA Astrophysics Data System (ADS)

    Thapa, Dev Kumar; Pandey, Anshu

    2016-08-01

    In most template directed preparative methods, while the template decides the nanostructure morphology, the structure of the template itself is a non-general outcome of its peculiar chemistry. Here we demonstrate a template mediated synthesis that overcomes this deficiency. This synthesis involves overgrowth of silica template onto a sacrificial nanocrystal. Such templates are used to copy the morphologies of gold nanorods. After template overgrowth, gold is removed and silver is regrown in the template cavity to produce a single crystal silver nanorod. This technique allows for duplicating existing nanocrystals, while also providing a quantifiable breakdown of the structure - shape interdependence.

  1. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 25 2011-07-01 2011-07-01 false High-level radioactive waste. 227.30 Section 227.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the...

  2. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 24 2010-07-01 2010-07-01 false High-level radioactive waste. 227.30 Section 227.30 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) OCEAN DUMPING...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the...

  3. 46 CFR 119.530 - Bilge high level alarms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Bilge high level alarms. 119.530 Section 119.530... Bilge and Ballast Systems § 119.530 Bilge high level alarms. (a) Each vessel must be provided with a visual and audible alarm at the operating station to indicate a high water level in each of the...

  4. 46 CFR 182.530 - Bilge high level alarms.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 7 2010-10-01 2010-10-01 false Bilge high level alarms. 182.530 Section 182.530... TONS) MACHINERY INSTALLATION Bilge and Ballast Systems § 182.530 Bilge high level alarms. (a) On a... operating station to indicate a high water level in each of the following normally unmanned spaces: (1)...

  5. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 26 2013-07-01 2013-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  6. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 26 2012-07-01 2011-07-01 true High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  7. 40 CFR 227.30 - High-level radioactive waste.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 25 2014-07-01 2014-07-01 false High-level radioactive waste. 227.30...-level radioactive waste. High-level radioactive waste means the aqueous waste resulting from the operation of the first cycle solvent extraction system, or equivalent, and the concentrated waste...

  8. 46 CFR 182.530 - Bilge high level alarms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 7 2012-10-01 2012-10-01 false Bilge high level alarms. 182.530 Section 182.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) MACHINERY INSTALLATION Bilge and Ballast Systems § 182.530 Bilge high level alarms. (a) On a...

  9. 46 CFR 119.530 - Bilge high level alarms.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Bilge high level alarms. 119.530 Section 119.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS CARRYING MORE... Bilge and Ballast Systems § 119.530 Bilge high level alarms. (a) Each vessel must be provided with a...

  10. 46 CFR 119.530 - Bilge high level alarms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Bilge high level alarms. 119.530 Section 119.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS CARRYING MORE... Bilge and Ballast Systems § 119.530 Bilge high level alarms. (a) Each vessel must be provided with a...

  11. 46 CFR 119.530 - Bilge high level alarms.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Bilge high level alarms. 119.530 Section 119.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS CARRYING MORE... Bilge and Ballast Systems § 119.530 Bilge high level alarms. (a) Each vessel must be provided with a...

  12. 46 CFR 182.530 - Bilge high level alarms.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 7 2014-10-01 2014-10-01 false Bilge high level alarms. 182.530 Section 182.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) MACHINERY INSTALLATION Bilge and Ballast Systems § 182.530 Bilge high level alarms. (a) On a...

  13. 46 CFR 182.530 - Bilge high level alarms.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 7 2011-10-01 2011-10-01 false Bilge high level alarms. 182.530 Section 182.530 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) SMALL PASSENGER VESSELS (UNDER 100 GROSS TONS) MACHINERY INSTALLATION Bilge and Ballast Systems § 182.530 Bilge high level alarms. (a) On a...

  14. Process for solidifying high-level nuclear waste

    DOEpatents

    Ross, Wayne A.

    1978-01-01

    The addition of a small amount of reducing agent to a mixture of a high-level radioactive waste calcine and glass frit before the mixture is melted will produce a more homogeneous glass which is leach-resistant and suitable for long-term storage of high-level radioactive waste products.

  15. Automatic script identification from images using cluster-based templates

    SciTech Connect

    Hochberg, J.; Kerns, L.; Kelly, P.; Thomas, T.

    1995-02-01

    We have developed a technique for automatically identifying the script used to generate a document that is stored electronically in bit image form. Our approach differs from previous work in that the distinctions among scripts are discovered by an automatic learning procedure, without any handson analysis. We first develop a set of representative symbols (templates) for each script in our database (Cyrillic, Roman, etc.). We do this by identifying all textual symbols in a set of training documents, scaling each symbol to a fixed size, clustering similar symbols, pruning minor clusters, and finding each cluster`s centroid. To identify a new document`s script, we identify and scale a subset of symbols from the document and compare them to the templates for each script. We choose the script whose templates provide the best match. Our current system distinguishes among the Armenian, Burmese, Chinese, Cyrillic, Ethiopic, Greek, Hebrew, Japanese, Korean, Roman, and Thai scripts with over 90% accuracy.

  16. On the construction of a new stellar classification template library for the LAMOST spectral analysis pipeline

    SciTech Connect

    Wei, Peng; Luo, Ali; Li, Yinbi; Tu, Liangping; Wang, Fengfei; Zhang, Jiannan; Chen, Xiaoyan; Hou, Wen; Kong, Xiao; Wu, Yue; Zuo, Fang; Yi, Zhenping; Zhao, Yongheng; Chen, Jianjun; Du, Bing; Guo, Yanxin; Ren, Juanjuan; Pan, Jingchang; Jiang, Bin; Liu, Jie E-mail: weipeng@nao.cas.cn; and others

    2014-05-01

    The LAMOST spectral analysis pipeline, called the 1D pipeline, aims to classify and measure the spectra observed in the LAMOST survey. Through this pipeline, the observed stellar spectra are classified into different subclasses by matching with template spectra. Consequently, the performance of the stellar classification greatly depends on the quality of the template spectra. In this paper, we construct a new LAMOST stellar spectral classification template library, which is supposed to improve the precision and credibility of the present LAMOST stellar classification. About one million spectra are selected from LAMOST Data Release One to construct the new stellar templates, and they are gathered in 233 groups by two criteria: (1) pseudo g – r colors obtained by convolving the LAMOST spectra with the Sloan Digital Sky Survey ugriz filter response curve, and (2) the stellar subclass given by the LAMOST pipeline. In each group, the template spectra are constructed using three steps. (1) Outliers are excluded using the Local Outlier Probabilities algorithm, and then the principal component analysis method is applied to the remaining spectra of each group. About 5% of the one million spectra are ruled out as outliers. (2) All remaining spectra are reconstructed using the first principal components of each group. (3) The weighted average spectrum is used as the template spectrum in each group. Using the previous 3 steps, we initially obtain 216 stellar template spectra. We visually inspect all template spectra, and 29 spectra are abandoned due to low spectral quality. Furthermore, the MK classification for the remaining 187 template spectra is manually determined by comparing with 3 template libraries. Meanwhile, 10 template spectra whose subclass is difficult to determine are abandoned. Finally, we obtain a new template library containing 183 LAMOST template spectra with 61 different MK classes by combining it with the current library.

  17. Integration of professional judgement and decision-making in high-level adventure sports coaching practice.

    PubMed

    Collins, Loel; Collins, Dave

    2015-01-01

    This study examined the integration of professional judgement and decision-making processes in adventure sports coaching. The study utilised a thematic analysis approach to investigate the decision-making practices of a sample of high-level adventure sports coaches over a series of sessions. Results revealed that, in order to make judgements and decisions in practice, expert coaches employ a range of practical and pedagogic management strategies to create and opportunistically use time for decision-making. These approaches include span of control and time management strategies to facilitate the decision-making process regarding risk management, venue selection, aims, objectives, session content, and differentiation of the coaching process. The implication for coaches, coach education, and accreditation is the recognition and training of the approaches that "create time" for the judgements in practice, namely "creating space to think". The paper concludes by offering a template for a more expertise-focused progression in adventure sports coaching.

  18. Asymmetric Image-Template Registration

    PubMed Central

    Sabuncu, Mert R.; Yeo, B.T. Thomas; Van Leemput, Koen; Vercauteren, Tom; Golland, Polina

    2010-01-01

    A natural requirement in pairwise image registration is that the resulting deformation is independent of the order of the images. This constraint is typically achieved via a symmetric cost function and has been shown to reduce the effects of local optima. Consequently, symmetric registration has been successfully applied to pairwise image registration as well as the spatial alignment of individual images with a template. However, recent work has shown that the relationship between an image and a template is fundamentally asymmetric. In this paper, we develop a method that reconciles the practical advantages of symmetric registration with the asymmetric nature of image-template registration by adding a simple correction factor to the symmetric cost function. We instantiate our model within a log-domain diffeomorphic registration framework. Our experiments show exploiting the asymmetry in image-template registration improves alignment in the image coordinates. PMID:20426033

  19. Template Synthesis of Carbon Nanotubules

    NASA Astrophysics Data System (ADS)

    Tee, J. C.; Sanip, S. M.; Aziz, M.; Ismail, A. F.

    2010-03-01

    The template synthesis of carbon nanostructures formed in porous anodic aluminium oxide (AAO) template with a pore size of 200 nm by a liquid phase impregnation of the template with a polymer, polyfurfuryl alcohol, followed by carbonization is studied. The temperatures of exposure to furfuryl alcohol vapour were varied between 50 and 70° C. The resultant carbon nanotubules formed were hollow with open ends having diameter ranging from 220-300 nm which is in agreement with the pore size of the template used. The BET surface area was found to increase from 11.64 m2/g before pyrolysis to 90.19 m2/g after pyrolysis as a result of the formation of carbon nanotubules.

  20. Automatic capture of attention by conceptually generated working memory templates.

    PubMed

    Sun, Sol Z; Shen, Jenny; Shaw, Mark; Cant, Jonathan S; Ferber, Susanne

    2015-08-01

    Many theories of attention propose that the contents of working memory (WM) can act as an attentional template, which biases processing in favor of perceptually similar inputs. While support has been found for this claim, it is unclear how attentional templates are generated when searching real-world environments. We hypothesized that in naturalistic settings, attentional templates are commonly generated from conceptual knowledge, an idea consistent with sensorimotor models of knowledge representation. Participants performed a visual search task in the delay period of a WM task, where the item in memory was either a colored disk or a word associated with a color concept (e.g., "Rose," associated with red). During search, we manipulated whether a singleton distractor in the array matched the contents of WM. Overall, we found that search times were impaired in the presence of a memory-matching distractor. Furthermore, the degree of impairment did not differ based on the contents of WM. Put differently, regardless of whether participants were maintaining a perceptually colored disk identical to the singleton distractor, or whether they were simply maintaining a word associated with the color of the distractor, the magnitude of attentional capture was the same. Our results suggest that attentional templates can be generated from conceptual knowledge, in the physical absence of the visual feature.

  1. Templated Growth of Carbon Nanotubes

    NASA Technical Reports Server (NTRS)

    Siochik Emilie J. (Inventor)

    2007-01-01

    A method of growing carbon nanotubes uses a synthesized mesoporous si lica template with approximately cylindrical pores being formed there in. The surfaces of the pores are coated with a carbon nanotube precu rsor, and the template with the surfaces of the pores so-coated is th en heated until the carbon nanotube precursor in each pore is convert ed to a carbon nanotube.

  2. Reference commercial high-level waste glass and canister definition.

    SciTech Connect

    Slate, S.C.; Ross, W.A.; Partain, W.L.

    1981-09-01

    This report presents technical data and performance characteristics of a high-level waste glass and canister intended for use in the design of a complete waste encapsulation package suitable for disposal in a geologic repository. The borosilicate glass contained in the stainless steel canister represents the probable type of high-level waste product that will be produced in a commercial nuclear-fuel reprocessing plant. Development history is summarized for high-level liquid waste compositions, waste glass composition and characteristics, and canister design. The decay histories of the fission products and actinides (plus daughters) calculated by the ORIGEN-II code are presented.

  3. Reference commercial high-level waste glass and canister definition

    NASA Astrophysics Data System (ADS)

    Slate, S. C.; Ross, W. A.; Partain, W. L.

    1981-09-01

    Technical data and performance characteristics of a high level waste glass and canister intended for use in the design of a complete waste encapsulation package suitable for disposal in a geologic repository are presented. The borosilicate glass contained in the stainless steel canister represents the probable type of high level waste product that is produced in a commercial nuclear-fuel reprocessing plant. Development history is summarized for high level liquid waste compositions, waste glass composition and characteristics, and canister design. The decay histories of the fission products and actinides (plus daughters) calculated by the ORIGEN-II code are presented.

  4. Generating cancelable fingerprint templates.

    PubMed

    Ratha, Nalini K; Chikkerur, Sharat; Connell, Jonathan H; Bolle, Ruud M

    2007-04-01

    Biometrics-based authentication systems offer obvious usability advantages over traditional password and token-based authentication schemes. However, biometrics raises several privacy concerns. A biometric is permanently associated with a user and cannot be changed. Hence, if a biometric identifier is compromised, it is lost forever and possibly for every application where the biometric is used. Moreover, if the same biometric is used in multiple applications, a user can potentially be tracked from one application to the next by cross-matching biometric databases. In this paper, we demonstrate several methods to generate multiple cancelable identifiers from fingerprint images to overcome these problems. In essence, a user can be given as many biometric identifiers as needed by issuing a new transformation "key." The identifiers can be cancelled and replaced when compromised. We empirically compare the performance of several algorithms such as Cartesian, polar, and surface folding transformations of the minutiae positions. It is demonstrated through multiple experiments that we can achieve revocability and prevent cross-matching of biometric databases. It is also shown that the transforms are noninvertible by demonstrating that it is computationally as hard to recover the original biometric identifier from a transformed version as by randomly guessing. Based on these empirical results and a theoretical analysis we conclude that feature-level cancelable biometric construction is practicable in large biometric deployments.

  5. Adaptive, template moderated, spatially varying statistical classification.

    PubMed

    Warfield, S K; Kaus, M; Jolesz, F A; Kikinis, R

    2000-03-01

    A novel image segmentation algorithm was developed to allow the automatic segmentation of both normal and abnormal anatomy from medical images. The new algorithm is a form of spatially varying statistical classification, in which an explicit anatomical template is used to moderate the segmentation obtained by statistical classification. The algorithm consists of an iterated sequence of spatially varying classification and nonlinear registration, which forms an adaptive, template moderated (ATM), spatially varying statistical classification (SVC). Classification methods and nonlinear registration methods are often complementary, both in the tasks where they succeed and in the tasks where they fail. By integrating these approaches the new algorithm avoids many of the disadvantages of each approach alone while exploiting the combination. The ATM SVC algorithm was applied to several segmentation problems, involving different image contrast mechanisms and different locations in the body. Segmentation and validation experiments were carried out for problems involving the quantification of normal anatomy (MRI of brains of neonates) and pathology of various types (MRI of patients with multiple sclerosis, MRI of patients with brain tumors, MRI of patients with damaged knee cartilage). In each case, the ATM SVC algorithm provided a better segmentation than statistical classification or elastic matching alone.

  6. Holism and High Level Wellness in the Treatment of Alcoholism.

    ERIC Educational Resources Information Center

    Bartha, Robert; Davis, Tom

    1982-01-01

    Discusses how a holistic and wellness philosophy is a viable alternative in the treatment of alcoholism. Describes five major dimensions of high-level wellness: nutritional awareness, physical fitness, stress management, environmental sensitivity, and self-responsibility. (RC)

  7. Neptunium estimation in dissolver and high-level-waste solutions

    SciTech Connect

    Pathak, P.N.; Prabhu, D.R.; Kanekar, A.S.; Manchanda, V.K.

    2008-07-01

    This papers deals with the optimization of the experimental conditions for the estimation of {sup 237}Np in spent-fuel dissolver/high-level waste solutions using thenoyltrifluoroacetone as the extractant. (authors)

  8. High-Level Waste System Process Interface Description

    SciTech Connect

    d'Entremont, P.D.

    1999-01-14

    The High-Level Waste System is a set of six different processes interconnected by pipelines. These processes function as one large treatment plant that receives, stores, and treats high-level wastes from various generators at SRS and converts them into forms suitable for final disposal. The three major forms are borosilicate glass, which will be eventually disposed of in a Federal Repository, Saltstone to be buried on site, and treated water effluent that is released to the environment.

  9. Templated Self-Assembly of Dynamic Peptide Nucleic Acids.

    PubMed

    Beierle, John M; Ura, Yasuyuki; Ghadiri, M Reza; Leman, Luke J

    2017-09-06

    Template-directed macromolecule synthesis is a hallmark of living systems. Inspired by this natural process, several fundamentally novel mechanisms for template-directed assembly of nucleic acid analogues have been developed. Although these approaches have broad significance, including potential applications in biotechnology and implications for the origins of life, there are unresolved challenges in how to characterize in detail the complex assembly equilibria associated with dynamic templated reactions. Here we describe mechanistic studies of template-directed dynamic assembly for thioester peptide nucleic acid (tPNA), an informational polymer that responds to selection pressures under enzyme-free conditions. To overcome some of the inherent challenges of mechanistic studies of dynamic oligomers, we designed, synthesized, and implemented tPNA-DNA conjugates. The DNA primer region affords a high level of control over the location and register of the tPNA backbone in relation to the template strand. We characterized the degree and kinetics of dynamic nucleobase mismatch correction at defined backbone positions. Furthermore, we report the fidelity of dynamic assembly in tPNA as a function of position along the peptide backbone. Finally, we present theoretical studies that explore the level of fidelity that can be expected for an oligomer having a given hybridization affinity in dynamic templated reactions and provide guidance for the future development of sequence self-editing polymers and materials. As our results demonstrate, the use of molecular conjugates of constitutionally static and dynamic polymers establishes a new methodology for expediting the characterization of the complex chemical equilibria that underlie the assembly of dynamic informational polymers.

  10. e-Stars Template Builder

    NASA Technical Reports Server (NTRS)

    Cox, Brian

    2003-01-01

    e-Stars Template Builder is a computer program that implements a concept of enabling users to rapidly gain access to information on projects of NASA's Jet Propulsion Laboratory. The information about a given project is not stored in a data base, but rather, in a network that follows the project as it develops. e-Stars Template Builder resides on a server computer, using Practical Extraction and Reporting Language (PERL) scripts to create what are called "e-STARS node templates," which are software constructs that allow for project-specific configurations. The software resides on the server and does not require specific software on the user machine except for an Internet browser. A user's computer need not be equipped with special software (other than an Internet-browser program). e-Stars Template Builder is compatible with Windows, Macintosh, and UNIX operating systems. A user invokes e-Stars Template Builder from a browser window. Operations that can be performed by the user include the creation of child processes and the addition of links and descriptions of documentation to existing pages or nodes. By means of this addition of "child processes" of nodes, a network that reflects the development of a project is generated.

  11. Development of Crystal-Tolerant High-Level Waste Glasses

    SciTech Connect

    Matyas, Josef; Vienna, John D.; Schaible, Micah J.; Rodriguez, Carmen P.; Crum, Jarrod V.; Arrigoni, Alyssa L.; Tate, Rachel M.

    2010-12-17

    Twenty five glasses were formulated. They were batched from HLW AZ-101 simulant or raw chemicals and melted and tested with a series of tests to elucidate the effect of spinel-forming components (Ni, Fe, Cr, Mn, and Zn), Al, and noble metals (Rh2O3 and RuO2) on the accumulation rate of spinel crystals in the glass discharge riser of the high-level waste (HLW) melter. In addition, the processing properties of glasses, such as the viscosity and TL, were measured as a function of temperature and composition. Furthermore, the settling of spinel crystals in transparent low-viscosity fluids was studied at room temperature to access the shape factor and hindered settling coefficient of spinel crystals in the Stokes equation. The experimental results suggest that Ni is the most troublesome component of all the studied spinel-forming components producing settling layers of up to 10.5 mm in just 20 days in Ni-rich glasses if noble metals or a higher concentration of Fe was not introduced in the glass. The layer of this thickness can potentially plug the bottom of the riser, preventing glass from being discharged from the melter. The noble metals, Fe, and Al were the components that significantly slowed down or stopped the accumulation of spinel at the bottom. Particles of Rh2O3 and RuO2, hematite and nepheline, acted as nucleation sites significantly increasing the number of crystals and therefore decreasing the average crystal size. The settling rate of ≤10-μm crystal size around the settling velocity of crystals was too low to produce thick layers. The experimental data for the thickness of settled layers in the glasses prepared from AZ-101 simulant were used to build a linear empirical model that can predict crystal accumulation in the riser of the melter as a function of concentration of spinel-forming components in glass. The developed model predicts the thicknesses of accumulated layers quite well, R2 = 0.985, and can be become an efficient tool for the formulation

  12. Alignment-free cancelable fingerprint templates based on local minutiae information.

    PubMed

    Lee, Chulhan; Choi, Jeung-Yoon; Toh, Kar-Ann; Lee, Sangyoun; Kim, Jaihie

    2007-08-01

    To replace compromised biometric templates, cancelable biometrics has recently been introduced. The concept is to transform a biometric signal or feature into a new one for enrollment and matching. For making cancelable fingerprint templates, previous approaches used either the relative position of a minutia to a core point or the absolute position of a minutia in a given fingerprint image. Thus, a query fingerprint is required to be accurately aligned to the enrolled fingerprint in order to obtain identically transformed minutiae. In this paper, we propose a new method for making cancelable fingerprint templates that do not require alignment. For each minutia, a rotation and translation invariant value is computed from the orientation information of neighboring local regions around the minutia. The invariant value is used as the input to two changing functions that output two values for the translational and rotational movements of the original minutia, respectively, in the cancelable template. When a template is compromised, it is replaced by a new one generated by different changing functions. Our approach preserves the original geometric relationships (translation and rotation) between the enrolled and query templates after they are transformed. Therefore, the transformed templates can be used to verify a person without requiring alignment of the input fingerprint images. In our experiments, we evaluated the proposed method in terms of two criteria: performance and changeability. When evaluating the performance, we examined how verification accuracy varied as the transformed templates were used for matching. When evaluating the changeability, we measured the dissimilarities between the original and transformed templates, and between two differently transformed templates, which were obtained from the same original fingerprint. The experimental results show that the two criteria mutually affect each other and can be controlled by varying the control parameters of

  13. High level resistance to aminoglycosides in enterococci from Riyadh.

    PubMed

    Al-Ballaa, S R; Qadri, S M; Al-Ballaa, S R; Kambal, A M; Saldin, H; Al-Qatary, K

    1994-07-01

    Enterococci with high level of aminoglycosides resistance are being reported from different parts of the world with increasing frequency. Treatment of infections caused by such isolates is associated with a high incidence of failure or relapse. This is attributed to the loss of the synergetic effect of aminoglycosides and cell wall active agents against isolates exhibiting this type of resistance. To determine the prevalence of enterococci with high level resistance to aminoglycosides in Riyadh, Saudi Arabia, 241 distinct clinical isolates were examined by disk diffusion method using high content aminoglycosides disks. Seventy-four isolates (30%) were resistant to one or more of the aminoglycosides tested. The most common pattern of resistance was that to streptomycin and kanamycin. Of the 241 isolates tested, 29 (12%) were resistant to high levels of gentamicin, 35 (15%) to tobramycin, 65 (27%) to kanamycin and 53 (22%) to streptomycin. The highest rate of resistance to a high level of gentamicin was found among enterococcal blood isolates (30%). Eighteen of the isolates were identified as Enterococcus faecium, 13 (72%) of these showed high level resistance to two or more of the aminoglycosides tested.

  14. Cooperation of catalysts and templates

    NASA Technical Reports Server (NTRS)

    White, D. H.; Kanavarioti, A.; Nibley, C. W.; Macklin, J. W.

    1986-01-01

    In order to understand how self-reproducing molecules could have originated on the primitive Earth or extraterrestrial bodies, it would be useful to find laboratory models of simple molecules which are able to carry out processes of catalysis and templating. Furthermore, it may be anticipated that systems in which several components are acting cooperatively to catalyze each other's synthesis will have different behavior with respect to natural selection than those of purely replicating systems. As the major focus of this work, laboratory models are devised to study the influence of short peptide catalysts on template reactions which produce oligonucleotides or additional peptides. Such catalysts could have been the earliest protoenzymes of selective advantage produced by replicating oligonucleotides. Since this is a complex problem, simpler systems are also studied which embody only one aspect at a time, such as peptide formation with and without a template, peptide catalysis of nontemplated peptide synthesis, and model reactions for replication of the type pioneered by Orgel.

  15. Cooperation of catalysts and templates

    NASA Technical Reports Server (NTRS)

    White, D. H.; Kanavarioti, A.; Nibley, C. W.; Macklin, J. W.

    1986-01-01

    In order to understand how self-reproducing molecules could have originated on the primitive Earth or extraterrestrial bodies, it would be useful to find laboratory models of simple molecules which are able to carry out processes of catalysis and templating. Furthermore, it may be anticipated that systems in which several components are acting cooperatively to catalyze each other's synthesis will have different behavior with respect to natural selection than those of purely replicating systems. As the major focus of this work, laboratory models are devised to study the influence of short peptide catalysts on template reactions which produce oligonucleotides or additional peptides. Such catalysts could have been the earliest protoenzymes of selective advantage produced by replicating oligonucleotides. Since this is a complex problem, simpler systems are also studied which embody only one aspect at a time, such as peptide formation with and without a template, peptide catalysis of nontemplated peptide synthesis, and model reactions for replication of the type pioneered by Orgel.

  16. High-level aminoglycoside resistant enterococci isolated from swine.

    PubMed Central

    Jackson, C. R.; Fedorka-Cray, P. J.; Barrett, J. B.; Ladely, S. R.

    2005-01-01

    Approximately 42% (187/444) of swine enterococci collected between the years 1999 and 2000 exhibited high-level resistance to gentamicin (MIC > or =500 microg/ml), kanamycin (MIC > or =500 microg/ml), or streptomycin (MIC > or =1000 microg/ml). Eight aminoglycoside resistance genes were detected using PCR, most frequently ant(6)-Ia and aac(6')-Ii from Enterococcus faecium. Twenty-four per cent (45/187) of total high-level aminoglycoside-resistant isolates and 26% (4/15) of isolates resistant to high levels of all three antimicrobials were negative for all genes tested. These data suggest that enterococci isolated from swine contain diverse and possibly unidentified aminoglycoside resistance genes. PMID:15816164

  17. An overview of very high level software design methods

    NASA Technical Reports Server (NTRS)

    Asdjodi, Maryam; Hooper, James W.

    1988-01-01

    Very High Level design methods emphasize automatic transfer of requirements to formal design specifications, and/or may concentrate on automatic transformation of formal design specifications that include some semantic information of the system into machine executable form. Very high level design methods range from general domain independent methods to approaches implementable for specific applications or domains. Applying AI techniques, abstract programming methods, domain heuristics, software engineering tools, library-based programming and other methods different approaches for higher level software design are being developed. Though one finds that a given approach does not always fall exactly in any specific class, this paper provides a classification for very high level design methods including examples for each class. These methods are analyzed and compared based on their basic approaches, strengths and feasibility for future expansion toward automatic development of software systems.

  18. High-level aminoglycoside resistant enterococci isolated from swine.

    PubMed

    Jackson, C R; Fedorka-Cray, P J; Barrett, J B; Ladely, S R

    2005-04-01

    Approximately 42% (187/444) of swine enterococci collected between the years 1999 and 2000 exhibited high-level resistance to gentamicin (MIC > or =500 microg/ml), kanamycin (MIC > or =500 microg/ml), or streptomycin (MIC > or =1000 microg/ml). Eight aminoglycoside resistance genes were detected using PCR, most frequently ant(6)-Ia and aac(6')-Ii from Enterococcus faecium. Twenty-four per cent (45/187) of total high-level aminoglycoside-resistant isolates and 26% (4/15) of isolates resistant to high levels of all three antimicrobials were negative for all genes tested. These data suggest that enterococci isolated from swine contain diverse and possibly unidentified aminoglycoside resistance genes.

  19. Template polymerization of nucleotide analogues

    NASA Technical Reports Server (NTRS)

    Orgel, L. E.

    1991-01-01

    Recent work on the template-directed reactions of the natural D-nucleotides has made it clear that l-nucleotides and nucleotide-like derivatives of other sugars would strongly inhibit the formation of long oligonucleotides. Consequently, attention is focusing on molecules simpler than nucleotides that might have acted as monomers of an information transfer system. We have begun a general exploration of the template directed reactions of diverse peptide analogues. I will present work by Dr. Taifeng Wu on oxidative oligomerization of phosphorothioates and of Dr. Mary Tohidi on the cyclic polymerization of nucleoside and related cyclic pyrophosphates.

  20. Wide band gap semiconductor templates

    DOEpatents

    Arendt, Paul N.; Stan, Liliana; Jia, Quanxi; DePaula, Raymond F.; Usov, Igor O.

    2010-12-14

    The present invention relates to a thin film structure based on an epitaxial (111)-oriented rare earth-Group IVB oxide on the cubic (001) MgO terminated surface and the ion-beam-assisted deposition ("IBAD") techniques that are amendable to be over coated by semiconductors with hexagonal crystal structures. The IBAD magnesium oxide ("MgO") technology, in conjunction with certain template materials, is used to fabricate the desired thin film array. Similarly, IBAD MgO with appropriate template layers can be used for semiconductors with cubic type crystal structures.

  1. High-level trigger system for the LHC ALICE experiment

    NASA Astrophysics Data System (ADS)

    Bramm, R.; Helstrup, H.; Lien, J.; Lindenstruth, V.; Loizides, C.; Röhrich, D.; Skaali, B.; Steinbeck, T.; Stock, R.; Ullaland, K.; Vestbø, A.; Wiebalck, A.; ALICE Colloboration

    2003-04-01

    The central detectors of the ALICE experiment at LHC will produce a data size of up to 75 MB/ event at an event rate ⩽200 Hz resulting in a data rate of ˜15 GB/ s. Online processing of the data is necessary in order to select interesting (sub)events ("High Level Trigger"), or to compress data efficiently by modeling techniques. Processing this data requires a massive parallel computing system (High Level Trigger System). The system will consist of a farm of clustered SMP-nodes based on off-the-shelf PCs connected with a high bandwidth low latency network.

  2. High level radioactive waste management facility design criteria

    SciTech Connect

    Sheikh, N.A.; Salaymeh, S.R.

    1993-10-01

    This paper discusses the engineering systems for the structural design of the Defense Waste Processing Facility (DWPF) at the Savannah River Site (SRS). At the DWPF, high level radioactive liquids will be mixed with glass particles and heated in a melter. This molten glass will then be poured into stainless steel canisters where it will harden. This process will transform the high level waste into a more stable, manageable substance. This paper discuss the structural design requirements for this unique one of a kind facility. A special emphasis will be concentrated on the design criteria pertaining to earthquake, wind and tornado, and flooding.

  3. Final report on cermet high-level waste forms

    SciTech Connect

    Kobisk, E.H.; Quinby, T.C.; Aaron, W.S.

    1981-08-01

    Cermets are being developed as an alternate method for the fixation of defense and commercial high level radioactive waste in a terminal disposal form. Following initial feasibility assessments of this waste form, consisting of ceramic particles dispersed in an iron-nickel base alloy, significantly improved processing methods were developed. The characterization of cermets has continued through property determinations on samples prepared by various methods from a variety of simulated and actual high-level wastes. This report describes the status of development of the cermet waste form as it has evolved since 1977. 6 tables, 18 figures.

  4. High Level Waste (HLW) Feed Process Control Strategy

    SciTech Connect

    STAEHR, T.W.

    2000-06-14

    The primary purpose of this document is to describe the overall process control strategy for monitoring and controlling the functions associated with the Phase 1B high-level waste feed delivery. This document provides the basis for process monitoring and control functions and requirements needed throughput the double-shell tank system during Phase 1 high-level waste feed delivery. This document is intended to be used by (1) the developers of the future Process Control Plan and (2) the developers of the monitoring and control system.

  5. Attributes and templates from active measurements with {sup 252}Cf

    SciTech Connect

    Mihalczo, J.T.; Mattingly, J.K.

    2000-02-01

    Active neutron interrogation is useful for the detection of shielded HEU and could also be used for Pu. In an active technique, fissile material is stimulated by an external neutron source to produce fission with the emanation of neutrons and gamma rays. The time distribution of particles leaving the fissile material is measured with respect to the source emission in a variety of ways. A variety of accelerator and radioactive sources can be used. Active interrogation of nuclear weapons/components can be used in two ways: template matching or attribute estimation. Template matching compares radiation signatures with known reference signatures and for treaty applications has the problem of authentication of the reference signatures along with storage and retrieval of templates. Attribute estimation determines, for example, the fissile mass from various features of the radiation signatures and does not require storage of radiation signatures but does require calibration, which can be repeated as necessary. A nuclear materials identification system (NMIS) has been in use at the Oak Ridge Y-12 Plant for verification of weapons components being received and in storage by template matching and has been used with calibrations for attribute (fissile mass) estimation for HEU metal. NMIS employs a {sup 252}Cf source of low intensity (< 2 x 10{sup 6} n/sec) such that the dose at 1 m is approximately twice that on a commercial airline at altitude. The use of such a source presents no significant safety concerns either for personnel or nuclear explosive safety, and has been approved for use at the Pantex Plant on fully assembled weapons systems.

  6. A narrow band pattern-matching model of vowel perception.

    PubMed

    Hillenbrand, James M; Houde, Robert A

    2003-02-01

    The purpose of this paper is to propose and evaluate a new model of vowel perception which assumes that vowel identity is recognized by a template-matching process involving the comparison of narrow band input spectra with a set of smoothed spectral-shape templates that are learned through ordinary exposure to speech. In the present simulation of this process, the input spectra are computed over a sufficiently long window to resolve individual harmonics of voiced speech. Prior to template creation and pattern matching, the narrow band spectra are amplitude equalized by a spectrum-level normalization process, and the information-bearing spectral peaks are enhanced by a "flooring" procedure that zeroes out spectral values below a threshold function consisting of a center-weighted running average of spectral amplitudes. Templates for each vowel category are created simply by averaging the narrow band spectra of like vowels spoken by a panel of talkers. In the present implementation, separate templates are used for men, women, and children. The pattern matching is implemented with a simple city-block distance measure given by the sum of the channel-by-channel differences between the narrow band input spectrum (level-equalized and floored) and each vowel template. Spectral movement is taken into account by computing the distance measure at several points throughout the course of the vowel. The input spectrum is assigned to the vowel template that results in the smallest difference accumulated over the sequence of spectral slices. The model was evaluated using a large database consisting of 12 vowels in /hVd/ context spoken by 45 men, 48 women, and 46 children. The narrow band model classified vowels in this database with a degree of accuracy (91.4%) approaching that of human listeners.

  7. A narrow band pattern-matching model of vowel perception

    NASA Astrophysics Data System (ADS)

    Hillenbrand, James M.; Houde, Robert A.

    2003-02-01

    The purpose of this paper is to propose and evaluate a new model of vowel perception which assumes that vowel identity is recognized by a template-matching process involving the comparison of narrow band input spectra with a set of smoothed spectral-shape templates that are learned through ordinary exposure to speech. In the present simulation of this process, the input spectra are computed over a sufficiently long window to resolve individual harmonics of voiced speech. Prior to template creation and pattern matching, the narrow band spectra are amplitude equalized by a spectrum-level normalization process, and the information-bearing spectral peaks are enhanced by a ``flooring'' procedure that zeroes out spectral values below a threshold function consisting of a center-weighted running average of spectral amplitudes. Templates for each vowel category are created simply by averaging the narrow band spectra of like vowels spoken by a panel of talkers. In the present implementation, separate templates are used for men, women, and children. The pattern matching is implemented with a simple city-block distance measure given by the sum of the channel-by-channel differences between the narrow band input spectrum (level-equalized and floored) and each vowel template. Spectral movement is taken into account by computing the distance measure at several points throughout the course of the vowel. The input spectrum is assigned to the vowel template that results in the smallest difference accumulated over the sequence of spectral slices. The model was evaluated using a large database consisting of 12 vowels in /hVd/ context spoken by 45 men, 48 women, and 46 children. The narrow band model classified vowels in this database with a degree of accuracy (91.4%) approaching that of human listeners.

  8. Precision feature point tracking method using a drift-correcting template update strategy

    NASA Astrophysics Data System (ADS)

    Peng, Xiaoming; Ma, Qian; Zhang, Qiheng; Chen, Wufan; Xu, Zhiyong

    2009-02-01

    We present a drift-correcting template update strategy for precisely tracking a feature point in 2D image sequences in this paper. The proposed strategy greatly extends Matthews et al's template tracking strategy [I. Matthews, T. Ishikawa and S. Baker, The template update problem, IEEE Trans. PAMI 26 (2004) 810-815.] by incorporating a robust non-rigid image registration step used in medical imaging. Matthews et al's strategy uses the first template to correct drifts in the current template; however, the drift would still build up if the first template becomes quite different from the current one as the tracking continues. In our strategy the first template is updated timely when it is quite different from the current one, and henceforth the updated first template can be used to correct template drifts in subsequent frames. The method based on the proposed strategy yields sub-pixel accuracy tracking results measured by the commercial software REALVIZ(R) MatchMover(R) Pro 4.0. Our method runs fast on a desktop PC (3.0 GHz Pentium(R) IV CPU, 1GB RAM, Windows(R) XP professional operating system, Microsoft Visual C++ 6.0 (R) programming), using about 0.03 seconds on average to track the feature point in a frame (under the assumption of a general affine transformation model, 61×61 pixels in template size) and when required, less than 0.1 seconds to update the first template. We also propose the architecture for implementing our strategy in parallel.

  9. Computational resources to filter gravitational wave data with P-approximant templates

    NASA Astrophysics Data System (ADS)

    Porter, Edward K.

    2002-08-01

    The prior knowledge of the gravitational waveform from compact binary systems makes matched filtering an attractive detection strategy. This detection method involves the filtering of the detector output with a set of theoretical waveforms or templates. One of the most important factors in this strategy is knowing how many templates are needed in order to reduce the loss of possible signals. In this study, we calculate the number of templates and computational power needed for a one-step search for gravitational waves from inspiralling binary systems. We build on previous works by first expanding the post-Newtonian waveforms to 2.5-PN order and second, for the first time, calculating the number of templates needed when using P-approximant waveforms. The analysis is carried out for the four main first-generation interferometers, LIGO, GEO600, VIRGO and TAMA. As well as template number, we also calculate the computational cost of generating banks of templates for filtering GW data. We carry out the calculations for two initial conditions. In the first case we assume a minimum individual mass of 1 Msolar and in the second, we assume a minimum individual mass of 5 Msolar. We find that, in general, we need more P-approximant templates to carry out a search than if we use standard PN templates. This increase varies according to the order of PN-approximation, but can be as high as a factor of 3 and is explained by the smaller span of the P-approximant templates as we go to higher masses. The promising outcome is that for 2-PN templates, the increase is small and is outweighed by the known robustness of the 2-PN P-approximant templates.

  10. High-Level Overview of Data Needs for RE Analysis

    SciTech Connect

    Lopez, Anthony

    2016-12-22

    This presentation provides a high level overview of analysis topics and associated data needs. Types of renewable energy analysis are grouped into two buckets: First, analysis for renewable energy potential, and second, analysis for other goals. Data requirements are similar but and they build upon one another.

  11. High-level manpower movement and Japan's foreign aid.

    PubMed

    Furuya, K

    1992-01-01

    "Japan's technical assistance programs to Asian countries are summarized. Movements of high-level manpower accompanying direct foreign investments by private enterprise are also reviewed. Proposals for increased human resources development include education and training of foreigners in Japan as well as the training of Japanese aid experts and the development of networks for information exchange."

  12. Typewriter Modifications for Persons Who Are High-Level Quadriplegics.

    ERIC Educational Resources Information Center

    O'Reagan, James R.; And Others

    Standard, common electric typewriters are not completely suited to the needs of a high-level quadriplegic typing with a mouthstick. Experiences show that for complete control of a typewriter a mouthstick user needs the combined features of one-button correction, electric forward and reverse indexing, and easy character viewing. To modify a…

  13. Structuring Peer Interaction To Promote High-Level Cognitive Processing.

    ERIC Educational Resources Information Center

    King, Alison

    2002-01-01

    Examines the kind of peer learning that demands high-level cognitive processing, discussing how peer interaction influences cognitive processes (structuring peer interaction and using guided reciprocal peer questioning); how to promote cognitive processing (knowledge construction and integration and socio- cognitive conflict); metacognition; and…

  14. Typewriter Modifications for Persons Who Are High-Level Quadriplegics.

    ERIC Educational Resources Information Center

    O'Reagan, James R.; And Others

    Standard, common electric typewriters are not completely suited to the needs of a high-level quadriplegic typing with a mouthstick. Experiences show that for complete control of a typewriter a mouthstick user needs the combined features of one-button correction, electric forward and reverse indexing, and easy character viewing. To modify a…

  15. THE XAL INFRASTRUCTURE FOR HIGH LEVEL CONTROL ROOM APPLICATIONS

    SciTech Connect

    Shishlo, Andrei P; Allen, Christopher K; Chu, Paul; Galambos, John D; Pelaia II, Tom

    2009-01-01

    XAL is a Java programming framework for building high-level control applications related to accelerator physics. The structure, details of implementation, and interaction between components, auxiliary XAL packages, and the latest modifications are discussed. A general overview of XAL applications created for the SNS project is presented.

  16. A comparison of high-level waste form characteristics

    SciTech Connect

    Salmon, R.; Notz, K.J.

    1991-01-01

    There are currently about 1055 million curies of high-level waste with a thermal output of about 2950 kilowatts (KW) at four sites in the United States: West Valley Demonstration Project (WVDP), Savannah River Site (SRS), Hanford Site (HANF), and Idaho National Engineering Laboratory (INEL). These quantities are expected to increase to about 1200 million curies and 3570 kw by the end of year 2020. Under the Nuclear Waste Policy Act, this high-level waste must ultimately be disposed of in a geologic repository. Accordingly, canisters of high-level waste immobilized in borosilicate glass or glass-ceramic mixtures are to be produced at the four sites and stored there until a repository becomes available. Data on the estimated production schedules and on the physical, chemical, and radiological characteristics of the canisters of immobilized high-level waste have been collected in OCRWM's Waste Characteristics Data Base, including recent updates an revisions. Comparisons of some of these data for the four sites are presented in this report. 14 refs., 3 tabs.

  17. Supply-Chain Optimization Template

    NASA Technical Reports Server (NTRS)

    Quiett, William F.; Sealing, Scott L.

    2009-01-01

    The Supply-Chain Optimization Template (SCOT) is an instructional guide for identifying, evaluating, and optimizing (including re-engineering) aerospace- oriented supply chains. The SCOT was derived from the Supply Chain Council s Supply-Chain Operations Reference (SCC SCOR) Model, which is more generic and more oriented toward achieving a competitive advantage in business.

  18. Progress of NIL template making

    NASA Astrophysics Data System (ADS)

    Yusa, Satoshi; Hiraka, Takaaki; Kobiki, Ayumi; Sasaki, Shiho; Itoh, Kimio; Toyama, Nobuhito; Kurihara, Masaaki; Mohri, Hiroshi; Hayashi, Naoya

    2007-05-01

    Nano-imprint lithography (NIL) has been counted as one of the lithography solutions for hp32nm node and beyond. Recently, the small line edge roughness (LER) as well as the potentially high resolution that will ensure no-OPC mask feature is attracting many researchers. The template making is one of the most critical issues for the realization of NIL. Especially when we think of a practical template fabrication process on a 65mm square format that is going to be the industry standard, the resolution of the template making process showed a limitation. We have achieved for the first time an hp22nm resolution on the 65nm template format. Both line and space patterns and hole patterns were well resolved. Regarding dot patterns, we still need improvement, but we have achieved resolution down to hp28nm. Although so far we cannot achieve these resolution limits of various pattern category at the same time on one substrate, an intermediate process condition showed sufficient uniformity both in lateral CD and in vertical depth. Global pattern image placement also showed sufficient numbers at this stage of lithography development. A 20nm feature (with a pitch of 80nm) showed sufficient imprint result.

  19. High-Level Waste Vitrification Facility Feasibility Study

    SciTech Connect

    D. A. Lopez

    1999-08-01

    A ''Settlement Agreement'' between the Department of Energy and the State of Idaho mandates that all radioactive high-level waste now stored at the Idaho Nuclear Technology and Engineering Center will be treated so that it is ready to be moved out of Idaho for disposal by a compliance date of 2035. This report investigates vitrification treatment of the high-level waste in a High-Level Waste Vitrification Facility based on the assumption that no more New Waste Calcining Facility campaigns will be conducted after June 2000. Under this option, the sodium-bearing waste remaining in the Idaho Nuclear Technology and Engineering Center Tank Farm, and newly generated liquid waste produced between now and the start of 2013, will be processed using a different option, such as a Cesium Ion Exchange Facility. The cesium-saturated waste from this other option will be sent to the Calcine Solids Storage Facilities to be mixed with existing calcine. The calcine and cesium-saturated waste will be processed in the High-Level Waste Vitrification Facility by the end of calendar year 2035. In addition, the High-Level Waste Vitrification Facility will process all newly-generated liquid waste produced between 2013 and the end of 2035. Vitrification of this waste is an acceptable treatment method for complying with the Settlement Agreement. This method involves vitrifying the waste and pouring it into stainless-steel canisters that will be ready for shipment out of Idaho to a disposal facility by 2035. These canisters will be stored at the Idaho National Engineering and Environmental Laboratory until they are sent to a national geologic repository. The operating period for vitrification treatment will be from the end of 2015 through 2035.

  20. Learning AND-OR templates for object recognition and detection.

    PubMed

    Si, Zhangzhang; Zhu, Song-Chun

    2013-09-01

    This paper presents a framework for unsupervised learning of a hierarchical reconfigurable image template--the AND-OR Template (AOT) for visual objects. The AOT includes: 1) hierarchical composition as "AND" nodes, 2) deformation and articulation of parts as geometric "OR" nodes, and 3) multiple ways of composition as structural "OR" nodes. The terminal nodes are hybrid image templates (HIT) [17] that are fully generative to the pixels. We show that both the structures and parameters of the AOT model can be learned in an unsupervised way from images using an information projection principle. The learning algorithm consists of two steps: 1) a recursive block pursuit procedure to learn the hierarchical dictionary of primitives, parts, and objects, and 2) a graph compression procedure to minimize model structure for better generalizability. We investigate the factors that influence how well the learning algorithm can identify the underlying AOT. And we propose a number of ways to evaluate the performance of the learned AOTs through both synthesized examples and real-world images. Our model advances the state of the art for object detection by improving the accuracy of template matching.

  1. Higher Accuracy Template for Corner Cube Reflected Image

    SciTech Connect

    Awwal, A S; Rice, K L; Leach, R R; Taha, T M

    2008-07-11

    Video images of laser beams are analyzed to determine the position of the laser beams for alignment purpose in the National Ignition Facility (NIF). Algorithms process beam images to facilitate automated laser alignment. One such beam image, known as the corner cube reflected pinhole image, exhibits wide beam quality variations that are processed by a matched-filter-based algorithm. The challenge is to design a representative template that captures these variations while at the same time assuring accurate position determination. This paper describes the development of a new analytical template to accurately estimate the center of a beam with good image quality. The templates are constructed to exploit several key recurring features observed in the beam images. When the beam image quality is low, the algorithm chooses a template that contains fewer features. The algorithm was implemented using a Xilinx Virtex II Pro FPGA implementation that provides a speedup of about 6.4 times over a baseline 3GHz Pentium 4 processor.

  2. Evaluation of a template-based algorithm for markerless lung tumour localization on single- and dual-energy kilovoltage images.

    PubMed

    Block, Alec M; Patel, Rakesh; Surucu, Murat; Harkenrider, Matthew M; Roeske, John C

    2016-12-01

    To evaluate a template-based matching algorithm on single-energy (SE) and dual-energy (DE) radiographs for markerless localization of lung tumours. A total of 74 images from 17 patients with Stages IA-IV lung cancer were considered. At the time of radiotherapy treatment, gated end-expiration SE radiographs were obtained at 60 and 120 kVp at different gantry angles (33° anterior and 41° oblique), from which soft-tissue-enhanced DE images were created. A template-based matching algorithm was used to localize individual tumours on both SE and DE radiographs. Tumour centroid co-ordinates obtained from the template-matching software on both SE and DE images were compared with co-ordinates defined by physicians. The template-based matching algorithm was able to successfully localize the gross tumor volume within 5 mm on 70% (52/74) of the SE images vs 91% (66/74) of the DE images (p < 0.01). The mean vector differences between the co-ordinates of the template matched by the algorithm and the co-ordinates of the physician-defined ground truth were 3.2 ± 2.8 mm for SE images vs 2.3 ± 1.7 mm for DE images (p = 0.03). Template-based matching on DE images was more accurate and precise than using SE images. Advances in knowledge: This represents, to the authors' knowledge, the largest study evaluating template matching on clinical SE and DE images, considering not only anterior gantry angles but also oblique angles, suggesting a novel lung tumour matching technique using DE subtraction that is reliable, accurate and precise.

  3. Viral-templated Palladium Nanocatalysts

    NASA Astrophysics Data System (ADS)

    Yang, Cuixian

    Despite recent progress on nanocatalysis, there exist several critical challenges in simple and readily controllable nanocatalyst synthesis including the unpredictable particle growth, deactivation of catalytic activity, cumbersome catalyst recovery and lack of in-situ reaction monitoring. In this dissertation, two novel approaches are presented for the fabrication of viral-templated palladium (Pd) nanocatalysts, and their catalytic activities for dichromate reduction reaction and Suzuki Coupling reaction were thoroughly studied. In the first approach, viral template based bottom-up assembly is employed for the Pd nanocatalyst synthesis in a chip-based format. Specifically, genetically displayed cysteine residues on each coat protein of Tobacco Mosaic Virus (TMV) templates provide precisely spaced thiol functionalities for readily controllable surface assembly and enhanced formation of catalytically active Pd nanoparticles. Catalysts with the chip-based format allow for simple separation and in-situ monitoring of the reaction extent. Thorough examination of synthesis-structure-activity relationship of Pd nanoparticles formed on surface-assembled viral templates shows that Pd nanoparticle size, catalyst loading density and catalytic activity of viral-templated Pd nanocatalysts can be readily controlled simply by tuning the synthesis conditions. The viral-templated Pd nanocatalysts with optimized synthesis conditions are shown to have higher catalytic activity per unit Pd mass than the commercial Pd/C catalysts. Furthermore, tunable and selective surface assembly of TMV biotemplates is exploited to control the loading density and location of Pd nanocatalysts on solid substrates via preferential electroless deposition. In addition, the catalytic activities of surface-assembled TMV-templated Pd nanocatalysts were also investigated for the ligand-free Suzuki Coupling reaction under mild reaction conditions. The chip-based format enables simple catalyst separation and

  4. Content analysis of physical examination templates in electronic health records using SNOMED CT.

    PubMed

    Gøeg, Kirstine Rosenbeck; Chen, Rong; Højen, Anne Randorff; Elberg, Pia

    2014-10-01

    Most electronic health record (EHR) systems are built on proprietary information models and terminology, which makes achieving semantic interoperability a challenge. Solving interoperability problems requires well-defined standards. In contrast, the need to support clinical work practice requires a local customization of EHR systems. Consequently, contrasting goals may be evident in EHR template design because customization means that local EHR organizations can define their own templates, whereas standardization implies consensus at some level. To explore the complexity of balancing these two goals, this study analyzes the differences and similarities between templates in use today. A similarity analysis was developed on the basis of SNOMED CT. The analysis was performed on four physical examination templates from Denmark and Sweden. The semantic relationships in SNOMED CT were used to quantify similarities and differences. Moreover, the analysis used these identified similarities to investigate the common content of a physical examination template. The analysis showed that there were both similarities and differences in physical examination templates, and the size of the templates varied from 18 to 49 fields. In the SNOMED CT analysis, exact matches and terminology similarities were represented in all template pairs. The number of exact matches ranged from 7 to 24. Moreover, the number of unrelated fields differed a lot from 1/18 to 22/35. Cross-country comparisons tended to have more unrelated content than within-country comparisons. On the basis of identified similarities, it was possible to define the common content of a physical examination. Nevertheless, a complete view on the physical examination required the inclusion of both exact matches and terminology similarities. This study revealed that a core set of items representing the physical examination templates can be generated when the analysis takes into account not only exact matches but also terminology

  5. Spreadsheet Templates for Chemical Equilibrium Calculations.

    ERIC Educational Resources Information Center

    Joshi, Bhairav D.

    1993-01-01

    Describes two general spreadsheet templates to carry out all types of one-equation chemical equilibrium calculations encountered by students in undergraduate chemistry courses. Algorithms, templates, macros, and representative examples are presented to illustrate the approach. (PR)

  6. Fumigant Management Plan - Phase 1 Templates

    EPA Pesticide Factsheets

    FMPs are required by pesticide labels, so EPA provides chemical-specific soil fumigant templates and samples in PDF and Word formats. Choose the appropriate template for products containing chloropicrin, dazomet, metam sodium/potassium, or methyl bromide.

  7. Transforming surgery through biomaterial template technology.

    PubMed

    Hodde, Jason; Hiles, Michael

    2016-03-01

    Templates inserted into surgical wounds strongly influence the healing responses in humans. The science of these templates, in the form of extracellular matrix biomaterials, is rapidly evolving and improving as the natural interactions with the body become better understood.

  8. AN EXPRESSION TEMPLATE AWARE LAMBDA FUNCTION

    SciTech Connect

    S. A. SMITH; J. STRIEGNITZ

    2000-09-19

    The authors show how the paradigms of lambda functions and expression templates fit together in order to provide a means to increase the expressiveness of existing STL algorithms. They demonstrate how the expression templates approach could be extended in order to work with built-in types. To be portable, their solution is based on the Portable Expression Template Engine (PETE), which is a framework that enables the development of expression template aware classes.

  9. Feature matching evaluation for multimodal correspondence

    NASA Astrophysics Data System (ADS)

    Gesto-Diaz, M.; Tombari, F.; Gonzalez-Aguilera, D.; Lopez-Fernandez, L.; Rodriguez-Gonzalvez, P.

    2017-07-01

    This paper proposes a study and evaluation of approaches aimed at image matching under different modalities, together with a survey of methodologies used for performance comparison in this specific context, and, finally, a novel algorithm for image matching. First, a new dataset is introduced to overcome the limitations of existing datasets, which includes modalities such as visible, thermal, intensity and depth images. This dataset is used to compare the state of the art of feature detectors and descriptors. Template matching techniques commonly used to carry out multimodal correspondence are also adapted and compared therein. In total, 28 different combinations of detectors and descriptors are evaluated. In addition, the detectors' repeatability and the assessment of matching results based on Receiving Operating Characteristic (ROC) curve associated to all tested detector-descriptor combinations are presented, highlighting the best performing pairs. Finally, a novel Adaptive Pairwise Matching (APM) algorithm created to improve the robustness of matching towards outliers is also proposed and tested within our evaluation framework.

  10. A multibiometric face recognition fusion framework with template protection

    NASA Astrophysics Data System (ADS)

    Chindaro, S.; Deravi, F.; Zhou, Z.; Ng, M. W. R.; Castro Neves, M.; Zhou, X.; Kelkboom, E.

    2010-04-01

    In this work we present a multibiometric face recognition framework based on combining information from 2D with 3D facial features. The 3D biometrics channel is protected by a privacy enhancing technology, which uses error correcting codes and cryptographic primitives to safeguard the privacy of the users of the biometric system at the same time enabling accurate matching through fusion with 2D. Experiments are conducted to compare the matching performance of such multibiometric systems with the individual biometric channels working alone and with unprotected multibiometric systems. The results show that the proposed hybrid system incorporating template protection, match and in some cases exceed the performance of corresponding unprotected equivalents, in addition to offering the additional privacy protection.

  11. Software Template for Instruction in Mathematics

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Moebes, Travis A.; Beall, Anna

    2005-01-01

    Intelligent Math Tutor (IMT) is a software system that serves as a template for creating software for teaching mathematics. IMT can be easily connected to artificial-intelligence software and other analysis software through input and output of files. IMT provides an easy-to-use interface for generating courses that include tests that contain both multiple-choice and fill-in-the-blank questions, and enables tracking of test scores. IMT makes it easy to generate software for Web-based courses or to manufacture compact disks containing executable course software. IMT also can function as a Web-based application program, with features that run quickly on the Web, while retaining the intelligence of a high-level language application program with many graphics. IMT can be used to write application programs in text, graphics, and/or sound, so that the programs can be tailored to the needs of most handicapped persons. The course software generated by IMT follows a "back to basics" approach of teaching mathematics by inducing the student to apply creative mathematical techniques in the process of learning. Students are thereby made to discover mathematical fundamentals and thereby come to understand mathematics more deeply than they could through simple memorization.

  12. RETENTION OF SULFATE IN HIGH LEVEL RADIOACTIVE WASTE GLASS

    SciTech Connect

    Fox, K.

    2010-09-07

    High level radioactive wastes are being vitrified at the Savannah River Site for long term disposal. Many of the wastes contain sulfate at concentrations that can be difficult to retain in borosilicate glass. This study involves efforts to optimize the composition of a glass frit for combination with the waste to improve sulfate retention while meeting other process and product performance constraints. The fabrication and characterization of several series of simulated waste glasses are described. The experiments are detailed chronologically, to provide insight into part of the engineering studies used in developing frit compositions for an operating high level waste vitrification facility. The results lead to the recommendation of a specific frit composition and a concentration limit for sulfate in the glass for the next batch of sludge to be processed at Savannah River.

  13. Nondestructive examination of DOE high-level waste storage tanks

    SciTech Connect

    Bush, S.; Bandyopadhyay, K.; Kassir, M.; Mather, B.; Shewmon, P.; Streicher, M.; Thompson, B.; van Rooyen, D.; Weeks, J.

    1995-05-01

    A number of DOE sites have buried tanks containing high-level waste. Tanks of particular interest am double-shell inside concrete cylinders. A program has been developed for the inservice inspection of the primary tank containing high-level waste (HLW), for testing of transfer lines and for the inspection of the concrete containment where possible. Emphasis is placed on the ultrasonic examination of selected areas of the primary tank, coupled with a leak-detection system capable of detecting small leaks through the wall of the primary tank. The NDE program is modelled after ASME Section XI in many respects, particularly with respects to the sampling protocol. Selected testing of concrete is planned to determine if there has been any significant degradation. The most probable failure mechanisms are corrosion-related so that the examination program gives major emphasis to possible locations for corrosion attack.

  14. Life Extension of Aging High-Level Waste Tanks

    SciTech Connect

    Bryson, D.; Callahan, V.; Ostrom, M.; Bryan, W.; Berman, H.

    2002-02-26

    The Double Shell Tanks (DSTs) play a critical role in the Hanford High-Level Waste Treatment Complex, and therefore activities are underway to protect and better understand these tanks. The DST Life Extension Program is focused on both tank life extension and on evaluation of tank integrity. Tank life extension activities focus on understanding tank failure modes and have produced key chemistry and operations controls to minimize tank corrosion and extend useful tank life. Tank integrity program activities have developed and applied key technologies to evaluate the condition of the tank structure and predict useful tank life. Program results to date indicate that DST useful life can be extended well beyond the original design life and allow the existing tanks to fill a critical function within the Hanford High-Level Waste Treatment Complex. In addition the tank life may now be more reliably predicted, facilitating improved planning for the use and possible future replacement of these tanks.

  15. Evaluation and selection of candidate high-level waste forms

    SciTech Connect

    Bernadzikowski, T. A.; Allender, J. S.; Butler, J. L.; Gordon, D. E.; Gould, Jr., T. H.; Stone, J. A.

    1982-03-01

    Seven candidate waste forms being developed under the direction of the Department of Energy's National High-Level Waste (HLW) Technology Program, were evaluated as potential media for the immobilization and geologic disposal of high-level nuclear wastes. The evaluation combined preliminary waste form evaluations conducted at DOE defense waste-sites and independent laboratories, peer review assessments, a product performance evaluation, and a processability analysis. Based on the combined results of these four inputs, two of the seven forms, borosilicate glass and a titanate based ceramic, SYNROC, were selected as the reference and alternative forms for continued development and evaluation in the National HLW Program. Both the glass and ceramic forms are viable candidates for use at each of the DOE defense waste-sites; they are also potential candidates for immobilization of commercial reprocessing wastes. This report describes the waste form screening process, and discusses each of the four major inputs considered in the selection of the two forms.

  16. Long-term high-level waste technology

    NASA Astrophysics Data System (ADS)

    Corman, W. R.

    1981-08-01

    Work performed at sites to immobilize high-level radioactive wastes is described. Program management and support with subtasks of management and budget, environmental and safety assessments, waste preparation, storage or disposal; waste retrieval, separation and concentration are discussed. Waste fixation and characterization, process and equipment development, final handling, canister development and characterization and onsite storage or disposal are also reported. Event trees defining possible accidents were completed in a safety assessment of continued in-tank storage of high-level waste. Two low-cost waste forms (tailored concrete and bitumen) were investigated as candidate immobilization forms. Comparative impact tests and leaching tests were also conducted on glasses, ceramics, and concretes. A process design description was written for the tailored ceramic process.

  17. Multipurpose optimization models for high level waste vitrification

    SciTech Connect

    Hoza, M.

    1994-08-01

    Optimal Waste Loading (OWL) models have been developed as multipurpose tools for high-level waste studies for the Tank Waste Remediation Program at Hanford. Using nonlinear programming techniques, these models maximize the waste loading of the vitrified waste and optimize the glass formers composition such that the glass produced has the appropriate properties within the melter, and the resultant vitrified waste form meets the requirements for disposal. The OWL model can be used for a single waste stream or for blended streams. The models can determine optimal continuous blends or optimal discrete blends of a number of different wastes. The OWL models have been used to identify the most restrictive constraints, to evaluate prospective waste pretreatment methods, to formulate and evaluate blending strategies, and to determine the impacts of variability in the wastes. The OWL models will be used to aid in the design of frits and the maximize the waste in the glass for High-Level Waste (HLW) vitrification.

  18. FPGA based compute nodes for high level triggering in PANDA

    NASA Astrophysics Data System (ADS)

    Kühn, W.; Gilardi, C.; Kirschner, D.; Lang, J.; Lange, S.; Liu, M.; Perez, T.; Yang, S.; Schmitt, L.; Jin, D.; Li, L.; Liu, Z.; Lu, Y.; Wang, Q.; Wei, S.; Xu, H.; Zhao, D.; Korcyl, K.; Otwinowski, J. T.; Salabura, P.; Konorov, I.; Mann, A.

    2008-07-01

    PANDA is a new universal detector for antiproton physics at the HESR facility at FAIR/GSI. The PANDA data acquisition system has to handle interaction rates of the order of 107/s and data rates of several 100 Gb/s. FPGA based compute nodes with multi-Gb/s bandwidth capability using the ATCA architecture are designed to handle tasks such as event building, feature extraction and high level trigger processing. Data connectivity is provided via optical links as well as multiple Gb Ethernet ports. The boards will support trigger algorithms such us pattern recognition for RICH detectors, EM shower analysis, fast tracking algorithms and global event characterization. Besides VHDL, high level C-like hardware description languages will be considered to implement the firmware.

  19. Management of data quality of high level waste characterization

    SciTech Connect

    Winters, W.I., Westinghouse Hanford

    1996-06-12

    Over the past 10 years, the Hanford Site has been transitioning from nuclear materials production to Site cleanup operations. High-level waste characterization at the Hanford Site provides data to support present waste processing operations, tank safety programs, and future waste disposal programs. Quality elements in the high-level waste characterization program will be presented by following a sample through the data quality objective, sampling, laboratory analysis and data review process. Transition from production to cleanup has resulted in changes in quality systems and program; the changes, as well as other issues in these quality programs, will be described. Laboratory assessment through quality control and performance evaluation programs will be described, and data assessments in the laboratory and final reporting in the tank characterization reports will be discussed.

  20. Life Extension of Aging High Level Waste (HLW) Tanks

    SciTech Connect

    BRYSON, D.

    2002-02-04

    The Double Shell Tanks (DSTs) play a critical role in the Hanford High-Level Waste Treatment Complex, and therefore activities are underway to protect and better understand these tanks. The DST Life Extension Program is focused on both tank life extension and on evaluation of tank integrity. Tank life extension activities focus on understanding tank failure modes and have produced key chemistry and operations controls to minimize tank corrosion and extend useful tank life. Tank integrity program activities have developed and applied key technologies to evaluate the condition of the tank structure and predict useful tank life. Program results to date indicate that DST useful life can be extended well beyond the original design life and allow the existing tanks to fill a critical function within the Hanford High-Level Waste Treatment Complex. In addition the tank life may now be more reliably predicted, facilitating improved planning for the use and possible future replacement of these tanks.

  1. Long-term high-level waste technology

    NASA Astrophysics Data System (ADS)

    Cornman, W. R.

    1980-07-01

    This series of reports summarizes research and development studies on the immobilization of high level wastes from the chemical reprocessing of nuclear reactor fuels. Immobilization of the wastes (defense and commercial) consists of placing them in a high integrity form with a very low potential for radionuclide release. Immobilization of commercial wastes is being considered on a contingency basis in the event that reprocessing is resumed. The basic plan for meeting the goal of immobilization of the DOE high level wastes is: (1) to develop technology to support a realistic choice of waste form alternatives for each of the three DOE sites; (2) to develop product and processing technology with sufficient scaleup to provide design data for full scale facilities; and (3) to construct and operate the facilities.

  2. Online pattern recognition for the ALICE high level trigger

    NASA Astrophysics Data System (ADS)

    Bramm, R.; Helstrup, H.; Lien, J.; Lindenstruth, V.; Loizides, C.; Rohrich, D.; Skaali, B.; Steinbeck, T.; Stock, R.; Ullaland, K.; Vestbø, A.; Wiebalck, A.; Alice Collaboration

    2003-04-01

    The ALICE High Level Trigger system needs to reconstruct events online at high data rates. Focusing on the Time Projection Chamber we present two pattern recognition methods under investigation: the sequential approach (cluster finding, track follower) and the iterative approach (Hough Transform, cluster assignment, re-fitting). The implementation of the former in hardware indicates that we can reach the designed inspection rate for p-p collisions of 1 kHz with 98% efficiency.

  3. Mixing Processes in High-Level Waste Tanks - Final Report

    SciTech Connect

    Peterson, P.F.

    1999-05-24

    The mixing processes in large, complex enclosures using one-dimensional differential equations, with transport in free and wall jets is modeled using standard integral techniques. With this goal in mind, we have constructed a simple, computationally efficient numerical tool, the Berkeley Mechanistic Mixing Model, which can be used to predict the transient evolution of fuel and oxygen concentrations in DOE high-level waste tanks following loss of ventilation, and validate the model against a series of experiments.

  4. Case for retrievable high-level nuclear waste disposal

    USGS Publications Warehouse

    Roseboom, Eugene H.

    1994-01-01

    Plans for the nation's first high-level nuclear waste repository have called for permanently closing and sealing the repository soon after it is filled. However, the hydrologic environment of the proposed site at Yucca Mountain, Nevada, should allow the repository to be kept open and the waste retrievable indefinitely. This would allow direct monitoring of the repository and maintain the options for future generations to improve upon the disposal methods or use the uranium in the spent fuel as an energy resource.

  5. Cake: Enabling High-level SLOs on Shared Storage Systems

    DTIC Science & Technology

    2012-11-07

    Cake : Enabling High-level SLOs on Shared Storage Systems Andrew Wang Shivaram Venkataraman Sara Alspaugh Randy H. Katz Ion Stoica Electrical...control number. 1. REPORT DATE 07 NOV 2012 2. REPORT TYPE 3. DATES COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Cake : Enabling High...SUPPLEMENTARY NOTES 14. ABSTRACT Cake is a coordinated, multi-resource scheduler for shared distributed storage environments with the goal of

  6. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    Many high-level vision systems use rule-based approaches to solving problems such as autonomous navigation and image understanding. The rules are usually elaborated by experts. However, this procedure may be rather tedious. In this paper, we propose a method to generate such rules automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  7. The tracking of high level waste shipments-TRANSCOM system

    SciTech Connect

    Johnson, P.E.; Joy, D.S.; Pope, R.B.

    1995-12-31

    The TRANSCOM (transportation tracking and communication) system is the U.S. Department of Energy`s (DOE`s) real-time system for tracking shipments of spent fuel, high-level wastes, and other high-visibility shipments of radioactive material. The TRANSCOM system has been operational since 1988. The system was used during FY1993 to track almost 100 shipments within the US.DOE complex, and it is accessed weekly by 10 to 20 users.

  8. High level cognitive information processing in neural networks

    NASA Technical Reports Server (NTRS)

    Barnden, John A.; Fields, Christopher A.

    1992-01-01

    Two related research efforts were addressed: (1) high-level connectionist cognitive modeling; and (2) local neural circuit modeling. The goals of the first effort were to develop connectionist models of high-level cognitive processes such as problem solving or natural language understanding, and to understand the computational requirements of such models. The goals of the second effort were to develop biologically-realistic model of local neural circuits, and to understand the computational behavior of such models. In keeping with the nature of NASA's Innovative Research Program, all the work conducted under the grant was highly innovative. For instance, the following ideas, all summarized, are contributions to the study of connectionist/neural networks: (1) the temporal-winner-take-all, relative-position encoding, and pattern-similarity association techniques; (2) the importation of logical combinators into connection; (3) the use of analogy-based reasoning as a bridge across the gap between the traditional symbolic paradigm and the connectionist paradigm; and (4) the application of connectionism to the domain of belief representation/reasoning. The work on local neural circuit modeling also departs significantly from the work of related researchers. In particular, its concentration on low-level neural phenomena that could support high-level cognitive processing is unusual within the area of biological local circuit modeling, and also serves to expand the horizons of the artificial neural net field.

  9. High-Level Visual Object Representations Are Constrained by Position

    PubMed Central

    Kriegeskorte, Nikolaus; Baker, Chris I.

    2010-01-01

    It is widely assumed that high-level visual object representations are position-independent (or invariant). While there is sensitivity to position in high-level object-selective cortex, position and object identity are thought to be encoded independently in the population response such that position information is available across objects and object information is available across positions. Contrary to this view, we show, with both behavior and neuroimaging, that visual object representations are position-dependent (tied to limited portions of the visual field). Behaviorally, we show that the effect of priming an object was greatly reduced with any change in position (within- or between-hemifields), indicating nonoverlapping representations of the same object across different positions. Furthermore, using neuroimaging, we show that object-selective cortex is not only highly sensitive to object position but also the ability to differentiate objects based on its response is greatly reduced across different positions, consistent with the observed behavior and the receptive field properties observed in macaque object-selective neurons. Thus, even at the population level, the object information available in response of object-selective cortex is constrained by position. We conclude that even high-level visual object representations are position-dependent. PMID:20351021

  10. High-level visual object representations are constrained by position.

    PubMed

    Kravitz, Dwight J; Kriegeskorte, Nikolaus; Baker, Chris I

    2010-12-01

    It is widely assumed that high-level visual object representations are position-independent (or invariant). While there is sensitivity to position in high-level object-selective cortex, position and object identity are thought to be encoded independently in the population response such that position information is available across objects and object information is available across positions. Contrary to this view, we show, with both behavior and neuroimaging, that visual object representations are position-dependent (tied to limited portions of the visual field). Behaviorally, we show that the effect of priming an object was greatly reduced with any change in position (within- or between-hemifields), indicating nonoverlapping representations of the same object across different positions. Furthermore, using neuroimaging, we show that object-selective cortex is not only highly sensitive to object position but also the ability to differentiate objects based on its response is greatly reduced across different positions, consistent with the observed behavior and the receptive field properties observed in macaque object-selective neurons. Thus, even at the population level, the object information available in response of object-selective cortex is constrained by position. We conclude that even high-level visual object representations are position-dependent.

  11. Handbook of high-level radioactive waste transportation

    SciTech Connect

    Sattler, L.R.

    1992-10-01

    The High-Level Radioactive Waste Transportation Handbook serves as a reference to which state officials and members of the general public may turn for information on radioactive waste transportation and on the federal government`s system for transporting this waste under the Civilian Radioactive Waste Management Program. The Handbook condenses and updates information contained in the Midwestern High-Level Radioactive Waste Transportation Primer. It is intended primarily to assist legislators who, in the future, may be called upon to enact legislation pertaining to the transportation of radioactive waste through their jurisdictions. The Handbook is divided into two sections. The first section places the federal government`s program for transporting radioactive waste in context. It provides background information on nuclear waste production in the United States and traces the emergence of federal policy for disposing of radioactive waste. The second section covers the history of radioactive waste transportation; summarizes major pieces of legislation pertaining to the transportation of radioactive waste; and provides an overview of the radioactive waste transportation program developed by the US Department of Energy (DOE). To supplement this information, a summary of pertinent federal and state legislation and a glossary of terms are included as appendices, as is a list of publications produced by the Midwestern Office of The Council of State Governments (CSG-MW) as part of the Midwestern High-Level Radioactive Waste Transportation Project.

  12. Materials Science of High-Level Nuclear Waste Immobilization

    SciTech Connect

    Weber, William J.; Navrotsky, Alexandra; Stefanovsky, S. V.; Vance, E. R.; Vernaz, Etienne Y.

    2009-01-09

    With the increasing demand for the development of more nuclear power comes the responsibility to address the technical challenges of immobilizing high-level nuclear wastes in stable solid forms for interim storage or disposition in geologic repositories. The immobilization of high-level nuclear wastes has been an active area of research and development for over 50 years. Borosilicate glasses and complex ceramic composites have been developed to meet many technical challenges and current needs, although regulatory issues, which vary widely from country to country, have yet to be resolved. Cooperative international programs to develop advanced proliferation-resistant nuclear technologies to close the nuclear fuel cycle and increase the efficiency of nuclear energy production might create new separation waste streams that could demand new concepts and materials for nuclear waste immobilization. This article reviews the current state-of-the-art understanding regarding the materials science of glasses and ceramics for the immobilization of high-level nuclear waste and excess nuclear materials and discusses approaches to address new waste streams.

  13. Bacteriologic testing of endoscopes after high-level disinfection.

    PubMed

    Rejchrt, Stanislav; Cermák, Pavel; Pavlatová, Ludmila; McKová, Eva; Bures, Jan

    2004-07-01

    There are no definitive data available concerning microbiologic safety of prolonged endoscope storage after reprocessing and disinfection. This study evaluated the durability of high-level disinfection of endoscopes stored in a dust-proof cabinet for 5 days. Three different types of endoscopes (upper endoscopes, duodenoscopes, colonoscopes) were tested. After completion of the endoscopic procedure, endoscopes were subjected to an initial decontamination, followed by manual cleaning with the endoscope immersed in detergent. The endoscopes then were placed in an automatic reprocessor that provides high-level disinfection. They then were stored by hanging in a dust-proof cabinet. Bacteriologic samples were obtained from the surface of the endoscopes, the openings for the piston valves, and the accessory channel daily for 5 days, and by flush-through (combined with brushing) from the accessory channels after 5 days of storage. Samples were cultured for all types of aerobic and anaerobic bacteria, including bacterial spores, and for Candida species. For all assays, all endoscopes were bacteria-free immediately after high-level disinfection. Only 4 assays (of 135) were positive during the subsequent 5-day assessment (skin bacteria cultured from endoscope surfaces). All flush-through samples were sterile. When endoscope reprocessing guidelines are strictly observed and endoscopes are stored in appropriate cabinets for up to 5 days, reprocessing before use may not be necessary.

  14. Accurate Position Sensing of Defocused Beams Using Simulated Beam Templates

    SciTech Connect

    Awwal, A; Candy, J; Haynam, C; Widmayer, C; Bliss, E; Burkhart, S

    2004-09-29

    In position detection using matched filtering one is faced with the challenge of determining the best position in the presence of distortions such as defocus and diffraction noise. This work evaluates the performance of simulated defocused images as the template against the real defocused beam. It was found that an amplitude modulated phase-only filter is better equipped to deal with real defocused images that suffer from diffraction noise effects resulting in a textured spot intensity pattern. It is shown that the there is a tradeoff of performance dependent upon the type and size of the defocused image. A novel automated system was developed that can automatically select the right template type and size. Results of this automation for real defocused images are presented.

  15. Verifying likelihoods for low template DNA profiles using multiple replicates

    PubMed Central

    Steele, Christopher D.; Greenhalgh, Matthew; Balding, David J.

    2014-01-01

    To date there is no generally accepted method to test the validity of algorithms used to compute likelihood ratios (LR) evaluating forensic DNA profiles from low-template and/or degraded samples. An upper bound on the LR is provided by the inverse of the match probability, which is the usual measure of weight of evidence for standard DNA profiles not subject to the stochastic effects that are the hallmark of low-template profiles. However, even for low-template profiles the LR in favour of a true prosecution hypothesis should approach this bound as the number of profiling replicates increases, provided that the queried contributor is the major contributor. Moreover, for sufficiently many replicates the standard LR for mixtures is often surpassed by the low-template LR. It follows that multiple LTDNA replicates can provide stronger evidence for a contributor to a mixture than a standard analysis of a good-quality profile. Here, we examine the performance of the likeLTD software for up to eight replicate profiling runs. We consider simulated and laboratory-generated replicates as well as resampling replicates from a real crime case. We show that LRs generated by likeLTD usually do exceed the mixture LR given sufficient replicates, are bounded above by the inverse match probability and do approach this bound closely when this is expected. We also show good performance of likeLTD even when a large majority of alleles are designated as uncertain, and suggest that there can be advantages to using different profiling sensitivities for different replicates. Overall, our results support both the validity of the underlying mathematical model and its correct implementation in the likeLTD software. PMID:25082140

  16. How to protect biometric templates

    NASA Astrophysics Data System (ADS)

    Sutcu, Yagiz; Li, Qiming; Memon, Nasir

    2007-02-01

    In addition to the inherent qualities that biometrics posses, powerful signal processing tools enabled widespread deployment of the biometric-based identification/verification systems. However, due to the nature of biometric data, well-established cryptographic tools (such as hashing, encryption, etc.) are not sufficient for solving one of the most important problems related to biometric systems, namely, template security. In this paper, we examine and show how to apply a recently proposed secure sketch scheme in order to protect the biometric templates. We consider face biometrics and study how the performance of the authentication scheme would be affected after the application of the secure sketch. We further study the trade-off between the performance of the scheme and the bound of the entropy loss from the secure sketch.

  17. Porous Networks Through Colloidal Templates

    NASA Astrophysics Data System (ADS)

    Li, Qin; Retsch, Markus; Wang, Jianjun; Knoll, Wolfgang; Jonas, Ulrich

    Porous networks represent a class of materials with interconnected voids with specific properties concerning adsorption, mass and heat transport, and spatial confinement, which lead to a wide range of applications ranging from oil recovery and water purification to tissue engineering. Porous networks with well-defined, highly ordered structure and periodicities around the wavelength of light can furthermore show very sophisticated optical properties. Such networks can be fabricated from a very large range of materials by infiltration of a sacrificial colloidal crystal template and subsequent removal of the template. The preparation procedures reported in the literature are discussed in this review and the resulting porous networks are presented with respect to the underlying material class. Furthermore, methods for hierarchical superstructure formation and functionalization of the network walls are discussed.

  18. Template boundary definition in mammalian telomerase.

    PubMed

    Chen, Jiunn-Liang; Greider, Carol W

    2003-11-15

    Telomerase uses a short template sequence in its intrinsic RNA component to synthesize telomere repeats. Disruption of the helix P1b in human telomerase RNA or alteration of its distance from the template resulted in telomerase copying residues past the normal template boundary both in vivo and in vitro. Therefore, helix P1b is important for template boundary definition in human telomerase. Mouse telomerase RNA lacks helix P1b, and the boundary is established at 2 nt downstream of the 5'-end. The divergent structure of boundary definition elements in mammals, yeast, and ciliates suggests diverse mechanisms for template boundary definition in telomerase.

  19. RNA-templated DNA repair

    PubMed Central

    Storici, Francesca; Bebenek, Katarzyna; Kunkel, Thomas A.; Gordenin, Dmitry A.; Resnick, Michael A.

    2007-01-01

    RNA can act as a template for DNA synthesis in the reverse transcription of retroviruses and retrotransposons1 and in the elongation of telomeres2. Despite its abundance in the nucleus, there has been no evidence for a direct role of RNA as a template in the repair of any chromosomal DNA lesions, including DNA double-strand breaks (DSBs), which are repaired in most organisms by homologous recombination or by non-homologous end joining3. An indirect role for RNA in DNA repair, following reverse transcription and formation of a complementary DNA, has been observed in the non-homologous joining of DSB ends4,5. In the yeast Saccharomyces cerevisiae, in which homologous recombination is efficient3, RNA was shown to mediate recombination, but only indirectly through a cDNA intermediate6,7 generated by the reverse transcriptase function of Ty retrotransposons in Ty particles in the cytoplasm8. Although pairing between duplex DNA and single-strand (ss)RNA can occur in vitro9,10 and in vivo11, direct homologous exchange of genetic information between RNA and DNA molecules has not been observed. We show here that RNA can serve as a template for DNA synthesis during repair of a chromosomal DSB in yeast. The repair was accomplished with RNA oligonucleotides complementary to the broken ends. This and the observation that even yeast replicative DNA polymerases such as α and δ can copy short RNA template tracts in vitro demonstrate that RNA can transfer genetic information in vivo through direct homologous interaction with chromosomal DNA. PMID:17429354

  20. Metal nanodisks using bicellar templates

    SciTech Connect

    Song, Yujiang; Shelnutt, John A

    2013-12-03

    Metallic nanodisks and a method of making them. The metallic nanodisks are wheel-shaped structures that that provide large surface areas for catalytic applications. The metallic nanodisks are grown within bicelles (disk-like micelles) that template the growth of the metal in the form of approximately circular dendritic sheets. The zero-valent metal forming the nanodisks is formed by reduction of a metal ion using a suitable electron donor species.

  1. The Distributed D-Clean Model Revisited by Templates

    NASA Astrophysics Data System (ADS)

    Zsók, Viktória; Porkoláb, Zoltán

    2011-09-01

    D-Clean is a functional coordination language for distributed computation. The language was designed for the need of high-level process description and communication coordination of functional programs distributed over a cluster. The pure functional computational nodes required language primitives to control the dataflow in a distributed process-network. Therefore, in order to achieve parallel features, we created an extension for the lazy functional programming language Clean using new language elements. D-Clean is compiled to an intermediate level language called D-Box, which is designed for the description of the computational nodes. Every D-Clean construct generates a D-Box expression. The D-Box expressions hide the low level implementation details and enable direct control over the process-network. The asynchronous communication is based on language-independent middleware services. Earlier we have presented the syntax and the semantics of both coordination languages. Practical experiences showed the difficulties of distributed program development, especially in testing and debugging. This paper aims to provide software comprehension application for a better understanding and usage of the D-Clean distributed system. We model the elements and the behaviour of the D-Clean system using C++ templates. The strong type system of C++ templates guarantees the correctness of the model. Using templates we can avoid run-time overhead achieving impressive efficiency.

  2. Titanium template for scaphoid reconstruction.

    PubMed

    Haefeli, M; Schaefer, D J; Schumacher, R; Müller-Gerbl, M; Honigmann, P

    2015-06-01

    Reconstruction of a non-united scaphoid with a humpback deformity involves resection of the non-union followed by bone grafting and fixation of the fragments. Intraoperative control of the reconstruction is difficult owing to the complex three-dimensional shape of the scaphoid and the other carpal bones overlying the scaphoid on lateral radiographs. We developed a titanium template that fits exactly to the surfaces of the proximal and distal scaphoid poles to define their position relative to each other after resection of the non-union. The templates were designed on three-dimensional computed tomography reconstructions and manufactured using selective laser melting technology. Ten conserved human wrists were used to simulate the reconstruction. The achieved precision measured as the deviation of the surface of the reconstructed scaphoid from its virtual counterpart was good in five cases (maximal difference 1.5 mm), moderate in one case (maximal difference 3 mm) and inadequate in four cases (difference more than 3 mm). The main problems were attributed to the template design and can be avoided by improved pre-operative planning, as shown in a clinical case. © The Author(s) 2014.

  3. Template synthesis of nanophase mesocarbon.

    PubMed

    Yang, Nancy Y; Jian, Kengqing; Külaots, Indrek; Crawford, Gregory P; Hurt, Robert H

    2003-10-01

    Templating techniques are used increasingly to create carbon materials with precisely engineered pore systems. This article presents a new templating technique that achieves simultaneous control of pore structure and molecular (crystal) structure in a single synthesis step. With the use of discotic liquid crystalline precursors, unique carbon structures can be engineered by selecting the size and geometry of the confining spaces and selecting the template material to induce edge-on or face-on orientation of the discotic precursor. Here mesophase pitch is infiltrated by capillary forces into a nanoporous glass followed by slow carbonization and NaOH etching. The resulting porous carbon material exhibits interconnected solid grains about 100 nm in size, a monodisperse pore size of 60 nm, 42% total porosity, and an abundance of edge-plane inner surfaces that reflect the favored edge-on anchoring of the mesophase precursor on glass. This new carbon form is potentially interesting for a number of important applications in which uniform large pores, active-site-rich surfaces, and easy access to interlayer spaces in nanometric grains are advantageous.

  4. LTL - The Little Template Library

    NASA Astrophysics Data System (ADS)

    Gössl, C. A.; Drory, N.; Snigula, J.

    2004-07-01

    The Little Template Library is an expression templates based C++ library for array processing, image processing, FITS and ASCII I/O, and linear algebra. It is released under the GNU Public License (GPL). Although the library is developed with application to astronomical image and data processing in mind, it is by no means restricted to these fields of application. In fact, it qualifies as a fully general array processing package. Focus is laid on a high abstraction level regarding the handling of expressions involving arrays or parts thereof and linear algebra related operations without the usually involved negative impact on performance. The price to pay is dependence on a compiler implementing enough of the current ANSI C++ specification, as well as significantly higher demand on resources at compile time. The LTL provides dynamic arrays of up to 5 dimensions, sub-arrays and slicing, support for fixed size vectors and matrices including basic linear algebra operations, expression templates based evaluation, and I/O facilities for columnar ASCII and FITS format files. In addition it supplies utility classes for statistics, linear and non-linear least squares fitting, and command line and configuration file parsing. YODA (Drory 2002) and all elements of the WeCAPP reduction pipeline (Riffeser et al. 2001, Gössl & Riffeser 2002, 2003) were implemented using the LTL.

  5. Random template banks and relaxed lattice coverings

    NASA Astrophysics Data System (ADS)

    Messenger, C.; Prix, R.; Papa, M. A.

    2009-05-01

    Template-based searches for gravitational waves are often limited by the computational cost associated with searching large parameter spaces. The study of efficient template banks, in the sense of using the smallest number of templates, is therefore of great practical interest. The traditional approach to template-bank construction requires every point in parameter space to be covered by at least one template, which rapidly becomes inefficient at higher dimensions. Here we study an alternative approach, where any point in parameter space is covered only with a given probability η<1. We find that by giving up complete coverage in this way, large reductions in the number of templates are possible, especially at higher dimensions. The prime examples studied here are random template banks in which templates are placed randomly with uniform probability over the parameter space. In addition to its obvious simplicity, this method turns out to be surprisingly efficient. We analyze the statistical properties of such random template banks, and compare their efficiency to traditional lattice coverings. We further study relaxed lattice coverings (using Zn and An* lattices), which similarly cover any signal location only with probability η. The relaxed An* lattice is found to yield the most efficient template banks at low dimensions (n≲10), while random template banks increasingly outperform any other method at higher dimensions.

  6. Random template banks and relaxed lattice coverings

    SciTech Connect

    Messenger, C.; Prix, R.; Papa, M. A.

    2009-05-15

    Template-based searches for gravitational waves are often limited by the computational cost associated with searching large parameter spaces. The study of efficient template banks, in the sense of using the smallest number of templates, is therefore of great practical interest. The traditional approach to template-bank construction requires every point in parameter space to be covered by at least one template, which rapidly becomes inefficient at higher dimensions. Here we study an alternative approach, where any point in parameter space is covered only with a given probability {eta}<1. We find that by giving up complete coverage in this way, large reductions in the number of templates are possible, especially at higher dimensions. The prime examples studied here are random template banks in which templates are placed randomly with uniform probability over the parameter space. In addition to its obvious simplicity, this method turns out to be surprisingly efficient. We analyze the statistical properties of such random template banks, and compare their efficiency to traditional lattice coverings. We further study relaxed lattice coverings (using Z{sub n} and A{sub n}* lattices), which similarly cover any signal location only with probability {eta}. The relaxed A{sub n}* lattice is found to yield the most efficient template banks at low dimensions (n < or approx. 10), while random template banks increasingly outperform any other method at higher dimensions.

  7. Distorted colloidal arrays as designed template

    NASA Astrophysics Data System (ADS)

    Yu, Ye; Zhou, Ziwei; Möhwald, Helmuth; Ai, Bin; Zhao, Zhiyuan; Ye, Shunsheng; Zhang, Gang

    2015-01-01

    In this paper, a novel type of colloidal template with broken symmetry was generated using commercial, inductively coupled plasma reactive ion etching (ICP-RIE). With proper but simple treatment, the traditional symmetric non-close-packed colloidal template evolves into an elliptical profile with high uniformity. This unique feature can add flexibility to colloidal lithography and/or other lithography techniques using colloidal particles as building blocks to fabricate nano-/micro-structures with broken symmetry. Beyond that the novel colloidal template we developed possesses on-site tunability, i.e. the transformability from a symmetric into an asymmetric template. Sandwich-type particles with eccentric features were fabricated utilizing this tunable template. This distinguishing feature will provide the possibility to fabricate structures with unique asymmetric features using one set of colloidal template, providing flexibility and broad tunability to enable nano-/micro-structure fabrication with colloidal templates.

  8. Distorted colloidal arrays as designed template.

    PubMed

    Yu, Ye; Zhou, Ziwei; Möhwald, Helmuth; Ai, Bin; Zhao, Zhiyuan; Ye, Shunsheng; Zhang, Gang

    2015-01-21

    In this paper, a novel type of colloidal template with broken symmetry was generated using commercial, inductively coupled plasma reactive ion etching (ICP-RIE). With proper but simple treatment, the traditional symmetric non-close-packed colloidal template evolves into an elliptical profile with high uniformity. This unique feature can add flexibility to colloidal lithography and/or other lithography techniques using colloidal particles as building blocks to fabricate nano-/micro-structures with broken symmetry. Beyond that the novel colloidal template we developed possesses on-site tunability, i.e. the transformability from a symmetric into an asymmetric template. Sandwich-type particles with eccentric features were fabricated utilizing this tunable template. This distinguishing feature will provide the possibility to fabricate structures with unique asymmetric features using one set of colloidal template, providing flexibility and broad tunability to enable nano-/micro-structure fabrication with colloidal templates.

  9. Matching a Distribution by Matching Quantiles Estimation

    PubMed Central

    Sgouropoulos, Nikolaos; Yao, Qiwei; Yastremiz, Claudia

    2015-01-01

    Motivated by the problem of selecting representative portfolios for backtesting counterparty credit risks, we propose a matching quantiles estimation (MQE) method for matching a target distribution by that of a linear combination of a set of random variables. An iterative procedure based on the ordinary least-squares estimation (OLS) is proposed to compute MQE. MQE can be easily modified by adding a LASSO penalty term if a sparse representation is desired, or by restricting the matching within certain range of quantiles to match a part of the target distribution. The convergence of the algorithm and the asymptotic properties of the estimation, both with or without LASSO, are established. A measure and an associated statistical test are proposed to assess the goodness-of-match. The finite sample properties are illustrated by simulation. An application in selecting a counterparty representative portfolio with a real dataset is reported. The proposed MQE also finds applications in portfolio tracking, which demonstrates the usefulness of combining MQE with LASSO. PMID:26692592

  10. Matching a Distribution by Matching Quantiles Estimation.

    PubMed

    Sgouropoulos, Nikolaos; Yao, Qiwei; Yastremiz, Claudia

    2015-04-03

    Motivated by the problem of selecting representative portfolios for backtesting counterparty credit risks, we propose a matching quantiles estimation (MQE) method for matching a target distribution by that of a linear combination of a set of random variables. An iterative procedure based on the ordinary least-squares estimation (OLS) is proposed to compute MQE. MQE can be easily modified by adding a LASSO penalty term if a sparse representation is desired, or by restricting the matching within certain range of quantiles to match a part of the target distribution. The convergence of the algorithm and the asymptotic properties of the estimation, both with or without LASSO, are established. A measure and an associated statistical test are proposed to assess the goodness-of-match. The finite sample properties are illustrated by simulation. An application in selecting a counterparty representative portfolio with a real dataset is reported. The proposed MQE also finds applications in portfolio tracking, which demonstrates the usefulness of combining MQE with LASSO.

  11. Defense High Level Waste Disposal Container System Description Document

    SciTech Connect

    N. E. Pettit

    2001-07-13

    The Defense High Level Waste Disposal Container System supports the confinement and isolation of waste within the Engineered Barrier System of the Monitored Geologic Repository (MGR). Disposal containers are loaded and sealed in the surface waste handling facilities, transferred to the underground through the accesses using a rail mounted transporter, and emplaced in emplacement drifts. The defense high level waste (HLW) disposal container provides long-term confinement of the commercial HLW and defense HLW (including immobilized plutonium waste forms [IPWF]) placed within disposable canisters, and withstands the loading, transfer, emplacement, and retrieval loads and environments. US Department of Energy (DOE)-owned spent nuclear fuel (SNF) in disposable canisters may also be placed in a defense HLW disposal container along with commercial HLW waste forms, which is known as co-disposal. The Defense High Level Waste Disposal Container System provides containment of waste for a designated period of time, and limits radionuclide release. The disposal container/waste package maintains the waste in a designated configuration, withstands maximum handling and rockfall loads, limits the individual canister temperatures after emplacement, resists corrosion in the expected handling and repository environments, and provides containment of waste in the event of an accident. Defense HLW disposal containers for HLW disposal will hold up to five HLW canisters. Defense HLW disposal containers for co-disposal will hold up to five HLW canisters arranged in a ring and one DOE SNF canister inserted in the center and/or one or more DOE SNF canisters displacing a HLW canister in the ring. Defense HLW disposal containers also will hold two Multi-Canister Overpacks (MCOs) and two HLW canisters in one disposal container. The disposal container will include outer and inner cylinders, outer and inner cylinder lids, and may include a canister guide. An exterior label will provide a means by

  12. Development of a High Level Waste Tank Inspection System

    SciTech Connect

    Appel, D.K.; Loibl, M.W.; Meese, D.C.

    1995-03-21

    The Westinghouse Savannah River Technology Center was requested by it`s sister site, West Valley Nuclear Service (WVNS), to develop a remote inspection system to gather wall thickness readings of their High Level Waste Tanks. WVNS management chose to take a proactive approach to gain current information on two tanks t hat had been in service since the early 70`s. The tanks contain high level waste, are buried underground, and have only two access ports to an annular space between the tank and the secondary concrete vault. A specialized remote system was proposed to provide both a visual surveillance and ultrasonic thickness measurements of the tank walls. A magnetic wheeled crawler was the basis for the remote delivery system integrated with an off-the-shelf Ultrasonic Data Acquisition System. A development program was initiated for Savannah River Technology Center (SRTC) to design, fabricate, and test a remote system based on the Crawler. The system was completed and involved three crawlers to perform the needed tasks, an Ultrasonic Crawler, a Camera Crawler, and a Surface Prep Crawler. The crawlers were computer controlled so that their operation could be done remotely and their position on the wall could be tracked. The Ultrasonic Crawler controls were interfaced with ABB Amdata`s I-PC, Ultrasonic Data Acquisition System so that thickness mapping of the wall could be obtained. A second system was requested by Westinghouse Savannah River Company (WSRC), to perform just ultrasonic mapping on their similar Waste Storage Tanks; however, the system needed to be interfaced with the P-scan Ultrasonic Data Acquisition System. Both remote inspection systems were completed 9/94. Qualifications tests were conducted by WVNS prior to implementation on the actual tank and tank development was achieved 10/94. The second inspection system was deployed at WSRC 11/94 with success, and the system is now in continuous service inspecting the remaining high level waste tanks at WSRC.

  13. High-level waste management technology program plan

    SciTech Connect

    Harmon, H.D.

    1995-01-01

    The purpose of this plan is to document the integrated technology program plan for the Savannah River Site (SRS) High-Level Waste (HLW) Management System. The mission of the SRS HLW System is to receive and store SRS high-level wastes in a see and environmentally sound, and to convert these wastes into forms suitable for final disposal. These final disposal forms are borosilicate glass to be sent to the Federal Repository, Saltstone grout to be disposed of on site, and treated waste water to be released to the environment via a permitted outfall. Thus, the technology development activities described herein are those activities required to enable successful accomplishment of this mission. The technology program is based on specific needs of the SRS HLW System and organized following the systems engineering level 3 functions. Technology needs for each level 3 function are listed as reference, enhancements, and alternatives. Finally, FY-95 funding, deliverables, and schedules are s in Chapter IV with details on the specific tasks that are funded in FY-95 provided in Appendix A. The information in this report represents the vision of activities as defined at the beginning of the fiscal year. Depending on emergent issues, funding changes, and other factors, programs and milestones may be adjusted during the fiscal year. The FY-95 SRS HLW technology program strongly emphasizes startup support for the Defense Waste Processing Facility and In-Tank Precipitation. Closure of technical issues associated with these operations has been given highest priority. Consequently, efforts on longer term enhancements and alternatives are receiving minimal funding. However, High-Level Waste Management is committed to participation in the national Radioactive Waste Tank Remediation Technology Focus Area. 4 refs., 5 figs., 9 tabs.

  14. Modern Alchemy: Solidifying high-level nuclear waste

    SciTech Connect

    Newton, C.C.

    1997-07-01

    The U.S. Department of Energy is putting a modern version of alchemy to work to produce an answer to a decades-old problem. It is taking place at the Savannah River Site (SRS) in Aiken, South Carolina and at the West Valley Demonstration Project (WVDP) near Buffalo, New York. At both locations, contractor Westinghouse Electric Corporation is applying technology that is turning liquid high-level radioactive waste (HLW) into a stabilized, durable glass for safer and easier management. The process is called vitrification. SRS and WVDP are now operating the nation`s first full-scale HLW vitrification plants.

  15. Corrosion and failure processes in high-level waste tanks

    SciTech Connect

    Mahidhara, R.K.; Elleman, T.S.; Murty, K.L.

    1992-11-01

    A large amount of radioactive waste has been stored safely at the Savannah River and Hanford sites over the past 46 years. The aim of this report is to review the experimental corrosion studies at Savannah River and Hanford with the intention of identifying the types and rates of corrosion encountered and indicate how these data contribute to tank failure predictions. The compositions of the High-Level Wastes, mild steels used in the construction of the waste tanks and degradation-modes particularly stress corrosion cracking and pitting are discussed. Current concerns at the Hanford Site are highlighted.

  16. Very-high-level neutral-beam control system

    SciTech Connect

    Elischer, V.; Jacobson, V.; Theil, E.

    1981-10-01

    As increasing numbers of neutral beams are added to fusion machines, their operation can consume a significant fraction of a facility's total resources. LBL has developed a very high level control system that allows a neutral beam injector to be treated as a black box with just 2 controls: one to set the beam power and one to set the pulse duration. This 2 knob view allows simple operation and provides a natural base for implementing even higher level controls such as automatic source conditioning.

  17. High level trigger online calibration framework in ALICE

    NASA Astrophysics Data System (ADS)

    Bablok, S. R.; Djuvsland, Ø.; Kanaki, K.; Nystrand, J.; Richter, M.; Röhrich, D.; Skjerdal, K.; Ullaland, K.; Øvrebekk, G.; Larsen, D.; Alme, J.; Alt, T.; Lindenstruth, V.; Steinbeck, T. M.; Thäder, J.; Kebschull, U.; Böttger, S.; Kalcher, S.; Lara, C.; Panse, R.; Appelshäuser, H.; Ploskon, M.; Helstrup, H.; Hetland, K. F.; Haaland, Ø.; Roed, K.; Thingnæs, T.; Aamodt, K.; Hille, P. T.; Lovhoiden, G.; Skaali, B.; Tveter, T.; Das, I.; Chattopadhyay, S.; Becker, B.; Cicalo, C.; Marras, D.; Siddhanta, S.; Cleymans, J.; Szostak, A.; Fearick, R.; Vaux, G. d.; Vilakazi, Z.

    2008-07-01

    The ALICE High Level Trigger (HLT) is designed to perform event analysis of heavy ion and proton-proton collisions as well as calibration calculations online. A large PC farm, currently under installation, enables analysis algorithms to process these computationally intensive tasks. The HLT receives event data from all major detectors in ALICE. Interfaces to the various other systems provide the analysis software with required additional information. Processed results are sent back to the corresponding systems. To allow online performance monitoring of the detectors an interface for visualizing these results has been developed.

  18. Ionization chamber for measurements of high-level tritium gas

    SciTech Connect

    Carstens, D.H.W.; David, W.R.

    1980-01-01

    The construction and calibration of a simple ionization-chamber apparatus for measurement of high level tritium gas is described. The apparatus uses an easily constructed but rugged chamber containing the unknown gas and an inexpensive digital multimeter for measuring the ion current. The equipment after calibration is suitable for measuring 0.01 to 100% tritium gas in hydrogen-helium mixes with an accuracy of a few percent. At both the high and low limits of measurements deviations from the predicted theoretical current are observed. These are briefly discussed.

  19. High-level neutron coincidence counter maintenance manual

    SciTech Connect

    Swansen, J.; Collinsworth, P.

    1983-05-01

    High-level neutron coincidence counter operational (field) calibration and usage is well known. This manual makes explicit basic (shop) check-out, calibration, and testing of new units and is a guide for repair of failed in-service units. Operational criteria for the major electronic functions are detailed, as are adjustments and calibration procedures, and recurrent mechanical/electromechanical problems are addressed. Some system tests are included for quality assurance. Data on nonstandard large-scale integrated (circuit) components and a schematic set are also included.

  20. Market Designs for High Levels of Variable Generation: Preprint

    SciTech Connect

    Milligan, M.; Holttinen, H.; Kiviluoma, J.; Orths, A.; Lynch, M.; Soder, L.

    2014-10-01

    Variable renewable generation is increasing in penetration in modern power systems, leading to higher variability in the supply and price of electricity as well as lower average spot prices. This raises new challenges, particularly in ensuring sufficient capacity and flexibility from conventional technologies. Because the fixed costs and lifetimes of electricity generation investments are significant, designing markets and regulations that ensure the efficient integration of renewable generation is a significant challenge. This papers reviews the state of play of market designs for high levels of variable generation in the United States and Europe and considers new developments in both regions.

  1. High-level wastes: DOE names three sites for characterization

    SciTech Connect

    1986-07-01

    DOE announced in May 1986 that there will be there site characterization studies made to determine suitability for a high-level radioactive waste repository. The studies will include several test drillings to the proposed disposal depths. Yucca Mountain, Nevada; Deaf Smith Country, Texas, and Hanford, Washington were identified as the study sites, and further studies for a second repository site in the East were postponed. The affected states all filed suits in federal circuit courts because they were given no advance warning of the announcement of their selection or the decision to suspend work on a second repository. Criticisms of the selection process include the narrowing or DOE options.

  2. Iris Matching Based on Personalized Weight Map.

    PubMed

    Dong, Wenbo; Sun, Zhenan; Tan, Tieniu

    2011-09-01

    Iris recognition typically involves three steps, namely, iris image preprocessing, feature extraction, and feature matching. The first two steps of iris recognition have been well studied, but the last step is less addressed. Each human iris has its unique visual pattern and local image features also vary from region to region, which leads to significant differences in robustness and distinctiveness among the feature codes derived from different iris regions. However, most state-of-the-art iris recognition methods use a uniform matching strategy, where features extracted from different regions of the same person or the same region for different individuals are considered to be equally important. This paper proposes a personalized iris matching strategy using a class-specific weight map learned from the training images of the same iris class. The weight map can be updated online during the iris recognition procedure when the successfully recognized iris images are regarded as the new training data. The weight map reflects the robustness of an encoding algorithm on different iris regions by assigning an appropriate weight to each feature code for iris matching. Such a weight map trained by sufficient iris templates is convergent and robust against various noise. Extensive and comprehensive experiments demonstrate that the proposed personalized iris matching strategy achieves much better iris recognition performance than uniform strategies, especially for poor quality iris images.

  3. A Matched Field Processing Framework for Coherent Detection Over Local and Regional Networks (Postprint)

    DTIC Science & Technology

    2011-12-30

    the term " superresolution "). The single-phase matched field statistic for a given template was also demonstrated to be a viable detection statistic... Superresolution with seismic arrays using empirical matched field processing, Geophys. J. Int. 182: 1455–1477. Kim, K.-H. and Park, Y. (2010): The 20

  4. Space augmentation of military high-level waste disposal

    NASA Technical Reports Server (NTRS)

    English, T.; Lees, L.; Divita, E.

    1979-01-01

    Space disposal of selected components of military high-level waste (HLW) is considered. This disposal option offers the promise of eliminating the long-lived radionuclides in military HLW from the earth. A space mission which meets the dual requirements of long-term orbital stability and a maximum of one space shuttle launch per week over a period of 20-40 years, is a heliocentric orbit about halfway between the orbits of earth and Venus. Space disposal of high-level radioactive waste is characterized by long-term predictability and short-term uncertainties which must be reduced to acceptably low levels. For example, failure of either the Orbit Transfer Vehicle after leaving low earth orbit, or the storable propellant stage failure at perihelion would leave the nuclear waste package in an unplanned and potentially unstable orbit. Since potential earth reencounter and subsequent burn-up in the earth's atmosphere is unacceptable, a deep space rendezvous, docking, and retrieval capability must be developed.

  5. High-level expressing YAC vector for transgenic animal bioreactors.

    PubMed

    Fujiwara, Y; Miwa, M; Takahashi, R; Kodaira, K; Hirabayashi, M; Suzuki, T; Ueda, M

    1999-04-01

    The position effect is one major problem in the production of transgenic animals as mammary gland bioreactors. In the present study, we introduced the human growth hormone (hGH) gene into 210-kb human alpha-lactalbumin position-independent YAC vectors using homologous recombination and produced transgenic rats via microinjection of YAC DNA into rat embryos. The efficiency of producing transgenic rats with the YAC vector DNA was the same as that using plasmid constructs. All analyzed transgenic rats had one copy of the transgene and produced milk containing a high level of hGH (0.25-8.9 mg/ml). In transgenic rats with the YAC vector in which the human alpha-lactalbumin gene was replaced with the hGH gene, tissue specificity of hGH mRNA was the same as that of the endogenous rat alpha-lactalbumin gene. Thus, the 210-kb human alpha-lactalbumin YAC is a useful vector for high-level expression of foreign genes in the milk of transgenic animals.

  6. Learning high-level features for chord recognition using Autoencoder

    NASA Astrophysics Data System (ADS)

    Phongthongloa, Vilailukkana; Kamonsantiroj, Suwatchai; Pipanmaekaporn, Luepol

    2016-07-01

    Chord transcription is valuable to do by itself. It is known that the manual transcription of chords is very tiresome, time-consuming. It requires, moreover, musical knowledge. Automatic chord recognition has recently attracted a number of researches in the Music Information Retrieval field. It has known that a pitch class profile (PCP) is the commonly signal representation of musical harmonic analysis. However, the PCP may contain additional non-harmonic noise such as harmonic overtones and transient noise. The problem of non-harmonic might be generating the sound energy in term of frequency more than the actual notes of the respective chord. Autoencoder neural network may be trained to learn a mapping from low level feature to one or more higher-level representation. These high-level representations can explain dependencies of the inputs and reduce the effect of non-harmonic noise. Then these improve features are fed into neural network classifier. The proposed high-level musical features show 80.90% of accuracy. The experimental results have shown that the proposed approach can achieve better performance in comparison with other based method.

  7. High level language-based robotic control system

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Inventor); Kreutz, Kenneth K. (Inventor); Jain, Abhinandan (Inventor)

    1996-01-01

    This invention is a robot control system based on a high level language implementing a spatial operator algebra. There are two high level languages included within the system. At the highest level, applications programs can be written in a robot-oriented applications language including broad operators such as MOVE and GRASP. The robot-oriented applications language statements are translated into statements in the spatial operator algebra language. Programming can also take place using the spatial operator algebra language. The statements in the spatial operator algebra language from either source are then translated into machine language statements for execution by a digital control computer. The system also includes the capability of executing the control code sequences in a simulation mode before actual execution to assure proper action at execution time. The robot's environment is checked as part of the process and dynamic reconfiguration is also possible. The languages and system allow the programming and control of multiple arms and the use of inward/outward spatial recursions in which every computational step can be related to a transformation from one point in the mechanical robot to another point to name two major advantages.

  8. High level language-based robotic control system

    NASA Technical Reports Server (NTRS)

    Rodriguez, Guillermo (Inventor); Kruetz, Kenneth K. (Inventor); Jain, Abhinandan (Inventor)

    1994-01-01

    This invention is a robot control system based on a high level language implementing a spatial operator algebra. There are two high level languages included within the system. At the highest level, applications programs can be written in a robot-oriented applications language including broad operators such as MOVE and GRASP. The robot-oriented applications language statements are translated into statements in the spatial operator algebra language. Programming can also take place using the spatial operator algebra language. The statements in the spatial operator algebra language from either source are then translated into machine language statements for execution by a digital control computer. The system also includes the capability of executing the control code sequences in a simulation mode before actual execution to assure proper action at execution time. The robot's environment is checked as part of the process and dynamic reconfiguration is also possible. The languages and system allow the programming and control of multiple arms and the use of inward/outward spatial recursions in which every computational step can be related to a transformation from one point in the mechanical robot to another point to name two major advantages.

  9. High level radioactive waste glass production and product description

    SciTech Connect

    Sproull, J.F.; Marra, S.L.; Jantzen, C.M.

    1993-12-01

    This report examines borosilicate glass as a means of immobilizing high-level radioactive wastes. Borosilicate glass will encapsulate most of the defense and some of the commercial HLW in the US. The resulting waste forms must meet the requirements of the WA-SRD and the WAPS, which include a short term PCT durability test. The waste form producer must report the composition(s) of the borosilicate waste glass(es) produced but can choose the composition(s) to meet site-specific requirements. Although the waste form composition is the primary determinant of durability, the redox state of the glass; the existence, content, and composition of crystals; and the presence of glass-in-glass phase separation can affect durability. The waste glass should be formulated to avoid phase separation regions. The ultimate result of this effort will be a waste form which is much more stable and potentially less mobile than the liquid high level radioactive waste is currently.

  10. Burning high-level TRU waste in fusion fission reactors

    NASA Astrophysics Data System (ADS)

    Shen, Yaosong

    2016-09-01

    Recently, the concept of actinide burning instead of a once-through fuel cycle for disposing spent nuclear fuel seems to get much more attention. A new method of burning high-level transuranic (TRU) waste combined with Thorium-Uranium (Th-U) fuel in the subcritical reactors driven by external fusion neutron sources is proposed in this paper. The thorium-based TRU fuel burns all of the long-lived actinides via a hard neutron spectrum while outputting power. A one-dimensional model of the reactor concept was built by means of the ONESN_BURN code with new data libraries. The numerical results included actinide radioactivity, biological hazard potential, and much higher burnup rate of high-level transuranic waste. The comparison of the fusion-fission reactor with the thermal reactor shows that the harder neutron spectrum is more efficient than the soft. The Th-U cycle produces less TRU, less radiotoxicity and fewer long-lived actinides. The Th-U cycle provides breeding of 233U with a long operation time (>20 years), hence significantly reducing the reactivity swing while improving safety and burnup.

  11. Spent fuel and high-level radioactive waste transportation report

    SciTech Connect

    Not Available

    1989-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages sew be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  12. How to achieve high-level expression of microbial enzymes

    PubMed Central

    Liu, Long; Yang, Haiquan; Shin, Hyun-dong; Chen, Rachel R.; Li, Jianghua; Du, Guocheng; Chen, Jian

    2013-01-01

    Microbial enzymes have been used in a large number of fields, such as chemical, agricultural and biopharmaceutical industries. The enzyme production rate and yield are the main factors to consider when choosing the appropriate expression system for the production of recombinant proteins. Recombinant enzymes have been expressed in bacteria (e.g., Escherichia coli, Bacillus and lactic acid bacteria), filamentous fungi (e.g., Aspergillus) and yeasts (e.g., Pichia pastoris). The favorable and very advantageous characteristics of these species have resulted in an increasing number of biotechnological applications. Bacterial hosts (e.g., E. coli) can be used to quickly and easily overexpress recombinant enzymes; however, bacterial systems cannot express very large proteins and proteins that require post-translational modifications. The main bacterial expression hosts, with the exception of lactic acid bacteria and filamentous fungi, can produce several toxins which are not compatible with the expression of recombinant enzymes in food and drugs. However, due to the multiplicity of the physiological impacts arising from high-level expression of genes encoding the enzymes and expression hosts, the goal of overproduction can hardly be achieved, and therefore, the yield of recombinant enzymes is limited. In this review, the recent strategies used for the high-level expression of microbial enzymes in the hosts mentioned above are summarized and the prospects are also discussed. We hope this review will contribute to the development of the enzyme-related research field. PMID:23686280

  13. Spent fuel and high-level radioactive waste transportation report

    SciTech Connect

    Not Available

    1990-11-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by the Southern States Energy Board (SSEB) in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste issues. In addition, this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages will be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  14. FLUIDIZED BED STEAM REFORMING ENABLING ORGANIC HIGH LEVEL WASTE DISPOSAL

    SciTech Connect

    Williams, M

    2008-05-09

    Waste streams planned for generation by the Global Nuclear Energy Partnership (GNEP) and existing radioactive High Level Waste (HLW) streams containing organic compounds such as the Tank 48H waste stream at Savannah River Site have completed simulant and radioactive testing, respectfully, by Savannah River National Laboratory (SRNL). GNEP waste streams will include up to 53 wt% organic compounds and nitrates up to 56 wt%. Decomposition of high nitrate streams requires reducing conditions, e.g. provided by organic additives such as sugar or coal, to reduce NOX in the off-gas to N2 to meet Clean Air Act (CAA) standards during processing. Thus, organics will be present during the waste form stabilization process regardless of the GNEP processes utilized and exists in some of the high level radioactive waste tanks at Savannah River Site and Hanford Tank Farms, e.g. organics in the feed or organics used for nitrate destruction. Waste streams containing high organic concentrations cannot be stabilized with the existing HLW Best Developed Available Technology (BDAT) which is HLW vitrification (HLVIT) unless the organics are removed by pretreatment. The alternative waste stabilization pretreatment process of Fluidized Bed Steam Reforming (FBSR) operates at moderate temperatures (650-750 C) compared to vitrification (1150-1300 C). The FBSR process has been demonstrated on GNEP simulated waste and radioactive waste containing high organics from Tank 48H to convert organics to CAA compliant gases, create no secondary liquid waste streams and create a stable mineral waste form.

  15. Local acceptance of a high-level nuclear waste repository.

    PubMed

    Sjöberg, Lennart

    2004-06-01

    The siting of nuclear waste facilities has been very difficult in all countries. Recent experience in Sweden indicates, however, that it may be possible, under certain circumstances, to gain local support for the siting of a high-level nuclear waste (HLNW) repository. The article reports on a study of attitudes and risk perceptions of people living in four municipalities in Sweden where HLNW siting was being intensely discussed at the political level, in media, and among the public. Data showed a relatively high level of consensus on acceptability of at least further investigation of the issue; in two cases local councils have since voted in favor of a go-ahead, and in one case only a very small majority defeated the issue. Models of policy attitudes showed that these were related to attitude to nuclear power, attributes of the perceived HLNW risk, and trust. Factors responsible for acceptance are discussed at several levels. One is the attitude to nuclear power, which is becoming more positive, probably because no viable alternatives are in sight. Other factors have to do with the extensive information programs conducted in these municipalities, and with the logical nature of the conclusion that they would be good candidates for hosting the national HLNW repository.

  16. Spent Fuel and High-Level Radioactive Waste Transportation Report

    SciTech Connect

    Not Available

    1992-03-01

    This publication is intended to provide its readers with an introduction to the issues surrounding the subject of transportation of spent nuclear fuel and high-level radioactive waste, especially as those issues impact the southern region of the United States. It was originally issued by SSEB in July 1987 as the Spent Nuclear Fuel and High-Level Radioactive Waste Transportation Primer, a document patterned on work performed by the Western Interstate Energy Board and designed as a ``comprehensive overview of the issues.`` This work differs from that earlier effort in that it is designed for the educated layman with little or no background in nuclear waste Issues. In addition. this document is not a comprehensive examination of nuclear waste issues but should instead serve as a general introduction to the subject. Owing to changes in the nuclear waste management system, program activities by the US Department of Energy and other federal agencies and developing technologies, much of this information is dated quickly. While this report uses the most recent data available, readers should keep in mind that some of the material is subject to rapid change. SSEB plans periodic updates in the future to account for changes in the program. Replacement pages will be supplied to all parties in receipt of this publication provided they remain on the SSEB mailing list.

  17. VITRIFICATION OF HIGH LEVEL WASTE AT THE SAVANNAH RIVER SITE

    SciTech Connect

    Fox, K.; Peeler, D.

    2009-06-17

    The objective of this study was to experimentally measure the properties and performance of a series of glasses with compositions that could represent high level waste Sludge Batch 5 (SB5) as vitrified at the Savannah River Site Defense Waste Processing Facility. These data were used to guide frit optimization efforts as the SB5 composition was finalized. Glass compositions for this study were developed by combining a series of SB5 composition projections with a group of candidate frits. The study glasses were fabricated using depleted uranium and their chemical compositions, crystalline contents and chemical durabilities were characterized. Trevorite was the only crystalline phase that was identified in a few of the study glasses after slow cooling, and is not of concern as spinels have been shown to have little impact on the durability of high level waste glasses. Chemical durability was quantified using the Product Consistency Test (PCT). All of the glasses had very acceptable durability performance. The results of this study indicate that a frit composition can be identified that will provide a processable and durable glass when combined with SB5.

  18. Permitting plan for the high-level waste interim storage

    SciTech Connect

    Deffenbaugh, M.L.

    1997-04-23

    This document addresses the environmental permitting requirements for the transportation and interim storage of solidified high-level waste (HLW) produced during Phase 1 of the Hanford Site privatization effort. Solidified HLW consists of canisters containing vitrified HLW (glass) and containers that hold cesium separated during low-level waste pretreatment. The glass canisters and cesium containers will be transported to the Canister Storage Building (CSB) in a U.S. Department of Energy (DOE)-provided transportation cask via diesel-powered tractor trailer. Tri-Party Agreement (TPA) Milestone M-90 establishes a new major milestone, and associated interim milestones and target dates, governing acquisition and/or modification of facilities necessary for: (1) interim storage of Tank Waste Remediation Systems (TWRS) immobilized HLW (IHLW) and other canistered high-level waste forms; and (2) interim storage and disposal of TWRS immobilized low-activity tank waste (ILAW). An environmental requirements checklist and narrative was developed to identify the permitting path forward for the HLW interim storage (HLWIS) project (See Appendix B). This permitting plan will follow the permitting logic developed in that checklist.

  19. Status of high-level waste processing at West Valley

    SciTech Connect

    Howell, A.J.; Baker, M.N. )

    1991-11-01

    The US Department of Energy is charged with the solidification of high-level liquid waste remaining from nuclear fuel reprocessing activities that were conducted at West Valley, New York, between 1966 and 1972. The 2.27 million liters (600,000 gal) of waste in an underground storage tank has separated into a sludge layer, {approximately}10% of the original volume, and a liquid layer. Prior to the high-level waste (HLW) vitrification, volume reduction of the waste is necessary. Sine May 1988, West Valley has successfully processed >1.59 million liters (420,000 gal) of HLW. Processing to date has involved the removal of {sup 139}Cs from the HLW effluent by ion exchange, evaporation to concentrate the effluent to a predetermined salt concentration, and finally cementation. This process has removed {approximately}80% of the {sup 137}Cs from the HLW liquid phase. Modifications are currently being made to begin the second phase of the HLW processing at West Valley. The second phase of HLW processing will include the removal of plutonium as well as cesium from the HLW sludge. This paper describes the progress made to date and the modifications being made to the process and to the feed stream to begin the second phase of HLW processing.

  20. High-level power analysis and optimization techniques

    NASA Astrophysics Data System (ADS)

    Raghunathan, Anand

    1997-12-01

    This thesis combines two ubiquitous trends in the VLSI design world--the move towards designing at higher levels of design abstraction, and the increasing importance of power consumption as a design metric. Power estimation and optimization tools are becoming an increasingly important part of design flows, driven by a variety of requirements such as prolonging battery life in portable computing and communication devices, thermal considerations and system cooling and packaging costs, reliability issues (e.g. electromigration, ground bounce, and I-R drops in the power network), and environmental concerns. This thesis presents a suite of techniques to automatically perform power analysis and optimization for designs at the architecture or register-transfer, and behavior or algorithm levels of the design hierarchy. High-level synthesis refers to the process of synthesizing, from an abstract behavioral description, a register-transfer implementation that satisfies the desired constraints. High-level synthesis tools typically perform one or more of the following tasks: transformations, module selection, clock selection, scheduling, and resource allocation and assignment (also called resource sharing or hardware sharing). High-level synthesis techniques for minimizing the area, maximizing the performance, and enhancing the testability of the synthesized designs have been investigated. This thesis presents high-level synthesis techniques that minimize power consumption in the synthesized data paths. This thesis investigates the effects of resource sharing on the power consumption in the data path, provides techniques to efficiently estimate power consumption during resource sharing, and resource sharing algorithms to minimize power consumption. The RTL circuit that is obtained from the high-level synthesis process can be further optimized for power by applying power-reducing RTL transformations. This thesis presents macro-modeling and estimation techniques for switching

  1. Macrostructuring of emulsion-templated porous polymers by 3D laser patterning.

    PubMed

    Johnson, David W; Sherborne, Colin; Didsbury, Matthew P; Pateman, Christopher; Cameron, Neil R; Claeyssens, Frederik

    2013-06-18

    Micro-stereolithography (μSL) is used to produce 3D porous polymer structures by templating high internal phase emulsions. A variety of structures are produced, including lines, squares, grids, and tubes. The porosity matches that of materials produced by conventional photopolymerization. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Functional Programming with C++ Template Metaprograms

    NASA Astrophysics Data System (ADS)

    Porkoláb, Zoltán

    Template metaprogramming is an emerging new direction of generative programming. With the clever definitions of templates we can force the C++ compiler to execute algorithms at compilation time. Among the application areas of template metaprograms are the expression templates, static interface checking, code optimization with adaption, language embedding and active libraries. However, as template metaprogramming was not an original design goal, the C++ language is not capable of elegant expression of metaprograms. The complicated syntax leads to the creation of code that is hard to write, understand and maintain. Although template metaprogramming has a strong relationship with functional programming, this is not reflected in the language syntax and existing libraries. In this paper we give a short and incomplete introduction to C++ templates and the basics of template metaprogramming. We will enlight the role of template metaprograms, and some important and widely used idioms. We give an overview of the possible application areas as well as debugging and profiling techniques. We suggest a pure functional style programming interface for C++ template metaprograms in the form of embedded Haskell code which is transformed to standard compliant C++ source.

  3. Moving object detection from a mobile robot using basis image matching

    NASA Astrophysics Data System (ADS)

    Tsai, Du-Ming; Chiu, Wei-Yao; Tseng, Tzu-Hsun

    2015-01-01

    In this paper, we propose an image processing scheme for moving object detection from a mobile robot with a single camera. It especially aims at intruder detection for the security robot on either smooth or uneven ground surfaces. The proposed scheme uses the template matching with basis image reconstruction for the alignment between two consecutive images in the video sequence. The most representative template patches in one image are first automatically selected based on the gradient energies in the patches. The chosen templates then form a basis image matrix. A windowed subimage is constructed by the linear combination of the basis images, and the instances of the templates in the subsequent image are matched by evaluating their reconstruction error from the basis image matrix. For two well aligned images, a simple and fast temporal difference can thus be applied to identify moving objects from the background. The proposed template matching can tolerate +/-10° in rotation and +/-10% in scaling. By adding templates with larger rotational angles in the basis image matrixes, the proposed method can match images from severe camera vibrations. The proposed scheme achieves a fast processing rate of 32 frames per second for images of size 160×120 pixels.

  4. Template protection and its implementation in 3D face recognition systems

    NASA Astrophysics Data System (ADS)

    Zhou, Xuebing

    2007-04-01

    As biometric recognition systems are widely applied in various application areas, security and privacy risks have recently attracted the attention of the biometric community. Template protection techniques prevent stored reference data from revealing private biometric information and enhance the security of biometrics systems against attacks such as identity theft and cross matching. This paper concentrates on a template protection algorithm that merges methods from cryptography, error correction coding and biometrics. The key component of the algorithm is to convert biometric templates into binary vectors. It is shown that the binary vectors should be robust, uniformly distributed, statistically independent and collision-free so that authentication performance can be optimized and information leakage can be avoided. Depending on statistical character of the biometric template, different approaches for transforming biometric templates into compact binary vectors are presented. The proposed methods are integrated into a 3D face recognition system and tested on the 3D facial images of the FRGC database. It is shown that the resulting binary vectors provide an authentication performance that is similar to the original 3D face templates. A high security level is achieved with reasonable false acceptance and false rejection rates of the system, based on an efficient statistical analysis. The algorithm estimates the statistical character of biometric templates from a number of biometric samples in the enrollment database. For the FRGC 3D face database, the small distinction of robustness and discriminative power between the classification results under the assumption of uniquely distributed templates and the ones under the assumption of Gaussian distributed templates is shown in our tests.

  5. Gravitational waves from inspiralling compact binaries: Hexagonal template placement and its efficiency in detecting physical signals

    NASA Astrophysics Data System (ADS)

    Cokelaer, T.

    2007-11-01

    Matched filtering is used to search for gravitational waves emitted by inspiralling compact binaries in data from the ground-based interferometers. One of the key aspects of the detection process is the design of a template bank that covers the astrophysically pertinent parameter space. In an earlier paper, we described a template bank that is based on a square lattice. Although robust, we showed that the square placement is overefficient, with the implication that it is computationally more demanding than required. In this paper, we present a template bank based on an hexagonal lattice, which size is reduced by 40% with respect to the proposed square placement. We describe the practical aspects of the hexagonal template bank implementation, its size, and computational cost. We have also performed exhaustive simulations to characterize its efficiency and safeness. We show that the bank is adequate to search for a wide variety of binary systems (primordial black holes, neutron stars, and stellar-mass black holes) and in data from both current detectors (initial LIGO, Virgo and GEO600) as well as future detectors (advanced LIGO and EGO). Remarkably, although our template bank placement uses a metric arising from a particular template family, namely, stationary phase approximation, we show that it can be used successfully with other template families (e.g., Padé resummation and effective one-body approximation). This quality of being effective for different template families makes the proposed bank suitable for a search that would use several of them in parallel (e.g., in a binary black hole search). The hexagonal template bank described in this paper is currently used to search for nonspinning inspiralling compact binaries in data from the Laser Interferometer Gravitational-Wave Observatory (LIGO).

  6. Adaptive user interfaces for relating high-level concepts to low-level photographic parameters

    NASA Astrophysics Data System (ADS)

    Scott, Edward; Madhawa Silva, Pubudu; Pardo, Bryan; Pappas, Thrasyvoulos N.

    2011-03-01

    Common controls for photographic editing can be difficult to use and have a significant learning curve. Often, a user does not know a direct mapping from a high-level concept (such as "soft") to the available parameters or controls. In addition, many concepts are subjective in nature, and the appropriate mapping may vary from user to user. To overcome these problems, we propose a system that can quickly learn a mapping from a high-level subjective concept onto low- level image controls using machine learning techniques. To learn such a concept, the system shows the user a series of training images that are generated by modifying a seed image along different dimensions (e.g., color, sharpness), and collects the user ratings of how well each training image matches the concept. Since it is known precisely how each modified example is different from the original, the system can determine the correlation between the user ratings and the image parameters to generate a controller tailored to the concept for the given user. The end result - a personalized image controller - is applicable to a variety of concepts. We have demonstrated the utility of this approach to relate low-level parameters, such as color balance and sharpness, to simple concepts, such as "lightness" and "crispness," as well as more complex and subjective concepts, such as "pleasantness." We have also applied the proposed approach to relate subband statistics (variance) to perceived roughness of visual textures (from the CUReT database).

  7. ALICE: Project Overview and High Level Science Products

    NASA Astrophysics Data System (ADS)

    Soummer, Remi; Choquet, Elodie; Pueyo, Laurent; Brendan Hagan, J.; Gofas-Salas, Elena; Rajan, Abhijith; Perrin, Marshall D.; Chen, Christine; Debes, John H.; Golimowski, David A.; Hines, Dean C.; Schneider, Glenn; N'Diaye, Mamadou; Mawet, Dimitri; Marois, Christian; Barman, Travis

    2015-01-01

    We report on the status of the ALICE project (Archival Legacy Investigation of Circumstellar Environments), which consists in a consistent reanalysis of the entire HST-NICMOS coronagraphic archive. Over the last two years, we have developed a sophisticated pipeline able to handle the data of the 400 stars of the archive. This pipeline builds on the Karhunen-Loeve Image Projection (KLIP) algorithm, and was completed in the fall of 2014. We discuss the first processing and analysis results of the overall reduction campaign. As we will deliver high-level science products to the STScI MAST archive, we are defining a new standard format for high-contrast science products, which will be compatible with every new high-contrast imaging instrument (GPI, SPHERE, P1640, CHARIS, etc.) and used by the JWST coronagraphs. We present here the specifications of this standard.

  8. Review of High Level Waste Tanks Ultrasonic Inspection Data

    SciTech Connect

    Wiersma, B

    2006-03-09

    A review of the data collected during ultrasonic inspection of the Type I high level waste tanks has been completed. The data was analyzed for relevance to the possibility of vapor space corrosion and liquid/air interface corrosion. The review of the Type I tank UT inspection data has confirmed that the vapor space general corrosion is not an unusually aggressive phenomena and correlates well with predicted corrosion rates for steel exposed to bulk solution. The corrosion rates are seen to decrease with time as expected. The review of the temperature data did not reveal any obvious correlations between high temperatures and the occurrences of leaks. The complex nature of temperature-humidity interaction, particularly with respect to vapor corrosion requires further understanding to infer any correlation. The review of the waste level data also did not reveal any obvious correlations.

  9. High Level Waste System Impacts from Acid Dissolution of Sludge

    SciTech Connect

    KETUSKY, EDWARD

    2006-04-20

    This research evaluates the ability of OLI{copyright} equilibrium based software to forecast Savannah River Site High Level Waste system impacts from oxalic acid dissolution of Tank 1-15 sludge heels. Without further laboratory and field testing, only the use of oxalic acid can be considered plausible to support sludge heel dissolution on multiple tanks. Using OLI{copyright} and available test results, a dissolution model is constructed and validated. Material and energy balances, coupled with the model, identify potential safety concerns. Overpressurization and overheating are shown to be unlikely. Corrosion induced hydrogen could, however, overwhelm the tank ventilation. While pH adjustment can restore the minimal hydrogen generation, resultant precipitates will notably increase the sludge volume. OLI{copyright} is used to develop a flowsheet such that additional sludge vitrification canisters and other negative system impacts are minimized. Sensitivity analyses are used to assess the processability impacts from variations in the sludge/quantities of acids.

  10. Mammut: High-level management of system knobs and sensors

    NASA Astrophysics Data System (ADS)

    De Sensi, Daniele; Torquati, Massimo; Danelutto, Marco

    Managing low-level architectural features for controlling performance and power consumption is a growing demand in the parallel computing community. Such features include, but are not limited to: energy profiling, platform topology analysis, CPU cores disabling and frequency scaling. However, these low-level mechanisms are usually managed by specific tools, without any interaction between each other, thus hampering their usability. More important, most existing tools can only be used through a command line interface and they do not provide any API. Moreover, in most cases, they only allow monitoring and managing the same machine on which the tools are used. MAMMUT provides and integrates architectural management utilities through a high-level and easy-to-use object-oriented interface. By using MAMMUT, is possible to link together different collected information and to exploit them on both local and remote systems, to build architecture-aware applications.

  11. Online Pattern Recognition for the ALICE High Level Trigger

    NASA Astrophysics Data System (ADS)

    Lindenstruth, V.; Loizides, C.; Rohrich, D.; Skaali, B.; Steinbeck, T.; Stock, R.; Tilsner, H.; Ullaland, K.; Vestbo, A.; Vik, T.

    2004-06-01

    The ALICE high level trigger has to process data online, in order to select interesting (sub)events, or to compress data efficiently by modeling techniques. Focusing on the main data source, the time projection chamber (TPC), we present two pattern recognition methods under investigation: a sequential approach (cluster finder and track follower) and an iterative approach (track candidate finder and cluster deconvoluter). We show, that the former is suited for pp and low multiplicity PbPb collisions, whereas the latter might be applicable for high multiplicity PbPb collisions of dN/dy>3000. Based on the developed tracking schemes we show that using modeling techniques, a compression factor of around 10 might be achievable.

  12. Socioeconomic studies of high-level nuclear waste disposal.

    PubMed Central

    White, G F; Bronzini, M S; Colglazier, E W; Dohrenwend, B; Erikson, K; Hansen, R; Kneese, A V; Moore, R; Page, E B; Rappaport, R A

    1994-01-01

    The socioeconomic investigations of possible impacts of the proposed repository for high-level nuclear waste at Yucca Mountain, Nevada, have been unprecedented in several respects. They bear on the public decision that sooner or later will be made as to where and how to dispose permanently of the waste presently at military weapons installations and that continues to accumulate at nuclear power stations. No final decision has yet been made. There is no clear precedent from other countries. The organization of state and federal studies is unique. The state studies involve more disciplines than any previous efforts. They have been carried out in parallel to federal studies and have pioneered in defining some problems and appropriate research methods. A recent annotated bibliography provides interested scientists with a compact guide to the 178 published reports, as well as to relevant journal articles and related documents. PMID:7971963

  13. National high-level waste systems analysis plan

    SciTech Connect

    Kristofferson, K.; Oholleran, T.P.; Powell, R.H.; Thiel, E.C.

    1995-05-01

    This document details the development of modeling capabilities that can provide a system-wide view of all US Department of Energy (DOE) high-level waste (HLW) treatment and storage systems. This model can assess the impact of budget constraints on storage and treatment system schedules and throughput. These impacts can then be assessed against existing and pending milestones to determine the impact to the overall HLW system. A nation-wide view of waste treatment availability will help project the time required to prepare HLW for disposal. The impacts of the availability of various treatment systems and throughput can be compared to repository readiness to determine the prudent application of resources or the need to renegotiate milestones.

  14. Exceptionally high levels of multiple mating in an army ant

    NASA Astrophysics Data System (ADS)

    Denny, A. Jay; Franks, Nigel R.; Powell, Scott; Edwards, Keith J.

    Most species of social insects have singly mated queens, although there are notable exceptions. Competing hypotheses have been proposed to explain the evolution of high levels of multiple mating, but this issue is far from resolved. Here we use microsatellites to investigate mating frequency in the army ant Eciton burchellii and show that queens mate with an exceptionally large number of males, eclipsing all but one other social insect species for which data are available. In addition we present evidence that suggests that mating is serial, continuing throughout the lifetime of the queen. This is the first demonstration of serial mating among social hymenoptera. We propose that high paternity within colonies is most likely to have evolved to increase genetic diversity and to counter high pathogen and parasite loads.

  15. High-level theoretical rovibrational spectroscopy of HCS+ isotopologues

    NASA Astrophysics Data System (ADS)

    Schröder, B.; Sebald, P.

    2016-12-01

    In this work the rovibrational spectrum of the HCS+ molecular cation is revisited through high-level electronic structure and variational rovibrational calculations. A local potential energy function is built from explicitly correlated coupled-cluster results, incorporating corrections for core-valence, scalar relativistic and higher-order excitation effects. The computed spectroscopic parameters, based on variational calculations with Watson's isomorphic Hamiltonian for linear molecules lead to a nearly perfect agreement with experimentally reported values (Rosenbaum et al., 1989). Furthermore, the documented Fermi resonance within the (0,00, 1) / (0,20, 0) and (1,00, 1) / (1,20, 0) pairs of states is clarified. Based on a newly developed electric dipole moment function transition dipole moments of fundamental transitions are predicted for the most important isotopologues.

  16. High-level waste tank farm set point document

    SciTech Connect

    Anthony, J.A. III

    1995-01-15

    Setpoints for nuclear safety-related instrumentation are required for actions determined by the design authorization basis. Minimum requirements need to be established for assuring that setpoints are established and held within specified limits. This document establishes the controlling methodology for changing setpoints of all classifications. The instrumentation under consideration involve the transfer, storage, and volume reduction of radioactive liquid waste in the F- and H-Area High-Level Radioactive Waste Tank Farms. The setpoint document will encompass the PROCESS AREA listed in the Safety Analysis Report (SAR) (DPSTSA-200-10 Sup 18) which includes the diversion box HDB-8 facility. In addition to the PROCESS AREAS listed in the SAR, Building 299-H and the Effluent Transfer Facility (ETF) are also included in the scope.

  17. High-level tetracycline resistant Neisseria gonorrhoeae isolated in Portugal.

    PubMed

    Ferreira, E; Louro, D; Gomes, J P; Catry, M A; Pato, M V

    1997-05-01

    The first high-level tetracycline resistance (MIC > or = 16 mg/l) isolates of Neisseria gonorrhoeae (TRNG) were reported in 1990 from patients attending a Sexual Transmitted Disease (STD) Center in Lisbon. The TRNG prevalence was 4% in 1991, 5.3% in 1992 and 10,8% in 1994, exploding to 52.2% in 1995. The tet M determinant was evaluated by PCR. The digests of PCRP using HpaII produced the restriction pattern 2 for all the strains, except one (pattern 3). 78.3% of the TRNG strains were beta-lactamase producers and the 4.5 MDa penicillinase plasmid was the dominant (83%), 90% and 93.3% of the TRNG strains belonged to the auxotype NR and to the serogroup IA, respectively. The IA-8/NR class represented 58.3% of the TRNG isolates, suggesting a clonal spreading.

  18. Remote ignitability analysis of high-level radioactive waste

    SciTech Connect

    Lundholm, C.W.; Morgan, J.M.; Shurtliff, R.M.; Trejo, L.E.

    1992-09-01

    The Idaho Chemical Processing Plant (ICPP), was used to reprocess nuclear fuel from government owned reactors to recover the unused uranium-235. These processes generated highly radioactive liquid wastes which are stored in large underground tanks prior to being calcined into a granular solid. The Resource Conservation and Recovery Act (RCRA) and state/federal clean air statutes require waste characterization of these high level radioactive wastes for regulatory permitting and waste treatment purposes. The determination of the characteristic of ignitability is part of the required analyses prior to calcination and waste treatment. To perform this analysis in a radiologically safe manner, a remoted instrument was needed. The remote ignitability Method and Instrument will meet the 60 deg. C. requirement as prescribed for the ignitability in method 1020 of SW-846. The method for remote use will be equivalent to method 1020 of SW-846.

  19. SIMULANT DEVELOPMENT FOR SAVANNAH RIVER SITE HIGH LEVEL WASTE

    SciTech Connect

    Stone, M; Russell Eibling, R; David Koopman, D; Dan Lambert, D; Paul Burket, P

    2007-09-04

    The Defense Waste Processing Facility (DWPF) at the Savannah River Site vitrifies High Level Waste (HLW) for repository internment. The process consists of three major steps: waste pretreatment, vitrification, and canister decontamination/sealing. The HLW consists of insoluble metal hydroxides (primarily iron, aluminum, magnesium, manganese, and uranium) and soluble sodium salts (carbonate, hydroxide, nitrite, nitrate, and sulfate). The HLW is processed in large batches through DWPF; DWPF has recently completed processing Sludge Batch 3 (SB3) and is currently processing Sludge Batch 4 (SB4). The composition of metal species in SB4 is shown in Table 1 as a function of the ratio of a metal to iron. Simulants remove radioactive species and renormalize the remaining species. Supernate composition is shown in Table 2.

  20. 4.5 Meter high level waste canister study

    SciTech Connect

    Calmus, R. B.

    1997-10-01

    The Tank Waste Remediation System (TWRS) Storage and Disposal Project has established the Immobilized High-Level Waste (IBLW) Storage Sub-Project to provide the capability to store Phase I and II BLW products generated by private vendors. A design/construction project, Project W-464, was established under the Sub-Project to provide the Phase I capability. Project W-464 will retrofit the Hanford Site Canister Storage Building (CSB) to accommodate the Phase I I-ILW products. Project W-464 conceptual design is currently being performed to interim store 3.0 m-long BLW stainless steel canisters with a 0.61 in diameter, DOE is considering using a 4.5 in canister of the same diameter to reduce permanent disposal costs. This study was performed to assess the impact of replacing the 3.0 in canister with the 4.5 in canister. The summary cost and schedule impacts are described.

  1. The SEISMED High Level Security Policy for Health Care.

    PubMed

    Katsikas, S K

    1996-01-01

    The proliferation of the use of automated Health Information Systems in the everyday practice of health professionals has brought a number of issues related to the security of health information to a critical point. The preservation of security of health-related information can only be achieved through a concerted approach, comprising legal, organisational, technical and educational actions. These classes of actions constitute a complete "security framework", a key aspect of which is the set of rules, laws and regulations that govern the usage of information within a Health Care Establishment. This set is commonly referred to as "Security Policy". In this paper, the SEISMED High Level Security Policy for Health Care Establishments is presented.

  2. Midwestern High-Level Radioactive Waste Transportation Project

    SciTech Connect

    Dantoin, T.S.

    1990-12-01

    For more than half a century, the Council of State Governments has served as a common ground for the states of the nation. The Council is a nonprofit, state-supported and -directed service organization that provides research and resources, identifies trends, supplies answers and creates a network for legislative, executive and judicial branch representatives. This List of Available Resources was prepared with the support of the US Department of Energy, Cooperative Agreement No. DE-FC02-89CH10402. However, any opinions, findings, conclusions, or recommendations expressed herein are those of the author(s) and do not necessarily reflect the views of DOE. The purpose of the agreement, and reports issued pursuant to it, is to identify and analyze regional issues pertaining to the transportation of high-level radioactive waste and to inform Midwestern state officials with respect to technical issues and regulatory concerns related to waste transportation.

  3. A High-Level Language for Rule-Based Modelling

    PubMed Central

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D.

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages. PMID:26043208

  4. High-Level Language Production in Parkinson's Disease: A Review

    PubMed Central

    Altmann, Lori J. P.; Troche, Michelle S.

    2011-01-01

    This paper discusses impairments of high-level, complex language production in Parkinson's disease (PD), defined as sentence and discourse production, and situates these impairments within the framework of current psycholinguistic theories of language production. The paper comprises three major sections, an overview of the effects of PD on the brain and cognition, a review of the literature on language production in PD, and a discussion of the stages of the language production process that are impaired in PD. Overall, the literature converges on a few common characteristics of language production in PD: reduced information content, impaired grammaticality, disrupted fluency, and reduced syntactic complexity. Many studies also document the strong impact of differences in cognitive ability on language production. Based on the data, PD affects all stages of language production including conceptualization and functional and positional processing. Furthermore, impairments at all stages appear to be exacerbated by impairments in cognitive abilities. PMID:21860777

  5. High level radioactive waste vitrification process equipment component testing

    SciTech Connect

    Siemens, D.H.; Heath, W.O.; Larson, D.E.; Craig, S.N.; Berger, D.N.; Goles, R.W.

    1985-04-01

    Remote operability and maintainability of vitrification equipment were assessed under shielded-cell conditions. The equipment tested will be applied to immobilize high-level and transuranic liquid waste slurries that resulted from plutonium production for defense weapons. Equipment tested included: a turntable for handling waste canisters under the melter; a removable discharge cone in the melter overflow section; a thermocouple jumper that extends into a shielded cell; remote instrument and electrical connectors; remote, mechanical, and heat transfer aspects of the melter glass overflow section; a reamer to clean out plugged nozzles in the melter top; a closed circuit camera to view the melter interior; and a device to retrieve samples of the glass product. A test was also conducted to evaluate liquid metals for use in a liquid metal sealing system.

  6. A high-level language for rule-based modelling.

    PubMed

    Pedersen, Michael; Phillips, Andrew; Plotkin, Gordon D

    2015-01-01

    Rule-based languages such as Kappa excel in their support for handling the combinatorial complexities prevalent in many biological systems, including signalling pathways. But Kappa provides little structure for organising rules, and large models can therefore be hard to read and maintain. This paper introduces a high-level, modular extension of Kappa called LBS-κ. We demonstrate the constructs of the language through examples and three case studies: a chemotaxis switch ring, a MAPK cascade, and an insulin signalling pathway. We then provide a formal definition of LBS-κ through an abstract syntax and a translation to plain Kappa. The translation is implemented in a compiler tool which is available as a web application. We finally demonstrate how to increase the expressivity of LBS-κ through embedded scripts in a general-purpose programming language, a technique which we view as generally applicable to other domain specific languages.

  7. Electrophysiological correlates of high-level perception during spatial navigation

    PubMed Central

    Weidemann, Christoph T.; Mollison, Matthew V.; Kahana, Michael J.

    2009-01-01

    We studied the electrophysiological basis o object recognition by recording scalp EEG while participants played a virtual reality taxi driver game. Participants searched for passengers and stores during virtual navigation in simulated towns. We compared oscillatory brain activity for store views that were targets or non-targets (during store search) or neutral (during passenger search). Even though store category was solely defined by task context (rather than sensory cues), frontal electrophysiological activity in low frequency bands (primarily in the theta [4–8 Hz] band) reliably distinguished between target, non-target, and neutral store views. These results implicate low frequency oscillatory brain activity in frontal regions as an important variable in the study of cognitive processes involved in object recognition, categorization, and other forms of high-level perception. PMID:19293100

  8. Reinforcement learning for high-level fuzzy Petri nets.

    PubMed

    Shen, V L

    2003-01-01

    The author has developed a reinforcement learning algorithm for the high-level fuzzy Petri net (HLFPN) models in order to perform structure and parameter learning simultaneously. In addition to the HLFPN itself, the difference and similarity among a variety of subclasses concerning Petri nets are also discussed. As compared with the fuzzy adaptive learning control network (FALCON), the HLFPN model preserves the advantages that: 1) it offers more flexible learning capability because it is able to model both IF-THEN and IF-THEN-ELSE rules; 2) it allows multiple heterogeneous outputs to be drawn if they exist; 3) it offers a more compact data structure for fuzzy production rules so as to save information storage; and 4) it is able to learn faster due to its structural reduction. Finally, main results are presented in the form of seven propositions and supported by some experiments.

  9. High-level waste melter alternatives assessment report

    SciTech Connect

    Calmus, R.B.

    1995-02-01

    This document describes the Tank Waste Remediation System (TWRS) High-Level Waste (HLW) Program`s (hereafter referred to as HLW Program) Melter Candidate Assessment Activity performed in fiscal year (FY) 1994. The mission of the TWRS Program is to store, treat, and immobilize highly radioactive Hanford Site waste (current and future tank waste and encapsulated strontium and cesium isotopic sources) in an environmentally sound, safe, and cost-effective manner. The goal of the HLW Program is to immobilize the HLW fraction of pretreated tank waste into a vitrified product suitable for interim onsite storage and eventual offsite disposal at a geologic repository. Preparation of the encapsulated strontium and cesium isotopic sources for final disposal is also included in the HLW Program. As a result of trade studies performed in 1992 and 1993, processes planned for pretreatment of tank wastes were modified substantially because of increasing estimates of the quantity of high-level and transuranic tank waste remaining after pretreatment. This resulted in substantial increases in needed vitrification plant capacity compared to the capacity of original Hanford Waste Vitrification Plant (HWVP). The required capacity has not been finalized, but is expected to be four to eight times that of the HWVP design. The increased capacity requirements for the HLW vitrification plant`s melter prompted the assessment of candidate high-capacity HLW melter technologies to determine the most viable candidates and the required development and testing (D and T) focus required to select the Hanford Site HLW vitrification plant melter system. An assessment process was developed in early 1994. This document describes the assessment team, roles of team members, the phased assessment process and results, resulting recommendations, and the implementation strategy.

  10. Hip Arthroscopy in High-Level Baseball Players.

    PubMed

    Byrd, J W Thomas; Jones, Kay S

    2015-08-01

    To report the results of hip arthroscopy among high-level baseball players as recorded by outcome scores and return to baseball. All patients undergoing hip arthroscopy were prospectively assessed with the modified Harris Hip Score. On review of all procedures performed over a 12-year period, 44 hips were identified among 41 intercollegiate or professional baseball players who had achieved 2-year follow-up. Among the 41 players, follow-up averaged 45 months (range, 24 to 120 months), with a mean age of 23 years (range, 18 to 34 years). There were 23 collegiate (1 bilateral) and 18 professional (2 bilateral) baseball players, including 10 Major League Baseball players. Of the 8 Major League Baseball pitchers, 6 (75%) also underwent ulnar collateral ligament elbow surgery. Improvement in the modified Harris Hip Score averaged 13 points (from 81 points preoperatively to 94 points postoperatively); a paired-samples t test determined that this mean improvement of 13 points was statistically significant (P < .001). Players returned to baseball after 42 of 44 procedures (95%) at a mean of 4.3 months (range, 3 to 8 months), with 90% regaining the ability to participate at their previous level of competition. There were no complications. Three players (1 bilateral) underwent repeat arthroscopy. This study supports the idea that arthroscopic treatment for a variety of hip pathologies in high-level baseball players provides a successful return to sport and improvement in functional outcome scores. Level IV, therapeutic case series. Copyright © 2015 Arthroscopy Association of North America. Published by Elsevier Inc. All rights reserved.

  11. High Level Information Fusion (HLIF) with nested fusion loops

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Gosnell, Michael; Fischer, Amber

    2013-05-01

    Situation modeling and threat prediction require higher levels of data fusion in order to provide actionable information. Beyond the sensor data and sources the analyst has access to, the use of out-sourced and re-sourced data is becoming common. Through the years, some common frameworks have emerged for dealing with information fusion—perhaps the most ubiquitous being the JDL Data Fusion Group and their initial 4-level data fusion model. Since these initial developments, numerous models of information fusion have emerged, hoping to better capture the human-centric process of data analyses within a machine-centric framework. 21st Century Systems, Inc. has developed Fusion with Uncertainty Reasoning using Nested Assessment Characterizer Elements (FURNACE) to address challenges of high level information fusion and handle bias, ambiguity, and uncertainty (BAU) for Situation Modeling, Threat Modeling, and Threat Prediction. It combines JDL fusion levels with nested fusion loops and state-of-the-art data reasoning. Initial research has shown that FURNACE is able to reduce BAU and improve the fusion process by allowing high level information fusion (HLIF) to affect lower levels without the double counting of information or other biasing issues. The initial FURNACE project was focused on the underlying algorithms to produce a fusion system able to handle BAU and repurposed data in a cohesive manner. FURNACE supports analyst's efforts to develop situation models, threat models, and threat predictions to increase situational awareness of the battlespace. FURNACE will not only revolutionize the military intelligence realm, but also benefit the larger homeland defense, law enforcement, and business intelligence markets.

  12. High-level waste program integration within the DOE complex

    SciTech Connect

    Valentine, J.H.; Davis, N.R.; Malone, K.; Schaus, P.S.

    1998-03-01

    Eleven major Department of Energy (DOE) site contractors were chartered by the Assistant Secretary to use a systems engineering approach to develop and evaluate technically defensible cost savings opportunities across the complex. Known as the complex-wide Environmental Management Integration (EMI), this process evaluated all the major DOE waste streams including high level waste (HLW). Across the DOE complex, this waste stream has the highest life cycle cost and is scheduled to take until at least 2035 before all HLW is processed for disposal. Technical contract experts from the four DOE sites that manage high level waste participated in the integration analysis: Hanford, Savannah River Site (SRS), Idaho National Engineering and Environmental Laboratory (INEEL), and West Valley Demonstration Project (WVDP). In addition, subject matter experts from the Yucca Mountain Project and the Tanks Focus Area participated in the analysis. Also, departmental representatives from the US Department of Energy Headquarters (DOE-HQ) monitored the analysis and results. Workouts were held throughout the year to develop recommendations to achieve a complex-wide integrated program. From this effort, the HLW Environmental Management (EM) Team identified a set of programmatic and technical opportunities that could result in potential cost savings and avoidance in excess of $18 billion and an accelerated completion of the HLW mission by seven years. The cost savings, schedule improvements, and volume reduction are attributed to a multifaceted HLW treatment disposal strategy which involves waste pretreatment, standardized waste matrices, risk-based retrieval, early development and deployment of a shipping system for glass canisters, and reasonable, low cost tank closure.

  13. Hard template synthesis of metal nanowires

    PubMed Central

    Kawamura, Go; Muto, Hiroyuki; Matsuda, Atsunori

    2014-01-01

    Metal nanowires (NWs) have attracted much attention because of their high electron conductivity, optical transmittance, and tunable magnetic properties. Metal NWs have been synthesized using soft templates such as surface stabilizing molecules and polymers, and hard templates such as anodic aluminum oxide, mesoporous oxide, carbon nanotubes. NWs prepared from hard templates are composites of metals and the oxide/carbon matrix. Thus, selecting appropriate elements can simplify the production of composite devices. The resulting NWs are immobilized and spatially arranged, as dictated by the ordered porous structure of the template. This avoids the NWs from aggregating, which is common for NWs prepared with soft templates in solution. Herein, the hard template synthesis of metal NWs is reviewed, and the resulting structures, properties and potential applications are discussed. PMID:25453031

  14. Vertical Carbon Nanotube Device in Nanoporous Templates

    NASA Technical Reports Server (NTRS)

    Maschmann, Matthew Ralph (Inventor); Fisher, Timothy Scott (Inventor); Sands, Timothy (Inventor); Bashir, Rashid (Inventor)

    2014-01-01

    A modified porous anodic alumina template (PAA) containing a thin CNT catalyst layer directly embedded into the pore walls. CNT synthesis using the template selectively catalyzes SWNTs and DWNTs from the embedded catalyst layer to the top PAA surface, creating a vertical CNT channel within the pores. Subsequent processing allows for easy contact metallization and adaptable functionalization of the CNTs and template for a myriad of applications.

  15. DNA-templated gold nanowires

    NASA Astrophysics Data System (ADS)

    Mohammadzadegan, Reza; Mohabatkar, Hassan; Sheikhi, Mohammad Hossein; Safavi, Afsaneh; Khajouee, Mahmood Barati

    2008-10-01

    We have developed simple methods of reproducibly creating deoxyribonucleic acid (DNA)-templated gold nanowires on silicon. First DNA nanowires were aligned on silicon surfaces. Briefly, modified silicon wafer was soaked in the DNA solution, and then the solution was removed using micropipettes; the surface tension at the moving air-solution interface is sufficient to align the DNA nanowires on the silicon wafer. In another attempt, an aqueous dispersion of sodium azide-stabilized gold nanoparticles was prepared. The nanoparticles aligned double-stranded λ-DNA to form a linear nanoparticle array. Continuous gold nanowires were obtained. The above nanowires were structurally characterized using scanning electron microscopy. The results of the characterizations show the wires to be 57-323 nm wide, to be continuous with a length of 2.8-9.5 μm. The use of DNA as a template for the self-assembly of conducting nanowires represents a potentially important approach in the fabrication of nanoscale interconnects.

  16. Method of installing sub-sea templates

    SciTech Connect

    Hampton, J.E.

    1984-03-06

    A subsea template is installed by a method which includes the steps of securing the template in a position beneath the deck of a semi-submersible drilling vessel, moving the semi-submersible drilling vessel to an appropriate offshore site and subsequently lowering the template from the semi-submersible to the sea bed. In addition, at least three anchorage templates may be loaded onto one or both of the pontoons of the semi-submersible drilling vessel at its original position and are subsequently lowered from the pontoons to their respective locations on the sea bed after the semi-submersible has moved to the offshore site.

  17. Templated Dry Printing of Conductive Metal Nanoparticles

    NASA Astrophysics Data System (ADS)

    Rolfe, David Alexander

    Printed electronics can lower the cost and increase the ubiquity of electrical components such as batteries, sensors, and telemetry systems. Unfortunately, the advance of printed electronics has been held back by the limited minimum resolution, aspect ratio, and feature fidelity of present printing techniques such as gravure, screen printing and inkjet printing. Templated dry printing offers a solution to these problems by patterning nanoparticle inks into templates before drying. This dissertation shows advancements in two varieties of templated dry nanoprinting. The first, advective micromolding in vapor-permeable templates (AMPT) is a microfluidic approach that uses evaporation-driven mold filling to create submicron features with a 1:1 aspect ratio. We will discuss submicron surface acoustic wave (SAW) resonators made through this process, and the refinement process in the template manufacturing process necessary to make these devices. We also present modeling techniques that can be applied to future AMPT templates. We conclude with a modified templated dry printing that improves throughput and isolated feature patterning by transferring dry-templated features with laser ablation. This method utilizes surface energy-defined templates to pattern features via doctor blade coating. Patterned and dried features can be transferred to a polymer substrate with an Nd:YAG MOPA fiber laser, and printed features can be smaller than the laser beam width.

  18. Optimized periocular template selection for human recognition.

    PubMed

    Bakshi, Sambit; Sa, Pankaj K; Majhi, Banshidhar

    2013-01-01

    A novel approach for selecting a rectangular template around periocular region optimally potential for human recognition is proposed. A comparatively larger template of periocular image than the optimal one can be slightly more potent for recognition, but the larger template heavily slows down the biometric system by making feature extraction computationally intensive and increasing the database size. A smaller template, on the contrary, cannot yield desirable recognition though the smaller template performs faster due to low computation for feature extraction. These two contradictory objectives (namely, (a) to minimize the size of periocular template and (b) to maximize the recognition through the template) are aimed to be optimized through the proposed research. This paper proposes four different approaches for dynamic optimal template selection from periocular region. The proposed methods are tested on publicly available unconstrained UBIRISv2 and FERET databases and satisfactory results have been achieved. Thus obtained template can be used for recognition of individuals in an organization and can be generalized to recognize every citizen of a nation.

  19. A hybrid approach for face template protection

    NASA Astrophysics Data System (ADS)

    Feng, Y. C.; Yuen, Pong C.; Jain, Anil K.

    2008-03-01

    Biometric template protection is one of the important issues in deploying a practical biometric system. To tackle this problem, many algorithms have been reported in recent years, most of them being applicable to fingerprint biometric. Since the content and representation of fingerprint template is different from templates of other modalities such as face, the fingerprint template protection algorithms cannot be directly applied to face template. Moreover, we believe that no single template protection method is capable of satisfying the diversity, revocability, security and performance requirements. We propose a three-step cancelable framework which is a hybrid approach for face template protection. This hybrid algorithm is based on the random projection, class distribution preserving transform and hash function. Two publicly available face databases, namely FERET and CMU-PIE, are used for evaluating the template protection scheme. Experimental results show that the proposed method maintains good template discriminability, resulting in good recognition performance. A comparison with the recently developed random multispace quantization (RMQ) biohashing algorithm shows that our method outperforms the RMQ algorithm.

  20. Influence of template fill in graphoepitaxy DSA

    NASA Astrophysics Data System (ADS)

    Doise, Jan; Bekaert, Joost; Chan, Boon Teik; Hong, SungEun; Lin, Guanyang; Gronheid, Roel

    2016-03-01

    Directed self-assembly (DSA) of block copolymers (BCP) is considered a promising patterning approach for the 7 nm node and beyond. Specifically, a grapho-epitaxy process using a cylindrical phase BCP may offer an efficient solution for patterning randomly distributed contact holes with sub-resolution pitches, such as found in via and cut mask levels. In any grapho-epitaxy process, the pattern density impacts the template fill (local BCP thickness inside the template) and may cause defects due to respectively over- or underfilling of the template. In order to tackle this issue thoroughly, the parameters that determine template fill and the influence of template fill on the resulting pattern should be investigated. In this work, using three process flow variations (with different template surface energy), template fill is experimentally characterized as a function of pattern density and film thickness. The impact of these parameters on template fill is highly dependent on the process flow, and thus pre-pattern surface energy. Template fill has a considerable effect on the pattern transfer of the DSA contact holes into the underlying layer. Higher fill levels give rise to smaller contact holes and worse critical dimension uniformity. These results are important towards DSA-aware design and show that fill is a crucial parameter in grapho-epitaxy DSA.

  1. Ontology Matching Across Domains

    DTIC Science & Technology

    2010-05-01

    matching include GMO [1], Anchor-Prompt [2], and Similarity Flooding [3]. GMO is an iterative structural matcher, which uses RDF bipartite graphs to...AFRL under contract# FA8750-09-C-0058. References [1] Hu, W., Jian, N., Qu, Y., Wang, Y., “ GMO : a graph matching for ontologies”, in: Proceedings of

  2. DOE Matching Grant Program

    SciTech Connect

    Dr Marvin Adams

    2002-03-01

    OAK 270 - The DOE Matching Grant Program provided $50,000.00 to the Dept of N.E. at TAMU, matching a gift of $50,000.00 from TXU Electric. The $100,000.00 total was spent on scholarships, departmental labs, and computing network.

  3. Development of Simulants to Support Mixing Tests for High Level Waste and Low Activity Waste

    SciTech Connect

    EIBLING, RUSSELLE.

    2004-06-01

    The objectives of this study were to develop two different types of simulants to support vendor agitator design studies and mixing studies. The initial simulant development task was to develop rheologically-bounding physical simulants and the final portion was to develop a nominal chemical simulant which is designed to match, as closely as possible, the actual sludge from a tank. The physical simulants to be developed included a lower and upper rheologically bounded: pretreated low activity waste (LAW) physical simulant; LAW melter feed physical simulant; pretreated high level waste (HLW) physical simulant; HLW melter feed physical simulant. The nominal chemical simulant, hereafter referred to as the HLW Precipitated Hydroxide simulant, is designed to represent the chemical/physical composition of the actual washed and leached sludge sample. The objective was to produce a simulant which matches not only the chemical composition but also the physical properties of the actual waste sample. The HLW Precipitated Hydroxide simulant could then be used for mixing tests to validate mixing, homogeneity and representative sampling and transferring issues. The HLW Precipitated Hydroxide simulant may also be used for integrated nonradioactive testing of the WTP prior to radioactive operation.

  4. Matched-pair classification

    SciTech Connect

    Theiler, James P

    2009-01-01

    Following an analogous distinction in statistical hypothesis testing, we investigate variants of machine learning where the training set comes in matched pairs. We demonstrate that even conventional classifiers can exhibit improved performance when the input data has a matched-pair structure. Online algorithms, in particular, converge quicker when the data is presented in pairs. In some scenarios (such as the weak signal detection problem), matched pairs can be generated from independent samples, with the effect not only doubling the nominal size of the training set, but of providing the structure that leads to better learning. A family of 'dipole' algorithms is introduced that explicitly takes advantage of matched-pair structure in the input data and leads to further performance gains. Finally, we illustrate the application of matched-pair learning to chemical plume detection in hyperspectral imagery.

  5. Hybrid geometric-random template-placement algorithm for gravitational wave searches from compact binary coalescences

    NASA Astrophysics Data System (ADS)

    Roy, Soumen; Sengupta, Anand S.; Thakor, Nilay

    2017-05-01

    Astrophysical compact binary systems consisting of neutron stars and black holes are an important class of gravitational wave (GW) sources for advanced LIGO detectors. Accurate theoretical waveform models from the inspiral, merger, and ringdown phases of such systems are used to filter detector data under the template-based matched-filtering paradigm. An efficient grid over the parameter space at a fixed minimal match has a direct impact on the overall time taken by these searches. We present a new hybrid geometric-random template placement algorithm for signals described by parameters of two masses and one spin magnitude. Such template banks could potentially be used in GW searches from binary neutron stars and neutron star-black hole systems. The template placement is robust and is able to automatically accommodate curvature and boundary effects with no fine-tuning. We also compare these banks against vanilla stochastic template banks and show that while both are equally efficient in the fitting-factor sense, the bank sizes are ˜25 % larger in the stochastic method. Further, we show that the generation of the proposed hybrid banks can be sped up by nearly an order of magnitude over the stochastic bank. Generic issues related to optimal implementation are discussed in detail. These improvements are expected to directly reduce the computational cost of gravitational wave searches.

  6. Defense High Level Waste Disposal Container System Description

    SciTech Connect

    2000-10-12

    The Defense High Level Waste Disposal Container System supports the confinement and isolation of waste within the Engineered Barrier System of the Monitored Geologic Repository (MGR). Disposal containers are loaded and sealed in the surface waste handling facilities, transferred to the underground through the accesses using a rail mounted transporter, and emplaced in emplacement drifts. The defense high level waste (HLW) disposal container provides long-term confinement of the commercial HLW and defense HLW (including immobilized plutonium waste forms (IPWF)) placed within disposable canisters, and withstands the loading, transfer, emplacement, and retrieval loads and environments. U.S. Department of Energy (DOE)-owned spent nuclear fuel (SNF) in disposable canisters may also be placed in a defense HLW disposal container along with commercial HLW waste forms, which is known as 'co-disposal'. The Defense High Level Waste Disposal Container System provides containment of waste for a designated period of time, and limits radionuclide release. The disposal container/waste package maintains the waste in a designated configuration, withstands maximum handling and rockfall loads, limits the individual canister temperatures after emplacement, resists corrosion in the expected handling and repository environments, and provides containment of waste in the event of an accident. Defense HLW disposal containers for HLW disposal will hold up to five HLW canisters. Defense HLW disposal containers for co-disposal will hold up to five HLW canisters arranged in a ring and one DOE SNF canister in the ring. Defense HLW disposal containers also will hold two Multi-Canister Overpacks (MCOs) and two HLW canisters in one disposal container. The disposal container will include outer and inner cylinders, outer and inner cylinder lids, and may include a canister guide. An exterior label will provide a means by which to identify the disposal container and its contents. Different materials

  7. Interventions for Individuals With High Levels of Needle Fear

    PubMed Central

    Noel, Melanie; Taddio, Anna; Antony, Martin M.; Asmundson, Gordon J.G.; Riddell, Rebecca Pillai; Chambers, Christine T.; Shah, Vibhuti

    2015-01-01

    Background: This systematic review evaluated the effectiveness of exposure-based psychological and physical interventions for the management of high levels of needle fear and/or phobia and fainting in children and adults. Design/Methods: A systematic review identified relevant randomized and quasi-randomized controlled trials of children, adults, or both with high levels of needle fear, including phobia (if not available, then populations with other specific phobias were included). Critically important outcomes were self-reported fear specific to the feared situation and stimulus (psychological interventions) or fainting (applied muscle tension). Data were pooled using standardized mean difference (SMD) or relative risk with 95% confidence intervals. Results: The systematic review included 11 trials. In vivo exposure-based therapy for children 7 years and above showed benefit on specific fear (n=234; SMD: −1.71 [95% CI: −2.72, −0.7]). In vivo exposure-based therapy with adults reduced fear of needles posttreatment (n=20; SMD: −1.09 [−2.04, −0.14]) but not at 1-year follow-up (n=20; SMD: −0.28 [−1.16, 0.6]). Compared with single session, a benefit was observed for multiple sessions of exposure-based therapy posttreatment (n=93; SMD: −0.66 [−1.08, −0.24]) but not after 1 year (n=83; SMD: −0.37 [−0.87, 0.13]). Non in vivo e.g., imaginal exposure-based therapy in children reduced specific fear posttreatment (n=41; SMD: −0.88 [−1.7, −0.05]) and at 3 months (n=24; SMD: −0.89 [−1.73, −0.04]). Non in vivo exposure-based therapy for adults showed benefit on specific fear (n=68; SMD: −0.62 [−1.11, −0.14]) but not procedural fear (n=17; SMD: 0.18 [−0.87, 1.23]). Applied tension showed benefit on fainting posttreatment (n=20; SMD: −1.16 [−2.12, −0.19]) and after 1 year (n=20; SMD: −0.97 [−1.91, −0.03]) compared with exposure alone. Conclusions: Exposure-based psychological interventions and applied muscle tension show

  8. Spent nuclear fuel project high-level information management plan

    SciTech Connect

    Main, G.C.

    1996-09-13

    This document presents the results of the Spent Nuclear Fuel Project (SNFP) Information Management Planning Project (IMPP), a short-term project that identified information management (IM) issues and opportunities within the SNFP and outlined a high-level plan to address them. This high-level plan for the SNMFP IM focuses on specific examples from within the SNFP. The plan`s recommendations can be characterized in several ways. Some recommendations address specific challenges that the SNFP faces. Others form the basis for making smooth transitions in several important IM areas. Still others identify areas where further study and planning are indicated. The team`s knowledge of developments in the IM industry and at the Hanford Site were crucial in deciding where to recommend that the SNFP act and where they should wait for Site plans to be made. Because of the fast pace of the SNFP and demands on SNFP staff, input and interaction were primarily between the IMPP team and members of the SNFP Information Management Steering Committee (IMSC). Key input to the IMPP came from a workshop where IMSC members and their delegates developed a set of draft IM principles. These principles, described in Section 2, became the foundation for the recommendations found in the transition plan outlined in Section 5. Availability of SNFP staff was limited, so project documents were used as a basis for much of the work. The team, realizing that the status of the project and the environment are continually changing, tried to keep abreast of major developments since those documents were generated. To the extent possible, the information contained in this document is current as of the end of fiscal year (FY) 1995. Programs and organizations on the Hanford Site as a whole are trying to maximize their return on IM investments. They are coordinating IM activities and trying to leverage existing capabilities. However, the SNFP cannot just rely on Sitewide activities to meet its IM requirements

  9. CEMENTITIOUS GROUT FOR CLOSING SRS HIGH LEVEL WASTE TANKS - #12315

    SciTech Connect

    Langton, C.; Burns, H.; Stefanko, D.

    2012-01-10

    In 1997, the first two United States Department of Energy (US DOE) high level waste tanks (Tanks 17-F and 20-F: Type IV, single shell tanks) were taken out of service (permanently closed) at the Savannah River Site (SRS). In 2012, the DOE plans to remove from service two additional Savannah River Site (SRS) Type IV high-level waste tanks, Tanks 18-F and 19-F. These tanks were constructed in the late 1950's and received low-heat waste and do not contain cooling coils. Operational closure of Tanks 18-F and 19-F is intended to be consistent with the applicable requirements of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and will be performed in accordance with South Carolina Department of Health and Environmental Control (SCDHEC). The closure will physically stabilize two 4.92E+04 cubic meter (1.3 E+06 gallon) carbon steel tanks and isolate and stabilize any residual contaminants left in the tanks. The closure will also fill, physically stabilize and isolate ancillary equipment abandoned in the tanks. A Performance Assessment (PA) has been developed to assess the long-term fate and transport of residual contamination in the environment resulting from the operational closure of the F-Area Tank Farm (FTF) waste tanks. Next generation flowable, zero-bleed cementitious grouts were designed, tested, and specified for closing Tanks 18-F and 19-F and for filling the abandoned equipment. Fill requirements were developed for both the tank and equipment grouts. All grout formulations were required to be alkaline with a pH of 12.4 and chemically reduction potential (Eh) of -200 to -400 to stabilize selected potential contaminants of concern. This was achieved by including Portland cement and Grade 100 slag in the mixes, respectively. Ingredients and proportions of cementitious reagents were selected and adjusted, respectively, to support the mass placement strategy developed by closure

  10. ATW system impact on high-level waste

    SciTech Connect

    Arthur, E.D.

    1992-12-01

    This report discusses the Accelerator Transmutation of Waste (ATW) concept which aims at destruction of key long-lived radionuclides in high-level nuclear waste (HLW), both fission products and actinides. This focus makes it different from most other transmutation concepts which concentrate primarily on actinide burning. The ATW system uses an accelerator-driven, sub-critical assembly to create an intense thermal neutron environment for radionuclide transmutation. This feature allows rapid transmutation under low-inventory system conditions, which in turn, has a direct impact on the size of chemical separations and materials handling components of the system. Inventories in ATW are factors of eight to thirty times smaller than reactor systems of equivalent thermal power. Chemical separations systems are relatively small in scale and can be optimized to achieve high decontamination factors and minimized waste streams. The low-inventory feature also directly impacts material amounts remaining in the system at its end of life. In addition to its low-inventory operation, the accelerator-driven neutron source features of ATW are key to providing a sufficient level of neutrons to allow transmutation of long-lived fission products.

  11. Why consider subseabed disposal of high-level nuclear waste

    SciTech Connect

    Heath, G. R.; Hollister, C. D.; Anderson, D. R.; Leinen, M.

    1980-01-01

    Large areas of the deep seabed warrant assessment as potential disposal sites for high-level radioactive waste because: (1) they are far from seismically and tectonically active lithospheric plate boundaries; (2) they are far from active or young volcanos; (3) they contain thick layers of very uniform fine-grained clays; (4) they are devoid of natural resources likely to be exploited in the forseeable future; (5) the geologic and oceanographic processes governing the deposition of sediments in such areas are well understood, and are remarkably insensitive to past oceanographic and climatic changes; and (6) sedmentary records of tens of millions of years of slow, uninterrupted deposition of fine grained clay support predictions of the future stability of such sites. Data accumulated to date on the permeability, ion-retardation properties, and mechanical strength of pelagic clay sediments indicate that they can act as a primary barrier to the escape of buried nuclides. Work in progress should determine within the current decade whether subseabed disposal is environmentally acceptable and technically feasible, as well as address the legal, political and social issues raised by this new concept.

  12. The ALICE High Level Trigger: status and plans

    NASA Astrophysics Data System (ADS)

    Krzewicki, Mikolaj; Rohr, David; Gorbunov, Sergey; Breitner, Timo; Lehrbach, Johannes; Lindenstruth, Volker; Berzano, Dario

    2015-12-01

    The ALICE High Level Trigger (HLT) is an online reconstruction, triggering and data compression system used in the ALICE experiment at CERN. Unique among the LHC experiments, it extensively uses modern coprocessor technologies like general purpose graphic processing units (GPGPU) and field programmable gate arrays (FPGA) in the data flow. Realtime data compression is performed using a cluster finder algorithm implemented on FPGA boards. These data, instead of raw clusters, are used in the subsequent processing and storage, resulting in a compression factor of around 4. Track finding is performed using a cellular automaton and a Kalman filter algorithm on GPGPU hardware, where both CUDA and OpenCL technologies can be used interchangeably. The ALICE upgrade requires further development of online concepts to include detector calibration and stronger data compression. The current HLT farm will be used as a test bed for online calibration and both synchronous and asynchronous processing frameworks already before the upgrade, during Run 2. For opportunistic use as a Grid computing site during periods of inactivity of the experiment a virtualisation based setup is deployed.

  13. High level bacterial contamination of secondary school students’ mobile phones

    PubMed Central

    Kõljalg, Siiri; Mändar, Rando; Sõber, Tiina; Rööp, Tiiu; Mändar, Reet

    2017-01-01

    Introduction While contamination of mobile phones in the hospital has been found to be common in several studies, little information about bacterial abundance on phones used in the community is available. Our aim was to quantitatively determine the bacterial contamination of secondary school students’ mobile phones. Methods Altogether 27 mobile phones were studied. The contact plate method and microbial identification using MALDI-TOF mass spectrometer were used for culture studies. Quantitative PCR reaction for detection of universal 16S rRNA, Enterococcus faecalis 16S rRNA and Escherichia coli allantoin permease were performed, and the presence of tetracycline (tetA, tetB, tetM), erythromycin (ermB) and sulphonamide (sul1) resistance genes was assessed. Results We found a high median bacterial count on secondary school students’ mobile phones (10.5 CFU/cm2) and a median of 17,032 bacterial 16S rRNA gene copies per phone. Potentially pathogenic microbes (Staphylococcus aureus, Acinetobacter spp., Pseudomonas spp., Bacillus cereus and Neisseria flavescens) were found among dominant microbes more often on phones with higher percentage of E. faecalis in total bacterial 16S rRNA. No differences in contamination level or dominating bacterial species between phone owner’s gender and between phone types (touch screen/keypad) were found. No antibiotic resistance genes were detected on mobile phone surfaces. Conclusion Quantitative study methods revealed high level bacterial contamination of secondary school students’ mobile phones. PMID:28626737

  14. High-level fluorescence labeling of gram-positive pathogens.

    PubMed

    Aymanns, Simone; Mauerer, Stefanie; van Zandbergen, Ger; Wolz, Christiane; Spellerberg, Barbara

    2011-01-01

    Fluorescence labeling of bacterial pathogens has a broad range of interesting applications including the observation of living bacteria within host cells. We constructed a novel vector based on the E. coli streptococcal shuttle plasmid pAT28 that can propagate in numerous bacterial species from different genera. The plasmid harbors a promoterless copy of the green fluorescent variant gene egfp under the control of the CAMP-factor gene (cfb) promoter of Streptococcus agalactiae and was designated pBSU101. Upon transfer of the plasmid into streptococci, the bacteria show a distinct and easily detectable fluorescence using a standard fluorescence microscope and quantification by FACS-analysis demonstrated values that were 10-50 times increased over the respective controls. To assess the suitability of the construct for high efficiency fluorescence labeling in different gram-positive pathogens, numerous species were transformed. We successfully labeled Streptococcus pyogenes, Streptococcus agalactiae, Streptococcus dysgalactiae subsp. equisimilis, Enterococcus faecalis, Enterococcus faecium, Streptococcus mutans, Streptococcus anginosus and Staphylococcus aureus strains utilizing the EGFP reporter plasmid pBSU101. In all of these species the presence of the cfb promoter construct resulted in high-level EGFP expression that could be further increased by growing the streptococcal and enterococcal cultures under high oxygen conditions through continuous aeration.

  15. Wind resource quality affected by high levels of renewables

    DOE PAGES

    Diakov, Victor

    2015-06-17

    For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a givenmore » level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.« less

  16. High-level simulation of JWST event-driven operations

    NASA Astrophysics Data System (ADS)

    Henry, R.; Kinzel, W.

    2012-09-01

    The James Webb Space Telescope (JWST) has an event-driven architecture: an onboard Observation Plan Executive (OPE) executes an Observation Plan (OP) consisting of a sequence of observing units (visits). During normal operations, ground action to update the OP is only expected to be necessary about once a week. This architecture is designed to tolerate uncertainty in visit duration, and occasional visit failures due to inability to acquire guide stars, without creating gaps in the observing timeline. The operations concept is complicated by the need for occasional scheduling of timecritical science and engineering visits that cannot tolerate much slippage without inducing gaps, and also by onboard momentum management. A prototype Python tool called the JWST Observation Plan Execution Simulator (JOPES) has recently been developed to simulate OP execution at a high level and analyze the response of the Observatory and OPE to both nominal and contingency scenarios. Incorporating both deterministic and stochastic behavior, JOPES has potential to be a powerful tool for several purposes: requirements analysis, system verification, systems engineering studies, and test data generation. It has already been successfully applied to a study of overhead estimation bias: whether to use conservative or average-case estimates for timing components that are inherently uncertain, such as those involving guide-star acquisition. JOPES is being enhanced to support interfaces to the operational Proposal Planning Subsystem (PPS) now being developed, with the objective of "closing the loop" between testing and simulation by feeding simulated event logs back into the PPS.

  17. Commissioning and first experiences of the ALICE High Level Trigger

    NASA Astrophysics Data System (ADS)

    Steinbeck, Timm M.; Alice Hlt Collaboration

    2010-04-01

    For the ALICE heavy-ion experiment a large computing cluster will be used to perform the last triggering stages in the High Level Trigger (HLT). For the first year of operation the cluster consisted of about 100 multi-processing nodes with 4 or 8 CPU cores each, to be increased to more than 1000 nodes for the coming years of operation. During the commissioning phases of the detector, the preparations for first LHC beam, as well as during the periods of first LHC beam, the HLT has been used extensively already to reconstruct, compress, and display data from the different detectors. For example the HLT has been used to compress Silicon Drift Detector (SDD) data by a factor of 15, lossless, on the fly at a rate of more than 800 Hz. For ALICE's Time Projection Chamber (TPC) detector the HLT has been used to reconstruct tracks online and show the reconstructed tracks in an online event display. The event display can also display online reconstructed data from the Dimuon and Photon Spectrometer (PHOS) detectors. For the latter detector a first selection mechanism has also been put into place to select only events for forwarding to the online display in which data has passed through the PHOS detector. In this contribution we will present experiences and results from these commissioning phases.

  18. High Level Trigger Applications for the ALICE Experiment

    NASA Astrophysics Data System (ADS)

    Richter, M.; Aamodt, K.; Alt, T.; Bablok, S.; Cheshkov, C.; Hille, P. T.; Lindenstruth, V.; Ovrebekk, G.; Ploskon, M.; Popescu, S.; Rohrich, D.; Steinbeck, T. M.; Thader, J.

    2008-02-01

    For the ALICE experiment at the Large Hadron Collider (LHC) at CERN/Geneva, a high level trigger system (HLT) for online event selection and data compression has been developed and a computing cluster of several hundred dual-processor nodes is being installed. A major system integration test was carried out during the commissioning of the time projection chamber (TPC), where the HLT also provides a monitoring system. All major parts like a small computing cluster, hardware input devices, the online data transportation framework, and the HLT analysis could be tested successfully. A common interface for HLT processing components has been designed to run the components from either the online or offline analysis framework without changes. The interface adapts the component to the needs of the online processing and allows the developer to use the offline framework for easy development, debugging, and benchmarking. Following this approach, results can be compared directly. For the upcoming commissioning of the whole detector, the HLT is currently prepared to run online data analysis for the main detectors, e.g., the inner tracking system (ITS), the TPC, and the transition radiation detector (TRD). The HLT processing capability is indispensable for the photon spectrometer (PHOS), where the online pulse shape analysis reduces the data volume by a factor 20. A common monitoring framework is in place and detector calibration algorithms have been ported to the HLT. The paper describes briefly the architecture of the HLT system. It focuses on typical applications and component development.

  19. Hemipelvectomy: high-level amputation surgery and prosthetic rehabilitation.

    PubMed

    Houdek, Matthew T; Kralovec, Michael E; Andrews, Karen L

    2014-07-01

    The hemipelvectomy, most commonly performed for pelvic tumor resection, is one of the most technically demanding and invasive surgical procedures performed today. Adequate soft tissue coverage and wound complications after hemipelvectomy are important considerations. Rehabilitation after hemipelvectomy is optimally managed by a multidisciplinary integrated team. Understanding the functional outcomes for this population assists the rehabilitation team to counsel patients, plan goals, and determine discharge needs. The most important rehabilitation goal is the optimal restoration of the patient's functional independence. Factors such as age, sex, etiology, level of amputation, and general health play important roles in determining prosthetic use. The three main criteria for successful prosthetic rehabilitation of patients with high-level amputation are comfort, function, and cosmesis. Recent advances in hip and knee joints have contributed to increased function. Prosthetic use after hemipelvectomy improves balance and decreases the need for a gait aid. Using a prosthesis helps maintain muscle strength and tone, cardiovascular health, and functional mobility. With new advances in prosthetic components, patients are choosing to use their prostheses for primary mobility.

  20. High Level Waste Feed Certification in Hanford Double Shell Tanks

    SciTech Connect

    Thien, Micheal G.; Wells, Beric E.; Adamson, Duane J.

    2010-03-01

    The ability to effectively mix, sample, certify, and deliver consistent batches of High Level Waste (HLW) feed from the Hanford Double Shell Tanks (DST) to the Waste Treatment and Immobilization Plant (WTP) presents a significant mission risk with potential to impact mission length and the quantity of HLW glass produced. DOE’s River Protection Project (RPP) mission modeling and WTP facility modeling assume that individual 3785 cubic meter (1 million gallon) HLW feed tanks are homogenously mixed, representatively sampled, and consistently delivered to the WTP. It has been demonstrated that homogenous mixing of HLW sludge in Hanford DSTs is not likely achievable with the baseline design thereby causing representative sampling and consistent feed delivery to be more difficult. Inconsistent feed to the WTP could cause additional batch to batch operational adjustments that reduces operating efficiency and has the potential to increase the overall mission length. The Hanford mixing and sampling demonstration program will identify DST mixing performance capability, will evaluate representative sampling techniques, and will estimate feed batch consistency. An evaluation of demonstration program results will identify potential mission improvement considerations that will help ensure successful mission completion. This paper will discuss the history, progress, and future activities that will define and mitigate the mission risk.

  1. Wind resource quality affected by high levels of renewables

    SciTech Connect

    Diakov, Victor

    2015-06-17

    For solar photovoltaic (PV) and wind resources, the capacity factor is an important parameter describing the quality of the resource. As the share of variable renewable resources (such as PV and wind) on the electric system is increasing, so does curtailment (and the fraction of time when it cannot be avoided). At high levels of renewable generation, curtailments effectively change the practical measure of resource quality from capacity factor to the incremental capacity factor. The latter accounts only for generation during hours of no curtailment and is directly connected with the marginal capital cost of renewable generators for a given level of renewable generation during the year. The Western U.S. wind generation is analyzed hourly for a system with 75% of annual generation from wind, and it is found that the value for the system of resources with equal capacity factors can vary by a factor of 2, which highlights the importance of using the incremental capacity factor instead. Finally, the effect is expected to be more pronounced in smaller geographic areas (or when transmission limitations imposed) and less pronounced at lower levels of renewable energy in the system with less curtailment.

  2. Ultrafilter Conditions for High Level Waste Sludge Processing

    SciTech Connect

    Geeting, John GH; Hallen, Richard T.; Peterson, Reid A.

    2006-08-28

    An evaluation of the optimal filtration conditions was performed based on test data obtained from filtration of a High Level Waste Sludge sample from the Hanford tank farms. This evaluation was performed using the anticipated configuration for the Waste Treatment Plant at the Hanford site. Testing was performed to identify the optimal pressure drop and cross flow velocity for filtration at both high and low solids loading. However, this analysis indicates that the actual filtration rate achieved is relatively insensitive to these conditions under anticipated operating conditions. The maximum filter flux was obtained by adjusting the system control valve pressure from 400 to 650 kPa while the filter feed concentration increased from 5 to 20 wt%. However, operating the system with a constant control valve pressure drop of 500 kPa resulted in a less than 1% reduction in the average filter flux. Also note that allowing the control valve pressure to swing as much as +/- 20% resulted in less than a 5% decrease in filter flux.

  3. Vestibular contributions to high-level sensorimotor functions.

    PubMed

    Medendorp, W Pieter; Selen, Luc J P

    2017-02-02

    The vestibular system, which detects motion and orientation of the head in space, is known to be important in controlling gaze to stabilize vision, to ensure postural stability and to provide our sense of self-motion. While the brain's computations underlying these functions are extensively studied, the role of the vestibular system in higher level sensorimotor functions is less clear. This review covers new research on the vestibular influence on perceptual judgments, motor decisions, and the ability to learn multiple motor actions. Guided by concepts such as optimization, inference, estimation and control, we focus on how the brain determines causal relationships between memorized and visual representations in the updating of visual space, and how vestibular, visual and efferent motor information are integrated in the estimation of body motion. We also discuss evidence that these computations involve multiple coordinate representations, some of which can be probed in parietal cortex using neuronal oscillations derived from EEG. In addition, we describe work on decision making during self-motion, showing a clear modulation of bottom-up acceleration signals on decisions in the saccadic system. Finally, we consider the importance of vestibular signals as contextual cues in motor learning and recall. Taken together, these results emphasize the impact of vestibular information on high-level sensorimotor functions, and identify future directions for theoretical, behavioral, and neurophysiological investigations.

  4. Pupil responses to high-level image content.

    PubMed

    Naber, Marnix; Nakayama, Ken

    2013-05-17

    The link between arousal and pupil dilation is well studied, but it is less known that other cognitive processes can trigger pupil responses. Here we present evidence that pupil responses can be induced by high-level scene processing, independent of changes in low-level features or arousal. In Experiment 1, we recorded changes in pupil diameter of observers while they viewed a variety of natural scenes with or without a sun that were presented either upright or inverted. Image inversion had the strongest effect on the pupil responses. The pupil constricted more to the onset of upright images as compared to inverted images. Furthermore, the amplitudes of pupil constrictions to viewing images containing a sun were larger relative to control images. In Experiment 2, we presented cartoon versions of upright and inverted pictures that included either a sun or a moon. The image backgrounds were kept identical across conditions. Similar to Experiment 1, upright images triggered pupil constrictions with larger amplitudes than inverted images and images of the sun evoked greater pupil contraction than images of the moon. We suggest that the modulations of pupil responses were due to higher-level interpretations of image content.

  5. Potential for erosion corrosion of SRS high level waste tanks

    SciTech Connect

    Zapp, P.E.

    1994-01-01

    SRS high-level radioactive waste tanks will not experience erosion corrosion to any significant degree during slurry pump operations. Erosion corrosion in carbon steel structures at reported pump discharge velocities is dominated by electrochemical (corrosion) processes. Interruption of those processes, as by the addition of corrosion inhibitors, sharply reduces the rate of metal loss from erosion corrosion. The well-inhibited SRS waste tanks have a near-zero general corrosion rate, and therefore will be essentially immune to erosion corrosion. The experimental data on carbon steel erosion corrosion most relevant to SRS operations was obtained at the Hanford Site on simulated Purex waste. A metal loss rate of 2.4 mils per year was measured at a temperature of 102 C and a slurry velocity comparable to calculated SRS slurry velocities on ground specimens of the same carbon steel used in SRS waste tanks. Based on these data and the much lower expected temperatures, the metal loss rate of SRS tanks under waste removal and processing conditions should be insignificant, i.e. less than 1 mil per year.

  6. Control of high level radioactive waste-glass melters

    SciTech Connect

    Bickford, D.F.; Choi, A.S.

    1991-01-01

    Slurry Fed Melters (SFM) are being developed in the United States, Europe and Japan for the conversion of high-level radioactive waste to borosilicate glass for permanent disposal. The high transition metal, noble metal, nitrate, organic, and sulfate contents of these wastes lead to unique melter redox control requirements. Pilot waste-glass melter operations have indicated the possibility of nickel sulfide or noble-metal fission-product accumulation on melter floors, which can lead to distortion of electric heating patterns, and decrease melter life. Sulfide formation is prevented by control of the redox chemistry of the melter feed. The redox state of waste-glass melters is determined by balance between the reducing potential of organic compounds in the feed, and the oxidizing potential of gases above the melt, and nitrates and polyvalent elements in the waste. Semiquantitative models predicting limitations of organic content have been developed based on crucible testing. Computerized thermodynamic computations are being developed to predict the sequence and products of redox reactions and is assessing process variations. Continuous melter test results have been compared to improved computer staged-thermodynamic-models of redox behavior. Feed chemistry control to prevent sulfide and moderate noble metal accumulations are discussed. 17 refs., 3 figs.

  7. Identification of areas with high levels of untreated dental caries.

    PubMed

    Ellwood, R P; O'Mullane, D M

    1996-02-01

    In order to examine the geographical variation of dental health within 10 county districts in North Wales, 3538 children were examined. The associations between three demographic indicators, based on the 1981 OPCS census, and dental health outcomes were assessed for electoral wards within the county districts. The Townsend and Jarman indices were the first two indicators employed and the third was based on a mathematical model representing the variation in the mean number of untreated decayed surfaces per person for the wards. This model was developed using the children examined in the five most westerly county districts. Using the data derived from the five most easterly county districts, the three indicators were assessed. All three showed strong correlations (r > or = 0.88) with dental health. These results indicate that measures of dental health based on large administrative units may obscure variation within them. It is concluded that geographical methods of this type may be useful for targeting dental resources at small areas with high levels of need.

  8. High-Level Performance Modeling of SAR Systems

    NASA Technical Reports Server (NTRS)

    Chen, Curtis

    2006-01-01

    SAUSAGE (Still Another Utility for SAR Analysis that s General and Extensible) is a computer program for modeling (see figure) the performance of synthetic- aperture radar (SAR) or interferometric synthetic-aperture radar (InSAR or IFSAR) systems. The user is assumed to be familiar with the basic principles of SAR imaging and interferometry. Given design parameters (e.g., altitude, power, and bandwidth) that characterize a radar system, the software predicts various performance metrics (e.g., signal-to-noise ratio and resolution). SAUSAGE is intended to be a general software tool for quick, high-level evaluation of radar designs; it is not meant to capture all the subtleties, nuances, and particulars of specific systems. SAUSAGE was written to facilitate the exploration of engineering tradeoffs within the multidimensional space of design parameters. Typically, this space is examined through an iterative process of adjusting the values of the design parameters and examining the effects of the adjustments on the overall performance of the system at each iteration. The software is designed to be modular and extensible to enable consideration of a variety of operating modes and antenna beam patterns, including, for example, strip-map and spotlight SAR acquisitions, polarimetry, burst modes, and squinted geometries.

  9. Defense High-Level Waste Leaching Mechanisms Program. Final report

    SciTech Connect

    Mendel, J.E.

    1984-08-01

    The Defense High-Level Waste Leaching Mechanisms Program brought six major US laboratories together for three years of cooperative research. The participants reached a consensus that solubility of the leached glass species, particularly solubility in the altered surface layer, is the dominant factor controlling the leaching behavior of defense waste glass in a system in which the flow of leachant is constrained, as it will be in a deep geologic repository. Also, once the surface of waste glass is contacted by ground water, the kinetics of establishing solubility control are relatively rapid. The concentrations of leached species reach saturation, or steady-state concentrations, within a few months to a year at 70 to 90/sup 0/C. Thus, reaction kinetics, which were the main subject of earlier leaching mechanisms studies, are now shown to assume much less importance. The dominance of solubility means that the leach rate is, in fact, directly proportional to ground water flow rate. Doubling the flow rate doubles the effective leach rate. This relationship is expected to obtain in most, if not all, repository situations.

  10. The GRAVITY instrument software/high-level software

    NASA Astrophysics Data System (ADS)

    Burtscher, Leonard; Wieprecht, Ekkehard; Ott, Thomas; Kok, Yitping; Yazici, Senol; Anugu, Narsireddy; Dembet, Roderick; Fedou, Pierre; Lacour, Sylvestre; Ott, Jürgen; Paumard, Thibaut; Lapeyrere, Vincent; Kervella, Pierre; Abuter, Roberto; Pozna, Eszter; Eisenhauer, Frank; Blind, Nicolas; Genzel, Reinhard; Gillessen, Stefan; Hans, Oliver; Haug, Marcus; Haussmann, Frank; Kellner, Stefan; Lippa, Magdalena; Pfuhl, Oliver; Sturm, Eckhard; Weber, Johannes; Amorim, Antonio; Brandner, Wolfgang; Rousselet-Perraut, Karine; Perrin, Guy S.; Straubmeier, Christian; Schöller, Markus

    2014-07-01

    GRAVITY is the four-beam, near-infrared, AO-assisted, fringe tracking, astrometric and imaging instrument for the Very Large Telescope Interferometer (VLTI). It is requiring the development of one of the most complex instrument software systems ever built for an ESO instrument. Apart from its many interfaces and interdependencies, one of the most challenging aspects is the overall performance and stability of this complex system. The three infrared detectors and the fast reflective memory network (RMN) recorder contribute a total data rate of up to 20 MiB/s accumulating to a maximum of 250 GiB of data per night. The detectors, the two instrument Local Control Units (LCUs) as well as the five LCUs running applications under TAC (Tools for Advanced Control) architecture, are interconnected with fast Ethernet, RMN fibers and dedicated fiber connections as well as signals for the time synchronization. Here we give a simplified overview of all subsystems of GRAVITY and their interfaces and discuss two examples of high-level applications during observations: the acquisition procedure and the gathering and merging of data to the final FITS file.

  11. The LHCb Data Acquisition and High Level Trigger Processing Architecture

    NASA Astrophysics Data System (ADS)

    Frank, M.; Gaspar, C.; Jost, B.; Neufeld, N.

    2015-12-01

    The LHCb experiment at the LHC accelerator at CERN collects collisions of particle bunches at 40 MHz. After a first level of hardware trigger with an output rate of 1 MHz, the physically interesting collisions are selected by running dedicated trigger algorithms in the High Level Trigger (HLT) computing farm. This farm consists of up to roughly 25000 CPU cores in roughly 1750 physical nodes each equipped with up to 4 TB local storage space. This work describes the LHCb online system with an emphasis on the developments implemented during the current long shutdown (LS1). We will elaborate the architecture to treble the available CPU power of the HLT farm and the technicalities to determine and verify precise calibration and alignment constants which are fed to the HLT event selection procedure. We will describe how the constants are fed into a two stage HLT event selection facility using extensively the local disk buffering capabilities on the worker nodes. With the installed disk buffers, the CPU resources can be used during periods of up to ten days without beams. These periods in the past accounted to more than 70% of the total time.

  12. Cytotoxicity assessment of residual high-level disinfectants.

    PubMed

    Ryu, Mizuyuki; Kobayashi, Toshihiro; Kawamukai, Emiko; Quan, Glenlelyn; Furuta, Taro

    2013-01-01

    Some studies show the uptake of disinfectants on medical devices but no studies on their cytotoxicity have been reported. This study aimed to assess that cytotoxicity in a 3-dimensional culture system using HeLa cells grown in matrices composed of collagen. Plastic materials were soaked in the use solutions of the widely used high-level disinfectants, glutaraldehyde (GA), ortho-phthalaldehyde (OPA) and peracetic acid (PAA). After being rinsed, they were allowed to dry and were embedded into the cell medium to investigate the cytotoxicity of the residual disinfectants. Cytotoxicity was observed with the polyvinyl chloride, polyurethane and silicon tubes soaked in GA and OPA, indicating that both disinfectants were absorbed in the test pieces, whereas for PAA, none was observed. As for the polytetrafluoroethylene (PTFE) tubes, no disinfectant displayed cytotoxicity. GA and OPA are primary irritants, having a potential to cause anaphylaxis and other forms of allergic reactions. There should be consideration not only about the toxicity of the residual disinfectant from poor rinsing, but also about the toxicity that would result from the disinfectants that were absorbed and consequently released from the medical devices or materials.

  13. The IFR pyroprocessing for high-level waste minimization

    SciTech Connect

    Laidler, J.J. )

    1993-01-01

    The process developed for the recycle of integral fast reactor (IFR) spent fuel utilizes a combination of pyrometallurgical and electrochemical methods and has been termed pyroprocessing. The process has been operated at full scale with simulated spent fuel using nonradioactive fission product elements. A comprehensive demonstration of the pyroprocessing of irradiated IFR fuel will begin later this year. Pyroprocessing involves the anodic dissolution of all the constituent elements of the IFR spent fuel and controlled electrotransport (electrorefining) to separate the actinide elements from the fission products present in the spent fuel. The process be applied to the processing of spent light water reactor (LWR) fuel as well, requiring only the addition of a reduction step to convert the LWR fuel as well, requiring only the addition of a reduction step to convert the LWR oxide fuel to metallic form and a separation step to separate uranium from the transuranic (TRU) elements. The TRU elements are then recovered by electroefining in the same manner as the actinides from the IFR high-level wastes arising from pyroprocessing are virtually free of actinides, and the volume of the wastes is minimized by the intrinsic characteristics of the processing of the processing method.

  14. Template-guided vs. non-guided drilling in site preparation of dental implants.

    PubMed

    Scherer, Uta; Stoetzer, Marcus; Ruecker, Martin; Gellrich, Nils-Claudius; von See, Constantin

    2015-07-01

    Clinical success of oral implants is related to primary stability and osseointegration. These parameters are associated with delicate surgical techniques. We herein studied whether template-guided drilling has a significant influence on drillholes diameter and accuracy in an in vitro model. Fresh cadaveric porcine mandibles were used for drilling experiments of four experimental groups. Each group consisted of three operators, comparing guide templates for drilling with free-handed procedure. Operators without surgical knowledge were grouped together, contrasting highly experienced oral surgeons in other groups. A total of 180 drilling actions were performed, and diameters were recorded at multiple depth levels, with a precision measuring instrument. Template-guided drilling procedure improved accuracy on a very significant level in comparison with free-handed drilling operation (p ≤ 0.001). Inaccuracy of free-handed drilling became more significant in relation to measurement depth. High homogenic uniformity of template-guided drillholes was significantly stronger than unguided drilling operations by highly experienced oral surgeons (p ≤ 0.001). Template-guided drilling procedure leads to significantly enhanced accuracy. Significant results compared to free-handed drilling actions were achieved, irrespective of the clinical experience level of the operator. Template-guided drilling procedures lead to a more predictable clinical diameter. It shows that any set of instruments has to be carefully chosen to match the specific implant system. The current in vitro study is implicating an improvement of implant bed preparation but needs to be confirmed in clinical studies.

  15. Computer-assisted virtual planning and surgical template fabrication for frontoorbital advancement.

    PubMed

    Soleman, Jehuda; Thieringer, Florian; Beinemann, Joerg; Kunz, Christoph; Guzman, Raphael

    2015-05-01

    OBJECT The authors describe a novel technique using computer-assisted design (CAD) and computed-assisted manufacturing (CAM) for the fabrication of individualized 3D printed surgical templates for frontoorbital advancement surgery. METHODS Two patients underwent frontoorbital advancement surgery for unilateral coronal synostosis. Virtual surgical planning (SurgiCase-CMF, version 5.0, Materialise) was done by virtual mirroring techniques and superposition of an age-matched normative 3D pediatric skull model. Based on these measurements, surgical templates were fabricated using a 3D printer. Bifrontal craniotomy and the osteotomies for the orbital bandeau were performed based on the sterilized 3D templates. The remodeling was then done placing the bone plates within the negative 3D templates and fixing them using absorbable poly-dl-lactic acid plates and screws. RESULTS Both patients exhibited a satisfying head shape postoperatively and at follow-up. No surgery-related complications occurred. The cutting and positioning of the 3D surgical templates proved to be very accurate and easy to use as well as reproducible and efficient. CONCLUSIONS Computer-assisted virtual planning and 3D template fabrication for frontoorbital advancement surgery leads to reconstructions based on standardizedmeasurements, precludes subjective remodeling, and seems to be overall safe and feasible. A larger series of patients with long-term follow-up is needed for further evaluation of this novel technique.

  16. Latent fingerprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2011-01-01

    Latent fingerprint identification is of critical importance to law enforcement agencies in identifying suspects: Latent fingerprints are inadvertent impressions left by fingers on surfaces of objects. While tremendous progress has been made in plain and rolled fingerprint matching, latent fingerprint matching continues to be a difficult problem. Poor quality of ridge impressions, small finger area, and large nonlinear distortion are the main difficulties in latent fingerprint matching compared to plain or rolled fingerprint matching. We propose a system for matching latent fingerprints found at crime scenes to rolled fingerprints enrolled in law enforcement databases. In addition to minutiae, we also use extended features, including singularity, ridge quality map, ridge flow map, ridge wavelength map, and skeleton. We tested our system by matching 258 latents in the NIST SD27 database against a background database of 29,257 rolled fingerprints obtained by combining the NIST SD4, SD14, and SD27 databases. The minutiae-based baseline rank-1 identification rate of 34.9 percent was improved to 74 percent when extended features were used. In order to evaluate the relative importance of each extended feature, these features were incrementally used in the order of their cost in marking by latent experts. The experimental results indicate that singularity, ridge quality map, and ridge flow map are the most effective features in improving the matching accuracy.

  17. Learning graph matching.

    PubMed

    Caetano, Tibério S; McAuley, Julian J; Cheng, Li; Le, Quoc V; Smola, Alex J

    2009-06-01

    As a fundamental problem in pattern recognition, graph matching has applications in a variety of fields, from computer vision to computational biology. In graph matching, patterns are modeled as graphs and pattern recognition amounts to finding a correspondence between the nodes of different graphs. Many formulations of this problem can be cast in general as a quadratic assignment problem, where a linear term in the objective function encodes node compatibility and a quadratic term encodes edge compatibility. The main research focus in this theme is about designing efficient algorithms for approximately solving the quadratic assignment problem, since it is NP-hard. In this paper we turn our attention to a different question: how to estimate compatibility functions such that the solution of the resulting graph matching problem best matches the expected solution that a human would manually provide. We present a method for learning graph matching: the training examples are pairs of graphs and the 'labels' are matches between them. Our experimental results reveal that learning can substantially improve the performance of standard graph matching algorithms. In particular, we find that simple linear assignment with such a learning scheme outperforms Graduated Assignment with bistochastic normalisation, a state-of-the-art quadratic assignment relaxation algorithm.

  18. Template-space metric for searches for gravitational waves from the inspiral, merger, and ringdown of binary black holes

    NASA Astrophysics Data System (ADS)

    Kalaghatgi, Chinmay; Ajith, Parameswaran; Arun, K. G.

    2015-06-01

    Searches for gravitational waves (GWs) from binary black holes using interferometric GW detectors require the construction of template banks for performing matched filtering while analyzing the data. Placement of templates over the parameter space of binaries, as well as coincidence tests of GW triggers from multiple detectors make use of the definition of a metric over the space of gravitational waveforms. Although recent searches have employed waveform templates coherently describing the inspiral, merger and ringdown (IMR) of the coalescence, the metric used in the template banks and coincidence tests was derived from post-Newtonian inspiral waveforms. In this paper, we compute (semianalytically) the template-space metric of the IMR waveform family IMRPhenomB over the parameter space of masses and the effective spin parameter. We also propose a coordinate system, which is a modified version of post-Newtonian chirp time coordinates, in which the metric is slowly varying over the parameter space. The match function semianalytically computed using the metric has excellent agreement with the "exact" match function computed numerically. We show that the metric is able to provide a reasonable approximation to the match function of other IMR waveform families, such that the effective-one-body model calibrated to numerical relativity (EOBNRv2). The availability of this metric can contribute to improving the sensitivity of searches for GWs from binary black holes in the advanced detector era.

  19. Air Sampling System Evaluation Template

    SciTech Connect

    Blunt, Brent

    2000-05-09

    The ASSET1.0 software provides a template with which a user can evaluate an Air Sampling System against the latest version of ANSI N13.1 "Sampling and Monitoring Releases of Airborne Radioactive Substances from the Stacks and Ducts of Nuclear Facilities". The software uses the ANSI N13.1 PIC levels to establish basic design criteria for the existing or proposed sampling system. The software looks at such criteria as PIC level, type of radionuclide emissions, physical state of the radionuclide, nozzle entrance effects, particulate transmission effects, system and component accuracy and precision evaluations, and basic system operations to provide a detailed look at the subsystems of a monitoring and sampling system/program. A GAP evaluation can then be completed which leads to identification of design and operational flaws in the proposed systems. Corrective measures can then be limited to the GAPs.

  20. Mesoporous silica templated zirconia nanoparticles

    NASA Astrophysics Data System (ADS)

    Ballem, Mohamed A.; Córdoba, José M.; Odén, Magnus

    2011-07-01

    Nanoparticles of zirconium oxide (ZrO2) were synthesized by infiltration of a zirconia precursor (ZrOCl2·8H2O) into a SBA-15 mesoporous silica mold using a wet-impregnation technique. X-ray diffractometry and high-resolution transmission electron microscopy show formation of stable ZrO2 nanoparticles inside the silica pores after a thermal treatment at 550 °C. Subsequent leaching out of the silica template by NaOH resulted in well-dispersed ZrO2 nanoparticles with an average diameter of 4 nm. The formed single crystal nanoparticles are faceted with 110 surfaces termination suggesting it to be the preferred growth orientation. A growth model of these nanoparticles is also suggested.

  1. [Templates for curietherapy of the oral cavity and their dosimetric use].

    PubMed

    Pizzi, G; Fongione, S; Mandoliti, G; Beorchia, A; Contento, G; Malisan, M R

    1989-12-01

    Flexible 192Ir wire implants are commonly used for the treatment of some types of cancer in the oral cavity. A modified technique of plastic tubes is here presented which aims at correctly positioning the active wires with thin plastic templates. Possible sources of error are examined and their consequences on the dose distribution around the implant are analyzed. In most cases control dosimetry matches the provisions satisfactorily. It may be thus concluded that the use of templates allows good and reproducible results to be obtained in the brachytherapy of the oral cavity.

  2. Visual Templates in Pattern Generalization Activity

    ERIC Educational Resources Information Center

    Rivera, F. D.

    2010-01-01

    In this research article, I present evidence of the existence of visual templates in pattern generalization activity. Such templates initially emerged from a 3-week design-driven classroom teaching experiment on pattern generalization involving linear figural patterns and were assessed for existence in a clinical interview that was conducted four…

  3. Visual Templates in Pattern Generalization Activity

    ERIC Educational Resources Information Center

    Rivera, F. D.

    2010-01-01

    In this research article, I present evidence of the existence of visual templates in pattern generalization activity. Such templates initially emerged from a 3-week design-driven classroom teaching experiment on pattern generalization involving linear figural patterns and were assessed for existence in a clinical interview that was conducted four…

  4. Template synthesis of ordered macroporous hydroxyapatite bioceramics.

    PubMed

    Ji, Lijun; Jell, Gavin; Dong, Yixiang; Jones, Julian R; Stevens, Molly M

    2011-08-28

    Hydroxyapatite has found wide application in bone tissue engineering. Here we use a macroporous carbon template to generate highly ordered macroporous hydroxyapatite bioceramics composed of close-packed hollow spherical pores with interconnected channels. The template has advantages for the preparation of ordered materials.

  5. Subject Matter Expert's Training Module Template.

    ERIC Educational Resources Information Center

    Beaudin, Bart P.; Quick, Don

    This template was designed to assist subject matter experts in developing presentations. The template is for a training session plan (lesson plan, presentation plan, evaluation suggestions) that can be used to determine the sequence of what the presenter will say and do. Subject matter experts can easily copy the pages for use during the design…

  6. Qualification of Innovative High Level Waste Pipeline Unplugging Technologies

    SciTech Connect

    McDaniel, D.; Gokaltun, S.; Varona, J.; Awwad, A.; Roelant, D.; Srivastava, R.

    2008-07-01

    In the past, some of the pipelines have plugged during high level waste (HLW) transfers resulting in schedule delays and increased costs. Furthermore, pipeline plugging has been cited by the 'best and brightest' technical review as one of the major issues that can result in unplanned outages at the Waste Treatment Plant causing inconsistent operation. As the DOE moves toward a more active high level waste retrieval, the site engineers will be faced with increasing cross-site pipeline waste slurry transfers that will result in increased probability of a pipeline getting plugged. Hence, availability of a pipeline unplugging tool/technology is crucial to ensure smooth operation of the waste transfers and in ensuring tank farm cleanup milestones are met. FIU had earlier tested and evaluated various unplugging technologies through an industry call. Based on mockup testing, two technologies were identified that could withstand the rigors of operation in a radioactive environment and with the ability to handle sharp 90 elbows. We present results of the second phase of detailed testing and evaluation of pipeline unplugging technologies and the objective is to qualify these pipeline unplugging technologies for subsequent deployment at a DOE facility. The current phase of testing and qualification comprises of a heavily instrumented 3-inch diameter (full-scale) pipeline facilitating extensive data acquisition for design optimization and performance evaluation, as it applies to three types of plugs atypical of the DOE HLW waste. Furthermore, the data from testing at three different lengths of pipe in conjunction with the physics of the process will assist in modeling the unplugging phenomenon that will then be used to scale-up process parameters and system variables for longer and site typical pipe lengths, which can extend as much as up to 19,000 ft. Detailed information resulting from the testing will provide the DOE end-user with sufficient data and understanding of the

  7. Stability of High-Level Radioactive Waste Forms

    SciTech Connect

    Besmann, T.M.

    2001-06-22

    High-level waste (HLW) glass compositions, processing schemes, limits on waste content, and corrosion/dissolution release models are dependent on an accurate knowledge of melting temperatures and thermochemical values. Unfortunately, existing models for predicting these temperatures are empirically-based, depending on extrapolations of experimental information. In addition, present models of leaching behavior of glass waste forms use simplistic assumptions or experimentally measured values obtained under non-realistic conditions. There is thus a critical need for both more accurate and more widely applicable models for HLW glass behavior, which this project addressed. Significant progress was made in this project on modeling HLW glass. Borosilicate glass was accurately represented along with the additional important components that contain iron, lithium, potassium, magnesium, and calcium. The formation of crystalline inclusions in the glass, an issue in Hanford HLW formulations, was modeled and shown to be predictive. Thus the results of this work have already demonstrated practical benefits with the ability to map compositional regions where crystalline material forms, and therefore avoid that detrimental effect. With regard to a fundamental understanding, added insights on the behavior of the components of glass have been obtained, including the potential formation of molecular clusters. The EMSP project had very significant effects beyond the confines of Environmental Management. The models developed for glass have been used to solve a very costly problem in the corrosion of refractories for glass production. The effort resulted in another laboratory, Sandia National Laboratories-Livermore, to become conversant in the techniques and to apply those through a DOE Office of Industrial Technologies project joint with PPG Industries. The glass industry as a whole is now cognizant of these capabilities, and there is a Glass Manufacturer's Research Institute proposal

  8. High-level disinfection of gastrointestinal endoscope reprocessing

    PubMed Central

    Chiu, King-Wah; Lu, Lung-Sheng; Chiou, Shue-Shian

    2015-01-01

    High level disinfection (HLD) of the gastrointestinal (GI) endoscope is not simply a slogan, but rather is a form of experimental monitoring-based medicine. By definition, GI endoscopy is a semicritical medical device. Hence, such medical devices require major quality assurance for disinfection. And because many of these items are temperature sensitive, low-temperature chemical methods, such as liquid chemical germicide, must be used rather than steam sterilization. In summarizing guidelines for infection prevention and control for GI endoscopy, there are three important steps that must be highlighted: manual washing, HLD with automated endoscope reprocessor, and drying. Strict adherence to current guidelines is required because compared to any other medical device, the GI endoscope is associated with more outbreaks linked to inadequate cleaning or disinfecting during HLD. Both experimental evaluation on the surveillance bacterial cultures and in-use clinical results have shown that, the monitoring of the stringent processes to prevent and control infection is an essential component of the broader strategy to ensure the delivery of safe endoscopy services, because endoscope reprocessing is a multistep procedure involving numerous factors that can interfere with its efficacy. Based on our years of experience in the surveillance of culture monitoring of endoscopic reprocessing, we aim in this study to carefully describe what details require attention in the GI endoscopy disinfection and to share our experience so that patients can be provided with high quality and safe medical practices. Quality management encompasses all aspects of pre- and post-procedural care including the efficiency of the endoscopy unit and reprocessing area, as well as the endoscopic procedure itself. PMID:25699232

  9. High-Level Waste Systems Plan. Revision 7

    SciTech Connect

    Brooke, J.N.; Gregory, M.V.; Paul, P.; Taylor, G.; Wise, F.E.; Davis, N.R.; Wells, M.N.

    1996-10-01

    This revision of the High-Level Waste (HLW) System Plan aligns SRS HLW program planning with the DOE Savannah River (DOE-SR) Ten Year Plan (QC-96-0005, Draft 8/6), which was issued in July 1996. The objective of the Ten Year Plan is to complete cleanup at most nuclear sites within the next ten years. The two key principles of the Ten Year Plan are to accelerate the reduction of the most urgent risks to human health and the environment and to reduce mortgage costs. Accordingly, this System Plan describes the HLW program that will remove HLW from all 24 old-style tanks, and close 20 of those tanks, by 2006 with vitrification of all HLW by 2018. To achieve these goals, the DWPF canister production rate is projected to climb to 300 canisters per year starting in FY06, and remain at that rate through the end of the program in FY18, (Compare that to past System Plans, in which DWPF production peaked at 200 canisters per year, and the program did not complete until 2026.) An additional $247M (FY98 dollars) must be made available as requested over the ten year planning period, including a one-time $10M to enhance Late Wash attainment. If appropriate resources are made available, facility attainment issues are resolved and regulatory support is sufficient, then completion of the HLW program in 2018 would achieve a $3.3 billion cost savings to DOE, versus the cost of completing the program in 2026. Facility status information is current as of October 31, 1996.

  10. High-level waste issues and resolutions document

    SciTech Connect

    Not Available

    1994-05-01

    The High-Level Waste (HLW) Issues and Resolutions Document recognizes US Department of Energy (DOE) complex-wide HLW issues and offers potential corrective actions for resolving these issues. Westinghouse Management and Operations (M&O) Contractors are effectively managing HLW for the Department of Energy at four sites: Idaho National Engineering Laboratory (INEL), Savannah River Site (SRS), West Valley Demonstration Project (WVDP), and Hanford Reservation. Each site is at varying stages of processing HLW into a more manageable form. This HLW Issues and Resolutions Document identifies five primary issues that must be resolved in order to reach the long-term objective of HLW repository disposal. As the current M&O contractor at DOE`s most difficult waste problem sites, Westinghouse recognizes that they have the responsibility to help solve some of the complexes` HLW problems in a cost effective manner by encouraging the M&Os to work together by sharing expertise, eliminating duplicate efforts, and sharing best practices. Pending an action plan, Westinghouse M&Os will take the initiative on those corrective actions identified as the responsibility of an M&O. This document captures issues important to the management of HLW. The proposed resolutions contained within this document set the framework for the M&Os and DOE work cooperatively to develop an action plan to solve some of the major complex-wide problems. Dialogue will continue between the M&Os, DOE, and other regulatory agencies to work jointly toward the goal of storing, treating, and immobilizing HLW for disposal in an environmentally sound, safe, and cost effective manner.

  11. High Levels of Molecular Chlorine found in the Arctic Atmosphere

    NASA Astrophysics Data System (ADS)

    Liao, J.; Huey, L. G.; Liu, Z.; Tanner, D.; Cantrell, C. A.; Orlando, J. J.; Flocke, F. M.; Shepson, P. B.; Weinheimer, A. J.; Hall, S. R.; Beine, H.; Wang, Y.; Ingall, E. D.; Thompson, C. R.; Hornbrook, R. S.; Apel, E. C.; Fried, A.; Mauldin, L.; Smith, J. N.; Staebler, R. M.; Neuman, J. A.; Nowak, J. B.

    2014-12-01

    Chlorine radicals are a strong atmospheric oxidant, particularly in polar regions where levels of hydroxyl radicals can be quite low. In the atmosphere, chlorine radicals expedite the degradation of methane and tropospheric ozone and the oxidation of mercury to more toxic forms. Here, we present direct measurements of molecular chlorine levels in the Arctic marine boundary layer in Barrow, Alaska, collected in the spring of 2009 over a six-week period using chemical ionization mass spectrometry. We detected high levels of molecular chlorine of up to 400 pptv. Concentrations peaked in the early morning and late afternoon and fell to near-zero levels at night. Average daytime molecular chlorine levels were correlated with ozone concentrations, suggesting that sunlight and ozone are required for molecular chlorine formation. Using a time-dependent box model, we estimated that the chlorine radicals produced from the photolysis of molecular chlorine on average oxidized more methane than hydroxyl radicals and enhanced the abundance of short-lived peroxy radicals. Elevated hydroperoxyl radical levels, in turn, promoted the formation of hypobromous acid, which catalyzed mercury oxidation and the breakdown of tropospheric ozone. Therefore, we propose that molecular chlorine exerts a significant effect on the atmospheric chemistry in the Arctic. While the formation mechanisms of molecular chlorine are not yet understood, the main potential sources of chlorine include snowpack, sea salt, and sea ice. There is recent evidence of molecular halogen (Br2 and Cl2) formation in the Arctic snowpack. The coverage and composition of the snow may control halogen chemistry in the Arctic. Changes of sea ice and snow cover in the changing climate may affect air-snow-ice interaction and have a significant impact on the levels of radicals, ozone, mercury and methane in the Arctic troposphere.

  12. PLUTONIUM/HIGH-LEVEL VITRIFIED WASTE BDBE DOSE CALCULATION

    SciTech Connect

    D.C. Richardson

    2003-03-19

    In accordance with the Nuclear Waste Policy Amendments Act of 1987, Yucca Mountain was designated as the site to be investigated as a potential repository for the disposal of high-level radioactive waste. The Yucca Mountain site is an undeveloped area located on the southwestern edge of the Nevada Test Site (NTS), about 100 miles northwest of Las Vegas. The site currently lacks rail service or an existing right-of-way. If the Yucca Mountain site is found suitable for the repository, rail service is desirable to the Office of Civilian Waste Management (OCRWM) Program because of the potential of rail transportation to reduce costs and to reduce the number of shipments relative to highway transportation. A Preliminary Rail Access Study evaluated 13 potential rail spur options. Alternative routes within the major options were also developed. Each of these options was then evaluated for potential land use conflicts and access to regional rail carriers. Three potential routes having few land use conflicts and having access to regional carriers were recommended for further investigation. Figure 1-1 shows these three routes. The Jean route is estimated to be about 120 miles long, the Carlin route to be about 365 miles long, and Caliente route to be about 365 miles long. The remaining ten routes continue to be monitored and should any of the present conflicts change, a re-evaluation of that route will be made. Complete details of the evaluation of the 13 routes can be found in the previous study. The DOE has not identified any preferred route and recognizes that the transportation issues need a full and open treatment under the National Environmental Policy Act. The issue of transportation will be included in public hearings to support development of the Environmental Impact Statement (EIS) proceedings for either the Monitored Retrievable Storage Facility or the Yucca Mountain Project or both.

  13. Review of high-level waste form properties. [146 bibliographies

    SciTech Connect

    Rusin, J.M.

    1980-12-01

    This report is a review of waste form options for the immobilization of high-level-liquid wastes from the nuclear fuel cycle. This review covers the status of international research and development on waste forms as of May 1979. Although the emphasis in this report is on waste form properties, process parameters are discussed where they may affect final waste form properties. A summary table is provided listing properties of various nuclear waste form options. It is concluded that proposed waste forms have properties falling within a relatively narrow range. In regard to crystalline versus glass waste forms, the conclusion is that either glass of crystalline materials can be shown to have some advantage when a single property is considered; however, at this date no single waste form offers optimum properties over the entire range of characteristics investigated. A long-term effort has been applied to the development of glass and calcine waste forms. Several additional waste forms have enough promise to warrant continued research and development to bring their state of development up to that of glass and calcine. Synthetic minerals, the multibarrier approach with coated particles in a metal matrix, and high pressure-high temperature ceramics offer potential advantages and need further study. Although this report discusses waste form properties, the total waste management system should be considered in the final selection of a waste form option. Canister design, canister materials, overpacks, engineered barriers, and repository characteristics, as well as the waste form, affect the overall performance of a waste management system. These parameters were not considered in this comparison.

  14. High-level disinfection of gastrointestinal endoscope reprocessing.

    PubMed

    Chiu, King-Wah; Lu, Lung-Sheng; Chiou, Shue-Shian

    2015-02-20

    High level disinfection (HLD) of the gastrointestinal (GI) endoscope is not simply a slogan, but rather is a form of experimental monitoring-based medicine. By definition, GI endoscopy is a semicritical medical device. Hence, such medical devices require major quality assurance for disinfection. And because many of these items are temperature sensitive, low-temperature chemical methods, such as liquid chemical germicide, must be used rather than steam sterilization. In summarizing guidelines for infection prevention and control for GI endoscopy, there are three important steps that must be highlighted: manual washing, HLD with automated endoscope reprocessor, and drying. Strict adherence to current guidelines is required because compared to any other medical device, the GI endoscope is associated with more outbreaks linked to inadequate cleaning or disinfecting during HLD. Both experimental evaluation on the surveillance bacterial cultures and in-use clinical results have shown that, the monitoring of the stringent processes to prevent and control infection is an essential component of the broader strategy to ensure the delivery of safe endoscopy services, because endoscope reprocessing is a multistep procedure involving numerous factors that can interfere with its efficacy. Based on our years of experience in the surveillance of culture monitoring of endoscopic reprocessing, we aim in this study to carefully describe what details require attention in the GI endoscopy disinfection and to share our experience so that patients can be provided with high quality and safe medical practices. Quality management encompasses all aspects of pre- and post-procedural care including the efficiency of the endoscopy unit and reprocessing area, as well as the endoscopic procedure itself.

  15. Radiative Lifetimes for High Levels of Neutral Fe

    NASA Astrophysics Data System (ADS)

    Lawler, James E.; Den Hartog, E.; Guzman, A.

    2013-01-01

    New radiative lifetime measurements for ~ 50 high lying levels of Fe I are reported. Laboratory astrophysics faces a challenge to provide basic spectroscopic data, especially reliable atomic transition probabilities, in the IR region for abundance studies. The availability of HgCdTe (HAWAII) detector arrays has opened IR spectral regions for extensive new spectroscopic studies. The SDSS III APOGEE project in the H-Band is an important example which will penetrate the dust obscuring the Galactic bulge. APOGEE will survey elemental abundances of 100,000 red giant stars in the bulge, bar, disk, and halo of the Milky Way. Many stellar spectra in the H-Band are, as expected, dominated by transitions of Fe I. Most of these IR transitions connect high levels of Fe. Our program has started an effort to meet this challenge with new radiative lifetime measurements on high lying levels of Fe I using time resolved laser induced fluorescence (TRLIF). The TRLIF method is typically accurate to 5% and is efficient. Our goal is to combine these accurate, absolute radiative lifetimes with emission branching fractions [1] to determine log(gf) values of the highest quality for Fe I lines in the UV, visible, and IR. This method was used very successfully by O’Brian et al. [2] on lower levels of Fe I. This method is still the best available for all but very simple spectra for which ab-initio theory is more accurate. Supported by NSF grant AST-0907732. [1] Branching fractions are being measured by M. Ruffoni and J. C. Pickering at Imperial College London. [2] O'Brian, T. R., Wickliffe, M. E., Lawler, J. E., Whaling, W., & Brault, J. W. 1991, J. Opt. Soc. Am. B 8, 1185

  16. Deep borehole disposal of high-level radioactive waste.

    SciTech Connect

    Stein, Joshua S.; Freeze, Geoffrey A.; Brady, Patrick Vane; Swift, Peter N.; Rechard, Robert Paul; Arnold, Bill Walter; Kanney, Joseph F.; Bauer, Stephen J.

    2009-07-01

    Preliminary evaluation of deep borehole disposal of high-level radioactive waste and spent nuclear fuel indicates the potential for excellent long-term safety performance at costs competitive with mined repositories. Significant fluid flow through basement rock is prevented, in part, by low permeabilities, poorly connected transport pathways, and overburden self-sealing. Deep fluids also resist vertical movement because they are density stratified. Thermal hydrologic calculations estimate the thermal pulse from emplaced waste to be small (less than 20 C at 10 meters from the borehole, for less than a few hundred years), and to result in maximum total vertical fluid movement of {approx}100 m. Reducing conditions will sharply limit solubilities of most dose-critical radionuclides at depth, and high ionic strengths of deep fluids will prevent colloidal transport. For the bounding analysis of this report, waste is envisioned to be emplaced as fuel assemblies stacked inside drill casing that are lowered, and emplaced using off-the-shelf oilfield and geothermal drilling techniques, into the lower 1-2 km portion of a vertical borehole {approx}45 cm in diameter and 3-5 km deep, followed by borehole sealing. Deep borehole disposal of radioactive waste in the United States would require modifications to the Nuclear Waste Policy Act and to applicable regulatory standards for long-term performance set by the US Environmental Protection Agency (40 CFR part 191) and US Nuclear Regulatory Commission (10 CFR part 60). The performance analysis described here is based on the assumption that long-term standards for deep borehole disposal would be identical in the key regards to those prescribed for existing repositories (40 CFR part 197 and 10 CFR part 63).

  17. Advancements in automatic marking with range pattern matching

    NASA Astrophysics Data System (ADS)

    Salazar, D.; Valadez, J.

    2013-06-01

    In previous work, an approach was detailed using CATS-MRCC-RPM, where new pattern matching functionality is used to find locations on a jobdeck that are suitable for mark placements and ultimately, metrology tool measurement locations. These locations are found by first creating pattern definitions. The defined patterns are passed to the CATS MRCC-RPM match algorithm which in turn outputs all found locations that match the description. In that previous work, the pattern definitions, also known as mark templates, had several limitations. For example, each template could hold only one mark placed at its center, and had to be symmetrical. This was considered to be severely limiting in nature and not production worthy for advanced mask manufacturing. This paper builds on top of the previous one in various ways, extends the possibilities, and provides mask makers unlimited options for extending metrology automation. Mark templates are expanded upon to hold multiple marks at different offsets from its center, and even of different types. Each template can be symmetrical or asymmetrical, and yet all the marks on it can still be correctly placed by taking advantage of match orientation information during the Classification step. Placement of other mark types beyond basic ones is also explored, such as Arbitrary Area. Lastly, the classification step is an enhancement process that thoroughly manages the use of chip/mark information. The result makes use of the output of JD MRC (jobdeck MRC) which executes RPM on jobdecks by chip in order to reduce redundant chips processing, rather than search all chip placements by extension.

  18. Nanoimprint lithography using disposable biomass template

    NASA Astrophysics Data System (ADS)

    Hanabata, Makoto; Takei, Satoshi; Sugahara, Kigen; Nakajima, Shinya; Sugino, Naoto; Kameda, Takao; Fukushima, Jiro; Matsumoto, Yoko; Sekiguchi, Atsushi

    2016-04-01

    A novel nanoimprint lithography process using disposable biomass template having gas permeability was investigated. It was found that a disposable biomass template derived from cellulose materials shows an excellent gas permeability and decreases transcriptional defects in conventional templates such as quartz, PMDS, DLC that have no gas permeability. We believe that outgasses from imprinted materials are easily removed through the template. The approach to use a cellulose for template material is suitable as the next generation of clean separation technology. It is expected to be one of the defect-less thermal nanoimprint lithographic technologies. It is also expected that volatile materials and solvent including materials become available that often create defects and peelings in conventional temples that have no gas permeability.

  19. Pediatric MATCH Infographic

    Cancer.gov

    Infographic explaining NCI-COG Pediatric MATCH, a cancer treatment clinical trial for children and adolescents, from 1 to 21 years of age, that is testing the use of precision medicine for pediatric cancers.

  20. Project Matching Initiative

    EPA Pesticide Factsheets

    The Green Power Partnership's Project Matching initiative works to connect green power users with new, not-yet-built renewable energy projects that may align with their energy, environmental, and financial objectives.

  1. Cognitive Levels Matching.

    ERIC Educational Resources Information Center

    Brooks, Martin; And Others

    1983-01-01

    The Cognitive Levels Matching Project trains teachers to guide students' skill acquisition and problem-solving processes by assessing students' cognitive levels and adapting their teaching materials accordingly. (MLF)

  2. The molecular matching problem

    NASA Technical Reports Server (NTRS)

    Kincaid, Rex K.

    1993-01-01

    Molecular chemistry contains many difficult optimization problems that have begun to attract the attention of optimizers in the Operations Research community. Problems including protein folding, molecular conformation, molecular similarity, and molecular matching have been addressed. Minimum energy conformations for simple molecular structures such as water clusters, Lennard-Jones microclusters, and short polypeptides have dominated the literature to date. However, a variety of interesting problems exist and we focus here on a molecular structure matching (MSM) problem.

  3. Magnetic safety matches

    NASA Astrophysics Data System (ADS)

    Lindén, J.; Lindberg, M.; Greggas, A.; Jylhävuori, N.; Norrgrann, H.; Lill, J. O.

    2017-07-01

    In addition to the main ingredients; sulfur, potassium chlorate and carbon, ordinary safety matches contain various dyes, glues etc, giving the head of the match an even texture and appealing color. Among the common reddish-brown matches there are several types, which after ignition can be attracted by a strong magnet. Before ignition the match head is generally not attracted by the magnet. An elemental analysis based on proton-induced x-ray emission was performed to single out iron as the element responsible for the observed magnetism. 57Fe Mössbauer spectroscopy was used for identifying the various types of iron-compounds, present before and after ignition, responsible for the macroscopic magnetism: Fe2O3 before and Fe3O4 after. The reaction was verified by mixing the main chemicals in the match-head with Fe2O3 in glue and mounting the mixture on a match stick. The ash residue after igniting the mixture was magnetic.

  4. JET MIXING ANALYSIS FOR SRS HIGH-LEVEL WASTE RECOVERY

    SciTech Connect

    Lee, S.

    2011-07-05

    The process of recovering the waste in storage tanks at the Savannah River Site (SRS) typically requires mixing the contents of the tank to ensure uniformity of the discharge stream. Mixing is accomplished with one to four slurry pumps located within the tank liquid. The slurry pump may be fixed in position or they may rotate depending on the specific mixing requirements. The high-level waste in Tank 48 contains insoluble solids in the form of potassium tetraphenyl borate compounds (KTPB), monosodium titanate (MST), and sludge. Tank 48 is equipped with 4 slurry pumps, which are intended to suspend the insoluble solids prior to transfer of the waste to the Fluidized Bed Steam Reformer (FBSR) process. The FBSR process is being designed for a normal feed of 3.05 wt% insoluble solids. A chemical characterization study has shown the insoluble solids concentration is approximately 3.05 wt% when well-mixed. The project is requesting a Computational Fluid Dynamics (CFD) mixing study from SRNL to determine the solids behavior with 2, 3, and 4 slurry pumps in operation and an estimate of the insoluble solids concentration at the suction of the transfer pump to the FBSR process. The impact of cooling coils is not considered in the current work. The work consists of two principal objectives by taking a CFD approach: (1) To estimate insoluble solids concentration transferred from Tank 48 to the Waste Feed Tank in the FBSR process and (2) To assess the impact of different combinations of four slurry pumps on insoluble solids suspension and mixing in Tank 48. For this work, several different combinations of a maximum of four pumps are considered to determine the resulting flow patterns and local flow velocities which are thought to be associated with sludge particle mixing. Two different elevations of pump nozzles are used for an assessment of the flow patterns on the tank mixing. Pump design and operating parameters used for the analysis are summarized in Table 1. The baseline

  5. Scale and Rotation Invariant Matching Using Linearly Augmented Trees.

    PubMed

    Jiang, Hao; Tian, Tai-Peng; Sclaroff, Stan

    2015-12-01

    We propose a novel linearly augmented tree method for efficient scale and rotation invariant object matching. The proposed method enforces pairwise matching consistency defined on trees, and high-order constraints on all the sites of a template. The pairwise constraints admit arbitrary metrics while the high-order constraints use L1 norms and therefore can be linearized. Such a linearly augmented tree formulation introduces hyperedges and loops into the basic tree structure. But, different from a general loopy graph, its special structure allows us to relax and decompose the optimization into a sequence of tree matching problems that are efficiently solvable by dynamic programming. The proposed method also works on continuous scale and rotation parameters; we can match with a scale up to any large value with the same efficiency. Our experiments on ground truth data and a variety of real images and videos show that the proposed method is efficient, accurate and reliable.

  6. Latent palmprint matching.

    PubMed

    Jain, Anil K; Feng, Jianjiang

    2009-06-01

    The evidential value of palmprints in forensic applications is clear as about 30 percent of the latents recovered from crime scenes are from palms. While biometric systems for palmprint-based personal authentication in access control type of applications have been developed, they mostly deal with low-resolution (about 100 ppi) palmprints and only perform full-to-full palmprint matching. We propose a latent-to-full palmprint matching system that is needed in forensic applications. Our system deals with palmprints captured at 500 ppi (the current standard in forensic applications) or higher resolution and uses minutiae as features to be compatible with the methodology used by latent experts. Latent palmprint matching is a challenging problem because latent prints lifted at crime scenes are of poor image quality, cover only a small area of the palm, and have a complex background. Other difficulties include a large number of minutiae in full prints (about 10 times as many as fingerprints), and the presence of many creases in latents and full prints. A robust algorithm to reliably estimate the local ridge direction and frequency in palmprints is developed. This facilitates the extraction of ridge and minutiae features even in poor quality palmprints. A fixed-length minutia descriptor, MinutiaCode, is utilized to capture distinctive information around each minutia and an alignment-based minutiae matching algorithm is used to match two palmprints. Two sets of partial palmprints (150 live-scan partial palmprints and 100 latent palmprints) are matched to a background database of 10,200 full palmprints to test the proposed system. Despite the inherent difficulty of latent-to-full palmprint matching, rank-1 recognition rates of 78.7 and 69 percent, respectively, were achieved in searching live-scan partial palmprints and latent palmprints against the background database.

  7. Cementitious Grout for Closing SRS High Level Waste Tanks - 12315

    SciTech Connect

    Langton, C.A.; Stefanko, D.B.; Burns, H.H.; Waymer, J.; Mhyre, W.B.; Herbert, J.E.; Jolly, J.C. Jr.

    2012-07-01

    In 1997, the first two United States Department of Energy (US DOE) high level waste tanks (Tanks 17-F and 20-F: Type IV, single shell tanks) were taken out of service (permanently closed) at the Savannah River Site (SRS). In 2012, the DOE plans to remove from service two additional Savannah River Site (SRS) Type IV high-level waste tanks, Tanks 18-F and 19-F. These tanks were constructed in the late 1950's and received low-heat waste and do not contain cooling coils. Operational closure of Tanks 18-F and 19-F is intended to be consistent with the applicable requirements of the Resource Conservation and Recovery Act (RCRA) and the Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) and will be performed in accordance with South Carolina Department of Health and Environmental Control (SCDHEC). The closure will physically stabilize two 4.92E+04 cubic meter (1.3 E+06 gallon) carbon steel tanks and isolate and stabilize any residual contaminants left in the tanks. Ancillary equipment abandoned in the tanks will also be filled to the extent practical. A Performance Assessment (PA) has been developed to assess the long-term fate and transport of residual contamination in the environment resulting from the operational closure of the F-Area Tank Farm (FTF) waste tanks. Next generation flowable, zero-bleed cementitious grouts were designed, tested, and specified for closing Tanks 18-F and 19-F and for filling the abandoned equipment. Fill requirements were developed for both the tank and equipment grouts. All grout formulations were required to be alkaline with a pH of 12.4 and to be chemically reducing with a reduction potential (Eh) of -200 to -400. Grouts with this chemistry stabilize potential contaminants of concern. This was achieved by including Portland cement and Grade 100 slag in the mixes, respectively. Ingredients and proportions of cementitious reagents were selected and adjusted to support the mass placement strategy developed by

  8. Diffeomorphic spectral matching of cortical surfaces.

    PubMed

    Lombaert, Herve; Sporring, Jon; Siddiqi, Kaleem

    2013-01-01

    Accurate matching of cortical surfaces is necessary in many neuroscience applications. In this context diffeomorphisms are often sought, because they facilitate further statistical analysis and atlas building. Present methods for computing diffeomorphisms are based on optimizing flows or on inflating surfaces to a common template, but they are often computationally expensive. It typically takes several hours on a conventional desktop computer to match a single pair of cortical surfaces having a few hundred thousand vertices. We propose a very fast alternative based on an application of spectral graph theory on a novel association graph. Our symmetric approach can generate a diffeomorphic correspondence map within a few minutes on high-resolution meshes while avoiding the sign and multiplicity ambiguities of conventional spectral matching methods. The eigenfunctions are shared between surfaces and provide a smooth parameterization of surfaces. These properties are exploited to compute differentials on highly folded cortical surfaces. Diffeomorphisms can thus be verified and invalid surface folding detected. Our method is demonstrated to attain a vertex accuracy that is at least as good as that of FreeSurfer and Spherical Demons but in only a fraction of their processing time. As a practical experiment, we construct an unbiased atlas of cortical surfaces with a speed several orders of magnitude faster than current methods.

  9. Template optimization and transfer in perceptual learning.

    PubMed

    Kurki, Ilmari; Hyvärinen, Aapo; Saarinen, Jussi

    2016-08-01

    We studied how learning changes the processing of a low-level Gabor stimulus, using a classification-image method (psychophysical reverse correlation) and a task where observers discriminated between slight differences in the phase (relative alignment) of a target Gabor in visual noise. The method estimates the internal "template" that describes how the visual system weights the input information for decisions. One popular idea has been that learning makes the template more like an ideal Bayesian weighting; however, the evidence has been indirect. We used a new regression technique to directly estimate the template weight change and to test whether the direction of reweighting is significantly different from an optimal learning strategy. The subjects trained the task for six daily sessions, and we tested the transfer of training to a target in an orthogonal orientation. Strong learning and partial transfer were observed. We tested whether task precision (difficulty) had an effect on template change and transfer: Observers trained in either a high-precision (small, 60° phase difference) or a low-precision task (180°). Task precision did not have an effect on the amount of template change or transfer, suggesting that task precision per se does not determine whether learning generalizes. Classification images show that training made observers use more task-relevant features and unlearn some irrelevant features. The transfer templates resembled partially optimized versions of templates in training sessions. The template change direction resembles ideal learning significantly but not completely. The amount of template change was highly correlated with the amount of learning.

  10. Templated Growth of Magnetic Recording Media

    NASA Astrophysics Data System (ADS)

    Sundar, Vignesh

    Current and potential next-generation magnetic recording technologies are based on the writing and reading of bits on a magnetic thin film with a granular microstructure, with grains of the magnetic material surrounded by an amorphous segregant. In order to realize the highest achievable data storage capabilities, there is a need for better control of the magnetic media microstructure, particularly in terms of minimizing grain size and grain boundary thickness distributions. In this work, a guided magnetic media growth is attempted by creating a pre-fabricated template with a specific material and morphology. The template is designed in such a way that, when magnetic media consisting of the magnetic alloy and segregant are sputtered, the sites on the template result in a controlled two-phase growth of magnetic media. The template is fabricated using self-assembling block copolymers, which can be used to fabricate nanostructures with a regular hexagonal lattice of spheres of one block in the other's matrix. These are then used as etch-masks to fabricate the template. In this thesis, we describe the approach used to fabricate these templates and demonstrate the two-phase growth of magnetic recording media. In such an approach, the magnetic grain size is defined by the uniform pitch of the block copolymer pattern, resulting in a uniform microstructure with much better grain size distribution than can be obtained with conventional un-templated media growth. The templated growth technique is also a suitable additive technique for the fabrication of Bit Patterned Media, another potential next-generation technology wherein the magnetic bits are isolated patterned islands. Combining nanoimprint lithography with templated growth, we can generate a long range spatially ordered array of magnetic islands with no etching of the magnetic material.

  11. Nonlinear matching measure for the analysis of on-off type DNA microarray images

    NASA Astrophysics Data System (ADS)

    Kim, Jong D.; Park, Misun; Kim, Jongwon

    2003-07-01

    In this paper, we propose a new nonlinear matching measure for automatic analysis of the on-off type DNA microarray images in which the hybridized spots are detected by the template matching method. The targeting spots of HPV DNA chips are designed for genotyping the human papilloma virus(HPV). The proposed measure is obtained by binarythresholding over the whole template region and taking the number of white pixels inside the spotted area. This measure is evaluated in terms of the accuracy of the estimated marker location to show better performance than the normalized covariance.

  12. Tunable compression of template banks for fast gravitational-wave detection and localization

    NASA Astrophysics Data System (ADS)

    Chua, Alvin J. K.; Gair, Jonathan R.

    2016-06-01

    One strategy for reducing the online computational cost of matched-filter searches for gravitational waves is to introduce a compressed basis for the waveform template bank in a grid-based search. In this paper, we propose and investigate several tunable compression schemes for a general template bank. Through offline compression, such schemes are shown to yield faster detection and localization of signals, along with moderately improved sensitivity and accuracy over coarsened banks at the same level of computational cost. This is potentially useful for any search involving template banks, and especially in the analysis of data from future space-based detectors such as eLISA, for which online grid searches are difficult due to the long-duration waveforms and large parameter spaces.

  13. Advanced High-Level Waste Glass Research and Development Plan

    SciTech Connect

    Peeler, David K.; Vienna, John D.; Schweiger, Michael J.; Fox, Kevin M.

    2015-07-01

    The U.S. Department of Energy Office of River Protection (ORP) has implemented an integrated program to increase the loading of Hanford tank wastes in glass while meeting melter lifetime expectancies and process, regulatory, and product quality requirements. The integrated ORP program is focused on providing a technical, science-based foundation from which key decisions can be made regarding the successful operation of the Hanford Tank Waste Treatment and Immobilization Plant (WTP) facilities. The fundamental data stemming from this program will support development of advanced glass formulations, key process control models, and tactical processing strategies to ensure safe and successful operations for both the low-activity waste (LAW) and high-level waste (HLW) vitrification facilities with an appreciation toward reducing overall mission life. The purpose of this advanced HLW glass research and development plan is to identify the near-, mid-, and longer-term research and development activities required to develop and validate advanced HLW glasses and their associated models to support facility operations at WTP, including both direct feed and full pretreatment flowsheets. This plan also integrates technical support of facility operations and waste qualification activities to show the interdependence of these activities with the advanced waste glass (AWG) program to support the full WTP mission. Figure ES-1 shows these key ORP programmatic activities and their interfaces with both WTP facility operations and qualification needs. The plan is a living document that will be updated to reflect key advancements and mission strategy changes. The research outlined here is motivated by the potential for substantial economic benefits (e.g., significant increases in waste throughput and reductions in glass volumes) that will be realized when advancements in glass formulation continue and models supporting facility operations are implemented. Developing and applying advanced

  14. Templated Native Silk Smectic Gels

    NASA Technical Reports Server (NTRS)

    Jin, Hyoung-Joon (Inventor); Park, Jae-Hyung (Inventor); Valluzzi, Regina (Inventor)

    2016-01-01

    One aspect of the present invention relates to a method of preparing a fibrous protein smectic hydrogel by way of a solvent templating process, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; and collecting the resulting fibrous protein smectic hydrogel and allowing it to dry. Another aspect of the present invention relates to a method of obtaining predominantly one enantiomer from a racemic mixture, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; allowing the enantiomers of racemic mixture to diffuse selectively into the smectic hydrogel in solution; removing the smectic hydrogel from the solution; rinsing predominantly one enantiomer from the surface of the smectic hydrogel; and extracting predominantly one enantiomer from the interior of the smectic hydrogel. The present invention also relates to a smectic hydrogel prepared according to an aforementioned method.

  15. Templated Native Silk Smectic Gels

    NASA Technical Reports Server (NTRS)

    Jin, Hyoung-Joon (Inventor); Park, Jae-Hyung (Inventor); Valluzzi, Regina (Inventor)

    2013-01-01

    One aspect of the present invention relates to a method of preparing a fibrous protein smectic hydrogel by way of a solvent templating process, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; and collecting the resulting fibrous protein smectic hydrogel and allowing it to dry. Another aspect of the present invention relates to a method of obtaining predominantly one enantiomer from a racemic mixture, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; allowing the enantiomers of racemic mixture to diffuse selectively into the smectic hydrogel in solution; removing the smectic hydrogel from the solution; rinsing predominantly one enantiomer from the surface of the smectic hydrogel; and extracting predominantly one enantiomer from the interior of the smectic hydrogel. The present invention also relates to a smectic hydrogel prepared according to an aforementioned method.

  16. Biomineralization Guided by Paper Templates

    PubMed Central

    Camci-Unal, Gulden; Laromaine, Anna; Hong, Estrella; Derda, Ratmir; Whitesides, George M.

    2016-01-01

    This work demonstrates the fabrication of partially mineralized scaffolds fabricated in 3D shapes using paper by folding, and by supporting deposition of calcium phosphate by osteoblasts cultured in these scaffolds. This process generates centimeter-scale free-standing structures composed of paper supporting regions of calcium phosphate deposited by osteoblasts. This work is the first demonstration that paper can be used as a scaffold to induce template-guided mineralization by osteoblasts. Because paper has a porous structure, it allows transport of O2 and nutrients across its entire thickness. Paper supports a uniform distribution of cells upon seeding in hydrogel matrices, and allows growth, remodelling, and proliferation of cells. Scaffolds made of paper make it possible to construct 3D tissue models easily by tuning material properties such as thickness, porosity, and density of chemical functional groups. Paper offers a new approach to study mechanisms of biomineralization, and perhaps ultimately new techniques to guide or accelerate the repair of bone. PMID:27277575

  17. Templated native silk smectic gels

    NASA Technical Reports Server (NTRS)

    Jin, Hyoung-Joon (Inventor); Park, Jae-Hyung (Inventor); Valluzzi, Regina (Inventor)

    2009-01-01

    One aspect of the present invention relates to a method of preparing a fibrous protein smectic hydrogel by way of a solvent templating process, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; and collecting the resulting fibrous protein smectic hydrogel and allowing it to dry. Another aspect of the present invention relates to a method of obtaining predominantly one enantiomer from a racemic mixture, comprising the steps of pouring an aqueous fibrous protein solution into a container comprising a solvent that is not miscible with water; sealing the container and allowing it to age at about room temperature; allowing the enantiomers of racemic mixture to diffuse selectively into the smectic hydrogel in solution; removing the smectic hydrogel from the solution; rinsing predominantly one enantiomer from the surface of the smectic hydrogel; and extracting predominantly one enantiomer from the interior of the smectic hydrogel. The present invention also relates to a smectic hydrogel prepared according to an aforementioned method.

  18. Templating irreversible covalent macrocyclization by using anions.

    PubMed

    Kataev, Evgeny A; Kolesnikov, Grigory V; Arnold, Rene; Lavrov, Herman V; Khrustalev, Victor N

    2013-03-11

    Inorganic anions were used as templates in the reaction between a diamine and an activated diacid to form macrocyclic amides. The reaction conditions were found to perform the macrocyclization sufficiently slow to observe a template effect. A number of analytical methods were used to clarify the reaction mechanisms and to show that the structure of the intermediate plays a decisive role in determining the product distribution. For the macrocyclization under kinetic control, it was shown that the amount of a template, the conformational rigidity of building blocks, and the anion affinities of reaction components and intermediates are important parameters that one should take into consideration to achieve high yields.

  19. NPTFit: Non-Poissonian Template Fitting

    NASA Astrophysics Data System (ADS)

    Safdi, Benjamin R.; Rodd, Nicholas L.; Mishra-Sharma, Siddharth

    2017-05-01

    NPTFit is a specialized Python/Cython package that implements Non-Poissonian Template Fitting (NPTF), originally developed for characterizing populations of unresolved point sources. It offers fast evaluation of likelihoods for NPTF analyses and has an easy-to-use interface for performing non-Poissonian (as well as standard Poissonian) template fits using MultiNest (ascl:1109.006) or other inference tools. It allows inclusion of an arbitrary number of point source templates, with an arbitrary number of degrees of freedom in the modeled flux distribution, and has modules for analyzing and plotting the results of an NPTF.

  20. A PANOPLY OF CEPHEID LIGHT CURVE TEMPLATES

    SciTech Connect

    Yoachim, Peter; McCommas, Les P.; Dalcanton, Julianne J.; Williams, Benjamin F.

    2009-06-15

    We have generated accurate V and I template light curves using a combination of Fourier decomposition and principal component analysis for a large sample of Cepheid light curves. Unlike previous studies, we include short-period Cepheids and stars pulsating in the first overtone mode in our analysis. Extensive Monte Carlo simulations show that our templates can be used to precisely measure Cepheid magnitudes and periods, even in cases where there are few observational epochs. These templates are ideal for characterizing serendipitously discovered Cepheids and can be used in conjunction with surveys such as Pan-Starrs and LSST where the observational sampling may not be optimized for Cepheids.

  1. Template boundary definition in Tetrahymena telomerase.

    PubMed

    Lai, Cary K; Miller, Michael C; Collins, Kathleen

    2002-02-15

    Telomerase elongates chromosome ends by addition of telomeric DNA repeats. The telomerase ribonucleoprotein can copy only a short template sequence within the telomerase RNA subunit. Here, we identify a region of telomerase RNA that is necessary for both correct 5' template boundary definition and high affinity telomerase reverse transcriptase (TERT) interaction. We also demonstrate that TERT mutants in the RNA binding domain compromise both 5' boundary definition and RNA binding. Our results indicate that sequence-specific interaction of a telomerase RNA element with the TERT RNA binding domain, not the active site motifs, defines the template boundary.

  2. The Role of Color in Search Templates for Real-world Target Objects.

    PubMed

    Nako, Rebecca; Smith, Tim J; Eimer, Martin

    2016-11-01

    During visual search, target representations (attentional templates) control the allocation of attention to template-matching objects. The activation of new attentional templates can be prompted by verbal or pictorial target specifications. We measured the N2pc component of the ERP as a temporal marker of attentional target selection to determine the role of color signals in search templates for real-world search target objects that are set up in response to word or picture cues. On each trial run, a word cue (e.g., "apple") was followed by three search displays that contained the cued target object among three distractors. The selection of the first target was based on the word cue only, whereas selection of the two subsequent targets could be controlled by templates set up after the first visual presentation of the target (picture cue). In different trial runs, search displays either contained objects in their natural colors or monochromatic objects. These two display types were presented in different blocks (Experiment 1) or in random order within each block (Experiment 2). RTs were faster, and target N2pc components emerged earlier for the second and third display of each trial run relative to the first display, demonstrating that pictures are more effective than word cues in guiding search. N2pc components were triggered more rapidly for targets in the second and third display in trial runs with colored displays. This demonstrates that when visual target attributes are fully specified by picture cues, the additional presence of color signals in target templates facilitates the speed with which attention is allocated to template-matching objects. No such selection benefits for colored targets were found when search templates were set up in response to word cues. Experiment 2 showed that color templates activated by word cues can even impair the attentional selection of noncolored targets. Results provide new insights into the status of color during the guidance

  3. Corrosion issues in high-level nuclear waste containers

    NASA Astrophysics Data System (ADS)

    Asl, Samin Sharifi

    In this dissertation different aspects of corrosion and electrochemistry of copper, candidate canister material in Scandinavian high-level nuclear waste disposal program, including the thermodynamics and kinetics of the reactions that are predicted to occur in the practical system have been studied. A comprehensive thermodynamic study of copper in contact with granitic groundwater of the type and composition that is expected in the Forsmark repository in Sweden has been performed. Our primary objective was to ascertain whether copper would exist in the thermodynamically immune state in the repository, in which case corrosion could not occur and the issue of corrosion in the assessment of the storage technology would be moot. In spite of the fact that metallic copper has been found to exist for geological times in granitic geological formations, copper is well-known to be activated from the immune state to corrode by specific species that may exist in the environment. The principal activator of copper is known to be sulfur in its various forms, including sulfide (H2S, HS-, S2-), polysulfide (H2Sx, HSx -, Sx 2-), poly sulfur thiosulfate ( SxO3 2-), and polythionates (SxO6 2-). A comprehensive study of this aspect of copper chemistry has never been reported, and yet an understanding of this issue is vital for assessing whether copper is a suitable material for fabricating canisters for the disposal of HLNW. Our study identifies and explores those species that activate copper; these species include sulfur-containing entities as well as other, non-sulfur species that may be present in the repository. The effects of temperature, solution pH, and hydrogen pressure on the kinetics of the hydrogen electrode reaction (HER) on copper in borate buffer solution have been studied by means of steady-state polarization measurements, including electrochemical impedance spectroscopy (EIS). In order to obtain electrokinetic parameters, such as the exchange current density and the

  4. Templated and template-free fabrication strategies for zero-dimensional hollow MOF superstructures.

    PubMed

    Kim, Hyehyun; Lah, Myoung Soo

    2017-05-16

    Various fabrication strategies for hollow metal-organic framework (MOF) superstructures are reviewed and classified using various types of external templates and their properties. Hollow MOF superstructures have also been prepared without external templates, wherein unstable intermediates obtained during reactions convert to the final hollow MOF superstructures. Many hollow MOF superstructures have been fabricated using hard templates. After the core-shell core@MOF structure was prepared using a hard template, the core was selectively etched to generate a hollow MOF superstructure. Another approach for generating hollow superstructures is to use a solid reactant as a sacrificial template; this method requires no additional etching process. Soft templates such as discontinuous liquid/emulsion droplets and gas bubbles in a continuous soft phase have also been employed to prepare hollow MOF superstructures.

  5. Dazomet Fumigant Management Plan Phase 2 Templates

    EPA Pesticide Factsheets

    These templates provide a framework for reporting application block information, buffer zones, emergency response plan, tarp plan, soil conditions, air monitoring, and more for pesticide products containing the active ingredient dazomet, such as Basamid G.

  6. Template definition by Tetrahymena telomerase reverse transcriptase.

    PubMed

    Miller, M C; Liu, J K; Collins, K

    2000-08-15

    The ribonucleoprotein enzyme telomerase extends chromosome ends by copying a specific template sequence within its integral RNA component. An active recombinant telomerase RNP is minimally composed of this RNA and the telomerase reverse transcriptase (TERT) protein, which contains sequence motifs conserved among viral reverse transcriptases (RTs), flanked by N- and C-terminal extensions specific to TERTs. We have used site-directed mutagenesis to explore the roles of Tetrahymena TERT in determining features of telomerase activity in general and in establishing the boundaries and use of an internal RNA template in specific. We identify a new ciliate-specific motif in the TERT N-terminus required for template definition. Moreover, several residues in reverse transcriptase motifs 1, 2, A and D are critical for specific aspects of internal template use. Our results indicate that the unique specificity of telomerase activity is conferred to a reverse transcriptase active site by TERT residues both within and beyond the RT motif region.

  7. Water Quality Exchange Web Template User Guide

    EPA Pesticide Factsheets

    This is a step by step guide to using the WQX Web Monitoring Data Entry Template for Physical/Chemical data to prepare your data for import into the WQX Web tool, and subsequent transfer to the STORET Data Warehouse.

  8. Nucleic Acid Templated Reactions for Chemical Biology.

    PubMed

    Di Pisa, Margherita; Seitz, Oliver

    2017-06-21

    Nucleic acid directed bioorthogonal reactions offer the fascinating opportunity to unveil and redirect a plethora of intracellular mechanisms. Nano- to picomolar amounts of specific RNA molecules serve as templates and catalyze the selective formation of molecules that 1) exert biological effects, or 2) provide measurable signals for RNA detection. Turnover of reactants on the template is a valuable asset when concentrations of RNA templates are low. The idea is to use RNA-templated reactions to fully control the biodistribution of drugs and to push the detection limits of DNA or RNA analytes to extraordinary sensitivities. Herein we review recent and instructive examples of conditional synthesis or release of compounds for in cellulo protein interference and intracellular nucleic acid imaging. © 2017 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  9. Using wavelets to learn pattern templates

    NASA Astrophysics Data System (ADS)

    Scott, Clayton D.; Nowak, Robert D.

    2002-07-01

    Despite the success of wavelet decompositions in other areas of statistical signal and image processing, current wavelet-based image models are inadequate for modeling patterns in images, due to the presence of unknown transformations (e.g., translation, rotation, location of lighting source) inherent in most pattern observations. In this paper we introduce a hierarchical wavelet-based framework for modeling patterns in digital images. This framework takes advantage of the efficient image representations afforded by wavelets, while accounting for unknown translation and rotation. Given a trained model, we can use this framework to synthesize pattern observations. If the model parameters are unknown, we can infer them from labeled training data using TEMPLAR (Template Learning from Atomic Representations), a novel template learning algorithm with linear complexity. TEMPLAR employs minimum description length (MDL) complexity regularization to learn a template with a sparse representation in the wavelet domain. We discuss several applications, including template learning, pattern classification, and image registration.

  10. Is Matching Innate?

    ERIC Educational Resources Information Center

    Gallistel, C. R.; King, Adam Philip; Gottlieb, Daniel; Balci, Fuat; Papachristos, Efstathios B.; Szalecki, Matthew; Carbone, Kimberly S.

    2007-01-01

    Experimentally naive mice matched the proportions of their temporal investments (visit durations) in two feeding hoppers to the proportions of the food income (pellets per unit session time) derived from them in three experiments that varied the coupling between the behavioral investment and food income, from no coupling to strict coupling.…

  11. Dewey Concentration Match.

    ERIC Educational Resources Information Center

    School Library Media Activities Monthly, 1996

    1996-01-01

    Giving students a chance to associate numbers with subjects can be useful in speeding their location of desired print or nonprint materials and helping students feel independent when browsing. A matching game for helping students learn the Dewey numbers is presented. Instructions for the library media specialist or teacher, instructions for…

  12. School Match Revisited.

    ERIC Educational Resources Information Center

    Lynch, Mary Jo; Weeks, Ann

    1988-01-01

    Reports an interview with the director of School Match, a firm that has created a database of information on public and private schools to assist relocated families in identifying the best local schools. The firm's finding that library and media services' expenditures relates most significantly to student achievement is discussed. (CLB)

  13. Is matching innate?

    PubMed

    Gallistel, C R; King, Adam Philip; Gottlieb, Daniel; Balci, Fuat; Papachristos, Efstathios B; Szalecki, Matthew; Carbone, Kimberly S

    2007-03-01

    Experimentally naive mice matched the proportions of their temporal investments (visit durations) in two feeding hoppers to the proportions of the food income (pellets per unit session time) derived from them in three experiments that varied the coupling between the behavioral investment and food income, from no coupling to strict coupling. Matching was observed from the outset; it did not improve with training. When the numbers of pellets received were proportional to time invested, investment was unstable, swinging abruptly from sustained, almost complete investment in one hopper, to sustained, almost complete investment in the other-in the absence of appropriate local fluctuations in returns (pellets obtained per time invested). The abruptness of the swings strongly constrains possible models. We suggest that matching reflects an innate (unconditioned) program that matches the ratio of expected visit durations to the ratio between the current estimates of expected incomes. A model that processes the income stream looking for changes in the income and generates discontinuous income estimates when a change is detected is shown to account for salient features of the data.

  14. Is Matching Innate?

    ERIC Educational Resources Information Center

    Gallistel, C. R.; King, Adam Philip; Gottlieb, Daniel; Balci, Fuat; Papachristos, Efstathios B.; Szalecki, Matthew; Carbone, Kimberly S.

    2007-01-01

    Experimentally naive mice matched the proportions of their temporal investments (visit durations) in two feeding hoppers to the proportions of the food income (pellets per unit session time) derived from them in three experiments that varied the coupling between the behavioral investment and food income, from no coupling to strict coupling.…

  15. Factorized Graph Matching.

    PubMed

    Zhou, Feng; de la Torre, Fernando

    2015-11-19

    Graph matching (GM) is a fundamental problem in computer science, and it plays a central role to solve correspondence problems in computer vision. GM problems that incorporate pairwise constraints can be formulated as a quadratic assignment problem (QAP). Although widely used, solving the correspondence problem through GM has two main limitations: (1) the QAP is NP-hard and difficult to approximate; (2) GM algorithms do not incorporate geometric constraints between nodes that are natural in computer vision problems. To address aforementioned problems, this paper proposes factorized graph matching (FGM). FGM factorizes the large pairwise affinity matrix into smaller matrices that encode the local structure of each graph and the pairwise affinity between edges. Four are the benefits that follow from this factorization: (1) There is no need to compute the costly (in space and time) pairwise affinity matrix; (2) The factorization allows the use of a path-following optimization algorithm, that leads to improved optimization strategies and matching performance; (3) Given the factorization, it becomes straight-forward to incorporate geometric transformations (rigid and non-rigid) to the GM problem. (4) Using a matrix formulation for the GM problem and the factorization, it is easy to reveal commonalities and differences between different GM methods. The factorization also provides a clean connection with other matching algorithms such as iterative closest point; Experimental results on synthetic and real databases illustrate how FGM outperforms state-of-the-art algorithms for GM. The code is available at http://humansensing.cs.cmu.edu/fgm.

  16. Inter-image matching

    NASA Technical Reports Server (NTRS)

    Wolfe, R. H., Jr.; Juday, R. D.

    1982-01-01

    Interimage matching is the process of determining the geometric transformation required to conform spatially one image to another. In principle, the parameters of that transformation are varied until some measure of some difference between the two images is minimized or some measure of sameness (e.g., cross-correlation) is maximized. The number of such parameters to vary is faily large (six for merely an affine transformation), and it is customary to attempt an a priori transformation reducing the complexity of the residual transformation or subdivide the image into small enough match zones (control points or patches) that a simple transformation (e.g., pure translation) is applicable, yet large enough to facilitate matching. In the latter case, a complex mapping function is fit to the results (e.g., translation offsets) in all the patches. The methods reviewed have all chosen one or both of the above options, ranging from a priori along-line correction for line-dependent effects (the high-frequency correction) to a full sensor-to-geobase transformation with subsequent subdivision into a grid of match points.

  17. Derivatives of Matching.

    ERIC Educational Resources Information Center

    Herrnstein, R. J.

    1979-01-01

    The matching law for reinforced behavior solves a differential equation relating infinitesimal changes in behavior to infinitesimal changes in reinforcement. The equation expresses plausible conceptions of behavior and reinforcement, yields a simple nonlinear operator model for acquisition, and suggests a alternative to the economic law of…

  18. Matching Supernovae to Galaxies

    NASA Astrophysics Data System (ADS)

    Kohler, Susanna

    2016-12-01

    developed a new automated algorithm for matching supernovae to their host galaxies. Their work builds on currently existing algorithms and makes use of information about the nearby galaxies, accounts for the uncertainty of the match, and even includes a machine learning component to improve the matching accuracy.Gupta and collaborators test their matching algorithm on catalogs of galaxies and simulated supernova events to quantify how well the algorithm is able to accurately recover the true hosts.Successful MatchingThe matching algorithms accuracy (purity) as a function of the true supernova-host separation, the supernova redshift, the true hosts brightness, and the true hosts size. [Gupta et al. 2016]The authors find that when the basic algorithm is run on catalog data, it matches supernovae to their hosts with 91% accuracy. Including the machine learning component, which is run after the initial matching algorithm, improves the accuracy of the matching to 97%.The encouraging results of this work which was intended as a proof of concept suggest that methods similar to this could prove very practical for tackling future survey data. And the method explored here has use beyond matching just supernovae to their host galaxies: it could also be applied to other extragalactic transients, such as gamma-ray bursts, tidal disruption events, or electromagnetic counterparts to gravitational-wave detections.CitationRavi R. Gupta et al 2016 AJ 152 154. doi:10.3847/0004-6256/152/6/154

  19. The Template: A Way To Control

    ERIC Educational Resources Information Center

    Schueneman, Margot

    1977-01-01

    When beginning students first attempt coil pots, there is a tendency to rely on the design of the coil to cover up any irregularities in form. One of the ways to help students see whether or not a form is getting away from then is to use a template. Explains and demonstrates how the contour of the template helps to guide the placement of the…

  20. Lipid bilayers on nano-templates

    DOEpatents

    Noy, Aleksandr; Artyukhin, Alexander B.; Bakajin, Olgica; Stoeve, Pieter

    2009-08-04

    A lipid bilayer on a nano-template comprising a nanotube or nanowire and a lipid bilayer around the nanotube or nanowire. One embodiment provides a method of fabricating a lipid bilayer on a nano-template comprising the steps of providing a nanotube or nanowire and forming a lipid bilayer around the polymer cushion. One embodiment provides a protein pore in the lipid bilayer. In one embodiment the protein pore is sensitive to specific agents

  1. Template synthesis of monodisperse carbon nanodots

    NASA Astrophysics Data System (ADS)

    Kurdyukov, D. A.; Eurov, D. A.; Stovpiaga, E. Yu.; Kirilenko, D. A.; Konyakhin, S. V.; Shvidchenko, A. V.; Golubev, V. G.

    2016-12-01

    Monodisperse carbon nanodots in pores of mesoporous silica particles are obtained by template synthesis. This method is based on introducing a precursor (organosilane) into pores, its thermal decomposition with formation of carbon nanodots, and the template removal. Structural analysis of the nanomaterial has been performed, which showed that carbon nanodots have an approximately spherical form and a graphite-like structure. According to dynamic light scattering data, the size of carbon nanodots is 3.3 ± 0.9 nm.

  2. Effect of digital template in the assistant of a giant condylar osteochondroma resection.

    PubMed

    Bai, Guo; He, Dongmei; Yang, Chi; Lu, Chuan; Huang, Dong; Chen, Minjie; Yuan, Jianbing

    2014-05-01

    Exostosis osteochondroma is usually resected with the whole condyle even part of it is not involved. This study was to report the effect of using digital template in the assistant of resection while protecting the uninvolved condyle. We used computer-aided design technique in the assistant of making preoperative plan of a patient with giant condylar osteochondroma of exogenous type, including determining the boundary between the tumor and the articular surface of condyle, and designing the virtual tumor resection plane, surgical approach, and remove-out path of the tumor. The digital osteotomy template was made by rapid prototyping technique based on the preoperative plan. Postoperative CT scan was performed and merged with the preoperative CT by the Proplan 1.3 system to evaluate the accuracy of surgical resection with the guide of digital template. The osteotomy template was attached to the lateral surface of condyle accurately, and the tumor was removed totally by the guide of the template without injuries to adjacent nerves and vessels. Postoperative CT showed that the osteochondroma was removed completely and the unaffected articular surface of condyle was preserved well. The merging of postoperative and preoperative CT by Proplan 1.3 system showed the outcome of the operation matched with the preoperative planning quite well with an error of 0.92 mm. There was no sign of recurrence after 6 months of follow-up. The application of digital template could improve the accuracy of the giant condylar tumor resection and help to preserve the uninvolved condyle. The use of digital template could reduce injuries to the nerves and vessels as well as save time for the operation.

  3. Anodic aluminum oxide-epoxy composite acoustic matching layers for ultrasonic transducer application.

    PubMed

    Fang, H J; Chen, Y; Wong, C M; Qiu, W B; Chan, H L W; Dai, J Y; Li, Q; Yan, Q F

    2016-08-01

    The goal of this work is to demonstrate the application of anodic aluminum oxide (AAO) template as matching layer of ultrasonic transducer. Quarter-wavelength acoustic matching layer is known as a vital component in medical ultrasonic transducers to compensate the acoustic impedance mismatch between piezoelectric element and human body. The AAO matching layer is made of anodic aluminum oxide template filled with epoxy resin, i.e. AAO-epoxy 1-3 composite. Using this composite as the first matching layer, a ∼12MHz ultrasonic transducer based on soft lead zirconate titanate piezoelectric ceramic is fabricated, and pulse-echo measurements show that the transducer exhibits very good performance with broad bandwidth of 68% (-6dB) and two-way insertion loss of -22.7dB. Wire phantom ultrasonic image is also used to evaluate the transducer's performance, and the results confirm the process feasibility and merit of AAO-epoxy composite as a new matching material for ultrasonic transducer application. This matching scheme provides a solution to address the problems existing in the conventional 0-3 composite matching layer and suggests another useful application of AAO template.

  4. [Propensity score matching in SPSS].

    PubMed

    Huang, Fuqiang; DU, Chunlin; Sun, Menghui; Ning, Bing; Luo, Ying; An, Shengli

    2015-11-01

    To realize propensity score matching in PS Matching module of SPSS and interpret the analysis results. The R software and plug-in that could link with the corresponding versions of SPSS and propensity score matching package were installed. A PS matching module was added in the SPSS interface, and its use was demonstrated with test data. Score estimation and nearest neighbor matching was achieved with the PS matching module, and the results of qualitative and quantitative statistical description and evaluation were presented in the form of a graph matching. Propensity score matching can be accomplished conveniently using SPSS software.

  5. Feasibility study of patient-specific surgical templates for the fixation of pedicle screws.

    PubMed

    Salako, F; Aubin, C-E; Fortin, C; Labelle, H

    2002-01-01

    Surgery for scoliosis, as well as other posterior spinal surgeries, frequently uses pedicle screws to fix an instrumentation on the spine. Misplacement of a screw can lead to intra- and post-operative complications. The objective of this study is to design patient-specific surgical templates to guide the drilling operation. From the CT-scan of a vertebra, the optimal drilling direction and limit angles are computed from an inverse projection of the pedicle limits. The first template design uses a surface-to-surface registration method and was constructed in a CAD system by subtracting the vertebra from a rectangular prism and a cylinder with the optimal orientation. This template and the vertebra were built using rapid prototyping. The second design uses a point-to-surface registration method and has 6 adjustable screws to adjust the orientation and length of the drilling support device. A mechanism was designed to hold it in place on the spinal process. A virtual prototype was build with CATIA software. During the operation, the surgeon places either template on patient's vertebra until a perfect match is obtained before drilling. The second design seems better than the first one because it can be reused on different vertebra and is less sensible to registration errors. The next step is to build the second design and make experimental and simulations tests to evaluate the benefits of this template during a scoliosis operation.

  6. Development of Korean Standard Brain Templates

    PubMed Central

    Lee, Jae Sung; Kim, Jinsu; Kim, Yu Kyeong; Kang, Eunjoo; Kang, Hyejin; Kang, Keon Wook; Lee, Jong Min; Kim, Jae-Jin; Park, Hae-Jeong; Kwon, Jun Soo; Kim, Sun I.; Yoo, Tae Woo; Chang, Kee-Hyun; Lee, Myung Chul

    2005-01-01

    We developed age, gender and ethnic specific brain templates based on MR and Positron-Emission Tomography (PET) images of Korean normal volunteers. Seventy-eight normal right-handed volunteers (M/F=49/29) underwent 3D T1-weighted SPGR MR and F-18-FDG PET scans. For the generation of standard templates, an optimal target brain that has the average global hemispheric shape was selected for each gender. MR images were then spatially normalized by linear transformation to the target brains, and normalization parameters were reapplied to PET images. Subjects were subdivided into 2 groups for each gender: the young/midlife (<55 yr) and the elderly groups. Young and elderly MRI/PET templates were composed by averaging the spatially normalized images. Korean templates showed different shapes and sizes (mean length, width, and height of the brains were 16.5, 14.3 and 12.1 cm for man, and 15.6, 13.5 and 11.4 cm for woman) from the template based on Caucasian (18.3, 14.2, and 13.3 cm). MRI and PET templates developed in this study will provide the framework for more accurate stereotactic standardization and anatomical localization. PMID:15953874

  7. Polymerization of Diacetylene Using β-SHEET as a Template

    NASA Astrophysics Data System (ADS)

    Mori, Takeshi; Fukawa, Yoichi; Shimoyama, Kenji; Minagawa, Keiji; Tanaka, Masami

    In the parallel and anti-parallel β-sheet structures, hydrogen bonding arises between the amide bonds of the peptide chains to arrange them with a distance of ca. 5 Å. That distance matched with the repeating unit distance of polydiacetylene. In this study, the effectiveness of the β-sheet as a template for the polymerization of diacetylene was examined by using diacetylene-introduced oligopeptides. The diacetylene-introduced amino acid (Thr(DA)) was synthesized from L-threonine. Though peptides Ac-Thr(DA)-NHMe and Ac-[Thr(DA)]2-NHMe formed anti-parallel β-sheet, they showed slight or no polymerization in both of the solid and the solution states. On the other hand, Ac-[Thr(DA)]5-NHMe and 11mer peptide with a Thr(DA) in the center of the sequence contained anti-parallel β-sheet structure and formed polydiacetylene of high degree of polymerization with high conversion during the cleavage process of the peptide from resin in the solution. This result indicated that the preorganization of the peptide through the β-sheet formation was necessary for the polymerization of diacetylene group. Thus, the β-sheet motif was effective template for the polymerization of diacetylene.

  8. Cost evaluation of a DSN high level real-time language

    NASA Technical Reports Server (NTRS)

    Mckenzie, M.

    1977-01-01

    The hypothesis that the implementation of a DSN High Level Real Time Language will reduce real time software expenditures is explored. The High Level Real Time Language is found to be both affordable and cost-effective.

  9. High-level waste program progress report, April 1, 1980-June 30, 1980

    SciTech Connect

    1980-08-01

    The highlights of this report are on: waste management analysis for nuclear fuel cycles; fixation of waste in concrete; study of ceramic and cermet waste forms; alternative high-level waste forms development; and high-level waste container development.

  10. Cost evaluation of a DSN high level real-time language

    NASA Technical Reports Server (NTRS)

    Mckenzie, M.

    1977-01-01

    The hypothesis that the implementation of a DSN High Level Real Time Language will reduce real time software expenditures is explored. The High Level Real Time Language is found to be both affordable and cost-effective.

  11. Matched designs and causal diagrams

    PubMed Central

    Mansournia, Mohammad A; Hernán, Miguel A; Greenland, Sander

    2013-01-01

    We use causal diagrams to illustrate the consequences of matching and the appropriate handling of matched variables in cohort and case-control studies. The matching process generally forces certain variables to be independent despite their being connected in the causal diagram, a phenomenon known as unfaithfulness. We show how causal diagrams can be used to visualize many previous results about matched studies. Cohort matching can prevent confounding by the matched variables, but censoring or other missing data and further adjustment may necessitate control of matching variables. Case-control matching generally does not prevent confounding by the matched variables, and control of matching variables may be necessary even if those were not confounders initially. Matching on variables that are affected by the exposure and the outcome, or intermediates between the exposure and the outcome, will ordinarily produce irremediable bias. PMID:23918854

  12. Adding a Zero-Crossing Count to Spectral Information in Template-Based Speech Recognition

    DTIC Science & Technology

    1982-01-01

    incorporation of zero-crossing information into the spectral representation used in a template-matching system ( cIcADA ). An analysis of zero-crossing data for an...procedure to be used. The work described in this paper was done using the CICADA system developed at Carnegie-Mellon University [Alleva 81, Waibel 801... CICADA uses a representation based on a compression of the short-term spectrum according to a 16 coefficient mel scale. Let us consider the CICADA

  13. Pre-operative templating for trauma hemiarthroplasty (Thompson's)

    PubMed Central

    Green, Robert Nicholas; Rushton, Paul R.P.; Kramer, Derek; Inman, Dominic; Partington, Paul F.

    2015-01-01

    Introduction Surgical complications may be avoided by preoperative templating in trauma hemiarthroplasty. Materials and methods Digital templates for the Stryker™ range of Thompson's prostheses were created and fifty trauma patients that had undergone cemented hemiarthroplasty were retrospectively templated by 2 blinded surgeons. Results Templating for prosthesis size was highly accurate with excellent Inter and intra-observer reproducibility. Sensitivity for identifying femoral canals too narrow for a Thompsons was 100%. Conclusions Templating is a valuable tool and should be standard practice in trauma. We have demonstrated that it is possible to generate custom templates to allow accurate templating. PMID:26566327

  14. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 8 2010-04-01 2010-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  15. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 8 2014-04-01 2014-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  16. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 8 2013-04-01 2013-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  17. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 8 2012-04-01 2012-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  18. 21 CFR 880.6885 - Liquid chemical sterilants/high level disinfectants.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 8 2011-04-01 2011-04-01 false Liquid chemical sterilants/high level... and Personal Use Miscellaneous Devices § 880.6885 Liquid chemical sterilants/high level disinfectants. (a) Identification. A liquid chemical sterilant/high level disinfectant is a germicide that...

  19. 40 CFR 1065.725 - High-level ethanol-gasoline blends.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false High-level ethanol-gasoline blends... Calibration Standards § 1065.725 High-level ethanol-gasoline blends. For testing vehicles capable of operating on a high-level ethanol-gasoline blend, create a test fuel as follows: (a) Add ethanol to an E10...

  20. Matching controlled vocabulary words.

    PubMed

    Grabar, Natalia; Zweigenbaum, Pierre; Soualmia, Lina; Darmoni, Stéfan

    2003-01-01

    This study examines an enabling condition for natural languages access to medical knowledge resources (Medline, CISMeF) indexed with controlled vocabularies (e.g., the MeSH): is the vocabulary of user queries comparable with that of the index terms? The two vocabularies were compared in their original form, then under incrementally normalized forms, using character-based normalizations then linguistic normalizations. Only 16.7% of the user vocabulary, in its original form, is in the MeSH. Progressive normalizations increase this proportion to 65.5%. Besides, if the frequencies of occurrence of words are taken into account, 89.3% of user word occurrences can be matched to MeSH words. This shows the interest of taking into account further matching methods between queries and index terms than those presented here.